1 00:00:05,080 --> 00:00:08,559 Speaker 1: We know so much about biology nowadays, so why is 2 00:00:08,600 --> 00:00:12,920 Speaker 1: it currently so difficult to figure out how to make 3 00:00:13,000 --> 00:00:16,680 Speaker 1: a person live longer? I mean, we know how to 4 00:00:16,720 --> 00:00:19,840 Speaker 1: do this in worms and often in mice, but why 5 00:00:19,920 --> 00:00:24,919 Speaker 1: is the study of extending lifespans so hard in humans? 6 00:00:25,600 --> 00:00:28,400 Speaker 1: Why are there so few companies studying this? And what 7 00:00:28,560 --> 00:00:32,360 Speaker 1: is the future of longevity research and what would it 8 00:00:32,400 --> 00:00:39,720 Speaker 1: be like to live a much longer life. Welcome to 9 00:00:39,760 --> 00:00:43,199 Speaker 1: Inner Cosmos with me David Eagleman. I'm a neuroscientist and 10 00:00:43,240 --> 00:00:46,400 Speaker 1: an author at Stanford, and in these episodes we sail 11 00:00:46,560 --> 00:00:50,040 Speaker 1: deeply into our three pound universe to uncover some of 12 00:00:50,080 --> 00:01:04,480 Speaker 1: the most surprising aspects of our lives. Today's episode reaches 13 00:01:04,520 --> 00:01:09,520 Speaker 1: beyond neurobiology to our biology more generally and specifically about 14 00:01:09,560 --> 00:01:13,200 Speaker 1: whether we could live a lot longer if we just 15 00:01:13,360 --> 00:01:17,280 Speaker 1: understood which of the billions of tiny molecular signals in 16 00:01:17,319 --> 00:01:21,560 Speaker 1: a cell mattered for the aging process, and which ones 17 00:01:21,600 --> 00:01:24,639 Speaker 1: we could grab a hold of and tweak, and how 18 00:01:24,800 --> 00:01:28,279 Speaker 1: the whole network might shift in a way that keeps 19 00:01:28,360 --> 00:01:33,479 Speaker 1: everything young and optimized. It's often said that the only 20 00:01:33,520 --> 00:01:38,039 Speaker 1: two certainties in life are death and taxes, but it 21 00:01:38,120 --> 00:01:41,360 Speaker 1: turns out the third certainty is that people will put 22 00:01:41,400 --> 00:01:44,800 Speaker 1: in a lot of work to avoid those two things. 23 00:01:45,360 --> 00:01:49,520 Speaker 1: So today's episode is about the endeavor of living longer. 24 00:01:49,640 --> 00:01:54,200 Speaker 1: This is not about immortality, in other words, never dying. Instead, 25 00:01:54,200 --> 00:02:00,840 Speaker 1: this is about longevity, increasing your lifespan. So to introduce 26 00:02:00,840 --> 00:02:03,800 Speaker 1: today's topic, I'm going to read a short story that 27 00:02:03,840 --> 00:02:07,880 Speaker 1: I wrote some years ago and originally read on BBC Radio. 28 00:02:08,120 --> 00:02:10,640 Speaker 1: I have been asked to speak here at the funeral 29 00:02:11,360 --> 00:02:14,520 Speaker 1: of a one hundred and twenty two year old. As 30 00:02:14,560 --> 00:02:17,000 Speaker 1: many of you know, I have been asked to speak 31 00:02:17,080 --> 00:02:20,680 Speaker 1: not only because of my expertise in the history of aging, 32 00:02:21,200 --> 00:02:25,400 Speaker 1: but also because she is a distant relative. She is 33 00:02:25,480 --> 00:02:31,520 Speaker 1: my great great great great granddaughter. Her death is tragic 34 00:02:31,600 --> 00:02:34,760 Speaker 1: to us, not only because she has died so young, 35 00:02:35,360 --> 00:02:38,359 Speaker 1: but because she was haunted her whole life by an 36 00:02:38,400 --> 00:02:42,359 Speaker 1: incurable blood disorder and was well aware that she would 37 00:02:42,400 --> 00:02:45,639 Speaker 1: not have a long life. I have been asked by 38 00:02:45,639 --> 00:02:49,359 Speaker 1: her parents to perhaps ease the tragedy of her young 39 00:02:49,440 --> 00:02:53,680 Speaker 1: death by thinking back to the time when a life 40 00:02:53,680 --> 00:02:56,799 Speaker 1: span of one hundred and twenty two years was not 41 00:02:56,919 --> 00:03:02,000 Speaker 1: considered short but unusually long. At that time, not so 42 00:03:02,160 --> 00:03:06,200 Speaker 1: long ago, it was not typical for people to earn 43 00:03:06,320 --> 00:03:11,000 Speaker 1: multiple PhDs or to be citizens of multiple countries in 44 00:03:11,080 --> 00:03:16,480 Speaker 1: their brief twinklings of a lifetime. They tended to mature faster, 45 00:03:17,120 --> 00:03:20,359 Speaker 1: moving out of the house in their teens or twenties, 46 00:03:20,800 --> 00:03:24,920 Speaker 1: having families of their own by their thirties, retiring in 47 00:03:25,000 --> 00:03:29,640 Speaker 1: their sixties, and dying shortly thereafter, with only enough time 48 00:03:30,120 --> 00:03:34,480 Speaker 1: for one career and one family. All things ran on 49 00:03:34,720 --> 00:03:40,600 Speaker 1: an accelerated schedule. In each generation, they had to relearn politics. 50 00:03:40,640 --> 00:03:45,560 Speaker 1: They had not experienced the patterns before. Everything seemed new, 51 00:03:45,880 --> 00:03:51,320 Speaker 1: every message of progress seemed inspired. Of course, this had 52 00:03:51,360 --> 00:03:56,200 Speaker 1: one advantage. People did not burn with vendettas that belonged 53 00:03:56,320 --> 00:04:01,440 Speaker 1: to distant times. Instead, they died quick, and their children 54 00:04:01,720 --> 00:04:07,280 Speaker 1: held different attitudes. Politics could take sharper turns. We all 55 00:04:07,360 --> 00:04:10,880 Speaker 1: know what it's like to spot a forgotten lover a 56 00:04:10,960 --> 00:04:15,120 Speaker 1: century later, but imagine what it was like to never 57 00:04:15,200 --> 00:04:20,920 Speaker 1: achieve any meaningful temporal distance from your past choices. Everything 58 00:04:21,000 --> 00:04:25,560 Speaker 1: you've done is there at your heels to haunt you. Obviously, 59 00:04:25,960 --> 00:04:30,880 Speaker 1: our mistakes make life educational, but as we know, getting 60 00:04:31,000 --> 00:04:35,760 Speaker 1: second chances allows us to endure it. People with rapid 61 00:04:35,839 --> 00:04:40,359 Speaker 1: life spans felt quite close to their recent ancestors because 62 00:04:40,400 --> 00:04:44,280 Speaker 1: they only had four of them alive at best. We 63 00:04:44,440 --> 00:04:48,800 Speaker 1: typically have five hundred and twelve great great great great 64 00:04:48,839 --> 00:04:54,240 Speaker 1: great grandparents alive, enough to fill a good sized lecture hall. Therefore, 65 00:04:54,680 --> 00:05:00,440 Speaker 1: relationships sink to genetic interest rather than emotional salience based 66 00:05:00,480 --> 00:05:04,240 Speaker 1: on scarcity. For those with only a handful of decades, 67 00:05:04,600 --> 00:05:08,719 Speaker 1: the idea of staying with one partner throughout life was 68 00:05:08,800 --> 00:05:12,800 Speaker 1: something to be devoutly hoped for, and people would declare 69 00:05:12,839 --> 00:05:16,279 Speaker 1: that as part of their wedding vows. In modern times, 70 00:05:16,640 --> 00:05:20,000 Speaker 1: we know of several couples who have celebrated three hundred 71 00:05:20,080 --> 00:05:24,080 Speaker 1: year anniversaries, but it's obviously much more common to enjoy 72 00:05:24,200 --> 00:05:27,960 Speaker 1: dozens of marriages. As we mourn this loss today, we 73 00:05:28,000 --> 00:05:32,200 Speaker 1: should remember that those with a scarcity of years had 74 00:05:32,240 --> 00:05:37,040 Speaker 1: no shortage of one thing novelty. For the rest of us. 75 00:05:37,560 --> 00:05:42,320 Speaker 1: Time speeds up as we grow older. At my age, 76 00:05:42,360 --> 00:05:47,839 Speaker 1: decades pass like summers. We have seen life's patterns many times. 77 00:05:48,240 --> 00:05:52,480 Speaker 1: We have traveled, we have married, we have wrangled, made peace, 78 00:05:53,080 --> 00:05:59,560 Speaker 1: failed friends, impressed strangers, seen new groups rise and dwindle, 79 00:06:00,160 --> 00:06:04,240 Speaker 1: sought new narratives about our abilities and our purpose. But 80 00:06:04,320 --> 00:06:08,440 Speaker 1: in the end, as things become less new, we write 81 00:06:08,480 --> 00:06:16,320 Speaker 1: down fewer memories, unique experiences become increasingly scarce, And so 82 00:06:16,920 --> 00:06:21,000 Speaker 1: as we mourn this life cut short here today we 83 00:06:21,120 --> 00:06:26,919 Speaker 1: can be thankful that for her, at least everything remained novel. 84 00:06:27,800 --> 00:06:31,760 Speaker 1: She did not drown in the ocean of time. Every 85 00:06:31,839 --> 00:06:36,640 Speaker 1: day she continued to find new seashells at the edges 86 00:06:37,080 --> 00:06:41,039 Speaker 1: of its low lying shores. Okay, so that was my 87 00:06:41,160 --> 00:06:45,120 Speaker 1: short story about the social changes that would follow if 88 00:06:45,160 --> 00:06:48,440 Speaker 1: we were to succeed at longevity science. How would that 89 00:06:48,560 --> 00:06:54,000 Speaker 1: change our lives? Our decisions are traditions, how we spend 90 00:06:54,040 --> 00:06:59,600 Speaker 1: our time. But today I want to ask, is longevity possible? 91 00:07:00,160 --> 00:07:03,200 Speaker 1: Where are we in terms of the science. When you 92 00:07:03,240 --> 00:07:07,159 Speaker 1: look at life expectancy at birth, let's say two hundred 93 00:07:07,240 --> 00:07:10,720 Speaker 1: years ago, you find that it was about twenty nine 94 00:07:10,840 --> 00:07:13,840 Speaker 1: years old. At the beginning of the nineteenth century, even 95 00:07:13,840 --> 00:07:17,000 Speaker 1: the healthiest countries in the world had a life expectancy 96 00:07:17,280 --> 00:07:21,880 Speaker 1: no longer than forty years. Now in the twenty first century, 97 00:07:21,960 --> 00:07:25,960 Speaker 1: the world average is seventy three years. That's a huge 98 00:07:26,000 --> 00:07:29,600 Speaker 1: difference in just two hundred years. Is it due to 99 00:07:29,920 --> 00:07:35,240 Speaker 1: miraculous medical advances? Well, not exactly. Largely it's due to 100 00:07:35,640 --> 00:07:39,880 Speaker 1: simple advances like being able to control diarrhea and vomiting 101 00:07:40,360 --> 00:07:45,200 Speaker 1: and having antibiotics, and also on the simple public health 102 00:07:45,200 --> 00:07:50,640 Speaker 1: initiatives like sanitation and clean water and managing sewage. These 103 00:07:50,680 --> 00:07:54,040 Speaker 1: are the steps that have had massive impact in the 104 00:07:54,320 --> 00:07:59,640 Speaker 1: expected lifespan. So we've seen astonishing progress. But keep in 105 00:07:59,680 --> 00:08:03,679 Speaker 1: mind what we're talking about there is the average life span. 106 00:08:04,200 --> 00:08:08,120 Speaker 1: We prevented people from dying young, and that cranked the 107 00:08:08,240 --> 00:08:12,560 Speaker 1: average up and up, But the longest life spans, which 108 00:08:12,560 --> 00:08:17,880 Speaker 1: are influenced by biological aging processes, they haven't really changed. 109 00:08:18,120 --> 00:08:22,440 Speaker 1: People aren't generally growing older. In other words, a thousand 110 00:08:22,560 --> 00:08:25,320 Speaker 1: years ago you could have lived until eighty five, and 111 00:08:25,400 --> 00:08:29,640 Speaker 1: now you can live until eighty five. Despite all our advances, 112 00:08:30,240 --> 00:08:33,640 Speaker 1: there seems to be an upper limit biologically in the 113 00:08:33,679 --> 00:08:37,120 Speaker 1: low one hundreds. So at the moment we haven't yet 114 00:08:37,200 --> 00:08:42,400 Speaker 1: found ways to significantly slow or reverse aging itself. There 115 00:08:42,400 --> 00:08:45,960 Speaker 1: seems to be a biological sealing to human life, and 116 00:08:46,000 --> 00:08:50,120 Speaker 1: this is presumably set by genetics and properties of ourselves 117 00:08:50,679 --> 00:08:54,760 Speaker 1: that limit how long the human body can sustain itself. 118 00:08:54,880 --> 00:08:58,480 Speaker 1: This is despite all the improvements in health and medical care. 119 00:08:59,080 --> 00:09:02,280 Speaker 1: But on the lip side, we're living in a time 120 00:09:02,320 --> 00:09:05,600 Speaker 1: where there are a lot of studies with worms, with mice, 121 00:09:05,640 --> 00:09:09,800 Speaker 1: occasion with dogs or monkeys that seem to indicate there 122 00:09:09,840 --> 00:09:14,640 Speaker 1: may be molecular solutions or at least helpers to the problem. 123 00:09:14,880 --> 00:09:17,880 Speaker 1: If you follow the field or simply scan the news headlines, 124 00:09:17,920 --> 00:09:23,040 Speaker 1: you'll see articles about drugs like rapamycin, or chemicals like restrol, 125 00:09:23,520 --> 00:09:27,760 Speaker 1: or approaches like caloric restriction, and what you'll see is 126 00:09:27,800 --> 00:09:32,560 Speaker 1: that in animal studies these seem to increase lifespan. So 127 00:09:32,640 --> 00:09:35,760 Speaker 1: the question is should you be taking these drugs and 128 00:09:35,840 --> 00:09:39,319 Speaker 1: changing your diet. Do the studies translate over to humans? 129 00:09:39,960 --> 00:09:42,200 Speaker 1: And what dose should you go for or how many 130 00:09:42,520 --> 00:09:44,160 Speaker 1: calories should you restrict? 131 00:09:45,000 --> 00:09:45,600 Speaker 2: Well, these are. 132 00:09:45,559 --> 00:09:49,040 Speaker 1: Questions that short news articles tend to skate over the 133 00:09:49,080 --> 00:09:52,520 Speaker 1: complexity of and so to really understand the big picture, 134 00:09:53,000 --> 00:09:56,720 Speaker 1: we need to talk to experts in the field of longevity, 135 00:09:57,320 --> 00:09:59,960 Speaker 1: and that brings me to Martin borsch Jensen, a research 136 00:10:00,600 --> 00:10:03,560 Speaker 1: who takes the position that if we can really understand 137 00:10:03,640 --> 00:10:08,640 Speaker 1: the biological mechanisms of aging, that gives us our highest 138 00:10:08,800 --> 00:10:13,680 Speaker 1: leverage for alleviating human disease. Martin got his PhD at 139 00:10:13,679 --> 00:10:17,160 Speaker 1: the University of Copenhagen in aging research, then did a 140 00:10:17,200 --> 00:10:21,720 Speaker 1: postdoc at the Buck Institute before becoming a biotech entrepreneur. 141 00:10:22,040 --> 00:10:25,320 Speaker 1: He became the co founder and chief scientific officer at 142 00:10:25,480 --> 00:10:29,080 Speaker 1: Gordian Biotechnology, which is a company that puts the study 143 00:10:29,080 --> 00:10:33,360 Speaker 1: of longevity at the center. So here's my interview with Martin. 144 00:10:38,080 --> 00:10:41,559 Speaker 1: So what is aging biologically? 145 00:10:42,520 --> 00:10:45,880 Speaker 2: That is an unanswered question by the field of aging biology, right, 146 00:10:45,920 --> 00:10:47,840 Speaker 2: So I think that's one of the core things here 147 00:10:47,920 --> 00:10:51,880 Speaker 2: that nobody can tell you, Like, here is exactly how 148 00:10:51,920 --> 00:10:55,320 Speaker 2: this works. You could sort of compare it to like 149 00:10:55,360 --> 00:10:59,640 Speaker 2: what is society or civilization, what is the economy. It's 150 00:10:59,640 --> 00:11:01,920 Speaker 2: sort of the complex thing where we know a lot 151 00:11:01,960 --> 00:11:07,160 Speaker 2: of components and we know some key factors. But let's 152 00:11:08,280 --> 00:11:11,480 Speaker 2: try to do our best here and say aging is 153 00:11:11,600 --> 00:11:14,960 Speaker 2: some set of changes that happen in your body biologically 154 00:11:15,520 --> 00:11:20,320 Speaker 2: that worsen your capacity for your body to restore itself 155 00:11:20,360 --> 00:11:24,959 Speaker 2: to homeostasis is the technical term, but basically, like your 156 00:11:24,960 --> 00:11:27,480 Speaker 2: body is in a certain state, and then the essence 157 00:11:27,559 --> 00:11:30,120 Speaker 2: of life is that when something gets it out of 158 00:11:30,160 --> 00:11:32,959 Speaker 2: that state, it sort of restores itself. Right, So if 159 00:11:33,000 --> 00:11:35,080 Speaker 2: you drink too much, you have a hangover, your liver 160 00:11:35,160 --> 00:11:37,079 Speaker 2: is going to work and so forth, and then you're 161 00:11:37,080 --> 00:11:39,160 Speaker 2: going to be fine again. If you get influenza, you 162 00:11:39,160 --> 00:11:42,080 Speaker 2: have an immune system that's designed to kill all of 163 00:11:42,120 --> 00:11:44,240 Speaker 2: those viruses and then you're going to be fine again. 164 00:11:44,320 --> 00:11:47,120 Speaker 2: And so that's sort of the return to homeostasis. And 165 00:11:47,160 --> 00:11:49,240 Speaker 2: I would say aging is a set of changes that 166 00:11:49,320 --> 00:11:54,280 Speaker 2: happen that lower your ability to like return to homeostasis, 167 00:11:54,360 --> 00:11:57,880 Speaker 2: and then over time that combines with everything that you're 168 00:11:57,920 --> 00:12:01,080 Speaker 2: going through in life to lead to you know, your 169 00:12:01,200 --> 00:12:04,280 Speaker 2: organs failing in different ways and you get weaker, and 170 00:12:04,320 --> 00:12:06,960 Speaker 2: you get slower and like all of those kinds of things. 171 00:12:07,280 --> 00:12:10,360 Speaker 2: So a set of processes that lead to all of 172 00:12:10,400 --> 00:12:11,120 Speaker 2: the bad things. 173 00:12:11,400 --> 00:12:13,040 Speaker 1: And I want to heard you say that if we 174 00:12:13,040 --> 00:12:16,080 Speaker 1: were all twenty eight years old, we wouldn't have all 175 00:12:16,120 --> 00:12:20,199 Speaker 1: these departments in academic research institutes and hospitals and so on, 176 00:12:20,840 --> 00:12:24,120 Speaker 1: because we would just need to address a few things. 177 00:12:24,760 --> 00:12:27,480 Speaker 1: I think you mentioned testicular cancer a few other things, 178 00:12:27,520 --> 00:12:32,000 Speaker 1: but we wouldn't need giant cancer departments MD Anderson and 179 00:12:32,040 --> 00:12:34,760 Speaker 1: so on, because these are all things that come along 180 00:12:34,920 --> 00:12:36,160 Speaker 1: with aging. 181 00:12:36,679 --> 00:12:39,160 Speaker 2: Yeah, Like if you just look at the incident's rate 182 00:12:39,440 --> 00:12:43,320 Speaker 2: of a lot of different diseases, you know, heart failure, 183 00:12:43,640 --> 00:12:46,000 Speaker 2: how many people do you know had a heart attack 184 00:12:46,160 --> 00:12:49,320 Speaker 2: like above age fifty, and how many people you know 185 00:12:49,400 --> 00:12:51,600 Speaker 2: that had a heart attack was like below age fifty 186 00:12:51,679 --> 00:12:55,840 Speaker 2: or below age thirty, right, and probably your anecdotal evidence 187 00:12:55,840 --> 00:12:59,480 Speaker 2: they are sort of aligned with just the stats we have, 188 00:12:59,600 --> 00:13:01,400 Speaker 2: which is, you know, like the rate goes up one 189 00:13:01,440 --> 00:13:06,000 Speaker 2: thousandfold or ten thousandfold or something like that, Right, And so, yeah, 190 00:13:06,040 --> 00:13:08,720 Speaker 2: when you are young, there's just a few things that 191 00:13:09,200 --> 00:13:11,240 Speaker 2: affect you, and when you're old, there's a lot of 192 00:13:11,240 --> 00:13:14,319 Speaker 2: things that affect you. And US healthcare spend is for 193 00:13:14,559 --> 00:13:19,120 Speaker 2: trillion ish and like the majority of that fraction is 194 00:13:19,160 --> 00:13:22,960 Speaker 2: for you know, like people who have age related diseases, 195 00:13:23,640 --> 00:13:25,760 Speaker 2: often multiple age related diseases. 196 00:13:26,000 --> 00:13:30,080 Speaker 1: Now, when people go about trying to research this, you know, 197 00:13:30,160 --> 00:13:33,000 Speaker 1: it's limited in some way by the technologies that we have. 198 00:13:33,120 --> 00:13:35,440 Speaker 1: And so what people have done over let's call it 199 00:13:35,480 --> 00:13:39,120 Speaker 1: the last twenty years, is they look for particular biomarkers. 200 00:13:39,160 --> 00:13:41,800 Speaker 1: So tell us about that and how it's evolved in 201 00:13:41,840 --> 00:13:45,080 Speaker 1: the last couple of decades and what you're doing leading 202 00:13:45,120 --> 00:13:46,560 Speaker 1: to what you're doing at Gordian. 203 00:13:47,080 --> 00:13:51,240 Speaker 2: Yeah. Absolutely, So the field of like, hey, we can 204 00:13:51,320 --> 00:13:54,880 Speaker 2: molecularly sort of study and manipulate aging is really only 205 00:13:54,920 --> 00:13:58,920 Speaker 2: about three decades old. In the early nineties, there were 206 00:13:58,960 --> 00:14:02,199 Speaker 2: some studies where our researchers changed a single gene and 207 00:14:02,280 --> 00:14:04,760 Speaker 2: some worms and it made the worms live twice as long. 208 00:14:05,080 --> 00:14:07,320 Speaker 2: And so before that it's like, well, what is aging 209 00:14:07,400 --> 00:14:09,120 Speaker 2: and can we really change it? And so forth. But 210 00:14:09,160 --> 00:14:11,920 Speaker 2: this shows that there is a way to regulate the 211 00:14:11,920 --> 00:14:15,040 Speaker 2: aging process that's already innate in sort of the way 212 00:14:15,040 --> 00:14:19,680 Speaker 2: our biology functions, and so that's cool. And so in 213 00:14:19,680 --> 00:14:23,160 Speaker 2: a worm, it lives three weeks after you extended the lifespan, right, 214 00:14:23,200 --> 00:14:25,400 Speaker 2: so you can you can do studies where you're just like, 215 00:14:25,480 --> 00:14:28,040 Speaker 2: does it live longer? Is it healthier as you get 216 00:14:28,040 --> 00:14:31,760 Speaker 2: closer and closer to humans? That's sort of impractical, right, Like, 217 00:14:31,840 --> 00:14:34,920 Speaker 2: aging by definition takes a lifetime, and so you don't 218 00:14:34,960 --> 00:14:37,320 Speaker 2: want to do forty year studies all the time. And 219 00:14:37,400 --> 00:14:39,840 Speaker 2: so people have done a couple of things in response 220 00:14:39,880 --> 00:14:43,280 Speaker 2: to that. One is like, find some area of biology 221 00:14:43,280 --> 00:14:47,080 Speaker 2: that like, this seems important. This accumulation of these sinesse 222 00:14:47,160 --> 00:14:50,320 Speaker 2: and cells or these particular changes in your body seems important. 223 00:14:50,480 --> 00:14:52,560 Speaker 2: We're just going to study them and then sort of 224 00:14:52,640 --> 00:14:55,400 Speaker 2: like trust that they are important for the aging process. 225 00:14:55,680 --> 00:14:58,000 Speaker 2: The other thing people have done is try to find. 226 00:14:59,240 --> 00:15:02,240 Speaker 2: They're usually called like aging clocks, and so it's something 227 00:15:02,280 --> 00:15:05,280 Speaker 2: that you could measure that when you look at it 228 00:15:05,360 --> 00:15:09,200 Speaker 2: over time, it changes with age, and so you can 229 00:15:09,280 --> 00:15:12,720 Speaker 2: use whatever the measurement is to tell you sort of 230 00:15:12,760 --> 00:15:15,960 Speaker 2: like how far along is this person? Right, If we 231 00:15:16,000 --> 00:15:18,520 Speaker 2: do an analogy to a company, maybe it's like you 232 00:15:18,560 --> 00:15:21,120 Speaker 2: try to schedule a meeting with someone and whether they 233 00:15:21,120 --> 00:15:23,160 Speaker 2: can take the meeting, and like it's a young startup, 234 00:15:23,160 --> 00:15:25,600 Speaker 2: you can take the meeting in the next forty five minutes, 235 00:15:25,680 --> 00:15:28,800 Speaker 2: and if it's like a giant one hundred thousand person company, 236 00:15:28,800 --> 00:15:30,040 Speaker 2: it's going to take you a month and a half 237 00:15:30,120 --> 00:15:33,360 Speaker 2: to So that's like a marker of the aging of 238 00:15:34,360 --> 00:15:36,360 Speaker 2: you know, in this case the company. So we have those, 239 00:15:36,480 --> 00:15:40,960 Speaker 2: we have multiple of those for sort of human aging. 240 00:15:41,680 --> 00:15:45,440 Speaker 2: We don't yet have one where that when it responds, 241 00:15:45,640 --> 00:15:49,680 Speaker 2: that's actually our aging changing versus it's measuring something that 242 00:15:49,720 --> 00:15:54,080 Speaker 2: happens alongside aging. But it doesn't it doesn't necessarily mean 243 00:15:54,120 --> 00:15:56,360 Speaker 2: you're better off if the number goes down. So that's 244 00:15:56,360 --> 00:15:58,120 Speaker 2: sort of the main limitation here. 245 00:15:58,000 --> 00:16:01,760 Speaker 1: As in I do something so that at this large company, 246 00:16:02,080 --> 00:16:04,800 Speaker 1: I can get a meeting quickly, but it doesn't change 247 00:16:04,840 --> 00:16:06,760 Speaker 1: the age of the company. It's still an old company, 248 00:16:06,800 --> 00:16:09,000 Speaker 1: even though I did some tweak to make the meeting 249 00:16:09,040 --> 00:16:10,560 Speaker 1: happen right, Like you. 250 00:16:10,520 --> 00:16:13,240 Speaker 2: Could set up so that like every calendar in that 251 00:16:13,360 --> 00:16:17,720 Speaker 2: company must auto accept a meeting immediately. However, the company's 252 00:16:18,120 --> 00:16:21,480 Speaker 2: ability to actually act on what happens in the meeting 253 00:16:21,680 --> 00:16:24,160 Speaker 2: is not changed. The amount of bureaucracy is the same. 254 00:16:24,640 --> 00:16:27,760 Speaker 2: And so you could optimize that one all you want, 255 00:16:27,800 --> 00:16:30,240 Speaker 2: but it doesn't really do what you wanted to do. 256 00:16:30,400 --> 00:16:33,080 Speaker 1: So the point here is that even if you do 257 00:16:33,240 --> 00:16:37,480 Speaker 1: something that changes these biomarkers, your cell might be just 258 00:16:37,520 --> 00:16:41,600 Speaker 1: as old and you know, having all the ills of senescence, 259 00:16:42,040 --> 00:16:43,680 Speaker 1: even though you think, hey, I fixed it. 260 00:16:44,240 --> 00:16:46,160 Speaker 2: Yeah. Like, let's say, for example, a lot of these 261 00:16:46,200 --> 00:16:48,480 Speaker 2: are blood tests, right, so they would measure stuff in 262 00:16:48,520 --> 00:16:51,160 Speaker 2: immune cells, primarily because those are the ones circling around 263 00:16:51,200 --> 00:16:54,400 Speaker 2: your blood. And so I imagine if any your immune 264 00:16:54,400 --> 00:16:56,840 Speaker 2: cells will go and then when you have an infection, 265 00:16:57,000 --> 00:17:00,600 Speaker 2: they'll divide and proliferate and create more that's target disinfection 266 00:17:00,640 --> 00:17:03,920 Speaker 2: and so forth. Imagine that what a given aging clock 267 00:17:04,000 --> 00:17:07,600 Speaker 2: measures is actually how many times has this cell divided 268 00:17:08,000 --> 00:17:12,360 Speaker 2: That will tend to go up as you age, but whatever, 269 00:17:12,600 --> 00:17:17,320 Speaker 2: like TikTok tracks that, it's not the whole aging process. 270 00:17:17,359 --> 00:17:19,160 Speaker 2: There's all this other stuff going on in all these 271 00:17:19,160 --> 00:17:21,200 Speaker 2: other organs. And so I think that's the key thing 272 00:17:21,280 --> 00:17:24,200 Speaker 2: that sort of currently we need to address with these 273 00:17:24,240 --> 00:17:26,840 Speaker 2: kinds of aging clocks before we can really reliably use 274 00:17:26,840 --> 00:17:31,920 Speaker 2: them to accelerate the research, is how specific are they 275 00:17:32,520 --> 00:17:36,640 Speaker 2: for predicting the changes that we want to use them for. 276 00:17:37,400 --> 00:17:40,600 Speaker 1: And I know that like everything in biology, you know, 277 00:17:40,640 --> 00:17:44,200 Speaker 1: what we're looking at are these vast networks that bankrupt 278 00:17:44,200 --> 00:17:48,560 Speaker 1: our language. They're so complex, and so what we've been 279 00:17:48,640 --> 00:17:53,040 Speaker 1: doing over the past decades is looking for little places 280 00:17:53,080 --> 00:17:55,160 Speaker 1: in the network where we say, aha, this is maybe 281 00:17:55,240 --> 00:17:59,240 Speaker 1: the sign that we need to read right, But what's 282 00:17:59,320 --> 00:18:01,439 Speaker 1: the what's your approach to this? 283 00:18:01,440 --> 00:18:04,840 Speaker 2: Which I think is very clever. Yeah, absolutely so I 284 00:18:04,880 --> 00:18:08,240 Speaker 2: spent you know, after this sort of existentialist space, I 285 00:18:08,280 --> 00:18:10,919 Speaker 2: went into academic research. I did a PhD at an 286 00:18:10,920 --> 00:18:14,359 Speaker 2: aging institute, and I did postocical research and sort of 287 00:18:14,359 --> 00:18:18,119 Speaker 2: heading on the academic track to be a professor, all 288 00:18:18,160 --> 00:18:21,040 Speaker 2: like trying to figure out how does this work? You know, 289 00:18:21,040 --> 00:18:23,640 Speaker 2: because like one approach to fixing a system, let's say 290 00:18:23,640 --> 00:18:26,120 Speaker 2: a car or whatever is like, okay, well I understand 291 00:18:26,160 --> 00:18:28,639 Speaker 2: what each piece does, and I can see where the 292 00:18:28,720 --> 00:18:30,639 Speaker 2: broken thing is, and then I go in and I 293 00:18:30,760 --> 00:18:34,240 Speaker 2: like fix that broken thing. So that's one approach which 294 00:18:34,280 --> 00:18:37,639 Speaker 2: makes sense if you can like get to an understanding 295 00:18:37,640 --> 00:18:40,040 Speaker 2: in your lifetime, and so if you sort of capitulate 296 00:18:40,119 --> 00:18:42,199 Speaker 2: on that, which is honestly what I did. It's like 297 00:18:42,240 --> 00:18:44,280 Speaker 2: there's so much biology, like we don't even know what 298 00:18:44,359 --> 00:18:46,959 Speaker 2: we don't know yet, right, how do you find a 299 00:18:47,000 --> 00:18:51,200 Speaker 2: solution if you don't have an understanding of the whole. 300 00:18:51,320 --> 00:18:54,520 Speaker 2: And that's why the company I started with my co 301 00:18:54,560 --> 00:18:57,280 Speaker 2: found of Francisco, it's called Gordian after the sort of 302 00:18:57,359 --> 00:18:59,879 Speaker 2: legend of the Gordian nod where you try to untangle 303 00:18:59,880 --> 00:19:02,639 Speaker 2: it but it's so complicated you can't untangle it, and 304 00:19:02,680 --> 00:19:06,560 Speaker 2: then Alexander the then not yet Great, decides to cut 305 00:19:06,600 --> 00:19:09,960 Speaker 2: through it instead and find an elegant solution that questions 306 00:19:10,000 --> 00:19:13,840 Speaker 2: the assumptions. And so the biology version of that for 307 00:19:13,960 --> 00:19:17,240 Speaker 2: Gordian is, you know, I had a knowledge of how 308 00:19:17,280 --> 00:19:19,719 Speaker 2: complex everything was, and I knew that like, yeah, we 309 00:19:19,800 --> 00:19:22,440 Speaker 2: don't actually know what aging is. I can't give you. 310 00:19:22,720 --> 00:19:25,679 Speaker 2: Here's an accelerated version of aging that captures all the 311 00:19:25,720 --> 00:19:28,840 Speaker 2: important parts, because we don't know all the important parts. 312 00:19:29,240 --> 00:19:33,080 Speaker 2: And so if we want to find treatments that work 313 00:19:33,119 --> 00:19:36,119 Speaker 2: for different age related diseases. And this is really important because, 314 00:19:36,320 --> 00:19:38,479 Speaker 2: like we were talking about before, you know, most if 315 00:19:38,480 --> 00:19:41,240 Speaker 2: you just look up like top ten causes of death 316 00:19:41,280 --> 00:19:44,160 Speaker 2: in the US, seven of them, aging is the number 317 00:19:44,160 --> 00:19:47,240 Speaker 2: one risk factor and the eighth one like diabetes. It's 318 00:19:47,320 --> 00:19:50,320 Speaker 2: number two after diet, right, And if you just look 319 00:19:50,359 --> 00:19:52,159 Speaker 2: at like our spend, what are the things that we 320 00:19:52,240 --> 00:19:54,480 Speaker 2: can't cure, what are the things we're worried about, It's 321 00:19:54,480 --> 00:19:58,200 Speaker 2: like not tuberculosis anymore, we did good, but it's all 322 00:19:58,200 --> 00:20:01,199 Speaker 2: these age related diseases. And so so we've put a 323 00:20:01,200 --> 00:20:03,720 Speaker 2: lot of money into trying to treat these diseases and 324 00:20:03,760 --> 00:20:08,280 Speaker 2: find effective medicines, but we haven't succeeded in at scale yet. 325 00:20:08,560 --> 00:20:12,080 Speaker 2: Maybe that's because these diseases, as we talked about before, 326 00:20:12,520 --> 00:20:15,600 Speaker 2: they manifest in old people, and so some of these 327 00:20:15,600 --> 00:20:19,200 Speaker 2: physiological changes that are happening with aging are critical for 328 00:20:19,280 --> 00:20:23,840 Speaker 2: the disease, like being able to manifest and not being 329 00:20:23,960 --> 00:20:26,359 Speaker 2: just like repaired. So if we do all of our 330 00:20:26,440 --> 00:20:31,359 Speaker 2: drudge discovery in organisms that are young Let's say, you know, 331 00:20:31,440 --> 00:20:34,359 Speaker 2: like some model organism in the lab that is young 332 00:20:34,400 --> 00:20:36,359 Speaker 2: and we've sort of engineered it to get the disease. 333 00:20:36,440 --> 00:20:39,520 Speaker 2: Let's take Alzheimer's. People often like have these animals that 334 00:20:39,600 --> 00:20:43,640 Speaker 2: have like expressed the mutant proteins involved in the disease. 335 00:20:43,920 --> 00:20:46,040 Speaker 1: These are mice, for example you're talking about. 336 00:20:46,119 --> 00:20:49,239 Speaker 2: Yeah, exactly, And so we take these mice and we say, oh, 337 00:20:49,320 --> 00:20:53,200 Speaker 2: let's find every mutation that like happens in Alzheimer's patient 338 00:20:53,240 --> 00:20:54,760 Speaker 2: and throw all of them at the mice, and then 339 00:20:54,800 --> 00:20:57,919 Speaker 2: it's going to get Alzheimer's really quickly. But it's not 340 00:20:57,960 --> 00:21:00,800 Speaker 2: really Alzheimer's. It's like one specific dysfunction, and then we 341 00:21:00,840 --> 00:21:03,160 Speaker 2: can fix that. Like we have good drugs to treat 342 00:21:03,200 --> 00:21:05,480 Speaker 2: that in mice. We don't have good drugs to treat 343 00:21:05,520 --> 00:21:08,760 Speaker 2: Alzheimer's in people because it's probably more complicated. 344 00:21:08,400 --> 00:21:13,119 Speaker 1: Because the rest of the adult human has five million 345 00:21:13,240 --> 00:21:16,560 Speaker 1: other biological issues going on at the same time. So 346 00:21:16,600 --> 00:21:21,080 Speaker 1: those mutations in the giant network that's happening, you get 347 00:21:21,119 --> 00:21:23,719 Speaker 1: different results then you do in a mouse who is 348 00:21:23,800 --> 00:21:26,560 Speaker 1: otherwise young and perfect but has these mutations. 349 00:21:26,920 --> 00:21:30,000 Speaker 2: Yeah, that's right. You know, like you get a flat tire, 350 00:21:30,560 --> 00:21:32,480 Speaker 2: you know, driving to work and it's like it's fine, 351 00:21:32,520 --> 00:21:34,520 Speaker 2: I can deal with this, right, but like you get 352 00:21:34,560 --> 00:21:37,359 Speaker 2: a flat tire, like driving to the hospital with your 353 00:21:37,440 --> 00:21:39,720 Speaker 2: kid after you lost your job, it's like you can't. 354 00:21:39,800 --> 00:21:42,120 Speaker 2: You don't have the capacity to deal with this thing anymore. 355 00:21:42,119 --> 00:21:43,960 Speaker 2: That's probably a lot of what happens in these age 356 00:21:44,000 --> 00:21:46,439 Speaker 2: related diseases that like things aren't working as well, so 357 00:21:46,480 --> 00:21:49,320 Speaker 2: you don't have the capacity to compensate for whatever is 358 00:21:49,359 --> 00:21:52,960 Speaker 2: going wrong. Here's where we are. We've studied these diseases. 359 00:21:53,240 --> 00:21:57,199 Speaker 2: We've forced ourselves to like use these simplified models of 360 00:21:57,280 --> 00:21:59,359 Speaker 2: the disease because it's just not we didn't have a 361 00:21:59,359 --> 00:22:02,800 Speaker 2: practical way of trying a lot of things otherwise, and 362 00:22:02,840 --> 00:22:05,280 Speaker 2: then we failed. And so Gordian says, well, what if 363 00:22:05,320 --> 00:22:09,120 Speaker 2: we had a way to go into the most realistic 364 00:22:09,200 --> 00:22:11,520 Speaker 2: environment for the patient. What if we could instead of 365 00:22:11,600 --> 00:22:14,960 Speaker 2: like making this disease in an animal model in this 366 00:22:15,040 --> 00:22:17,399 Speaker 2: accelerated way, what if we could find an animal that 367 00:22:17,520 --> 00:22:20,280 Speaker 2: has the same disease as the human and developed it 368 00:22:20,760 --> 00:22:23,239 Speaker 2: over a long period of time the same way that 369 00:22:23,480 --> 00:22:26,280 Speaker 2: humans do. And then we could test a lot of 370 00:22:26,320 --> 00:22:30,080 Speaker 2: things there. And so for example, we could work with 371 00:22:30,440 --> 00:22:33,959 Speaker 2: you know, for ostereothritis, which is one of the diseases 372 00:22:34,359 --> 00:22:37,159 Speaker 2: that we're working on. A lot of people use have 373 00:22:37,200 --> 00:22:39,960 Speaker 2: these young mice and you do surgical injury and whatever. 374 00:22:40,640 --> 00:22:44,399 Speaker 2: We can find a horse that got ostereothritis from like 375 00:22:44,480 --> 00:22:48,199 Speaker 2: running around and living life, and then we can study 376 00:22:48,240 --> 00:22:51,760 Speaker 2: we can go in and see can we treat the 377 00:22:51,800 --> 00:22:54,600 Speaker 2: osterothritis in this context, and it's much more like a 378 00:22:54,680 --> 00:22:59,320 Speaker 2: human it's like an avatar for the human patient. Now, 379 00:22:59,640 --> 00:23:02,040 Speaker 2: anyone could do that, like anyone can go find a 380 00:23:02,080 --> 00:23:05,199 Speaker 2: whorse somewhere on a farm and then do this study. 381 00:23:05,320 --> 00:23:08,240 Speaker 2: But the way that most studies are done, you have 382 00:23:08,320 --> 00:23:10,960 Speaker 2: these large groups of animals with treatment A and large 383 00:23:10,960 --> 00:23:12,840 Speaker 2: group of animal with treatment B, so that you can 384 00:23:12,880 --> 00:23:17,080 Speaker 2: compare across the biological variability of the individual animals and stuff. 385 00:23:17,440 --> 00:23:21,560 Speaker 2: That's just impractical. It's plausible, but it's like prohibitively expensive 386 00:23:21,680 --> 00:23:25,280 Speaker 2: and like logistically challenging. And so Gordon started with the 387 00:23:25,320 --> 00:23:27,879 Speaker 2: idea that like, we can actually do this if we 388 00:23:27,960 --> 00:23:31,679 Speaker 2: invent a way to test hundreds of treatments in a 389 00:23:31,680 --> 00:23:34,520 Speaker 2: single animal, and so that's the core of the company. 390 00:23:34,600 --> 00:23:38,199 Speaker 2: We can go into the most predictive system for is 391 00:23:38,240 --> 00:23:41,199 Speaker 2: this going to cure disease X and a patient, and 392 00:23:41,280 --> 00:23:43,080 Speaker 2: then we can test a lot of things there, so 393 00:23:43,119 --> 00:23:45,760 Speaker 2: we don't have to be very smart. I try to 394 00:23:45,760 --> 00:23:47,880 Speaker 2: design it so that I can be not very smart 395 00:23:48,200 --> 00:23:50,760 Speaker 2: and still succeed. And so the way we do that 396 00:23:50,920 --> 00:23:54,439 Speaker 2: there's a lot of like cool biotechnology, and we have 397 00:23:54,560 --> 00:23:58,399 Speaker 2: these like re engineered viruses that can deliver therapies to 398 00:23:58,480 --> 00:24:01,320 Speaker 2: individual cells of the organ, and then we can like 399 00:24:01,680 --> 00:24:05,440 Speaker 2: pull those cells out and measure the activity of every 400 00:24:05,520 --> 00:24:08,480 Speaker 2: gene in that cell and then like predict, Okay, what 401 00:24:08,520 --> 00:24:11,640 Speaker 2: does this these gene changes mean for physiology. So there's 402 00:24:11,640 --> 00:24:14,760 Speaker 2: a lot of sort of hard stuff in actually making 403 00:24:14,800 --> 00:24:18,720 Speaker 2: that work, but the core of it is find the 404 00:24:18,800 --> 00:24:22,000 Speaker 2: thing that will actually predict the outcome you want, not 405 00:24:22,080 --> 00:24:24,800 Speaker 2: something that you sort of hope would predict the outcome 406 00:24:24,840 --> 00:24:28,320 Speaker 2: you want, but it's kind of very indirect, and then 407 00:24:28,359 --> 00:24:31,000 Speaker 2: do a lot of research there. And so we're using 408 00:24:31,000 --> 00:24:35,639 Speaker 2: that to find new treatments across four different diseases of aging, 409 00:24:35,800 --> 00:24:52,000 Speaker 2: heart failure, and fibrosis and osteothritis. 410 00:24:53,440 --> 00:24:55,679 Speaker 1: So the key is you're looking at lots of things 411 00:24:55,680 --> 00:24:59,600 Speaker 1: at once instead of saying, hey, here's our single measure 412 00:24:59,600 --> 00:25:01,240 Speaker 1: that we're from these horses. 413 00:25:01,640 --> 00:25:04,000 Speaker 2: Yeah, there's two things there. We're looking at lots of 414 00:25:04,040 --> 00:25:08,400 Speaker 2: different potential treatments, right, and so it's not like three 415 00:25:08,480 --> 00:25:11,760 Speaker 2: decades of research suggests that this is the right target 416 00:25:11,840 --> 00:25:14,640 Speaker 2: to treat this disease, which is often how it happens, right, 417 00:25:14,680 --> 00:25:16,920 Speaker 2: and like what we've had to do and what I 418 00:25:16,960 --> 00:25:20,959 Speaker 2: was just too impatient to stay in academia and do. Right, Instead, 419 00:25:21,000 --> 00:25:24,480 Speaker 2: we say like, well, let's test this like one hundred 420 00:25:24,520 --> 00:25:28,480 Speaker 2: different hypotheses all in the same animal. And then the 421 00:25:28,600 --> 00:25:31,480 Speaker 2: test is, yeah, measuring lots of things at once to 422 00:25:31,520 --> 00:25:34,639 Speaker 2: see what is the overall like state of this cell. 423 00:25:35,040 --> 00:25:38,040 Speaker 2: It was in a tissue that had let's say, al tereothritis. 424 00:25:38,080 --> 00:25:41,800 Speaker 2: There was a chondrocyite, which are the cells in your 425 00:25:41,920 --> 00:25:44,920 Speaker 2: like cartilage of your joints, and it's in this tissue 426 00:25:44,920 --> 00:25:47,480 Speaker 2: that has all the bad stuff going on. It's got 427 00:25:47,520 --> 00:25:50,040 Speaker 2: maybe some metabolic shifts that happen with age. There's an 428 00:25:50,040 --> 00:25:53,160 Speaker 2: immune system and all this stuff. And if we turn 429 00:25:53,280 --> 00:25:56,159 Speaker 2: this take this one genetic target, and turn it up 430 00:25:56,240 --> 00:26:00,280 Speaker 2: or down, does it now resemble a chondracite that is 431 00:26:00,280 --> 00:26:02,399 Speaker 2: in like a healthy whorese or has it changed in 432 00:26:02,400 --> 00:26:05,800 Speaker 2: its behavior to produce more cartilage which is the problem. 433 00:26:05,960 --> 00:26:07,840 Speaker 2: So we can look at the full sort of state 434 00:26:07,920 --> 00:26:10,240 Speaker 2: of the cell and we can say, you know, is 435 00:26:10,280 --> 00:26:12,280 Speaker 2: this more? Is this the state that we want to 436 00:26:12,320 --> 00:26:16,040 Speaker 2: go to in a way that doesn't require us to 437 00:26:16,040 --> 00:26:19,000 Speaker 2: make a single hypothesis around this is how the disease works. 438 00:26:19,359 --> 00:26:21,639 Speaker 2: So neither on the like, how do we try to 439 00:26:21,680 --> 00:26:25,000 Speaker 2: poke it? Or on the like is this better? Do 440 00:26:25,119 --> 00:26:27,719 Speaker 2: we rely on just like a single thing and like 441 00:26:27,800 --> 00:26:30,919 Speaker 2: this is the way of the disease because that's risky, 442 00:26:31,000 --> 00:26:33,680 Speaker 2: Like we've done that, you know neuroscience very well. 443 00:26:33,880 --> 00:26:37,000 Speaker 1: Yes, yes, you know that's the way that biology is 444 00:26:37,080 --> 00:26:40,920 Speaker 1: having to move because we've spent so long looking at 445 00:26:40,960 --> 00:26:44,240 Speaker 1: individual pieces of very complicated networks, and we've seen the 446 00:26:44,240 --> 00:26:46,359 Speaker 1: ways in which that doesn't get us the answer we want. 447 00:26:46,720 --> 00:26:48,680 Speaker 1: Let me ask you a more general question, because you're 448 00:26:48,720 --> 00:26:52,280 Speaker 1: an expert on aging research. More generally, what do you 449 00:26:52,480 --> 00:26:58,240 Speaker 1: see in human longevity in terms of you know, people 450 00:26:58,280 --> 00:27:02,120 Speaker 1: are doing intermittent fasting, chloric restriction, and there are drugs 451 00:27:02,240 --> 00:27:05,920 Speaker 1: like risveritral and others that everyone's very interested in. What's 452 00:27:06,000 --> 00:27:08,200 Speaker 1: your view of the field as it stands right now? 453 00:27:08,760 --> 00:27:12,520 Speaker 2: The key thing that we're missing is that measurement, right 454 00:27:12,560 --> 00:27:15,680 Speaker 2: and so if we had something that we had really validated, 455 00:27:16,119 --> 00:27:20,359 Speaker 2: like and bivalidated, I mean something along the lines of like, Okay, 456 00:27:20,520 --> 00:27:25,200 Speaker 2: I'm I'm postulating that this thing will predict whether you, 457 00:27:25,240 --> 00:27:28,439 Speaker 2: like you're risk of getting any disease, let's say, or 458 00:27:28,520 --> 00:27:31,600 Speaker 2: your risk of like dying before a certain age. So 459 00:27:31,680 --> 00:27:35,200 Speaker 2: I'm postulating that this is this measurement, your DNA methilation 460 00:27:35,359 --> 00:27:38,520 Speaker 2: clock or whatever like predicts that. So then we should 461 00:27:38,520 --> 00:27:41,600 Speaker 2: test it, right, you should put a bunch of things 462 00:27:41,640 --> 00:27:44,160 Speaker 2: in at least in mice right that we know will 463 00:27:44,160 --> 00:27:47,159 Speaker 2: extend miles lifespan, and then a bunch of other things 464 00:27:47,200 --> 00:27:49,360 Speaker 2: that like we know won't and then see just how 465 00:27:49,359 --> 00:27:51,879 Speaker 2: accurate are your predictions. If we have something like that 466 00:27:51,920 --> 00:27:55,159 Speaker 2: where we're like, oh yeah, this like has great accuracy. 467 00:27:55,240 --> 00:27:58,400 Speaker 2: When this thing moves per our previous conversation, that means 468 00:27:58,440 --> 00:28:02,000 Speaker 2: you're healthier, then we would have a much better sense 469 00:28:02,040 --> 00:28:05,400 Speaker 2: of like, Okay, is this spiritual thing working? Is this 470 00:28:06,040 --> 00:28:09,560 Speaker 2: repemison thing working, you know, like it's the fasting doing something. 471 00:28:09,680 --> 00:28:12,919 Speaker 2: There are a lot of sort of attempts to like 472 00:28:12,960 --> 00:28:16,240 Speaker 2: people who have this vision that like, look, we should 473 00:28:16,240 --> 00:28:21,200 Speaker 2: really get away from what is currently called healthcare, but 474 00:28:21,240 --> 00:28:23,359 Speaker 2: it's more like sick care, right, like wait until you 475 00:28:23,400 --> 00:28:27,720 Speaker 2: get disease and then get drugs for that disease. Towards 476 00:28:27,720 --> 00:28:30,240 Speaker 2: are more like can we just like measure your overall 477 00:28:30,320 --> 00:28:34,080 Speaker 2: health and your you know, homeostatic capacity or biological age, 478 00:28:34,119 --> 00:28:37,080 Speaker 2: whatever we want to call it, and try to prevent 479 00:28:37,119 --> 00:28:40,640 Speaker 2: you from having these diseases in the first place. That certainly, 480 00:28:40,800 --> 00:28:43,040 Speaker 2: if we can do that, well, that's a much better approach. 481 00:28:43,080 --> 00:28:46,920 Speaker 2: It's cheaper, like answer prevention, right, And the tricky part 482 00:28:46,920 --> 00:28:49,360 Speaker 2: of doing that is knowing the future, right, So we 483 00:28:49,400 --> 00:28:51,320 Speaker 2: need to build these tools that allow us to know 484 00:28:51,360 --> 00:28:54,160 Speaker 2: the future because currently you know. The other way with 485 00:28:54,280 --> 00:28:57,120 Speaker 2: that we do that is sort of controlled clinical trials 486 00:28:57,280 --> 00:28:59,680 Speaker 2: where you just like run the experiment and you have 487 00:28:59,720 --> 00:29:02,560 Speaker 2: a bunch of people and then you randomize them to 488 00:29:02,560 --> 00:29:04,840 Speaker 2: different groups, treat one and not the other one, and 489 00:29:04,880 --> 00:29:07,520 Speaker 2: then we see do we get the outcome that we want? 490 00:29:08,000 --> 00:29:10,320 Speaker 2: But doing that for aging it both you know, like 491 00:29:10,360 --> 00:29:12,520 Speaker 2: if you're doing that for like do you live or die? 492 00:29:13,560 --> 00:29:16,800 Speaker 2: It takes a long time. It's fairly expensive, and in 493 00:29:16,840 --> 00:29:20,160 Speaker 2: the US it's not obvious like is that something insurance 494 00:29:20,200 --> 00:29:22,760 Speaker 2: covers kind of should be, but like we haven't really 495 00:29:22,760 --> 00:29:26,000 Speaker 2: figured it out, So there's some financial risk involved there. 496 00:29:26,360 --> 00:29:31,080 Speaker 1: What's your intuition if you're even looking retrospectively. I don't 497 00:29:31,080 --> 00:29:32,840 Speaker 1: know if there are groups of people who have had 498 00:29:32,880 --> 00:29:35,200 Speaker 1: rest for all their whole lives and others that haven't, 499 00:29:35,320 --> 00:29:35,720 Speaker 1: or whatever. 500 00:29:35,760 --> 00:29:36,880 Speaker 2: But when you. 501 00:29:36,840 --> 00:29:39,280 Speaker 1: Look at the data and look at the whole picture, 502 00:29:39,440 --> 00:29:42,200 Speaker 1: what do you feel is the thing that maybe you 503 00:29:42,240 --> 00:29:44,400 Speaker 1: would do that might be useful? 504 00:29:45,000 --> 00:29:47,120 Speaker 2: Yeah, I think there's if you look at that. And 505 00:29:47,120 --> 00:29:48,600 Speaker 2: then the other thing you look at is like all 506 00:29:48,600 --> 00:29:51,160 Speaker 2: the animal studies, right, so like what actually makes a 507 00:29:51,280 --> 00:29:56,080 Speaker 2: mouse live longer? And the answer is calarge restriction of 508 00:29:56,120 --> 00:30:01,640 Speaker 2: some sort, so limiting the number of calories or are reprimicin, 509 00:30:02,000 --> 00:30:05,160 Speaker 2: which is a drug that's used as an immune suppress 510 00:30:05,200 --> 00:30:07,640 Speaker 2: and it's FDA approved at a high dose. So at 511 00:30:07,680 --> 00:30:11,680 Speaker 2: high dose it reduce prevent your immune system, it reduces activity, 512 00:30:11,680 --> 00:30:13,960 Speaker 2: and so we use it for like organ transplants. That's 513 00:30:13,960 --> 00:30:16,400 Speaker 2: not going to make you live longer suppressing your immune system, right, 514 00:30:16,400 --> 00:30:19,920 Speaker 2: but so at a much lower dose in mice. There's 515 00:30:19,920 --> 00:30:23,480 Speaker 2: some recent data in monkeys, and there's a trial on 516 00:30:23,600 --> 00:30:28,480 Speaker 2: going in dogs called Triad, where people are giving repromised 517 00:30:28,520 --> 00:30:31,200 Speaker 2: into dogs and see if they live longer. Repromizon has 518 00:30:31,200 --> 00:30:35,720 Speaker 2: the strongest data Other than this like eating regimen thing, 519 00:30:36,520 --> 00:30:39,560 Speaker 2: especially for kilored restriction, there's a caveat of like, well, 520 00:30:39,560 --> 00:30:42,960 Speaker 2: how much restriction and how much restriction once you multiply 521 00:30:43,080 --> 00:30:45,280 Speaker 2: that by like how much do you exercise? Right, Because 522 00:30:45,320 --> 00:30:48,880 Speaker 2: it's like in the lab mice that generally are in 523 00:30:48,960 --> 00:30:51,160 Speaker 2: cages and they have ad lib food and so forth, 524 00:30:51,520 --> 00:30:53,400 Speaker 2: you know you'll get a certain response, but then when 525 00:30:53,400 --> 00:30:55,600 Speaker 2: people are very different, they don't have the same genetics. 526 00:30:55,680 --> 00:30:58,560 Speaker 2: What is the right amount of restriction? There probably is 527 00:30:58,640 --> 00:31:02,240 Speaker 2: some amount, I mean there's for everyone. There's definitionly some 528 00:31:02,360 --> 00:31:04,960 Speaker 2: right amount. For probably a lot of people. It's probably 529 00:31:04,960 --> 00:31:08,240 Speaker 2: lower than where we're currently at, but it's it can 530 00:31:08,280 --> 00:31:10,640 Speaker 2: be you know, like there's a U curve. You just 531 00:31:10,640 --> 00:31:12,480 Speaker 2: eat less and less, it's definitely going to be bad 532 00:31:12,480 --> 00:31:14,360 Speaker 2: for you. So that's where we're back to, like needing 533 00:31:14,360 --> 00:31:18,719 Speaker 2: a measurement. Rappromizin is the other one, but similarly like 534 00:31:18,840 --> 00:31:21,920 Speaker 2: two higher dose is bad and so finding out like 535 00:31:22,000 --> 00:31:24,479 Speaker 2: what is the right dosing here where there's actually a 536 00:31:24,480 --> 00:31:29,000 Speaker 2: benefit that feels tricky. The nonprofit I started has funded 537 00:31:29,360 --> 00:31:32,360 Speaker 2: some clinical trials of rappromizing, like early stage trial Phase 538 00:31:32,440 --> 00:31:35,800 Speaker 2: one trials for different This one for like a reproductive health, 539 00:31:35,840 --> 00:31:37,960 Speaker 2: there's one for oral. 540 00:31:37,760 --> 00:31:39,360 Speaker 1: Health, and this is in humans. 541 00:31:39,760 --> 00:31:43,480 Speaker 2: That's in humans. Yeah, so there's four different trials that 542 00:31:43,560 --> 00:31:46,000 Speaker 2: we funded, and I think there's three others, and. 543 00:31:45,920 --> 00:31:49,680 Speaker 1: So you won't know the longevity piece for several decades. 544 00:31:50,240 --> 00:31:52,480 Speaker 2: None of those will give us the longevity piece, right, 545 00:31:52,520 --> 00:31:55,840 Speaker 2: So I think at that point, you're looking at this 546 00:31:56,000 --> 00:32:01,200 Speaker 2: dose seems safe and has a beneficial effect on this potentially, 547 00:32:01,680 --> 00:32:05,440 Speaker 2: and so if we then extrapolate like a broader health 548 00:32:05,720 --> 00:32:10,040 Speaker 2: benefit from the animal studies, your like expected value of 549 00:32:10,080 --> 00:32:12,920 Speaker 2: this might go to positive, but it's also positible, like 550 00:32:13,560 --> 00:32:17,160 Speaker 2: it's still it's still unclear. Yeah, So so those are 551 00:32:17,200 --> 00:32:20,080 Speaker 2: the two with the strongest data. Then there's some other things. 552 00:32:20,800 --> 00:32:23,240 Speaker 2: You know, mid Foreman, which is a diabetes drug. There's 553 00:32:23,280 --> 00:32:25,040 Speaker 2: a lot of push for that, you know, like we 554 00:32:25,040 --> 00:32:27,960 Speaker 2: should do a trial here in humans. There's a not 555 00:32:28,080 --> 00:32:30,600 Speaker 2: yet started but sort of like they're trying to fundraise 556 00:32:30,680 --> 00:32:35,040 Speaker 2: for it, thing called tame Game. And the mid form 557 00:32:35,080 --> 00:32:37,920 Speaker 2: of data it's more sort of observational what you said, like, Okay, 558 00:32:37,960 --> 00:32:40,239 Speaker 2: well we've given this to diabetics, so we have like 559 00:32:40,840 --> 00:32:44,719 Speaker 2: tens of millions of person years of data. What that 560 00:32:44,800 --> 00:32:48,320 Speaker 2: certainly tells us is like, this is not super dangerous. 561 00:32:48,680 --> 00:32:50,960 Speaker 2: What it may tell us is like, oh, there might 562 00:32:51,000 --> 00:32:54,280 Speaker 2: be lower overall cancer risk. But it's hard with these 563 00:32:54,280 --> 00:32:56,760 Speaker 2: like retrospective studies because you always have to be sure 564 00:32:56,800 --> 00:32:59,440 Speaker 2: that like the people that we looked at was there's 565 00:32:59,440 --> 00:33:03,040 Speaker 2: some unintended way that we selected for certain people that 566 00:33:03,120 --> 00:33:07,080 Speaker 2: already had a lower risk of having cancer diagnosis and 567 00:33:07,120 --> 00:33:09,760 Speaker 2: compared them to another one. So there's big caveats. That's 568 00:33:09,800 --> 00:33:12,480 Speaker 2: why we do these trials. But there's some sort of 569 00:33:12,600 --> 00:33:16,320 Speaker 2: human stuff the animal studies are less clear on, like 570 00:33:16,720 --> 00:33:19,239 Speaker 2: mid Foreman doing a whole lot, but might still be 571 00:33:19,240 --> 00:33:21,520 Speaker 2: beneficial and probably a safe. 572 00:33:21,600 --> 00:33:24,280 Speaker 1: What do you do personally? Do you take med forman 573 00:33:24,920 --> 00:33:25,719 Speaker 1: or wrap am icin. 574 00:33:26,880 --> 00:33:29,760 Speaker 2: I don't take rap of icein, Yeah, because I'd rather 575 00:33:29,800 --> 00:33:32,720 Speaker 2: see rather than somebody else find the safe dose, right, 576 00:33:33,280 --> 00:33:35,920 Speaker 2: I mean the basic stuff you already know. Right, It's 577 00:33:35,960 --> 00:33:38,600 Speaker 2: just like you should exercise, you should sleep, you should 578 00:33:38,640 --> 00:33:42,080 Speaker 2: like eat less bacon and more vegetables, definitely eat less sugar. Right, 579 00:33:42,120 --> 00:33:45,320 Speaker 2: So like most people is boring. I think like optimizing 580 00:33:45,360 --> 00:33:48,840 Speaker 2: for the longevity hack right now, you'll get a bit 581 00:33:49,120 --> 00:33:52,680 Speaker 2: but like the delta between I'm a hardcore like longevity 582 00:33:52,680 --> 00:33:57,160 Speaker 2: biohacker and I'm just like generally healthy. Maybe it's three 583 00:33:57,280 --> 00:34:00,000 Speaker 2: years or something. Like. I think the much more important 584 00:34:00,400 --> 00:34:03,720 Speaker 2: for the field is, like, let's put our effort into 585 00:34:04,920 --> 00:34:08,320 Speaker 2: doing these technological advancements that allow us to measure things, 586 00:34:08,320 --> 00:34:11,120 Speaker 2: that allow us to test more things at once, and 587 00:34:11,360 --> 00:34:15,359 Speaker 2: like really try to solve because like we know there 588 00:34:15,440 --> 00:34:17,680 Speaker 2: is potential. We know that like you or I could 589 00:34:17,760 --> 00:34:20,279 Speaker 2: have a kid that's like zero years old, so like 590 00:34:20,360 --> 00:34:23,400 Speaker 2: one cell in your body has the capacity to create 591 00:34:23,440 --> 00:34:27,040 Speaker 2: a functioning, completely non age system, right, And we also 592 00:34:27,080 --> 00:34:30,080 Speaker 2: know that there's animals that live much longer than humans 593 00:34:30,080 --> 00:34:32,680 Speaker 2: and there's great variability and like how long different animals live, 594 00:34:32,719 --> 00:34:34,920 Speaker 2: and we know that we can like you know, engineer 595 00:34:35,000 --> 00:34:37,720 Speaker 2: genetic engineer, there's worm or there's mouse to live longer. 596 00:34:37,880 --> 00:34:40,040 Speaker 2: So we know it's malleable. We know it's doable, and 597 00:34:40,120 --> 00:34:43,919 Speaker 2: it's possible to have like a big effect. I don't 598 00:34:43,920 --> 00:34:45,480 Speaker 2: know how hard it is, Like I don't know if 599 00:34:45,480 --> 00:34:48,880 Speaker 2: we like fifteen years we'll have some tremendous results with 600 00:34:48,920 --> 00:34:52,160 Speaker 2: partial reprogramming or something, or if it's like it's actually 601 00:34:52,239 --> 00:34:54,880 Speaker 2: much harder it'll take longer, right, sort of like AI, 602 00:34:54,960 --> 00:34:57,240 Speaker 2: it's like it's just around the corner and that suddenly 603 00:34:57,280 --> 00:34:57,879 Speaker 2: it takes off. 604 00:34:58,200 --> 00:35:00,680 Speaker 1: So if we were to fast forward to twenty five 605 00:35:00,760 --> 00:35:02,960 Speaker 1: years from now, where do you think the longevity field 606 00:35:03,000 --> 00:35:03,600 Speaker 1: is going to be. 607 00:35:04,360 --> 00:35:07,040 Speaker 2: Where I hope it would be is sort of like 608 00:35:07,719 --> 00:35:11,959 Speaker 2: rigorous testing of aging clocks happening in the next couple 609 00:35:12,040 --> 00:35:15,040 Speaker 2: of years probably will reveal many things to be fixed, 610 00:35:15,520 --> 00:35:18,880 Speaker 2: and then like deliberate efforts to create these sort of 611 00:35:18,920 --> 00:35:23,279 Speaker 2: like biomarkers, and then I think that's very doable on 612 00:35:23,400 --> 00:35:27,240 Speaker 2: like a five year horizon with concerted effort. Not guarantee 613 00:35:27,280 --> 00:35:30,440 Speaker 2: that that would happen, but there are different actors that 614 00:35:30,520 --> 00:35:33,200 Speaker 2: could like make that happen, such as the Altos lab 615 00:35:33,280 --> 00:35:36,600 Speaker 2: startups or our PAH in the US, or evolution in 616 00:35:36,640 --> 00:35:39,880 Speaker 2: Saudi Arabia. So that's one or even you know, frankly 617 00:35:40,040 --> 00:35:42,640 Speaker 2: like private philanthropists, like this is sort of like a 618 00:35:42,719 --> 00:35:46,200 Speaker 2: seed stage startup level effort or maybe a bit more. 619 00:35:46,960 --> 00:35:49,560 Speaker 2: So that's one. We definitely need that. Then we need 620 00:35:49,600 --> 00:35:51,719 Speaker 2: to figure out, Okay, what is the clinical trial we're 621 00:35:51,719 --> 00:35:53,759 Speaker 2: going to run? Are we going to try with the 622 00:35:53,760 --> 00:35:57,920 Speaker 2: current round of things like wrapamycin or synelytics. People are 623 00:35:57,920 --> 00:36:00,000 Speaker 2: excited about them. At foreman got to pick the right 624 00:36:00,160 --> 00:36:03,200 Speaker 2: thing or do a portfolio and then just start to 625 00:36:03,320 --> 00:36:05,359 Speaker 2: run those trials, and then I expect, I mean, this 626 00:36:05,400 --> 00:36:07,640 Speaker 2: is what happened in drug discovery in general. You try 627 00:36:07,640 --> 00:36:09,520 Speaker 2: to run the trial and then you find out some 628 00:36:09,560 --> 00:36:12,160 Speaker 2: of my endpoints, like we're not sensitive and off whatever, 629 00:36:12,520 --> 00:36:16,040 Speaker 2: so we could run we could do another round of trials. 630 00:36:16,360 --> 00:36:18,440 Speaker 2: So I think those things should happen. That's sort of 631 00:36:18,480 --> 00:36:21,759 Speaker 2: the ecosystem. And then at the same time people are 632 00:36:21,760 --> 00:36:25,280 Speaker 2: trying to do drug discovery for start getting different mechanisms 633 00:36:25,280 --> 00:36:28,120 Speaker 2: of aging. So if we did things in the right sequence, 634 00:36:28,120 --> 00:36:30,200 Speaker 2: and this is sort of like okay and prayer of 635 00:36:30,200 --> 00:36:33,480 Speaker 2: the world, you could like pm it all to like 636 00:36:33,560 --> 00:36:36,400 Speaker 2: work out, and then there's reality which relies on all 637 00:36:36,400 --> 00:36:39,439 Speaker 2: these intentives and whatever. But like all the companies now 638 00:36:39,480 --> 00:36:41,600 Speaker 2: and there's more and more of them that are trying 639 00:36:41,640 --> 00:36:45,440 Speaker 2: to find like age specific therapies, those would then have 640 00:36:45,520 --> 00:36:48,239 Speaker 2: an outlet to be not just we're going after And 641 00:36:48,400 --> 00:36:50,880 Speaker 2: so what happens now is like if I find a 642 00:36:50,960 --> 00:36:53,520 Speaker 2: drug for aging. Let's say I'm at Stanford, I'm doing 643 00:36:53,560 --> 00:36:55,920 Speaker 2: a bunch of research. I have this cool thing, I 644 00:36:55,960 --> 00:36:58,279 Speaker 2: think at least, and then I start a company and 645 00:36:58,320 --> 00:37:02,000 Speaker 2: then we raise fifty million dollars going to clinical trials. Bioage. 646 00:37:02,080 --> 00:37:05,800 Speaker 2: Just iPod yesterday that actually, yeah, that was from Stanford, 647 00:37:05,800 --> 00:37:09,319 Speaker 2: So that sort of a fits the story, right, and 648 00:37:09,360 --> 00:37:12,800 Speaker 2: they have clinical trials going. But even though the company 649 00:37:12,920 --> 00:37:16,680 Speaker 2: started with like, we're looking at aging, they had the 650 00:37:16,719 --> 00:37:19,560 Speaker 2: sort of incentive constraint to like, well, we don't know 651 00:37:19,560 --> 00:37:22,560 Speaker 2: how you run an aging trial yet. So everyone is like, 652 00:37:22,600 --> 00:37:24,200 Speaker 2: if we want to succeed here, if we want a 653 00:37:24,200 --> 00:37:27,319 Speaker 2: good readout for the investors, the patients, and so forth, 654 00:37:27,360 --> 00:37:30,240 Speaker 2: we're going to narrow down to a specific disease of aging. 655 00:37:30,280 --> 00:37:33,080 Speaker 2: And so you run that trial, which is good, Like 656 00:37:33,120 --> 00:37:37,640 Speaker 2: I hope that that treats patients. One could imagine, and again, 657 00:37:37,680 --> 00:37:40,160 Speaker 2: if we're being just blue skies ambitious, one can imagine 658 00:37:40,200 --> 00:37:44,399 Speaker 2: that the US decides, Look, we're doing this sick care thing. 659 00:37:44,480 --> 00:37:47,480 Speaker 2: We have like fourteen different institutes and medical specialties and 660 00:37:47,520 --> 00:37:50,759 Speaker 2: all this kind of stuff. But there's clearly biology that 661 00:37:51,080 --> 00:37:55,520 Speaker 2: spans those right, Like when your immune system dysfunctions, there's 662 00:37:55,520 --> 00:37:58,319 Speaker 2: a whole bunch of diseases. There's inflammation stuff that like 663 00:37:58,360 --> 00:38:02,080 Speaker 2: effects a whole bunch of diseases. We should stop doing 664 00:38:02,120 --> 00:38:05,799 Speaker 2: these trials where we test for one drug one disease, right, 665 00:38:05,840 --> 00:38:08,040 Speaker 2: match one of like thirty thousand things to one of 666 00:38:08,080 --> 00:38:10,719 Speaker 2: eight thousand diseases, and then try to do all by 667 00:38:10,760 --> 00:38:14,359 Speaker 2: all there you will take infinity time, right. Instead, we 668 00:38:14,400 --> 00:38:17,840 Speaker 2: should develop ways, and again this is sort of like 669 00:38:17,880 --> 00:38:20,400 Speaker 2: an our page kind of thing. We should develop ways 670 00:38:20,680 --> 00:38:22,960 Speaker 2: where when we run one trial, we can take some 671 00:38:23,000 --> 00:38:26,200 Speaker 2: blood samples and then we can look for these markers 672 00:38:26,520 --> 00:38:29,399 Speaker 2: that imply that the drug might work on a bunch 673 00:38:29,400 --> 00:38:32,200 Speaker 2: of other diseases. And so we sort of turn each 674 00:38:32,280 --> 00:38:37,120 Speaker 2: trial we run in this country into something that has 675 00:38:37,160 --> 00:38:40,799 Speaker 2: like an integrated overall beneficial effect on health or doesn't, right. 676 00:38:41,239 --> 00:38:43,319 Speaker 2: And so if you took that approach, you could have 677 00:38:43,400 --> 00:38:46,239 Speaker 2: fewer trials that lead us to a portfolio of medicines 678 00:38:46,280 --> 00:38:50,000 Speaker 2: that have a bigger overall impact. There's various reasons why 679 00:38:50,040 --> 00:38:52,040 Speaker 2: we're not doing that yet. One, we don't have those 680 00:38:52,040 --> 00:38:55,439 Speaker 2: measurements right, so there's technological development needed to do it right. 681 00:38:55,880 --> 00:38:59,240 Speaker 2: Another one is we probably don't have the perspective right. Like, again, 682 00:38:59,760 --> 00:39:02,719 Speaker 2: we used to you have one disease, and like that's 683 00:39:02,800 --> 00:39:05,480 Speaker 2: the thing that we the unit of sort of like 684 00:39:05,560 --> 00:39:08,600 Speaker 2: medical care. And then another one is the way that 685 00:39:08,640 --> 00:39:11,200 Speaker 2: the FDA is set up. It's sort of like nothing 686 00:39:11,239 --> 00:39:15,160 Speaker 2: bad can happen, but slowing things down has a secret 687 00:39:15,200 --> 00:39:17,000 Speaker 2: cost of like lots of people dying or whatever, but 688 00:39:17,040 --> 00:39:19,120 Speaker 2: like nothing bad can happen. And so you imagine if 689 00:39:19,120 --> 00:39:22,759 Speaker 2: you measure like is this good for all these different diseases, well, 690 00:39:22,800 --> 00:39:24,640 Speaker 2: what if one of them is bad? What if this 691 00:39:24,719 --> 00:39:28,400 Speaker 2: drug like will prevent Alzheimer's and heart failure, but it 692 00:39:28,600 --> 00:39:31,600 Speaker 2: increases your risk of cancer by ten percent, that might 693 00:39:31,640 --> 00:39:34,960 Speaker 2: still be like a good deal for a given human patient. 694 00:39:35,200 --> 00:39:37,480 Speaker 2: And you could also further segment down to people who 695 00:39:37,520 --> 00:39:39,680 Speaker 2: are not at elevated cancer risk, like this might be 696 00:39:39,719 --> 00:39:43,000 Speaker 2: extra good for them. But it's a tough sort of 697 00:39:43,040 --> 00:39:47,600 Speaker 2: political decision to say this is you know, we're utilitarian 698 00:39:47,680 --> 00:39:50,080 Speaker 2: in our healthcare or whatever that has not been made. 699 00:39:50,080 --> 00:39:54,160 Speaker 2: And consequently, for a farmer company of whoever's running the 700 00:39:54,160 --> 00:39:58,439 Speaker 2: clinical trial, there's a disincentive to have anything look bad 701 00:39:58,920 --> 00:40:01,319 Speaker 2: because that might tank the w whole thing. Even if 702 00:40:01,400 --> 00:40:03,600 Speaker 2: like there's you get you know, like twice as much 703 00:40:03,600 --> 00:40:06,440 Speaker 2: good stuff but some bad stuff, that's probably bad news 704 00:40:06,440 --> 00:40:09,000 Speaker 2: for approval right now. 705 00:40:23,840 --> 00:40:25,560 Speaker 1: So let me ask you something about this issue that 706 00:40:25,600 --> 00:40:27,560 Speaker 1: you mentioned about looking at one. 707 00:40:27,440 --> 00:40:28,160 Speaker 2: Disease at a time. 708 00:40:28,520 --> 00:40:31,120 Speaker 1: If you were extrapolating, you know, fifty years from now, 709 00:40:31,160 --> 00:40:34,120 Speaker 1: do you think that the names of the medical professionals 710 00:40:34,160 --> 00:40:37,239 Speaker 1: will change, so we don't have a neurologist and a 711 00:40:37,280 --> 00:40:39,960 Speaker 1: cardiologist and a liver specialist and so on, but what 712 00:40:40,000 --> 00:40:42,720 Speaker 1: we have or things that are more general or broad. 713 00:40:43,320 --> 00:40:47,600 Speaker 2: Optimistically, yes, pessimistically I feel like we will add something 714 00:40:47,680 --> 00:40:50,080 Speaker 2: new and just sort of like staple in on top 715 00:40:50,160 --> 00:40:53,480 Speaker 2: of the old, so you'll still have a neurologist and 716 00:40:53,520 --> 00:40:54,000 Speaker 2: so forth. 717 00:40:54,080 --> 00:40:57,120 Speaker 1: Right, if you were in charge, if you were czar 718 00:40:57,480 --> 00:41:00,279 Speaker 1: of the hospital system and could say what it should be, 719 00:41:00,360 --> 00:41:01,839 Speaker 1: what kind of things would you set up? 720 00:41:02,200 --> 00:41:04,920 Speaker 2: I mean I think it I probably would, you know, Like, yes, 721 00:41:05,320 --> 00:41:07,319 Speaker 2: it's not like throw away all the old, right, Like 722 00:41:07,360 --> 00:41:09,359 Speaker 2: you still have a heart, and like if we want 723 00:41:09,400 --> 00:41:11,920 Speaker 2: to know what's going on, someone who knows a lot 724 00:41:11,960 --> 00:41:14,319 Speaker 2: about the heart and can talk in detail about the 725 00:41:14,360 --> 00:41:16,480 Speaker 2: exact rhythm of your heart and what that means for 726 00:41:16,640 --> 00:41:18,960 Speaker 2: like your champers and stuff like that is a good 727 00:41:19,040 --> 00:41:22,200 Speaker 2: thing that we want. Right then, the question is like 728 00:41:22,320 --> 00:41:24,920 Speaker 2: what have we what do we want to lump and 729 00:41:25,000 --> 00:41:26,000 Speaker 2: what do we want to split? 730 00:41:26,280 --> 00:41:26,440 Speaker 1: Right? 731 00:41:26,520 --> 00:41:28,840 Speaker 2: Like what are the things that we've called different things 732 00:41:28,840 --> 00:41:32,080 Speaker 2: but actually they're a similar thing, right, And then what 733 00:41:32,120 --> 00:41:33,719 Speaker 2: do we want to split up? I think one thing 734 00:41:33,760 --> 00:41:36,959 Speaker 2: for sure we want to split up is like Alzheimer's 735 00:41:36,960 --> 00:41:38,960 Speaker 2: and aging. So if you look at the National Institutent 736 00:41:39,000 --> 00:41:42,239 Speaker 2: on Aging right now, something like half the budget is 737 00:41:42,280 --> 00:41:46,000 Speaker 2: earmarked for Alzheimer's research, which is like brain specific and 738 00:41:46,360 --> 00:41:49,960 Speaker 2: just one specific disease, right, And so it's something that 739 00:41:49,960 --> 00:41:51,520 Speaker 2: a lot of people are very afraid of, and so 740 00:41:51,560 --> 00:41:54,279 Speaker 2: that's why it happened, right, But like it's not it 741 00:41:54,280 --> 00:41:57,640 Speaker 2: doesn't make sense that like that's just like lumped in 742 00:41:57,960 --> 00:42:00,319 Speaker 2: under aging. So I think the things to me that 743 00:42:00,400 --> 00:42:04,520 Speaker 2: strike me as like there is a multi organ thing 744 00:42:04,600 --> 00:42:09,520 Speaker 2: going on here for sure, like inflammation immune function. And 745 00:42:09,560 --> 00:42:13,640 Speaker 2: so we obviously have immunologists and we have people that 746 00:42:13,680 --> 00:42:15,600 Speaker 2: look at like the function of the immune system. But 747 00:42:15,640 --> 00:42:22,120 Speaker 2: the overlap between the state of like infections and then 748 00:42:22,160 --> 00:42:24,640 Speaker 2: the amount of inflammation you have and like aging and 749 00:42:24,680 --> 00:42:28,480 Speaker 2: the amount of different organs like solid tissues and the 750 00:42:28,480 --> 00:42:31,280 Speaker 2: amount of inflammation you have, and like the interplay between 751 00:42:31,320 --> 00:42:35,120 Speaker 2: those that feels like a clear you know something we're 752 00:42:35,200 --> 00:42:39,879 Speaker 2: probably overlooking. I mean, COVID is sort of a big 753 00:42:39,920 --> 00:42:43,480 Speaker 2: boost here where people are looking at like, okay, exposure 754 00:42:43,520 --> 00:42:45,440 Speaker 2: to this virus, what does it do in the long term, 755 00:42:45,520 --> 00:42:47,319 Speaker 2: And a lot of us, especially when we're young, right, 756 00:42:47,320 --> 00:42:50,160 Speaker 2: we're used to like, Okay, got cold, you know, immune 757 00:42:50,160 --> 00:42:53,279 Speaker 2: system killed cold. We're fine. It's just like it's a 758 00:42:53,320 --> 00:42:56,520 Speaker 2: separate class of things. But more and more evidence, like 759 00:42:56,560 --> 00:42:59,879 Speaker 2: it does affect your chance of having Alzheimer's, Like you're 760 00:42:59,880 --> 00:43:04,120 Speaker 2: amount of bacterial like mouth bacteria correlate well with risk 761 00:43:04,160 --> 00:43:07,320 Speaker 2: of Alzheimer's. It seems like, you know, more research, so 762 00:43:07,719 --> 00:43:11,080 Speaker 2: I think that's one. And then like if we just 763 00:43:11,080 --> 00:43:13,480 Speaker 2: think about like what goes wrong in your tissues, there's 764 00:43:13,520 --> 00:43:16,960 Speaker 2: sort of some common things that happen. One is like 765 00:43:17,040 --> 00:43:20,600 Speaker 2: an out of control inflammation loop that leads to fibrosis. 766 00:43:20,600 --> 00:43:23,080 Speaker 2: So this is sort of scarring of your tissue. And 767 00:43:23,120 --> 00:43:26,080 Speaker 2: this will happen in your kidney, in your heart, your liver, 768 00:43:26,280 --> 00:43:29,440 Speaker 2: and your lungs. And sometimes we call it like pulmonary 769 00:43:29,440 --> 00:43:34,520 Speaker 2: fibrosis or COPD chronic obstructive pulmonary disorder, so like smoker's 770 00:43:34,600 --> 00:43:38,319 Speaker 2: lung that's when it's real bad. But the process is 771 00:43:38,320 --> 00:43:40,359 Speaker 2: happening in like most of your tissues at all times. 772 00:43:40,360 --> 00:43:43,280 Speaker 2: So that's like a thing. What is this particular loop 773 00:43:43,560 --> 00:43:48,680 Speaker 2: of something is like misfunctioning in the cells of that 774 00:43:48,760 --> 00:43:52,080 Speaker 2: tissue and then it triggers immune infiltration and then that 775 00:43:52,120 --> 00:43:55,719 Speaker 2: triggers fibrosis. Like that could be something that people are 776 00:43:55,719 --> 00:43:57,799 Speaker 2: specialized in, which is true in academia. There are people 777 00:43:57,840 --> 00:44:00,520 Speaker 2: who are like, look at that specifically, right, But medicine 778 00:44:00,600 --> 00:44:02,920 Speaker 2: is like different because then you start thinking, okay, we 779 00:44:03,000 --> 00:44:08,799 Speaker 2: have this drug. Might this drug actually improve multiple organs, right, 780 00:44:08,840 --> 00:44:11,320 Speaker 2: because the same thing is sort of happening, or maybe 781 00:44:11,360 --> 00:44:14,959 Speaker 2: like this drug it was approved just for this organ, 782 00:44:15,000 --> 00:44:16,840 Speaker 2: but it won't work in this or the organ because 783 00:44:16,880 --> 00:44:19,520 Speaker 2: like the biology is a bit different. So those are 784 00:44:19,560 --> 00:44:22,719 Speaker 2: some and then there's the whole like why do we 785 00:44:22,840 --> 00:44:25,360 Speaker 2: lose cells? So there's a whole bunch of organs in 786 00:44:25,400 --> 00:44:29,279 Speaker 2: your body that lose cells. The most shocking one might 787 00:44:29,320 --> 00:44:32,560 Speaker 2: be like the thymus, and so the thymus here is 788 00:44:32,640 --> 00:44:35,640 Speaker 2: like where your immune cells get told, you know, like 789 00:44:35,640 --> 00:44:37,560 Speaker 2: what should you attack and what should you not attack? 790 00:44:38,280 --> 00:44:41,080 Speaker 2: And that whole organ is like basically replaced with fat 791 00:44:41,080 --> 00:44:44,480 Speaker 2: by age forty, So it's just like you have an 792 00:44:44,600 --> 00:44:47,280 Speaker 2: organ that you start out with and it's gone obviously 793 00:44:47,600 --> 00:44:51,480 Speaker 2: for women like you're reproductive, organs like you get this 794 00:44:51,920 --> 00:44:53,359 Speaker 2: and for men, I don't know if you get it, 795 00:44:53,400 --> 00:44:55,960 Speaker 2: but that's one. Then there are other organs, right like 796 00:44:56,000 --> 00:44:58,720 Speaker 2: your brain. You lose cells in your brain, you lose 797 00:44:58,800 --> 00:45:01,000 Speaker 2: cells in your muscle, all what gets replaced with fat. 798 00:45:01,040 --> 00:45:03,040 Speaker 2: So there are some organs where like the way they 799 00:45:03,080 --> 00:45:06,480 Speaker 2: fail is that you lose them cells over time and 800 00:45:06,520 --> 00:45:09,000 Speaker 2: they don't get restored because that organ the cells don't 801 00:45:09,040 --> 00:45:11,400 Speaker 2: really divide, like the cells in your brain very little, 802 00:45:11,600 --> 00:45:14,799 Speaker 2: muscle very little. And so that's another area where there's 803 00:45:14,840 --> 00:45:19,680 Speaker 2: like something that happens consistently across tissues that isn't covered 804 00:45:19,760 --> 00:45:22,400 Speaker 2: by a single medical specialty. 805 00:45:22,680 --> 00:45:25,240 Speaker 1: So let's say we extrapolate a thousand years from now 806 00:45:25,320 --> 00:45:27,759 Speaker 1: and all of the mysteries of biology where we're now 807 00:45:27,800 --> 00:45:30,280 Speaker 1: sort of at the foot of the mountain. We've summitted 808 00:45:30,440 --> 00:45:33,560 Speaker 1: the peak and we're there. We understand the whole network computationally, 809 00:45:33,600 --> 00:45:35,920 Speaker 1: it's all worked out. My question is do you think 810 00:45:35,960 --> 00:45:39,520 Speaker 1: there are natural limits to how long the human body 811 00:45:39,760 --> 00:45:45,120 Speaker 1: can live? Or is longevity something that has no limit 812 00:45:45,160 --> 00:45:45,440 Speaker 1: to it? 813 00:45:46,040 --> 00:45:48,880 Speaker 2: If you make no changes whatsoever to the human body. 814 00:45:49,000 --> 00:45:51,600 Speaker 2: Then yes, it does seem like there are natural limits, right, 815 00:45:51,680 --> 00:45:54,480 Speaker 2: Like few people live over one hundred and fewer still 816 00:45:54,520 --> 00:45:58,400 Speaker 2: live more than one hundred and twenty. Right, But that 817 00:45:58,520 --> 00:46:01,200 Speaker 2: assumes that we don't change in anything, that we don't 818 00:46:01,200 --> 00:46:06,360 Speaker 2: have any technology, right, And so if we have endless 819 00:46:06,400 --> 00:46:10,719 Speaker 2: technology and we are willing to change to some degree, 820 00:46:10,760 --> 00:46:13,520 Speaker 2: like what is a human body, right, like we're willing 821 00:46:13,560 --> 00:46:16,640 Speaker 2: to which we do already, right, So like if you 822 00:46:16,680 --> 00:46:19,799 Speaker 2: lose a leg, you get robot leg, right, that's you 823 00:46:19,880 --> 00:46:22,719 Speaker 2: have a cyborg now, right, And similar if you have 824 00:46:22,760 --> 00:46:25,360 Speaker 2: a pacemaker, it's less visible. And so I think so 825 00:46:25,480 --> 00:46:30,719 Speaker 2: far we're okay with some cautious pace of like we 826 00:46:30,760 --> 00:46:34,320 Speaker 2: are actually augmenting the human body in order to avoid disease. 827 00:46:34,640 --> 00:46:37,680 Speaker 2: Even that a vaccine would fall into this category, right, 828 00:46:38,080 --> 00:46:42,120 Speaker 2: And so if you put that in, there's no no, 829 00:46:42,200 --> 00:46:45,520 Speaker 2: I don't think that there's any limit to some definition 830 00:46:45,600 --> 00:46:48,240 Speaker 2: of the human body. And we can see that because 831 00:46:48,280 --> 00:46:51,400 Speaker 2: like the human race exists still and so so many 832 00:46:51,480 --> 00:46:55,960 Speaker 2: generations of humans have produced a new body out of 833 00:46:56,000 --> 00:46:58,839 Speaker 2: a single cell, out of the same DNA code, right, 834 00:46:58,920 --> 00:47:02,720 Speaker 2: Like life happened once and it's still going for many, 835 00:47:02,800 --> 00:47:06,520 Speaker 2: many millions of years, right, And so it's not and 836 00:47:06,560 --> 00:47:07,960 Speaker 2: we see that, you know, you have a tree that 837 00:47:08,000 --> 00:47:10,480 Speaker 2: lived five thousand years. That's a very different kind of body. 838 00:47:10,680 --> 00:47:13,400 Speaker 2: But like it's possible to set things up to last 839 00:47:13,400 --> 00:47:13,799 Speaker 2: that long. 840 00:47:14,440 --> 00:47:20,520 Speaker 1: And do you see any ethical or philosophical issues about 841 00:47:20,560 --> 00:47:23,279 Speaker 1: what that would be to have a lifespan of two hundred, 842 00:47:23,320 --> 00:47:24,760 Speaker 1: three hundred and five hundred years. 843 00:47:25,160 --> 00:47:26,920 Speaker 2: Yeah, for sure. I mean I think there's a lot 844 00:47:27,000 --> 00:47:30,760 Speaker 2: of stuff, and I think that the not very serious 845 00:47:30,880 --> 00:47:34,400 Speaker 2: version of engaging with that is like it's weird and 846 00:47:34,560 --> 00:47:37,319 Speaker 2: unnatural and we need to like just not do it. 847 00:47:37,880 --> 00:47:40,239 Speaker 2: Because by that argument, if you look at I can 848 00:47:40,280 --> 00:47:43,320 Speaker 2: send you like this graph of like lifespan over time. 849 00:47:43,880 --> 00:47:45,560 Speaker 2: And so there have been these statements that like, well 850 00:47:45,560 --> 00:47:47,360 Speaker 2: it can't go above seventy you can't go so you 851 00:47:47,400 --> 00:47:49,759 Speaker 2: have these like horizontal lines and then you just have 852 00:47:49,840 --> 00:47:54,120 Speaker 2: like a straight diagonal line just going up since you know, 853 00:47:54,239 --> 00:47:57,319 Speaker 2: hundreds of years of like you know, human existence or 854 00:47:57,400 --> 00:48:01,120 Speaker 2: human civilization. Right. So anyway, like if you want to 855 00:48:01,160 --> 00:48:04,279 Speaker 2: take the stance that like living longer is bad because X, 856 00:48:04,320 --> 00:48:06,359 Speaker 2: and there's lots of different X, there's like what if 857 00:48:06,440 --> 00:48:09,560 Speaker 2: bad people live longer? What if a tyrant lives longer? Right? 858 00:48:09,920 --> 00:48:12,000 Speaker 2: Or like what if we have wealth inequality and then 859 00:48:12,000 --> 00:48:15,920 Speaker 2: it gets perpetuated for longer. Some of those are like, well, 860 00:48:16,200 --> 00:48:18,359 Speaker 2: you know, like we have inheritance so that it does 861 00:48:18,400 --> 00:48:21,239 Speaker 2: anyway or whatever. But like, but even if like that's 862 00:48:21,239 --> 00:48:25,759 Speaker 2: a legitimate issue, I find it fundamentally not serious that 863 00:48:25,840 --> 00:48:29,120 Speaker 2: your proposal is to kill everyone, right, Like it's it 864 00:48:29,200 --> 00:48:33,680 Speaker 2: let pretend that we were we had three hundred year 865 00:48:33,680 --> 00:48:36,600 Speaker 2: lifespans and then we found that, like, oh, we have 866 00:48:36,640 --> 00:48:39,279 Speaker 2: a lot of wealth inequality because the ones who get 867 00:48:39,280 --> 00:48:42,160 Speaker 2: ahead early they accumulate more and more over time. It's like, 868 00:48:42,560 --> 00:48:45,200 Speaker 2: let's kill everyone at a young age. It's not a 869 00:48:45,239 --> 00:48:50,600 Speaker 2: serious solution to this, Like, let's find a societal solution philosophically. 870 00:48:49,680 --> 00:48:51,960 Speaker 1: What does it mean if we're all living three hundred 871 00:48:52,000 --> 00:48:53,960 Speaker 1: and five hundred years? What does what does that mean 872 00:48:54,000 --> 00:48:54,520 Speaker 1: for society? 873 00:48:54,880 --> 00:48:56,880 Speaker 2: You know, there are concepts that we are used to 874 00:48:57,120 --> 00:49:00,600 Speaker 2: working a certain way that might not work that way. 875 00:49:00,640 --> 00:49:03,640 Speaker 2: The most obvious one is retirement, right, So, like retirement 876 00:49:03,680 --> 00:49:07,239 Speaker 2: is fundamentally like we recognize that humans are like not 877 00:49:07,400 --> 00:49:10,880 Speaker 2: very capable, then become capable, and then are less capable 878 00:49:10,880 --> 00:49:13,600 Speaker 2: again in old age, right, And so then we've structured 879 00:49:13,640 --> 00:49:15,680 Speaker 2: society so that the ones who are most capable are 880 00:49:15,719 --> 00:49:19,360 Speaker 2: supporting the ones who are not capable, including children, including 881 00:49:19,400 --> 00:49:22,680 Speaker 2: infirm aged people. Right. And so that's just like the 882 00:49:22,680 --> 00:49:25,719 Speaker 2: way things are. And so obviously if the older people, like, 883 00:49:25,719 --> 00:49:29,000 Speaker 2: if you just stretch that period of like competency, good 884 00:49:29,080 --> 00:49:31,239 Speaker 2: news is that you don't bankrupt medicare, which is what's 885 00:49:31,280 --> 00:49:33,600 Speaker 2: going to default happen now, right, But of course it 886 00:49:33,680 --> 00:49:36,480 Speaker 2: means like you're not going to retire at sixty or 887 00:49:36,520 --> 00:49:39,360 Speaker 2: sixty five or whatever it is. Right, Let's say we 888 00:49:39,520 --> 00:49:42,360 Speaker 2: really extended lifespan. You live for like three hundred years. 889 00:49:42,520 --> 00:49:45,600 Speaker 2: You might have multiple careers in different areas. You might 890 00:49:45,640 --> 00:49:48,640 Speaker 2: have like a work for this long and then go 891 00:49:48,719 --> 00:49:52,080 Speaker 2: on like retirement sabbatical thing where you just like go 892 00:49:52,200 --> 00:49:54,200 Speaker 2: back to learning a new thing for ten years, and 893 00:49:54,239 --> 00:49:57,080 Speaker 2: you're not very useful again because like you used to 894 00:49:57,160 --> 00:49:59,120 Speaker 2: be a scholar and now you're going to be an 895 00:49:59,200 --> 00:50:01,839 Speaker 2: artist or whatever it is, right, and then you come 896 00:50:01,880 --> 00:50:04,000 Speaker 2: back and you do a new thing. It'll be interesting 897 00:50:04,000 --> 00:50:05,319 Speaker 2: to see what happens. Right. 898 00:50:10,560 --> 00:50:14,040 Speaker 1: That was my interview with Martin borsch Jensen, who's longevity 899 00:50:14,040 --> 00:50:19,160 Speaker 1: researcher and co founder and chief scientific officer of Gordian Biotechnology. 900 00:50:19,600 --> 00:50:23,279 Speaker 1: And one of the things this conversation surfaces is why 901 00:50:23,360 --> 00:50:27,799 Speaker 1: the science is so difficult. Biology is full of feedback 902 00:50:27,880 --> 00:50:32,160 Speaker 1: loops that cause nonlinear responses, and that makes it really 903 00:50:32,200 --> 00:50:36,799 Speaker 1: difficult for us to move from simple experimental observations like 904 00:50:36,880 --> 00:50:40,040 Speaker 1: the blood level of some molecule type that we measure 905 00:50:40,080 --> 00:50:46,560 Speaker 1: in worms or mice, to reliably producing meaningful physiologic changes 906 00:50:46,640 --> 00:50:50,239 Speaker 1: in humans, or even knowing what the right changes to 907 00:50:50,320 --> 00:50:53,280 Speaker 1: aim for. So how far are we from a world 908 00:50:53,760 --> 00:50:59,520 Speaker 1: where anti aging therapies are as routine as vaccinations. It's 909 00:50:59,560 --> 00:51:03,799 Speaker 1: not happen anytime soon. But on the other hand, we 910 00:51:03,920 --> 00:51:07,440 Speaker 1: now have billions of young brains on this planet getting 911 00:51:07,560 --> 00:51:12,000 Speaker 1: great educations, and with the exponential pace of technology and 912 00:51:12,040 --> 00:51:16,319 Speaker 1: the increasing pace of medical innovation, maybe we can get there. 913 00:51:16,360 --> 00:51:19,800 Speaker 1: Maybe we can not only prevent people from dying young, 914 00:51:19,920 --> 00:51:22,719 Speaker 1: as we have done over the past couple centuries, but 915 00:51:22,840 --> 00:51:28,400 Speaker 1: also extend a healthy lifespan such that we actually will 916 00:51:28,480 --> 00:51:34,839 Speaker 1: someday give heartbroken funereal speeches lamenting a person dying at 917 00:51:34,880 --> 00:51:36,719 Speaker 1: the tender young age of one. 918 00:51:36,680 --> 00:51:38,440 Speaker 2: Hundred and twenty two years old. 919 00:51:38,800 --> 00:51:44,280 Speaker 1: Perhaps we will come to understand the giant biochemical puzzle 920 00:51:44,480 --> 00:51:49,080 Speaker 1: of ourselves and actually be able to shift around some 921 00:51:49,200 --> 00:51:52,799 Speaker 1: of the pieces. Perhaps a few of us listening to 922 00:51:52,880 --> 00:51:57,839 Speaker 1: this podcast may just be lucky enough to live to 923 00:51:58,000 --> 00:52:02,440 Speaker 1: see that day, and there to see lots of days 924 00:52:02,960 --> 00:52:09,640 Speaker 1: past that as well. Go to eagleman dot com slash 925 00:52:09,719 --> 00:52:13,319 Speaker 1: podcast for more information and to find further reading. Send 926 00:52:13,320 --> 00:52:16,480 Speaker 1: me an email at podcasts at eagleman dot com with 927 00:52:16,600 --> 00:52:20,120 Speaker 1: questions or discussion, and check out and subscribe to Inner 928 00:52:20,200 --> 00:52:23,960 Speaker 1: Cosmos on YouTube for videos of each episode and to 929 00:52:24,080 --> 00:52:28,520 Speaker 1: leave comments until next time. I'm David Eagleman, and this 930 00:52:28,760 --> 00:52:30,080 Speaker 1: is Inner Cosmos.