1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:20,396 --> 00:00:25,476 Speaker 2: There are these amazing cells in tiny human embryos. The 3 00:00:25,556 --> 00:00:29,876 Speaker 2: cells are called pluripotent stem cells, and they're amazing because 4 00:00:29,996 --> 00:00:33,996 Speaker 2: they can become any kind of human cell, a red 5 00:00:34,036 --> 00:00:37,596 Speaker 2: blood cell, a skin cell, anything, any cell in your body. 6 00:00:38,396 --> 00:00:43,076 Speaker 2: But pluripotent stem cells only exist for the first fourteen 7 00:00:43,196 --> 00:00:48,276 Speaker 2: days of embryonic development. After that, they're gone forever. At 8 00:00:48,316 --> 00:00:51,396 Speaker 2: least we used to think they were gone forever. And 9 00:00:51,396 --> 00:00:55,116 Speaker 2: then about twenty years ago, researchers figured out how to 10 00:00:55,156 --> 00:00:58,876 Speaker 2: take regular cells from adults, blood cells or skin cells 11 00:00:58,956 --> 00:01:01,956 Speaker 2: or whatever. Take those cells, bring them into the lab, 12 00:01:02,036 --> 00:01:07,116 Speaker 2: and then turn them back into pluripotent stem cells. These 13 00:01:07,156 --> 00:01:12,036 Speaker 2: cells are called induced pluripotent stem cells, or ipsc's, and 14 00:01:12,236 --> 00:01:16,116 Speaker 2: the possibilities they present for human health are both kind 15 00:01:16,116 --> 00:01:20,116 Speaker 2: of obvious and awesome. If a person has a disease, 16 00:01:20,556 --> 00:01:24,556 Speaker 2: you could use that person's own cells to grow any 17 00:01:24,676 --> 00:01:27,196 Speaker 2: kind of new cells that they might need. You could 18 00:01:27,196 --> 00:01:30,516 Speaker 2: grow new brain cells for Parkinson's disease, or new heart 19 00:01:30,596 --> 00:01:33,996 Speaker 2: muscle cells for heart failure, or new bone marrow cells 20 00:01:33,996 --> 00:01:37,876 Speaker 2: for leukemia patients. It has taken a long time to 21 00:01:38,076 --> 00:01:40,516 Speaker 2: put that dream into practice, and in fact, it's not 22 00:01:40,596 --> 00:01:45,116 Speaker 2: really solved yet, but researchers are getting close. A bunch 23 00:01:45,196 --> 00:01:49,036 Speaker 2: of clinical trials are now underway using iPSCs to treat 24 00:01:49,076 --> 00:01:54,036 Speaker 2: everything from Parkinson's disease to cancer to macular degeneration. But 25 00:01:54,276 --> 00:01:57,956 Speaker 2: even if these clinical trials are successful, there will be 26 00:01:58,156 --> 00:02:03,396 Speaker 2: another problem to solve. Turning a patient cells into iPSCs 27 00:02:03,516 --> 00:02:06,636 Speaker 2: and then into whatever kind of cells they need takes 28 00:02:06,916 --> 00:02:10,876 Speaker 2: months of work by highly trained science. It costs hundreds 29 00:02:10,916 --> 00:02:14,556 Speaker 2: of thousands of dollars for each patient, and so even 30 00:02:14,556 --> 00:02:18,036 Speaker 2: if those clinical trials are successful, the process of making 31 00:02:18,036 --> 00:02:20,756 Speaker 2: the seals will still be too expensive and too labor 32 00:02:20,756 --> 00:02:24,676 Speaker 2: intensive to ever benefit you know, millions of patients. So 33 00:02:25,036 --> 00:02:27,436 Speaker 2: if the dream of IPSS is going to come true, 34 00:02:27,836 --> 00:02:30,956 Speaker 2: somebody needs to figure out a faster, cheaper way to 35 00:02:31,076 --> 00:02:40,276 Speaker 2: make them. I'm Jacob Goldstein and this is What's Your Problem, 36 00:02:40,436 --> 00:02:42,156 Speaker 2: the show where I talk to people who are trying 37 00:02:42,236 --> 00:02:46,236 Speaker 2: to make technological progress. My guest today is Nabiha si Client. 38 00:02:46,436 --> 00:02:49,596 Speaker 2: She's the co founder and CEO of a company called Selino. 39 00:02:50,396 --> 00:02:54,596 Speaker 2: Her problem is this, how can you make iPSC therapies 40 00:02:54,676 --> 00:02:59,756 Speaker 2: quickly and cheaply? Before she got into the induced pluripotent 41 00:02:59,836 --> 00:03:03,396 Speaker 2: stem cell business Nibiha was studying to be a physicist 42 00:03:03,636 --> 00:03:05,556 Speaker 2: and she loved physics. Was going off to get a 43 00:03:05,556 --> 00:03:09,636 Speaker 2: PhD in physics at Harvard, but just before she started school, 44 00:03:09,796 --> 00:03:12,436 Speaker 2: her grandmother died, and she told me that had a 45 00:03:12,476 --> 00:03:15,476 Speaker 2: really profound effect on how she thought about her research 46 00:03:15,636 --> 00:03:16,476 Speaker 2: and her career. 47 00:03:17,516 --> 00:03:20,916 Speaker 3: My grandma died due to severe diabetes that was not 48 00:03:21,996 --> 00:03:26,836 Speaker 3: possible to control with insulin and other medications, and that 49 00:03:26,996 --> 00:03:30,636 Speaker 3: just I felt really helpless. I felt helpless, and I 50 00:03:30,756 --> 00:03:35,116 Speaker 3: felt this urge too. Okay, I don't feel comfortable going 51 00:03:35,156 --> 00:03:39,316 Speaker 3: down this intellectual curiosity path of becoming a physicist, and like, 52 00:03:39,396 --> 00:03:42,356 Speaker 3: what can I do to build better tools? There must 53 00:03:42,356 --> 00:03:46,476 Speaker 3: be better tools that are necessary if my grandma died 54 00:03:47,156 --> 00:03:49,836 Speaker 3: pretty odd, you know, due to diabetes, and there was 55 00:03:49,876 --> 00:03:52,396 Speaker 3: nothing anyone could really do about it. 56 00:03:52,916 --> 00:03:55,316 Speaker 2: So she started thinking about how to use physics to 57 00:03:55,436 --> 00:03:58,916 Speaker 2: improve human health. For her graduate work, she figured out 58 00:03:58,916 --> 00:04:02,716 Speaker 2: how to use lasers to make tiny holes in cell walls, 59 00:04:03,396 --> 00:04:06,236 Speaker 2: and she started talking to all the researchers that she 60 00:04:06,276 --> 00:04:11,356 Speaker 2: could to try and find useful applications for her research. Eventually, 61 00:04:11,476 --> 00:04:15,356 Speaker 2: someone told her about induced pluralpotent stem cells, about iPSCs, 62 00:04:15,916 --> 00:04:17,676 Speaker 2: and she realized that she might be able to use 63 00:04:17,756 --> 00:04:21,716 Speaker 2: lasers to help automate the process of cultivating iPSCs. 64 00:04:22,076 --> 00:04:25,716 Speaker 3: IPCs are extra complicated and special. I thought, I like 65 00:04:25,756 --> 00:04:27,956 Speaker 3: to call them special because you actually have to go 66 00:04:28,076 --> 00:04:30,996 Speaker 3: and like scrape bad cells away with a pipe pedant. 67 00:04:31,396 --> 00:04:32,596 Speaker 4: It's super artisanal. 68 00:04:32,716 --> 00:04:36,076 Speaker 3: So you're basically having these brilliant scientists looking under a microscope, 69 00:04:36,636 --> 00:04:39,636 Speaker 3: holding cells in the dish, and then scraping with a 70 00:04:39,676 --> 00:04:44,316 Speaker 3: pipe petter. And they're literally working ten ten hours a 71 00:04:44,396 --> 00:04:47,756 Speaker 3: day scraping cells, not taking vacations, trying to get to 72 00:04:47,796 --> 00:04:49,956 Speaker 3: work during the craziest snowstorms. 73 00:04:49,396 --> 00:04:52,116 Speaker 4: Because if they don't show up, that run dies. 74 00:04:52,756 --> 00:04:55,516 Speaker 2: Uh huh. And presumably, I mean if you think of 75 00:04:55,636 --> 00:04:58,036 Speaker 2: actually getting it to the point where you can treat patients, 76 00:04:58,876 --> 00:05:03,436 Speaker 2: it would be sort of impossibly expensive slash small scale, right, 77 00:05:03,516 --> 00:05:06,996 Speaker 2: Like you could never do that for one hundred thousand 78 00:05:07,036 --> 00:05:08,476 Speaker 2: people or something, right. 79 00:05:09,236 --> 00:05:11,356 Speaker 4: Heart, we don't have enough scientists in the world. 80 00:05:11,756 --> 00:05:15,076 Speaker 2: And of course that like highly skilled labor means like 81 00:05:15,276 --> 00:05:17,756 Speaker 2: even more expensive than expensive drugs usually are. 82 00:05:17,836 --> 00:05:19,316 Speaker 4: Right. Presumably that's right. 83 00:05:19,396 --> 00:05:22,836 Speaker 3: And I think the cost estimates have gotten down over 84 00:05:22,876 --> 00:05:24,956 Speaker 3: the past couple years because people have figured out how 85 00:05:24,996 --> 00:05:27,996 Speaker 3: to do better biology but we're still in the hundreds 86 00:05:27,996 --> 00:05:29,196 Speaker 3: of thousands of dollars. 87 00:05:29,876 --> 00:05:33,356 Speaker 2: So you're setting out to solve this problem of how 88 00:05:33,396 --> 00:05:37,236 Speaker 2: to grow iPSCs at scale, and in particular right this 89 00:05:37,396 --> 00:05:40,356 Speaker 2: problem of getting rid of the bad cell colonies without 90 00:05:40,356 --> 00:05:43,396 Speaker 2: disturbing the good ones. How do you figure out how 91 00:05:43,436 --> 00:05:43,836 Speaker 2: to do that? 92 00:05:44,596 --> 00:05:46,196 Speaker 3: So one of the things we did at the time 93 00:05:46,316 --> 00:05:49,356 Speaker 3: was we spent quite a bit of time with biologists, 94 00:05:49,756 --> 00:05:51,916 Speaker 3: so we were trying to understand their workflow. What do 95 00:05:51,996 --> 00:05:53,396 Speaker 3: they do, what do they do in the lab, what 96 00:05:53,436 --> 00:05:54,436 Speaker 3: are they looking for? 97 00:05:54,556 --> 00:05:59,036 Speaker 4: How are they doing their hand gestures? And there's a 98 00:05:59,036 --> 00:06:01,196 Speaker 4: lot of tilting, there's like tapping. 99 00:06:01,636 --> 00:06:05,516 Speaker 2: This is the artisanal sort of separating the sales process exactly. 100 00:06:05,556 --> 00:06:08,476 Speaker 3: The entire IPS manufacturing process can be on the order 101 00:06:08,516 --> 00:06:10,916 Speaker 3: of three to four and there are many different parts 102 00:06:10,956 --> 00:06:14,196 Speaker 3: to it. And certain scientists have certain features that they 103 00:06:14,236 --> 00:06:17,316 Speaker 3: see by eye. So some will describe smiley faces, some 104 00:06:17,396 --> 00:06:19,756 Speaker 3: will describe other features that they're seeing. 105 00:06:19,516 --> 00:06:22,836 Speaker 2: Meaning like they look through the microscope to distinguish the 106 00:06:22,876 --> 00:06:24,956 Speaker 2: good from the bad. That's right, and they sort of 107 00:06:24,996 --> 00:06:25,836 Speaker 2: know it when they see it. 108 00:06:25,996 --> 00:06:26,396 Speaker 4: That's right. 109 00:06:26,476 --> 00:06:28,916 Speaker 3: And you have a whole range of how good our 110 00:06:28,996 --> 00:06:33,076 Speaker 3: scientists are globally. So the best scientists are sitting at 111 00:06:33,876 --> 00:06:36,316 Speaker 3: inside cleanrooms looking at these cells and trying to decipher 112 00:06:36,356 --> 00:06:39,116 Speaker 3: the best cells. And the stakes are high because you 113 00:06:39,116 --> 00:06:41,596 Speaker 3: don't really get in the way they're manufacturing their cells. 114 00:06:42,196 --> 00:06:45,076 Speaker 3: They don't get to necessarily test the cells until you've 115 00:06:45,236 --> 00:06:46,396 Speaker 3: done the entire process. 116 00:06:46,476 --> 00:06:48,556 Speaker 2: Yeah. So, just to be clear, like this is a 117 00:06:48,636 --> 00:06:52,396 Speaker 2: disaster for drug manufacturing, right, like not even for safety, 118 00:06:52,436 --> 00:06:55,716 Speaker 2: but just like it's never going to work, right, It's 119 00:06:55,756 --> 00:06:57,996 Speaker 2: never gonna work at scalesh Sure it could work for 120 00:06:58,036 --> 00:07:00,676 Speaker 2: research for a while, but like you or somebody like 121 00:07:00,716 --> 00:07:02,796 Speaker 2: you needs to come along to automate this, right. 122 00:07:03,396 --> 00:07:06,556 Speaker 3: I'm so glad I never thought about how hard it 123 00:07:06,596 --> 00:07:09,316 Speaker 3: was going to be and just went for it. If 124 00:07:09,356 --> 00:07:12,076 Speaker 3: I know how hard it was going to be, like 125 00:07:12,116 --> 00:07:15,476 Speaker 3: startups are hard, but I think we had this curiosity. 126 00:07:15,756 --> 00:07:18,636 Speaker 3: We're a bunch of physicists, We had energy, we were passionate, 127 00:07:19,316 --> 00:07:21,196 Speaker 3: and my co founder Marine and I we are very 128 00:07:21,236 --> 00:07:24,876 Speaker 3: passionate about being in medicine and using our physics knowledge 129 00:07:25,316 --> 00:07:27,956 Speaker 3: for medicine. So we were like, Okay, why don't we 130 00:07:27,996 --> 00:07:29,916 Speaker 3: try why don't we figure out how to do it? 131 00:07:29,956 --> 00:07:35,396 Speaker 3: And between twenty twenty and now, so many things have 132 00:07:35,516 --> 00:07:39,156 Speaker 3: happened on the science and technology front at Seleno that 133 00:07:39,276 --> 00:07:41,436 Speaker 3: I would label as impossible. 134 00:07:42,236 --> 00:07:44,956 Speaker 4: I think there were at least the first few years. 135 00:07:44,756 --> 00:07:46,196 Speaker 3: Of that phase, I was like, I don't know if 136 00:07:46,236 --> 00:07:49,316 Speaker 3: any of this will work, and it was hard for 137 00:07:49,356 --> 00:07:52,356 Speaker 3: me to come to terms with given that we had 138 00:07:52,556 --> 00:07:53,516 Speaker 3: raised a big round. 139 00:07:53,596 --> 00:07:55,356 Speaker 4: You know, a lot of people were talking about. 140 00:07:55,196 --> 00:07:58,316 Speaker 3: This, but I also felt I kept telling my team, 141 00:07:58,356 --> 00:08:00,516 Speaker 3: it's like, we have to give it our best shot, 142 00:08:00,796 --> 00:08:05,796 Speaker 3: because if a team like ours isn't brave enough to 143 00:08:05,916 --> 00:08:08,316 Speaker 3: try to go after this, this might not be resolved 144 00:08:08,676 --> 00:08:11,916 Speaker 3: for a few decades. And what that means is we 145 00:08:12,476 --> 00:08:16,516 Speaker 3: are at risk of falling into this pattern of pushing 146 00:08:16,596 --> 00:08:21,236 Speaker 3: through complex manufacturing, of selling gene therapy products, even getting 147 00:08:21,276 --> 00:08:24,716 Speaker 3: them through a phase through approval, but once they hit commercial, 148 00:08:25,276 --> 00:08:27,996 Speaker 3: they're not meeting the patients. 149 00:08:28,516 --> 00:08:31,916 Speaker 2: Meaning even if it works with the kind of technology 150 00:08:31,916 --> 00:08:34,836 Speaker 2: that existed before you came along, even if it works therapeutically, 151 00:08:35,356 --> 00:08:38,836 Speaker 2: it'll be so sort of bespoke and expensive that most 152 00:08:38,876 --> 00:08:40,476 Speaker 2: people in the world aren't going to get it even 153 00:08:40,516 --> 00:08:41,956 Speaker 2: if they need it. Is that what that means? 154 00:08:43,116 --> 00:08:45,996 Speaker 3: That is what that means, And you know iPSCs, it's 155 00:08:46,036 --> 00:08:49,396 Speaker 3: still to be determined. We don't have any first approval, 156 00:08:49,476 --> 00:08:51,476 Speaker 3: so let's see how that goes. But it is the 157 00:08:51,516 --> 00:08:56,036 Speaker 3: most complex manufacturing process that I've seen so far. Other areas, 158 00:08:56,076 --> 00:08:59,356 Speaker 3: other examples I can point to are cart therapies, Cancer 159 00:08:59,396 --> 00:09:01,236 Speaker 3: therapy is curative, incredible. 160 00:09:01,676 --> 00:09:04,036 Speaker 2: This is another cell therapy and the domain of like 161 00:09:04,116 --> 00:09:06,556 Speaker 2: let's take cells and develop them and give them to 162 00:09:06,596 --> 00:09:10,996 Speaker 2: the patient. Right therapy is the sort of the signal 163 00:09:11,036 --> 00:09:12,596 Speaker 2: achievement of self therapy so far. 164 00:09:12,756 --> 00:09:13,516 Speaker 4: Right, it was. 165 00:09:13,556 --> 00:09:17,436 Speaker 3: Huge, and that was happening that approval was coming up 166 00:09:18,516 --> 00:09:21,756 Speaker 3: when I was starting Seleno, so it actually was very 167 00:09:21,796 --> 00:09:24,716 Speaker 3: motivating to see, Wow, now we can actually reach the 168 00:09:24,796 --> 00:09:27,756 Speaker 3: point of curing previously in kurbal disease. But it's been 169 00:09:27,796 --> 00:09:32,156 Speaker 3: interesting to see about the cart space. Every year, the 170 00:09:32,196 --> 00:09:34,716 Speaker 3: maximum number of patients being dosed with a CARTI on 171 00:09:34,716 --> 00:09:38,436 Speaker 3: the year is on average still ten thousand patients annually. 172 00:09:38,556 --> 00:09:40,836 Speaker 2: Which is a small number. Which is a small number, 173 00:09:40,996 --> 00:09:43,476 Speaker 2: what hundreds of thousands or millions who might benefit from 174 00:09:43,476 --> 00:09:44,636 Speaker 2: the tree correct. 175 00:09:44,316 --> 00:09:47,916 Speaker 3: And now the scope is expanding to solid tumors and autommunity, 176 00:09:47,996 --> 00:09:51,116 Speaker 3: So like, why are we stuck? We're stuck in this number. 177 00:09:51,116 --> 00:09:52,556 Speaker 3: There's an infrastructure problem. 178 00:09:52,636 --> 00:09:56,156 Speaker 2: It's a scale problem. It's a lack of manufacturing at scale. 179 00:09:56,476 --> 00:10:00,996 Speaker 3: That's correct, and so I do feel very strongly that 180 00:10:01,036 --> 00:10:06,356 Speaker 3: it's important to push forward trials and build for scale 181 00:10:06,516 --> 00:10:09,916 Speaker 3: from the get go. But it's hard because it's tempting 182 00:10:09,956 --> 00:10:12,036 Speaker 3: to take shortcuts and like, oh, we could do this 183 00:10:12,156 --> 00:10:14,636 Speaker 3: and this will be much faster. And the question I'm 184 00:10:14,636 --> 00:10:17,756 Speaker 3: always asking my team is Okay, great, so let's take 185 00:10:17,756 --> 00:10:20,916 Speaker 3: a step back, but how will this solution? Could this 186 00:10:21,036 --> 00:10:23,476 Speaker 3: address a million patients annually? 187 00:10:23,636 --> 00:10:25,876 Speaker 2: Yeah, because there are a bunch of sort of cluges 188 00:10:25,876 --> 00:10:27,836 Speaker 2: where you could make a thing that works and it's 189 00:10:27,876 --> 00:10:30,996 Speaker 2: more efficient than what people do today, but it's still 190 00:10:31,116 --> 00:10:33,636 Speaker 2: kind of clue g and artisanal and not great to scale. Like, 191 00:10:33,676 --> 00:10:34,596 Speaker 2: that's the trade off. 192 00:10:35,116 --> 00:10:37,676 Speaker 3: That is the trade off, and that's an everyday trade off. 193 00:10:37,676 --> 00:10:40,316 Speaker 3: And I feel very blessed that we've been at it 194 00:10:40,396 --> 00:10:43,436 Speaker 3: for quite some time, building the different layers, and we're 195 00:10:43,476 --> 00:10:46,076 Speaker 3: at this point now where we're building our closed system. 196 00:10:46,156 --> 00:10:51,636 Speaker 3: It's called Nebula, Okay, and it's it's it's the next version. 197 00:10:51,676 --> 00:10:54,356 Speaker 3: It's the version I never imagined even when we were 198 00:10:54,356 --> 00:10:58,516 Speaker 3: building the laser based technology, which we call the optical bioprocess. 199 00:10:59,396 --> 00:11:03,276 Speaker 3: But the idea is in order to ultimately scale down cost. 200 00:11:04,596 --> 00:11:08,836 Speaker 3: You know, it's always great to run high precision manufacturing, 201 00:11:09,636 --> 00:11:13,476 Speaker 3: so we have that down, but the cleanroom costs are still. 202 00:11:13,276 --> 00:11:17,516 Speaker 2: High because again you're dealing with living cells and they 203 00:11:17,556 --> 00:11:20,396 Speaker 2: can't get contaminated because that would be disastrous. So you 204 00:11:20,476 --> 00:11:23,036 Speaker 2: have to have a crazy super sterile environment. 205 00:11:23,796 --> 00:11:27,756 Speaker 3: And we're doing many patients, and you want to parallelize 206 00:11:27,756 --> 00:11:29,796 Speaker 3: as much as possible, and you want to protect all 207 00:11:29,836 --> 00:11:34,276 Speaker 3: the patient samples from each other. So what we're building now, 208 00:11:34,276 --> 00:11:38,876 Speaker 3: which is even harder but exciting and we're making some 209 00:11:38,916 --> 00:11:43,796 Speaker 3: great progress, is the closed version of this so building 210 00:11:43,916 --> 00:11:49,396 Speaker 3: cassettes that could ultimately be the clean space that all 211 00:11:49,436 --> 00:11:52,316 Speaker 3: the manufacturing happens. Is the size of your iPhone, which 212 00:11:52,356 --> 00:11:53,756 Speaker 3: is very exciting, huh. 213 00:11:53,796 --> 00:11:55,996 Speaker 2: So it's basically, the size of your iPhone is a 214 00:11:56,076 --> 00:12:00,876 Speaker 2: tiny clean room where you are cultivating induced player potent 215 00:12:00,916 --> 00:12:04,116 Speaker 2: stem cells. It all happens inside that little box. 216 00:12:04,716 --> 00:12:05,876 Speaker 4: That is what's being designed. 217 00:12:05,916 --> 00:12:08,876 Speaker 3: I just want to put lots of little stars next 218 00:12:08,916 --> 00:12:11,756 Speaker 3: to it, because this is all in collaboration with the FDA. 219 00:12:12,156 --> 00:12:15,276 Speaker 3: You know, the type of vision we're portraying is not 220 00:12:15,476 --> 00:12:18,916 Speaker 3: how cell therapy manufacturing is happening today. So we're gonna 221 00:12:18,916 --> 00:12:21,076 Speaker 3: work with the FDA and we're going to present data 222 00:12:21,116 --> 00:12:23,596 Speaker 3: and we're going to work with our clinical collaborators. But yes, 223 00:12:23,636 --> 00:12:26,876 Speaker 3: wouldn't that be awesome because then we can we have 224 00:12:27,076 --> 00:12:31,276 Speaker 3: these autonomous systems with close cassettes. We don't need to 225 00:12:31,396 --> 00:12:33,636 Speaker 3: establish high grade cleaner So you get a ton of 226 00:12:33,636 --> 00:12:38,156 Speaker 3: flexibility on setting up manufacturing in places where you might 227 00:12:38,156 --> 00:12:40,876 Speaker 3: not have clean room, you might not have academic centers 228 00:12:40,876 --> 00:12:42,636 Speaker 3: of excellent, you might not have the workforce. 229 00:12:42,916 --> 00:12:46,076 Speaker 2: Yeah, Like the dream is it happens inside a machine essentially, 230 00:12:46,156 --> 00:12:49,076 Speaker 2: like you buy the machine and all the artisanal knowledge 231 00:12:49,076 --> 00:12:51,156 Speaker 2: and the clean room and everything is in the machine 232 00:12:51,196 --> 00:12:52,076 Speaker 2: is embedded. 233 00:12:51,676 --> 00:12:55,596 Speaker 4: In the machine, intelligent machines that make your best them cells. 234 00:12:55,716 --> 00:12:59,676 Speaker 2: Yep. So that so we've talked about the past, and 235 00:12:59,676 --> 00:13:01,436 Speaker 2: now we've talked about the dream for the future. Let's 236 00:13:01,436 --> 00:13:03,436 Speaker 2: talk about the present. I read Is it right that 237 00:13:03,476 --> 00:13:08,516 Speaker 2: there's a Phase one trial in Parkinson's that you're involved in? 238 00:13:08,556 --> 00:13:09,636 Speaker 2: What's with that? 239 00:13:10,036 --> 00:13:14,556 Speaker 3: Yes, we had a very exciting year. We found our 240 00:13:14,636 --> 00:13:21,356 Speaker 3: first clinical collaborator. It's the Parkinson's Cell Therapy team at 241 00:13:21,436 --> 00:13:26,236 Speaker 3: mess General Brigham and it's been great to work with them. 242 00:13:26,276 --> 00:13:29,956 Speaker 3: They have a phase one running where they're making the 243 00:13:30,036 --> 00:13:34,956 Speaker 3: patient's own IPSS and their own neurons, and these neurons 244 00:13:34,956 --> 00:13:38,836 Speaker 3: are being transplanted into the brain, okay, and we're working 245 00:13:38,836 --> 00:13:42,196 Speaker 3: together to do the transfer of the manufacturing into our 246 00:13:42,276 --> 00:13:44,636 Speaker 3: automated platform. And there's a lot of back and forth 247 00:13:44,716 --> 00:13:47,956 Speaker 3: right now on getting that collaboration up and running. 248 00:13:48,156 --> 00:13:51,636 Speaker 2: And so just to be clear, that trial exists now 249 00:13:51,676 --> 00:13:55,076 Speaker 2: and they're currently doing it in the artisanal, old fashioned, 250 00:13:55,596 --> 00:13:58,396 Speaker 2: hard and expensive way, and the hope is that you 251 00:13:58,436 --> 00:14:03,276 Speaker 2: will come in and do it in your faster, cheaper way. 252 00:14:04,036 --> 00:14:05,036 Speaker 4: That's exactly right. 253 00:14:05,316 --> 00:14:10,956 Speaker 3: They're incredibly brave scientists and clinicians and they've put in 254 00:14:10,956 --> 00:14:14,356 Speaker 3: the hard work over the past few decades to get 255 00:14:14,396 --> 00:14:19,636 Speaker 3: this bioprocess up and running, invent new surgical techniques, design 256 00:14:19,676 --> 00:14:23,396 Speaker 3: the clinical trial, and yeah, we want to support them 257 00:14:23,436 --> 00:14:26,396 Speaker 3: to scale through the trial and then how does it 258 00:14:26,396 --> 00:14:30,036 Speaker 3: get into commercial So it's been really exciting. And one 259 00:14:30,076 --> 00:14:32,076 Speaker 3: of the things I love about this collaboration is we're 260 00:14:32,116 --> 00:14:36,756 Speaker 3: geographically very close because we're in Cambridge and they're across 261 00:14:36,836 --> 00:14:39,276 Speaker 3: the River Master. I think it's like a mile or 262 00:14:39,316 --> 00:14:44,556 Speaker 3: two from us. So the ability to connect dots has 263 00:14:44,556 --> 00:14:49,156 Speaker 3: been incredible because the manufacturing is happening inside the hospital. 264 00:14:50,196 --> 00:14:52,956 Speaker 2: Interesting, like you're building a machine in the hospital. 265 00:14:53,116 --> 00:14:55,476 Speaker 3: We will be deploying the machine in the hospital. Yes, 266 00:14:55,796 --> 00:14:57,956 Speaker 3: the machines are being built at our headquarters. 267 00:14:57,956 --> 00:15:00,516 Speaker 2: Now I shouldn't say built. Yeah, so you're going to 268 00:15:01,036 --> 00:15:03,356 Speaker 2: drive put the machine in a truck. How big is 269 00:15:03,396 --> 00:15:05,356 Speaker 2: the machine by the way, you know. 270 00:15:06,636 --> 00:15:10,956 Speaker 3: Maybe like one or two refrigerators. Yeah, so that's how 271 00:15:10,956 --> 00:15:12,596 Speaker 3: big they are. And the cassettes are about the size 272 00:15:12,596 --> 00:15:17,316 Speaker 3: your iPhone. And initially we'll be deploying the machines to 273 00:15:17,356 --> 00:15:19,716 Speaker 3: be doing making sales for one patient at a time, 274 00:15:19,836 --> 00:15:22,676 Speaker 3: and then as we build more evidence, we'd love to 275 00:15:22,716 --> 00:15:24,916 Speaker 3: make more and more patients on each machine. 276 00:15:25,116 --> 00:15:30,196 Speaker 2: And will that be the first time your sales are 277 00:15:30,236 --> 00:15:31,956 Speaker 2: going into patients. 278 00:15:31,676 --> 00:15:32,396 Speaker 4: Yes, I believe. 279 00:15:32,476 --> 00:15:32,556 Speaker 1: So. 280 00:15:33,156 --> 00:15:37,236 Speaker 3: We do have two other amazing collaborators, one in South Korea. 281 00:15:37,276 --> 00:15:39,636 Speaker 3: They're working on a peripheral artery disease. They also have 282 00:15:39,716 --> 00:15:43,076 Speaker 3: a phase one trial. They're going to have a US 283 00:15:43,876 --> 00:15:48,116 Speaker 3: expansion and then a spinal cord injury company as well. 284 00:15:48,116 --> 00:15:52,036 Speaker 3: But yeah, the mass General team is doing great. We're 285 00:15:52,076 --> 00:15:58,476 Speaker 3: also leaning into collaborating with the FDA We've had a 286 00:15:58,596 --> 00:16:02,156 Speaker 3: very positive experience working with them. Earlier this year we 287 00:16:02,156 --> 00:16:07,436 Speaker 3: were granted our Advanced Manufacturing Technology designation for iPSC generation 288 00:16:08,476 --> 00:16:11,436 Speaker 3: and it's I think that maybe that's the biggest surprise 289 00:16:11,476 --> 00:16:15,836 Speaker 3: of the year at how technology forward the FDA is 290 00:16:15,876 --> 00:16:18,716 Speaker 3: and they've been paying attention. They've been asking great questions 291 00:16:18,716 --> 00:16:21,156 Speaker 3: and all of our meetings. They have a full stack 292 00:16:21,276 --> 00:16:26,156 Speaker 3: AI team that is fully tuned into how our models 293 00:16:26,156 --> 00:16:29,756 Speaker 3: are being trained, like what's what's behind the hood. So 294 00:16:29,796 --> 00:16:32,756 Speaker 3: it's really really compelling. Yeah, because I think the last 295 00:16:32,796 --> 00:16:36,036 Speaker 3: ten years of selling gene therapies have been transformative in 296 00:16:36,116 --> 00:16:40,796 Speaker 3: terms of curative medicines, but everybody is missing the impact of. 297 00:16:40,756 --> 00:16:42,516 Speaker 4: Scale, like including the regulator. 298 00:16:42,556 --> 00:16:46,916 Speaker 3: So they've now taken upon themselves to sort of help 299 00:16:47,476 --> 00:16:50,956 Speaker 3: the field think about scale and work with technology companies 300 00:16:51,036 --> 00:16:56,916 Speaker 3: like ourselves. So it's incredibly it's special and. 301 00:16:56,996 --> 00:17:01,396 Speaker 2: Just briefly, like, from say the patient's point of view, 302 00:17:01,876 --> 00:17:03,516 Speaker 2: how will it work? Simply? 303 00:17:04,276 --> 00:17:06,516 Speaker 4: Yeah, I mean you know, we can take the mass 304 00:17:06,516 --> 00:17:07,716 Speaker 4: General Brigham. 305 00:17:07,396 --> 00:17:09,076 Speaker 2: Yeah, in that case, what will happen? 306 00:17:09,756 --> 00:17:13,076 Speaker 3: In that case, they'll be they'll see a physician and 307 00:17:13,116 --> 00:17:16,236 Speaker 3: they'll get a diagnosis for their disease. And in this 308 00:17:16,316 --> 00:17:20,476 Speaker 3: case it's Parkinson's. Then they'll get a prescription that says 309 00:17:21,516 --> 00:17:25,956 Speaker 3: you're going to get an atologous cell therapy. 310 00:17:25,516 --> 00:17:27,676 Speaker 2: And meaning your own cells. 311 00:17:27,796 --> 00:17:32,076 Speaker 3: Yes, yes, exactly, personalize your own match with your DNA. 312 00:17:32,396 --> 00:17:35,516 Speaker 3: And then they'll show up to or maybe it's the 313 00:17:35,516 --> 00:17:37,556 Speaker 3: same day they get the prescription and maybe they're at 314 00:17:37,556 --> 00:17:42,596 Speaker 3: their doctor's office, but they will have to either provide 315 00:17:43,116 --> 00:17:47,516 Speaker 3: blood draw or a skin biopsy. I can imagine really 316 00:17:47,556 --> 00:17:49,556 Speaker 3: far into the future. It could be hair cells, it 317 00:17:49,556 --> 00:17:50,476 Speaker 3: could be saliva. 318 00:17:50,676 --> 00:17:53,356 Speaker 2: But for this one, yeah, they take a little bit 319 00:17:53,356 --> 00:17:54,996 Speaker 2: of your blood or a little bit of your skin 320 00:17:55,116 --> 00:17:55,716 Speaker 2: and then. 321 00:17:56,636 --> 00:18:00,076 Speaker 3: And then they say, okay, we'll schedule your surgery and 322 00:18:01,196 --> 00:18:03,036 Speaker 3: come back in a couple months and. 323 00:18:02,996 --> 00:18:03,756 Speaker 4: We'll be ready to go. 324 00:18:04,236 --> 00:18:07,276 Speaker 2: Essentially, you're starting with the patient's cells, their skin cells 325 00:18:07,316 --> 00:18:10,916 Speaker 2: or their blood cells. You're ending up with brain cells 326 00:18:10,956 --> 00:18:13,716 Speaker 2: that match their own brain cells. What part of that 327 00:18:13,756 --> 00:18:16,396 Speaker 2: transition happens inside your machine automatically. 328 00:18:16,716 --> 00:18:23,076 Speaker 3: Yes, So the getting to really high quality iPSCs is 329 00:18:23,116 --> 00:18:25,516 Speaker 3: the first product we've established. 330 00:18:25,516 --> 00:18:27,876 Speaker 4: We're establishing end to end, Like. 331 00:18:27,796 --> 00:18:30,116 Speaker 2: You put the whatever the blood cell or the skin 332 00:18:30,196 --> 00:18:32,236 Speaker 2: sell in the machine and out the other end comes 333 00:18:32,236 --> 00:18:35,516 Speaker 2: a high quality ips is it really like that. 334 00:18:37,036 --> 00:18:39,996 Speaker 3: It's lots of AI and lots of biology and lots 335 00:18:40,036 --> 00:18:41,476 Speaker 3: of fluidages, but it is kind of like that. 336 00:18:41,676 --> 00:18:44,916 Speaker 2: Yeah, okay, it's true that part is automated. That's really 337 00:18:44,956 --> 00:18:46,116 Speaker 2: what I'm asking, Like. 338 00:18:46,836 --> 00:18:49,156 Speaker 3: We're working on it. Yeah, it's and it's going to 339 00:18:49,236 --> 00:18:53,396 Speaker 3: be automated. There will be human human experts in loop 340 00:18:53,636 --> 00:18:56,716 Speaker 3: as always, and there will be end QC. But the 341 00:18:56,796 --> 00:18:58,956 Speaker 3: day to day operations, I mean, you know right now, 342 00:18:58,996 --> 00:19:03,636 Speaker 3: like Exeleno, things here are running automated and they run 343 00:19:03,796 --> 00:19:06,516 Speaker 3: pretty much twenty four hours a day, so they're like, 344 00:19:07,436 --> 00:19:13,956 Speaker 3: you know, proming imaging, fluids, laser processing because the cells 345 00:19:14,076 --> 00:19:17,156 Speaker 3: need different actions at different time points, so a lot 346 00:19:17,196 --> 00:19:20,236 Speaker 3: of those things we've established to be algorithmic. 347 00:19:20,316 --> 00:19:21,436 Speaker 4: Yeah, it is automated. 348 00:19:21,916 --> 00:19:24,836 Speaker 3: You know, we have a we have a very small 349 00:19:24,876 --> 00:19:27,396 Speaker 3: and mighty team of bio engineers, but we definitely do 350 00:19:27,476 --> 00:19:28,116 Speaker 3: the work. 351 00:19:27,916 --> 00:19:31,596 Speaker 4: Of ten fifty x more. 352 00:19:31,836 --> 00:19:34,476 Speaker 2: I would say, yeah, through automation, through. 353 00:19:34,316 --> 00:19:35,436 Speaker 4: Automation, that's right. 354 00:19:35,596 --> 00:19:46,196 Speaker 1: Yeah, we'll be back in just a minute. 355 00:19:51,316 --> 00:19:53,756 Speaker 2: One quick note before we get back to the interview. 356 00:19:54,356 --> 00:19:57,636 Speaker 2: Near the end of the conversation, you will hear Nabiha 357 00:19:57,836 --> 00:20:04,276 Speaker 2: mentioned something called allogeneic therapies. Those are therapies where ipsc's 358 00:20:04,876 --> 00:20:08,556 Speaker 2: induced player potent stem cells are developed based on generic 359 00:20:08,636 --> 00:20:12,996 Speaker 2: cells rather than based on a patient's own cells. Those 360 00:20:13,076 --> 00:20:15,716 Speaker 2: are easier to cultivate, but in many cases they have 361 00:20:15,836 --> 00:20:20,236 Speaker 2: drawbacks similar to organ transplants, because patient's own immune systems 362 00:20:20,276 --> 00:20:22,716 Speaker 2: tend to reject those cells. So I just wanted to 363 00:20:22,796 --> 00:20:26,396 Speaker 2: clarify that in advance. Okay, back to the interview. How 364 00:20:26,396 --> 00:20:30,516 Speaker 2: are you using machine learning or AI in your automated process. 365 00:20:31,996 --> 00:20:33,996 Speaker 3: The way we use machine learning and AI is we 366 00:20:34,036 --> 00:20:37,516 Speaker 3: take a lot of photos of all the cells every day. 367 00:20:37,996 --> 00:20:38,836 Speaker 4: How are they doing. 368 00:20:38,916 --> 00:20:41,756 Speaker 3: We've trained a bunch of algorithms in Google Cloud that 369 00:20:41,916 --> 00:20:44,236 Speaker 3: tell you, oh, this is a good stem cell, this 370 00:20:44,316 --> 00:20:46,196 Speaker 3: is not a good stem cell, this is good density, 371 00:20:46,236 --> 00:20:50,916 Speaker 3: this is not good density. And then those algorithms feed 372 00:20:50,956 --> 00:20:53,556 Speaker 3: into other algorithms that make decisions on what to do 373 00:20:53,596 --> 00:20:56,836 Speaker 3: with the cells. So that your expert scientists don't have 374 00:20:56,916 --> 00:20:59,276 Speaker 3: to sit and make all these decisions, and it's hard 375 00:20:59,316 --> 00:21:01,996 Speaker 3: to make when it's at this massive of a scale. 376 00:21:02,196 --> 00:21:05,076 Speaker 3: So they can review what the algorithms are doing, they 377 00:21:05,116 --> 00:21:08,196 Speaker 3: can intervene it whenever they want to. I think like now, 378 00:21:08,236 --> 00:21:11,516 Speaker 3: a very timely a similar field would be self driving cars. 379 00:21:11,556 --> 00:21:14,756 Speaker 3: You know, there's just a lot of imaging that's being used, 380 00:21:14,796 --> 00:21:17,396 Speaker 3: and then the car can make its own decisions. And 381 00:21:17,716 --> 00:21:19,956 Speaker 3: at least in a tesla you have you have a 382 00:21:20,036 --> 00:21:23,476 Speaker 3: driver and they can override any time. Wimos are running autonomous. 383 00:21:24,196 --> 00:21:25,116 Speaker 4: So that's what we do. 384 00:21:25,156 --> 00:21:28,076 Speaker 3: We use imaging to help with all the decision making 385 00:21:28,156 --> 00:21:29,036 Speaker 3: your manufacturing. 386 00:21:29,156 --> 00:21:32,436 Speaker 2: So it's largely pattern matching, right, which is essentially what 387 00:21:32,876 --> 00:21:35,036 Speaker 2: the expert scientists are doing. As you said, some of 388 00:21:35,076 --> 00:21:37,396 Speaker 2: them talk about whatever a smiley faces something, it's just 389 00:21:37,436 --> 00:21:41,116 Speaker 2: because the individuals have seen whatever thousands have done it 390 00:21:41,116 --> 00:21:44,476 Speaker 2: thousands of times, and it's a it seems like a 391 00:21:44,516 --> 00:21:47,076 Speaker 2: good thing to use AI for, right, like a classic like, 392 00:21:47,156 --> 00:21:49,436 Speaker 2: here's lots and lots of good ones and lots of 393 00:21:49,476 --> 00:21:51,756 Speaker 2: lots of bad ones. Now here's a new one. Pickwa 394 00:21:51,796 --> 00:21:52,996 Speaker 2: are the good ones and one of the bad ones? 395 00:21:53,036 --> 00:21:55,956 Speaker 4: It's that you're absolutely right. So what's great about it. 396 00:21:56,116 --> 00:21:59,036 Speaker 3: Usually if humans can see something by eye, we're able 397 00:21:59,116 --> 00:22:00,636 Speaker 3: to trend an algorithm to do that. 398 00:22:00,716 --> 00:22:01,756 Speaker 4: So that's real number one. 399 00:22:01,836 --> 00:22:03,756 Speaker 3: And then the second thing that's very exciting about what 400 00:22:03,796 --> 00:22:09,316 Speaker 3: we do is this time series data. This is really 401 00:22:09,356 --> 00:22:12,836 Speaker 3: important because you can really start to draw patterns through 402 00:22:12,876 --> 00:22:16,556 Speaker 3: time and even figure out predictions. You can go back 403 00:22:16,596 --> 00:22:20,556 Speaker 3: in time and predict the future. And our algorithms have 404 00:22:20,676 --> 00:22:22,836 Speaker 3: been able to because we've fed in a lot of 405 00:22:22,876 --> 00:22:26,916 Speaker 3: time series data, and we've fed in also genetic data 406 00:22:27,116 --> 00:22:28,836 Speaker 3: of how these cells look. 407 00:22:28,876 --> 00:22:31,756 Speaker 4: At the end, these algorithms can go. 408 00:22:31,756 --> 00:22:34,876 Speaker 3: Back into the early stages of the process say actually, 409 00:22:34,916 --> 00:22:37,996 Speaker 3: this one is this cell or this colony is going 410 00:22:38,036 --> 00:22:41,356 Speaker 3: to be bad, so you might want to eliminate that. 411 00:22:41,796 --> 00:22:43,836 Speaker 2: So, just to be clear, time series data is like 412 00:22:44,036 --> 00:22:46,516 Speaker 2: we can think of it like a time lapse video 413 00:22:46,676 --> 00:22:50,236 Speaker 2: of the life of a cell colony, and so then 414 00:22:50,276 --> 00:22:53,276 Speaker 2: you learn patterns of like if it is evolving in 415 00:22:53,276 --> 00:22:55,116 Speaker 2: a certain way or developing, I should say in a 416 00:22:55,116 --> 00:22:56,836 Speaker 2: certain way, it's going to be good, or if it's 417 00:22:56,836 --> 00:22:58,476 Speaker 2: developing in this other way, it's going to be bad 418 00:22:58,516 --> 00:23:00,436 Speaker 2: based on the past. Is that what you mean? 419 00:23:00,796 --> 00:23:03,356 Speaker 4: That's exactly right. You just made me think of like Netflix. 420 00:23:03,436 --> 00:23:06,076 Speaker 3: I don't know why, but you know, like different movies, 421 00:23:06,116 --> 00:23:07,556 Speaker 3: like I know what I'm going to get when I'm 422 00:23:07,596 --> 00:23:09,556 Speaker 3: watching a rom com or I want to watch a 423 00:23:09,636 --> 00:23:11,876 Speaker 3: murder mystery. So you have the but you know, not 424 00:23:12,036 --> 00:23:14,516 Speaker 3: every rom CAMM is the same, but I know how 425 00:23:14,556 --> 00:23:15,956 Speaker 3: the story should flow. 426 00:23:15,756 --> 00:23:17,196 Speaker 4: And I'm making my choices. 427 00:23:17,276 --> 00:23:19,756 Speaker 3: Yes, so it can start to make those kinds of 428 00:23:20,596 --> 00:23:23,756 Speaker 3: look in the crystal ball and it just increases the 429 00:23:23,796 --> 00:23:27,756 Speaker 3: efficiency and we still do the end processing is the same, 430 00:23:27,796 --> 00:23:29,596 Speaker 3: and the more data we see it in, the better 431 00:23:29,636 --> 00:23:30,036 Speaker 3: they get. 432 00:23:30,196 --> 00:23:32,436 Speaker 4: So it's exciting. Data is important. 433 00:23:32,516 --> 00:23:34,796 Speaker 3: Data is important for everything in AI right now, and 434 00:23:34,836 --> 00:23:37,076 Speaker 3: it's no different for us. And I should add the 435 00:23:37,196 --> 00:23:39,476 Speaker 3: data that we generate a lot of our friends are 436 00:23:39,476 --> 00:23:42,716 Speaker 3: generating in biotechnology companies. 437 00:23:42,756 --> 00:23:45,156 Speaker 4: It's very expensive data. 438 00:23:44,556 --> 00:23:46,876 Speaker 3: So we do a lot of hacking to figure out 439 00:23:47,036 --> 00:23:50,316 Speaker 3: what is like the optimum data said that we can 440 00:23:50,356 --> 00:23:54,156 Speaker 3: take that it's cost effective and like timegated and like 441 00:23:54,196 --> 00:23:57,076 Speaker 3: we're not losing any resources. We don't have access to 442 00:23:57,116 --> 00:23:58,236 Speaker 3: infinite amounts of data. 443 00:23:59,036 --> 00:24:03,476 Speaker 2: So okay, So that's that's where you are as a company. 444 00:24:03,636 --> 00:24:09,476 Speaker 2: You talked a little bit before about iPSC therapies more generally, right, 445 00:24:09,836 --> 00:24:11,636 Speaker 2: but let's return to that now for a second. So, like, 446 00:24:11,916 --> 00:24:15,476 Speaker 2: have any iPSC therapies been approved by the FDA? 447 00:24:16,676 --> 00:24:17,196 Speaker 4: Not yet? 448 00:24:17,516 --> 00:24:21,676 Speaker 3: Okay, The first approvals are going to be conditional approvals 449 00:24:22,476 --> 00:24:24,796 Speaker 3: are going to be in Japan, okay, And you know, 450 00:24:24,876 --> 00:24:29,436 Speaker 3: I think Japan is a very passionate about iPSCs, given 451 00:24:29,556 --> 00:24:32,236 Speaker 3: that they want to know about price, so they've been 452 00:24:32,276 --> 00:24:36,316 Speaker 3: working hard at it when other nations and countries. 453 00:24:35,876 --> 00:24:38,796 Speaker 4: Maybe slowed down. I started a little bit, you know, 454 00:24:38,996 --> 00:24:40,516 Speaker 4: just being like, you know, not. 455 00:24:40,436 --> 00:24:42,516 Speaker 2: Sure what are those therapies going to be. 456 00:24:43,396 --> 00:24:46,676 Speaker 3: Yeah, Parkinson's, there's heart disease. Those are the two that 457 00:24:46,716 --> 00:24:50,116 Speaker 3: I think will be up first. And then there's just 458 00:24:50,276 --> 00:24:55,796 Speaker 3: incredible trials running all over the world around vision laws. 459 00:24:56,676 --> 00:24:59,836 Speaker 3: I mentioned spinal cordanentry diabetes. Yeah, there's some really interesting 460 00:24:59,876 --> 00:25:05,196 Speaker 3: programs out of China where it was an autologous patient 461 00:25:05,316 --> 00:25:11,236 Speaker 3: derived pancreatic cell transplant that they carried through, which was 462 00:25:11,356 --> 00:25:12,196 Speaker 3: quite incredible. 463 00:25:12,276 --> 00:25:12,436 Speaker 4: Yeah. 464 00:25:12,516 --> 00:25:15,996 Speaker 3: So I do think the volume picks up and sort 465 00:25:16,036 --> 00:25:19,556 Speaker 3: of creates even greater urgency to start putting all the 466 00:25:19,596 --> 00:25:22,036 Speaker 3: pieces together and getting to scale. 467 00:25:22,116 --> 00:25:27,836 Speaker 2: Urgency, because once people start figuring out therapies that work, 468 00:25:27,916 --> 00:25:30,756 Speaker 2: there will be a need to actually make the cells, 469 00:25:31,036 --> 00:25:32,116 Speaker 2: which is where you come in. 470 00:25:32,476 --> 00:25:35,596 Speaker 3: Yes, make the cells, scale them, and give therapy developers 471 00:25:35,636 --> 00:25:39,196 Speaker 3: options because right now a lot of them are budget limited, 472 00:25:39,436 --> 00:25:43,236 Speaker 3: resource limited and can only dose one patient every two years. 473 00:25:43,676 --> 00:25:47,916 Speaker 2: Wow, just because it's so expensive to essentially make the cells. 474 00:25:47,756 --> 00:25:51,876 Speaker 3: And high failure rate, it's not a high yield rate, 475 00:25:51,956 --> 00:25:55,476 Speaker 3: and they don't They're understaffed, there's been budget cuts, there's 476 00:25:55,516 --> 00:25:59,676 Speaker 3: just a lot of problems. So it would be great 477 00:25:59,716 --> 00:26:02,836 Speaker 3: to have even more trials running. You know, I think 478 00:26:02,916 --> 00:26:05,636 Speaker 3: until Phase three trials happen, it's really hard to know 479 00:26:06,796 --> 00:26:08,676 Speaker 3: how the trial's going to go. We just don't have 480 00:26:08,796 --> 00:26:11,556 Speaker 3: enough volume right now to have enough shots on goal. 481 00:26:11,476 --> 00:26:14,356 Speaker 2: Well, and to run a phase three trial, you kind 482 00:26:14,356 --> 00:26:16,356 Speaker 2: of need the automation, right, I mean Phase three trials 483 00:26:16,356 --> 00:26:18,316 Speaker 2: tend to be quite large, a lot of patients, and 484 00:26:18,356 --> 00:26:20,556 Speaker 2: if you have to have scientists making sales by hand, 485 00:26:20,596 --> 00:26:22,596 Speaker 2: it's going to be hard to run a phase three trial, right. 486 00:26:23,116 --> 00:26:25,236 Speaker 3: That's exactly right, which is why we haven't seen any 487 00:26:25,236 --> 00:26:28,436 Speaker 3: ATOLL news programs get that far yet. But the ALO 488 00:26:28,516 --> 00:26:31,476 Speaker 3: ones are getting there, which is exciting because it builds 489 00:26:31,476 --> 00:26:33,596 Speaker 3: evidence for the mechanism of action. 490 00:26:33,916 --> 00:26:37,236 Speaker 2: So it's easier to make sort of generic cells at 491 00:26:37,276 --> 00:26:39,996 Speaker 2: scale as opposed to sort of bespoke sales for each patient. 492 00:26:40,116 --> 00:26:43,756 Speaker 2: That's right, That's why the alloy So Okay, if things 493 00:26:43,836 --> 00:26:48,956 Speaker 2: go well, what for you and the field, I guess 494 00:26:48,996 --> 00:26:51,316 Speaker 2: you want you need both right, people need to find 495 00:26:51,356 --> 00:26:53,316 Speaker 2: therapies at work, and you need to be able to 496 00:26:53,636 --> 00:26:56,796 Speaker 2: sort of implement those therapies at scale. Like what will 497 00:26:56,796 --> 00:26:59,636 Speaker 2: the world look like in what is the right amount 498 00:26:59,676 --> 00:27:02,556 Speaker 2: of time to say ten years? It's five enough. Will 499 00:27:02,556 --> 00:27:04,196 Speaker 2: the world look different in five years? 500 00:27:04,636 --> 00:27:07,236 Speaker 3: In five years, I do think the world look will 501 00:27:07,276 --> 00:27:11,196 Speaker 3: look quite different for Parkinson's pace And that's incredible because 502 00:27:11,196 --> 00:27:15,076 Speaker 3: it is a pretty horrible disease that leads to lack 503 00:27:15,116 --> 00:27:20,596 Speaker 3: of independence. Just it's just sad to see what patients 504 00:27:20,596 --> 00:27:22,556 Speaker 3: have to go through and have lots of family members 505 00:27:22,596 --> 00:27:23,316 Speaker 3: on our team who. 506 00:27:24,756 --> 00:27:25,876 Speaker 4: Have Parkinson's. 507 00:27:26,636 --> 00:27:28,276 Speaker 3: So yeah, you're going to go to your doctrin and 508 00:27:28,316 --> 00:27:31,716 Speaker 3: say I want this this therapy and that will be 509 00:27:31,796 --> 00:27:32,196 Speaker 3: an option. 510 00:27:32,316 --> 00:27:34,756 Speaker 4: And I think we will see more of those. In 511 00:27:34,876 --> 00:27:35,476 Speaker 4: ten years. 512 00:27:36,916 --> 00:27:39,756 Speaker 3: I think we'll see at least five more diseases where 513 00:27:39,796 --> 00:27:44,476 Speaker 3: there is one allogenetic therapy that's available and a lot 514 00:27:44,556 --> 00:27:48,836 Speaker 3: of the autologous trials are getting into phase three in 515 00:27:48,876 --> 00:27:52,436 Speaker 3: a scalable way. And what's what I'm hopeful for in 516 00:27:52,436 --> 00:27:56,596 Speaker 3: the next five years. My mom just turned sixty, and 517 00:27:56,636 --> 00:27:59,316 Speaker 3: in the next five or ten years, I don't want 518 00:27:59,356 --> 00:28:01,716 Speaker 3: to have to worry about her diabetes all the time, 519 00:28:02,996 --> 00:28:05,436 Speaker 3: and I would love to have the option of having 520 00:28:05,796 --> 00:28:09,716 Speaker 3: her own cell replacement to manage her diabetes. 521 00:28:09,956 --> 00:28:12,156 Speaker 4: That would be incredible, and. 522 00:28:12,476 --> 00:28:15,356 Speaker 3: I think that that is happening, and I think the 523 00:28:15,476 --> 00:28:19,236 Speaker 3: questions around how scalable will it be and where will 524 00:28:19,276 --> 00:28:20,516 Speaker 3: we get it and who's. 525 00:28:20,276 --> 00:28:20,676 Speaker 4: Going to make it? 526 00:28:20,716 --> 00:28:23,396 Speaker 3: I mean, I think people are going to figure this out. 527 00:28:24,196 --> 00:28:27,516 Speaker 3: Even this year. This year was not happier for biotech 528 00:28:27,556 --> 00:28:32,396 Speaker 3: and biopharma. The markets were down. There's a lot of 529 00:28:34,156 --> 00:28:39,916 Speaker 3: concerns around scale and investor returns. But I still find 530 00:28:39,916 --> 00:28:45,756 Speaker 3: it incredible that gene cell therapy regard of medicine, companies 531 00:28:45,756 --> 00:28:49,556 Speaker 3: and scientists are still chugging away. We're still getting into 532 00:28:49,596 --> 00:28:52,916 Speaker 3: trials and brute forcing it because everybody is so passionate. 533 00:28:53,556 --> 00:28:56,476 Speaker 3: I think the passion really comes from creating a paradigm 534 00:28:56,476 --> 00:29:00,676 Speaker 3: shift from treating symptoms or accepting the status quo of 535 00:29:00,716 --> 00:29:05,516 Speaker 3: a disease trajectory to wow, can we reverse disease? Can 536 00:29:05,556 --> 00:29:09,076 Speaker 3: we get to curative medicines? And that collective will pay 537 00:29:09,796 --> 00:29:12,236 Speaker 3: is incredible And I think a lot of us have 538 00:29:12,396 --> 00:29:17,116 Speaker 3: experiences around aging and loss in our families, so it's 539 00:29:17,196 --> 00:29:21,076 Speaker 3: driving this movement. And I think in the next couple 540 00:29:21,116 --> 00:29:23,996 Speaker 3: of decades, we'll have lots of humans on Earth. We're 541 00:29:23,996 --> 00:29:26,276 Speaker 3: going to be above sixty five and getting into eighty. 542 00:29:27,196 --> 00:29:31,796 Speaker 3: So this does become an economic concern as well. So 543 00:29:31,916 --> 00:29:35,436 Speaker 3: how do we keep everybody healthier for longer and using 544 00:29:35,876 --> 00:29:41,516 Speaker 3: everybody's own regeneration, their own cell tissue and even organ replacements. 545 00:29:41,796 --> 00:29:43,956 Speaker 4: I'm very up. I just tend to always be optimistic. 546 00:29:43,996 --> 00:29:45,996 Speaker 3: That's what gets me up every day to work on 547 00:29:46,036 --> 00:29:50,116 Speaker 3: these hard problems. So I'm quite excited, and I think 548 00:29:50,156 --> 00:29:53,476 Speaker 3: what I'm encouraging myself, my team, and my friends, like, 549 00:29:53,596 --> 00:29:56,956 Speaker 3: let's keep the optimism high. Let's problem solve together. 550 00:29:56,996 --> 00:30:00,916 Speaker 4: We don't have to do this alone. 551 00:30:02,236 --> 00:30:04,836 Speaker 2: They'll be back in a minute with the lightning round. 552 00:30:15,396 --> 00:30:18,716 Speaker 2: I want to finish now with a lightning round. Okay, 553 00:30:20,076 --> 00:30:22,196 Speaker 2: were you aware when you chose the name of your 554 00:30:22,236 --> 00:30:25,996 Speaker 2: company that there is a personal injury lawyer who advertises 555 00:30:26,036 --> 00:30:29,636 Speaker 2: a lot whose name is Seleno? Yes? 556 00:30:29,996 --> 00:30:32,676 Speaker 4: Ish, not really so Yeah. 557 00:30:32,716 --> 00:30:36,636 Speaker 3: I think when I named the company, I wasn't clear 558 00:30:36,756 --> 00:30:39,116 Speaker 3: that anybody was going to care about us, honestly, Like, 559 00:30:39,156 --> 00:30:42,996 Speaker 3: we didn't have a website, we just incorporated. We had 560 00:30:42,996 --> 00:30:47,116 Speaker 3: a terrible logo that I made on PowerPoint, so it 561 00:30:47,196 --> 00:30:50,316 Speaker 3: became more obvious a lot later. But I knew it 562 00:30:50,356 --> 00:30:52,596 Speaker 3: was an Italian last name when we put that way, 563 00:30:52,796 --> 00:30:54,556 Speaker 3: So I did know then, and I think Seleno and 564 00:30:54,636 --> 00:30:59,516 Speaker 3: Barnes that person is an Italian last thing too, Cellino. 565 00:31:00,836 --> 00:31:02,316 Speaker 2: Did anybody ever call your company? 566 00:31:03,036 --> 00:31:05,116 Speaker 3: I did, like happen like maybe two percent of the time. 567 00:31:05,236 --> 00:31:07,076 Speaker 3: But the way I got to the name was cell 568 00:31:07,276 --> 00:31:10,836 Speaker 3: and Innovation Selena and it is also a star in 569 00:31:10,876 --> 00:31:13,476 Speaker 3: the Pleiades star cluster and says it's. 570 00:31:13,356 --> 00:31:15,836 Speaker 4: A nod to my astronomy. Love. 571 00:31:16,956 --> 00:31:20,316 Speaker 2: You called your mother's approval of your curry a proud moment, 572 00:31:20,996 --> 00:31:24,316 Speaker 2: and so I'm curious, like, what is the secret to 573 00:31:24,436 --> 00:31:27,276 Speaker 2: making a curry? Your mother approves. 574 00:31:26,876 --> 00:31:32,356 Speaker 3: Of being very focused on the taste and the smell. Uh, 575 00:31:32,956 --> 00:31:33,876 Speaker 3: it's always experience. 576 00:31:34,156 --> 00:31:37,316 Speaker 2: That's not a secret. Of course, it should taste good 577 00:31:37,316 --> 00:31:38,036 Speaker 2: and smell good. 578 00:31:38,316 --> 00:31:41,476 Speaker 3: It's not a secret, you know. It's interesting. Yeah, my 579 00:31:41,516 --> 00:31:44,596 Speaker 3: mom is an excellent shift. She's excellent and many things, 580 00:31:45,356 --> 00:31:47,836 Speaker 3: and her curry just taste a certain way. So I'm 581 00:31:47,836 --> 00:31:50,116 Speaker 3: always trying to get as close as I can, and 582 00:31:51,196 --> 00:31:54,156 Speaker 3: when it matches that she makes the best Bengali food 583 00:31:54,396 --> 00:31:56,956 Speaker 3: in the hut, So like that's my measure. 584 00:31:57,156 --> 00:31:59,476 Speaker 2: I know in my wife's family, there's a tradition where 585 00:31:59,476 --> 00:32:02,156 Speaker 2: if someone asks you for the secret to your recipe, 586 00:32:02,516 --> 00:32:05,196 Speaker 2: you mislead them, you don't actually tell them the secret. 587 00:32:05,276 --> 00:32:07,276 Speaker 2: And I feel like that's what's happening here. I feel 588 00:32:07,276 --> 00:32:09,396 Speaker 2: like I haven't got any information out of you on 589 00:32:09,636 --> 00:32:10,916 Speaker 2: making a great curry. 590 00:32:11,036 --> 00:32:14,836 Speaker 3: Okay, so let me maybe just general notes about Bangladeshi curries. 591 00:32:14,836 --> 00:32:17,756 Speaker 3: Like it's a lot of turmeric, garlic and ginger paste. 592 00:32:18,516 --> 00:32:23,196 Speaker 3: We like whole spices like cardamom and bay leaves and things, 593 00:32:23,276 --> 00:32:23,756 Speaker 3: but it's. 594 00:32:23,636 --> 00:32:26,436 Speaker 4: Turmeric, cuman chili. 595 00:32:26,676 --> 00:32:30,716 Speaker 3: It's not very complicated, but the ratio is different than 596 00:32:30,796 --> 00:32:34,996 Speaker 3: other regions of South Asia. So turn it up on 597 00:32:35,076 --> 00:32:37,196 Speaker 3: the turmeric and then you'll get to Bangladesh. 598 00:32:37,556 --> 00:32:44,556 Speaker 2: Good. Are lasers overrated or underrated? 599 00:32:44,916 --> 00:32:45,476 Speaker 4: Underrated? 600 00:32:46,836 --> 00:32:52,596 Speaker 2: What is one underrated thing about lasers? 601 00:32:53,796 --> 00:32:58,756 Speaker 3: They are so incredibly precise and thanks to optics light 602 00:32:58,796 --> 00:33:02,836 Speaker 3: based manufacturing, we have all like all semiconductors. That's why 603 00:33:02,876 --> 00:33:05,396 Speaker 3: you and I get to have this conversation online through 604 00:33:06,316 --> 00:33:08,076 Speaker 3: through a laptop and the internet. 605 00:33:08,796 --> 00:33:08,996 Speaker 4: Yeah. 606 00:33:09,116 --> 00:33:13,036 Speaker 3: So the precision and the scale that they've delivered, it's incredible. 607 00:33:13,316 --> 00:33:15,476 Speaker 3: Light is incredible. I mean I just I just find 608 00:33:15,516 --> 00:33:18,116 Speaker 3: it remarkable that can be a wave and a particle 609 00:33:18,116 --> 00:33:20,396 Speaker 3: at the same time. Actually, my dog is named Photon. 610 00:33:21,036 --> 00:33:26,676 Speaker 2: Oh great, I like that is your dog full of energy? 611 00:33:26,996 --> 00:33:29,596 Speaker 4: Incredible? And she is a particle and a wave at once. 612 00:33:30,316 --> 00:33:34,116 Speaker 2: We all are I guess if I understand correctly, she's. 613 00:33:33,956 --> 00:33:34,676 Speaker 4: Very high energy. 614 00:33:34,836 --> 00:33:36,916 Speaker 3: Yeah, we kept My husband and I are both physicists, 615 00:33:36,956 --> 00:33:39,876 Speaker 3: so we went with Photon because we both use a 616 00:33:39,876 --> 00:33:42,356 Speaker 3: lot of lasers in grad school. And then our daughter 617 00:33:42,956 --> 00:33:46,116 Speaker 3: is Kiara and that also means light in Italian. Oh. 618 00:33:46,156 --> 00:33:56,796 Speaker 2: I love that. Nabihas the client is the co founder 619 00:33:56,836 --> 00:34:00,916 Speaker 2: and CEO of Seleno. Please email us at problem at 620 00:34:00,956 --> 00:34:04,196 Speaker 2: pushkin dot fm. We are always looking for new guests 621 00:34:04,316 --> 00:34:08,396 Speaker 2: for the show. Today's show was produced by Trinamnino and 622 00:34:08,436 --> 00:34:12,356 Speaker 2: Gabriel Hunter Chang. It was edited by Alexander Garriton and 623 00:34:12,516 --> 00:34:15,676 Speaker 2: engineered by Sarah Bruguer. I'm Jacob Goldstein and we'll be 624 00:34:15,716 --> 00:34:31,316 Speaker 2: back next week with another episode of What's Your Problem.