1 00:00:15,316 --> 00:00:23,036 Speaker 1: Pushkin. Over the past few decades, it's become more and 2 00:00:23,156 --> 00:00:27,076 Speaker 1: more expensive to develop new drugs. It now costs over 3 00:00:27,116 --> 00:00:30,276 Speaker 1: a billion dollars on average to bring a new drug 4 00:00:30,316 --> 00:00:33,316 Speaker 1: to market in the United States, and of course drug 5 00:00:33,356 --> 00:00:36,796 Speaker 1: companies pass those high development costs onto us in the 6 00:00:36,836 --> 00:00:40,076 Speaker 1: form of higher drug prices. This has been going on 7 00:00:40,116 --> 00:00:42,156 Speaker 1: for so long that we have sort of gotten used 8 00:00:42,196 --> 00:00:46,916 Speaker 1: to it. But when you zoom out, it's strange because, 9 00:00:47,196 --> 00:00:49,276 Speaker 1: as I've said before on this show, and as I 10 00:00:49,316 --> 00:00:52,076 Speaker 1: will say again on this show, one of the main 11 00:00:52,156 --> 00:00:57,276 Speaker 1: things technology does is it makes things more efficient and 12 00:00:57,356 --> 00:01:01,916 Speaker 1: therefore cheaper. Over the past few centuries, we've seen technologies 13 00:01:01,956 --> 00:01:04,996 Speaker 1: make all kinds of things cheaper, everything from clothes to 14 00:01:05,116 --> 00:01:10,516 Speaker 1: food to TVs. So why hasn't new technology made drugs cheaper? Two. 15 00:01:15,116 --> 00:01:17,756 Speaker 1: I'm Jacob Goldstein and this is What's Your Problem, the 16 00:01:17,796 --> 00:01:20,276 Speaker 1: show where I talk to people who are trying to 17 00:01:20,316 --> 00:01:24,836 Speaker 1: make technological progress. My guest today is Alice Zang, co 18 00:01:24,996 --> 00:01:29,316 Speaker 1: founder and CEO of verge Genomics. Alice's problem is this, 19 00:01:30,036 --> 00:01:33,676 Speaker 1: how do you use artificial intelligence to drive down the 20 00:01:33,756 --> 00:01:38,636 Speaker 1: price of discovering and developing new drugs? Why is it 21 00:01:38,676 --> 00:01:41,196 Speaker 1: getting more expensive to develop drugs, despite the fact that 22 00:01:41,236 --> 00:01:44,396 Speaker 1: we have better technology to do it. Yeah. Absolutely. One 23 00:01:44,436 --> 00:01:46,996 Speaker 1: of the reasons is, you know, even though a lot 24 00:01:47,036 --> 00:01:51,276 Speaker 1: of the new technologies we've developed have made us better 25 00:01:51,596 --> 00:01:56,316 Speaker 1: at testing more drugs faster, but the fundamental problem is 26 00:01:56,356 --> 00:01:58,236 Speaker 1: that even if we can get a drug all the 27 00:01:58,276 --> 00:02:00,756 Speaker 1: way to clinical trials, which is the last step of 28 00:02:00,836 --> 00:02:04,916 Speaker 1: drug development, ninety percent of those drugs still fail. So 29 00:02:04,956 --> 00:02:07,396 Speaker 1: if you think about it, we're spending millions on each drug. 30 00:02:09,236 --> 00:02:11,636 Speaker 1: Of those drugs are failing at the last and most 31 00:02:11,676 --> 00:02:15,276 Speaker 1: expensive stage of drug development. And so really most of 32 00:02:15,396 --> 00:02:18,996 Speaker 1: that billion plus dollar figure you hear is due to 33 00:02:19,036 --> 00:02:22,676 Speaker 1: the cost of failure. Just to be clear, that figure 34 00:02:22,716 --> 00:02:25,236 Speaker 1: more than a billion dollars. It's you've got to include 35 00:02:25,276 --> 00:02:28,036 Speaker 1: the cost of all the drugs that don't work exactly, 36 00:02:28,196 --> 00:02:30,396 Speaker 1: the ones that do right exactly. So the ones that 37 00:02:30,436 --> 00:02:32,356 Speaker 1: do work have to pay for all the ones that fail. 38 00:02:32,476 --> 00:02:36,396 Speaker 1: That's the fundamental problem, exactly, And you're setting out to 39 00:02:36,996 --> 00:02:40,836 Speaker 1: fix that if you can. Absolutely, we think there's an 40 00:02:40,876 --> 00:02:47,156 Speaker 1: opportunity for AI to fundamentally shift really the failure rate, 41 00:02:47,716 --> 00:02:50,116 Speaker 1: and the most impactful time to do that really is 42 00:02:50,116 --> 00:02:54,436 Speaker 1: the failure in clinical trials. So can we predict before 43 00:02:54,476 --> 00:02:58,996 Speaker 1: we go in to these expensive clinical trials genes or 44 00:02:59,116 --> 00:03:02,916 Speaker 1: targets or drugs that are more likely to work in humans, 45 00:03:02,956 --> 00:03:07,076 Speaker 1: because even a ten percent decrease in that failure rate 46 00:03:07,116 --> 00:03:11,076 Speaker 1: could have massive I saw a number of up to 47 00:03:11,116 --> 00:03:15,636 Speaker 1: fifteen billion dollars annually in industry cost savings. You could 48 00:03:15,636 --> 00:03:18,196 Speaker 1: still be in a universe where most of the drugs 49 00:03:18,236 --> 00:03:20,756 Speaker 1: that go into clinical trials fail, but instead of ninety 50 00:03:20,796 --> 00:03:23,636 Speaker 1: percent of them failing, seventy percent of them fail, and 51 00:03:23,676 --> 00:03:25,356 Speaker 1: that would be a huge win. That would be a 52 00:03:25,436 --> 00:03:28,476 Speaker 1: huge efficiency gain. It would save a ton of money, absolutely, 53 00:03:28,476 --> 00:03:31,636 Speaker 1: And I think that's something that's underappreciated about AI and 54 00:03:31,756 --> 00:03:35,756 Speaker 1: really any technology, is that oftentimes people have this expectation 55 00:03:36,236 --> 00:03:40,156 Speaker 1: that this technology is going to absolutely transform a field overnight. 56 00:03:40,476 --> 00:03:42,716 Speaker 1: And I think what people don't appreciate is that most 57 00:03:42,716 --> 00:03:45,396 Speaker 1: of the time that doesn't happen. It's always step by 58 00:03:45,436 --> 00:03:49,836 Speaker 1: step incremental. But even a ten percent change would have 59 00:03:49,876 --> 00:03:52,876 Speaker 1: billions of dollars of cost savings and would be a 60 00:03:52,956 --> 00:03:56,156 Speaker 1: huge win for patients in the industry worldwide. I like 61 00:03:56,316 --> 00:03:59,596 Speaker 1: that frame, actually, I like that frame of maybe AI 62 00:03:59,756 --> 00:04:03,836 Speaker 1: can have drugs fail most of the time, but not 63 00:04:03,916 --> 00:04:05,956 Speaker 1: as much of the time as they fail. Now, like, 64 00:04:05,996 --> 00:04:08,396 Speaker 1: it seems very credible, It seems very plausible. Would you 65 00:04:08,396 --> 00:04:10,836 Speaker 1: put it that way? Yeah, it's all life is nothing 66 00:04:10,876 --> 00:04:15,596 Speaker 1: but a learning process, Yes, getting less bad at everything. 67 00:04:16,756 --> 00:04:20,356 Speaker 1: So I know you were studying to be a doctor 68 00:04:20,396 --> 00:04:22,876 Speaker 1: and a researcher not that long ago, a few years 69 00:04:22,876 --> 00:04:25,316 Speaker 1: ago before you started your company. Like, tell me how 70 00:04:25,356 --> 00:04:30,636 Speaker 1: you went from an mdphd program to starting the company. Well, 71 00:04:30,716 --> 00:04:35,956 Speaker 1: my PhD research was actually in using genomic analysis and 72 00:04:36,076 --> 00:04:41,476 Speaker 1: computational biology to analyze large scale data sets and find 73 00:04:41,516 --> 00:04:46,516 Speaker 1: new drugs that could improve drug development. And we found 74 00:04:46,556 --> 00:04:50,836 Speaker 1: that from our very first drug that was predicted from 75 00:04:50,836 --> 00:04:53,956 Speaker 1: our algorithms when we put it in mice after they've 76 00:04:53,996 --> 00:04:58,316 Speaker 1: been injured, help them walk and recover from that injury, 77 00:04:58,476 --> 00:05:01,396 Speaker 1: that nerve injury about four times faster than the leading standard. 78 00:05:01,436 --> 00:05:03,796 Speaker 1: And that was just the first drug that was predicted. 79 00:05:03,876 --> 00:05:06,276 Speaker 1: And I looked at this technology in this approach and 80 00:05:06,316 --> 00:05:09,276 Speaker 1: I thought, Wow, there's so much promise here. You know, 81 00:05:09,316 --> 00:05:12,316 Speaker 1: am I really going to just publish this and let 82 00:05:12,316 --> 00:05:16,436 Speaker 1: it sit on a bookshelf somewhere, or if I'm not 83 00:05:16,476 --> 00:05:19,116 Speaker 1: going to be the one to really develop this to patients, 84 00:05:19,196 --> 00:05:21,636 Speaker 1: you know who will, And when I looked out off 85 00:05:21,636 --> 00:05:25,196 Speaker 1: the field, I did not see a ton of biotech 86 00:05:25,276 --> 00:05:29,436 Speaker 1: or farmer companies that were truly computationally driven. Usually within 87 00:05:29,476 --> 00:05:33,356 Speaker 1: pharma companies they might bring in computational biologists to support. 88 00:05:34,116 --> 00:05:38,036 Speaker 1: There are scientists or their biologists, but there wasn't really 89 00:05:38,156 --> 00:05:41,596 Speaker 1: a genomics computationally driven company at that time. Now there 90 00:05:41,636 --> 00:05:44,476 Speaker 1: are many, but at the time there are very few. 91 00:05:44,956 --> 00:05:47,676 Speaker 1: And so I actually, you know, it wasn't a binary decision. 92 00:05:47,716 --> 00:05:50,596 Speaker 1: People always ask me, how did you make the courageous 93 00:05:50,596 --> 00:05:55,556 Speaker 1: decision to leap? It wasn't really like that. I think 94 00:05:55,596 --> 00:05:57,916 Speaker 1: what we did first is that we just took three 95 00:05:57,956 --> 00:06:00,596 Speaker 1: months three month leave of absence. We joined a program, 96 00:06:00,636 --> 00:06:04,036 Speaker 1: an incubator called a y combinator. We as you and 97 00:06:04,396 --> 00:06:08,916 Speaker 1: you and well me and my co founder Jason. And 98 00:06:09,316 --> 00:06:11,996 Speaker 1: the first question really was, you know, can we even 99 00:06:13,156 --> 00:06:17,636 Speaker 1: generate some data that validates that computational biology can predict 100 00:06:17,676 --> 00:06:21,036 Speaker 1: targets that work? And then when we saw some data, 101 00:06:21,116 --> 00:06:25,116 Speaker 1: the next question was can we even hire people that 102 00:06:25,236 --> 00:06:27,436 Speaker 1: want to come on? And the next question was can 103 00:06:27,516 --> 00:06:31,316 Speaker 1: we even raise money from people that will care? And 104 00:06:31,356 --> 00:06:34,236 Speaker 1: I think that is so such an important lesson because 105 00:06:34,276 --> 00:06:37,916 Speaker 1: I think people oftentimes get caught up in just the destination, 106 00:06:38,036 --> 00:06:39,676 Speaker 1: you know, is where I want to be? Is this 107 00:06:39,756 --> 00:06:42,956 Speaker 1: the career I want to have that they don't take 108 00:06:42,996 --> 00:06:45,156 Speaker 1: the first step, And really it's the first step that's 109 00:06:45,196 --> 00:06:48,036 Speaker 1: needed to actually get the data to even decide if 110 00:06:48,076 --> 00:06:50,916 Speaker 1: it's the appropriate track for you. And did you really 111 00:06:50,956 --> 00:06:53,116 Speaker 1: just keep thinking, well, this might not work, but let's 112 00:06:53,196 --> 00:06:55,156 Speaker 1: do the next thing. Were you in a place where 113 00:06:55,156 --> 00:06:57,556 Speaker 1: you could have gone back to the MD PhD program 114 00:06:57,556 --> 00:07:00,676 Speaker 1: for a while. Yeah. I took a leave of a 115 00:07:00,676 --> 00:07:05,356 Speaker 1: continuous leave of absence for probably over five years, probably 116 00:07:05,396 --> 00:07:07,676 Speaker 1: more than I should have, until the point where a 117 00:07:07,716 --> 00:07:09,956 Speaker 1: lot of my friends are like, are you really, are 118 00:07:09,956 --> 00:07:13,236 Speaker 1: you really gonna go back? And finally the medical school 119 00:07:13,316 --> 00:07:15,276 Speaker 1: is like, you're not really going to come back, let's 120 00:07:15,316 --> 00:07:18,756 Speaker 1: just terminate your leave of absence. But it was in 121 00:07:18,796 --> 00:07:21,036 Speaker 1: the first few years a really important safety net for 122 00:07:21,116 --> 00:07:25,076 Speaker 1: me that gave me the psychological safety to really take 123 00:07:25,156 --> 00:07:28,556 Speaker 1: a risk and really pursue a new idea that I 124 00:07:28,596 --> 00:07:30,796 Speaker 1: don't know if I would have otherwise. And I think 125 00:07:30,836 --> 00:07:34,476 Speaker 1: that's so important. I think for universities to provide is 126 00:07:34,556 --> 00:07:36,956 Speaker 1: that to recognize there can be more than one track 127 00:07:37,756 --> 00:07:40,956 Speaker 1: for people to do really excellent science and make an 128 00:07:40,996 --> 00:07:44,716 Speaker 1: impact more than just becoming a professor. And sometimes that 129 00:07:44,796 --> 00:07:49,156 Speaker 1: psychological safety is what's needed to help people find their 130 00:07:49,236 --> 00:07:54,876 Speaker 1: ultimate calling too. By the ways, so far, By the way, 131 00:07:55,956 --> 00:08:02,716 Speaker 1: what's a very brief definition of computational biology. It's really, 132 00:08:02,716 --> 00:08:04,236 Speaker 1: at the end of the day, in my view, just 133 00:08:04,316 --> 00:08:09,196 Speaker 1: the use of computers and data sets to understand and 134 00:08:09,316 --> 00:08:13,956 Speaker 1: biology better. By the way, what happened to that molecule 135 00:08:14,836 --> 00:08:17,396 Speaker 1: that you were testing in mice in grad school? That 136 00:08:17,476 --> 00:08:22,316 Speaker 1: seemed useful? I don't know. It's a good question. Actually, 137 00:08:26,396 --> 00:08:29,436 Speaker 1: I think the project was taken on by someone else, 138 00:08:30,196 --> 00:08:33,956 Speaker 1: but I'm not actually completely sure. So, Okay, you leave 139 00:08:33,956 --> 00:08:37,276 Speaker 1: grad school, you start a company you in fact now 140 00:08:37,636 --> 00:08:41,436 Speaker 1: have taken You have a bunch of molecules that you're 141 00:08:41,436 --> 00:08:45,076 Speaker 1: working on, and that seemed promising. But there's one that 142 00:08:45,236 --> 00:08:49,196 Speaker 1: is in clinical trials now right to treat als Luke 143 00:08:49,196 --> 00:08:52,156 Speaker 1: Gary's disease, a terrible disease that is very poorly treated. 144 00:08:53,556 --> 00:08:57,676 Speaker 1: And I thought that we could talk about the story 145 00:08:57,716 --> 00:09:00,356 Speaker 1: of that molecule of that drug as a way to 146 00:09:00,436 --> 00:09:03,276 Speaker 1: understand the way your company works. Can you just sort 147 00:09:03,316 --> 00:09:08,396 Speaker 1: of take me through the life of that drug? So far? Yeah, absolutely. 148 00:09:08,516 --> 00:09:13,196 Speaker 1: I'll start off just by talking about als and why 149 00:09:13,356 --> 00:09:16,796 Speaker 1: it's been so hard to discover the right therapy, and 150 00:09:16,836 --> 00:09:19,836 Speaker 1: then you know why how we did that differently. So, 151 00:09:19,956 --> 00:09:22,956 Speaker 1: as you might know, LS Luke Garrig's disease is a 152 00:09:22,996 --> 00:09:27,316 Speaker 1: really horrible disease. What happens is that these neurons called 153 00:09:27,356 --> 00:09:31,876 Speaker 1: motor neurons start dying, and most patients experience paralysis and 154 00:09:31,916 --> 00:09:35,556 Speaker 1: then death, usually within three to five years of diagnosis. 155 00:09:35,596 --> 00:09:39,676 Speaker 1: A very fast progressing disease, and there really aren't any 156 00:09:40,036 --> 00:09:44,716 Speaker 1: meaningfully effective treatments that really slow or stop the disease today. 157 00:09:44,876 --> 00:09:48,316 Speaker 1: So a very horrible disease with a horrible prognosis and 158 00:09:48,356 --> 00:09:52,036 Speaker 1: no available treatments, and why it's been so hard I 159 00:09:52,076 --> 00:09:55,676 Speaker 1: think to discover really effective treatments is really just the 160 00:09:55,716 --> 00:09:59,596 Speaker 1: complexity of the disease, and really any disease of the brain, 161 00:09:59,636 --> 00:10:02,076 Speaker 1: the brain is the most complex organ in the body. 162 00:10:02,556 --> 00:10:05,116 Speaker 1: So you end up having a lot of drugs brought 163 00:10:05,156 --> 00:10:07,596 Speaker 1: into clinical trials that worked in mice. I always like 164 00:10:07,636 --> 00:10:09,636 Speaker 1: to say we've cured LS or can There are many 165 00:10:09,636 --> 00:10:12,996 Speaker 1: diseases in mice a thousand times, but none of them 166 00:10:13,036 --> 00:10:16,316 Speaker 1: have really worked in humans. So what we did differently 167 00:10:16,476 --> 00:10:19,676 Speaker 1: was we started from day one by collecting data from 168 00:10:19,676 --> 00:10:24,316 Speaker 1: over a thousand ALS patients as well as controls, and specifically, 169 00:10:24,396 --> 00:10:28,156 Speaker 1: we collected samples of brain tissue as well as spinal 170 00:10:28,236 --> 00:10:32,196 Speaker 1: cords from these patients that actually passed away from ALS. 171 00:10:32,196 --> 00:10:36,796 Speaker 1: So you got samples from a thousand patients who had 172 00:10:37,156 --> 00:10:40,796 Speaker 1: died of ALS. How did you do that? So what 173 00:10:40,836 --> 00:10:43,356 Speaker 1: we've done over the last seven years is we've signed 174 00:10:43,356 --> 00:10:49,396 Speaker 1: partnerships with over twenty one different brain banks, hospitals, labs, 175 00:10:49,516 --> 00:10:53,956 Speaker 1: academic centers worldwide that collect these brain tissues. They're usually 176 00:10:53,956 --> 00:10:56,996 Speaker 1: donated from patients that have passed away from the disease 177 00:10:57,076 --> 00:11:00,676 Speaker 1: and whose families want to contribute to research. Could So 178 00:11:00,716 --> 00:11:05,596 Speaker 1: step one basically is get tissue samples from real patients. 179 00:11:06,036 --> 00:11:08,596 Speaker 1: And you said controls as well, right, So tissue samples 180 00:11:08,596 --> 00:11:10,556 Speaker 1: from healthy people as well, so that you can use 181 00:11:10,596 --> 00:11:14,076 Speaker 1: them as a basis of comparison. You have the samples, Now, 182 00:11:14,196 --> 00:11:18,236 Speaker 1: what's step two? So step two is that we put 183 00:11:18,236 --> 00:11:22,156 Speaker 1: an enormous amount of effort into quality controlling these, So 184 00:11:22,236 --> 00:11:26,716 Speaker 1: that's a big underappreciated step. They can be very noisy samples. 185 00:11:27,316 --> 00:11:31,196 Speaker 1: And then step three is that we sequence them, so 186 00:11:31,236 --> 00:11:35,916 Speaker 1: we profile, what is the expression of all twenty thousand 187 00:11:35,956 --> 00:11:40,196 Speaker 1: genes in the genome, and we also sometimes do DNA sequencing, 188 00:11:40,276 --> 00:11:44,236 Speaker 1: we look at genetic mutations. We also have a clinical 189 00:11:44,276 --> 00:11:47,636 Speaker 1: information about that patient, how long did they have the disease, 190 00:11:47,756 --> 00:11:52,636 Speaker 1: when did they die? And that makes for a very rich, 191 00:11:52,716 --> 00:11:56,316 Speaker 1: multidimensional data set, and that gives us essentially a global 192 00:11:56,396 --> 00:12:01,076 Speaker 1: snapshot of what happened in that patient. H okay, and 193 00:12:01,156 --> 00:12:05,516 Speaker 1: you and presumably the sequencing that you're doing on the 194 00:12:05,556 --> 00:12:08,996 Speaker 1: patient's tissue samples, you're doing the same sequencing on the controls, 195 00:12:09,236 --> 00:12:12,476 Speaker 1: the samples from healthy people. So now you have this 196 00:12:12,756 --> 00:12:17,636 Speaker 1: very large data set. What's the next step. So then 197 00:12:17,716 --> 00:12:20,556 Speaker 1: you have this snapshot of what happened, and the tricky 198 00:12:20,596 --> 00:12:22,916 Speaker 1: part is to figure out what caused it. I often 199 00:12:22,996 --> 00:12:26,436 Speaker 1: liken it to a plane has crashed, right, You're looking 200 00:12:26,436 --> 00:12:28,676 Speaker 1: through the rubble and you want to figure out how 201 00:12:28,716 --> 00:12:31,316 Speaker 1: the plane crashed and how that information can be used 202 00:12:31,356 --> 00:12:35,956 Speaker 1: to prevent further planes from crashing. So that's when our 203 00:12:36,076 --> 00:12:39,396 Speaker 1: software engineers and data scientists as well as machine learning 204 00:12:39,396 --> 00:12:43,196 Speaker 1: scientists come in and we have algorithms essentially to integrate 205 00:12:43,276 --> 00:12:46,596 Speaker 1: multiple data types all the way from the RNA, so 206 00:12:46,716 --> 00:12:50,436 Speaker 1: how the genes were expressed to genetic mutations to essentially 207 00:12:50,476 --> 00:12:54,836 Speaker 1: create a map of disease biology, and within the map 208 00:12:54,876 --> 00:12:58,636 Speaker 1: our networks of genes that are all interconnected that we 209 00:12:58,716 --> 00:13:02,036 Speaker 1: believe cause disease. And so I like to think about 210 00:13:02,036 --> 00:13:04,836 Speaker 1: it like when you're looking through a plane crash the rubble, 211 00:13:04,876 --> 00:13:07,556 Speaker 1: you want to find the black box, which I'll help 212 00:13:07,596 --> 00:13:10,676 Speaker 1: you figure out the cause of the disease. And by 213 00:13:10,716 --> 00:13:14,236 Speaker 1: having all the information, we essentially locate the black boxes 214 00:13:14,276 --> 00:13:17,156 Speaker 1: of disease, the targets that are really at the center 215 00:13:17,196 --> 00:13:20,556 Speaker 1: of those networks, and then we design drugs against those 216 00:13:20,596 --> 00:13:24,196 Speaker 1: targets that we believe can reverse disease. It seems like 217 00:13:24,956 --> 00:13:29,036 Speaker 1: differentiating between correlation and causality in this particular setting would 218 00:13:29,036 --> 00:13:31,116 Speaker 1: be really hard, right, Like to use the plane metaphor, 219 00:13:31,116 --> 00:13:32,596 Speaker 1: if you had a bunch of planes that crash and 220 00:13:32,636 --> 00:13:34,836 Speaker 1: a bunch that hadn't crashed, you might say, oh, like 221 00:13:35,276 --> 00:13:38,116 Speaker 1: the wings were off all the ones that crashed, and 222 00:13:38,156 --> 00:13:40,476 Speaker 1: that's why they crashed. But actually the wings came off 223 00:13:40,516 --> 00:13:42,596 Speaker 1: because they crashed, right, and it was something else that 224 00:13:42,596 --> 00:13:45,156 Speaker 1: caused the crash. I feel like that would be I mean, 225 00:13:45,596 --> 00:13:49,756 Speaker 1: an obvious problem. Yeah, that might be hard. To solve Absolutely, 226 00:13:49,796 --> 00:13:51,356 Speaker 1: you hit the nail on the head, and actually the 227 00:13:51,356 --> 00:13:53,756 Speaker 1: plane metaphor is a really great one here. For one 228 00:13:53,796 --> 00:13:57,036 Speaker 1: of the biggest challenges with looking at tissue from a 229 00:13:57,036 --> 00:13:59,956 Speaker 1: patient that already died is that you're getting the crash right. 230 00:13:59,956 --> 00:14:03,676 Speaker 1: You're not seeing video of before the crash. You're really 231 00:14:03,716 --> 00:14:06,396 Speaker 1: getting the crash. And the challenge is how do you 232 00:14:06,436 --> 00:14:09,836 Speaker 1: figure what caused the crash versus as well was just 233 00:14:10,156 --> 00:14:14,436 Speaker 1: the effect of the crash, like a burned wing, etc. 234 00:14:15,116 --> 00:14:18,196 Speaker 1: And one of the ways we do that is we 235 00:14:18,396 --> 00:14:21,436 Speaker 1: combine different data types. So we found that looking at 236 00:14:21,476 --> 00:14:24,996 Speaker 1: one type of data, for example, just RNA data is 237 00:14:25,036 --> 00:14:28,516 Speaker 1: in particularly helpful, but it's actually looking at where do 238 00:14:28,556 --> 00:14:31,596 Speaker 1: you get convergence signal that pulls through multiple types of 239 00:14:31,676 --> 00:14:35,836 Speaker 1: data to start revealing more compelling signal. So as an example, 240 00:14:35,956 --> 00:14:38,916 Speaker 1: we look at genetic data as well. So genetic data 241 00:14:39,036 --> 00:14:42,476 Speaker 1: is useful for looking at cause versus effect because it 242 00:14:42,476 --> 00:14:46,036 Speaker 1: contains information about genetic mutations that you were born with 243 00:14:46,116 --> 00:14:50,196 Speaker 1: as a baby that then lead to increased risk later 244 00:14:50,276 --> 00:14:52,676 Speaker 1: in life for a disease. And that's kind of nature's 245 00:14:53,116 --> 00:14:56,636 Speaker 1: human experiment for really cause and effect. And when we 246 00:14:56,796 --> 00:15:00,556 Speaker 1: layer that on that information on with the RNA data. 247 00:15:00,756 --> 00:15:03,636 Speaker 1: It actually gives us information about how the genetic drivers 248 00:15:03,676 --> 00:15:07,556 Speaker 1: are acting in these functional pathways, which is a big 249 00:15:07,596 --> 00:15:10,236 Speaker 1: issue actually with just looking at genetic data on its own. 250 00:15:10,476 --> 00:15:11,836 Speaker 1: So I wish I had a better I wish I 251 00:15:11,836 --> 00:15:13,436 Speaker 1: had a way to actually string that through to the 252 00:15:13,476 --> 00:15:18,876 Speaker 1: plane metaphor. But and there's a time for leaving metaphors behind. 253 00:15:20,676 --> 00:15:24,676 Speaker 1: Your company uses AI in drug discovery. I appreciate in 254 00:15:24,676 --> 00:15:27,556 Speaker 1: a certain way that you haven't said AI yet, but 255 00:15:27,716 --> 00:15:29,876 Speaker 1: also I don't want to not talk about it. I 256 00:15:29,876 --> 00:15:32,316 Speaker 1: mean in the sort of figuring out what's going on 257 00:15:32,356 --> 00:15:35,996 Speaker 1: in this step? Is that well, is that the first 258 00:15:36,036 --> 00:15:38,396 Speaker 1: instance in this process where you're using AI? Is it? 259 00:15:38,396 --> 00:15:41,556 Speaker 1: We're talking about that here? Yeah, I mean, I think 260 00:15:41,556 --> 00:15:45,076 Speaker 1: AI is a really broad term for any kind of 261 00:15:45,196 --> 00:15:48,836 Speaker 1: process where the computer is learning from something. So there 262 00:15:48,876 --> 00:15:53,036 Speaker 1: are all sorts of applications of AI in this entire process, 263 00:15:53,076 --> 00:15:56,836 Speaker 1: for example, how we're integrating the data sets together, how 264 00:15:56,836 --> 00:16:03,316 Speaker 1: we're inferring what are the central nodes or the key targets. 265 00:16:04,036 --> 00:16:07,276 Speaker 1: I would say the most classical use of A on 266 00:16:07,356 --> 00:16:09,476 Speaker 1: the way that most people think of it is then 267 00:16:09,516 --> 00:16:11,836 Speaker 1: once we have this network of say one hundred genes, 268 00:16:12,276 --> 00:16:15,156 Speaker 1: how do we actually find what the cause is? How 269 00:16:15,196 --> 00:16:18,356 Speaker 1: do we find what is the hub or the right 270 00:16:18,396 --> 00:16:21,996 Speaker 1: target to hit to turn off or on all hundred 271 00:16:22,036 --> 00:16:24,556 Speaker 1: of those genes. And that's where machine learning and AI 272 00:16:24,756 --> 00:16:31,876 Speaker 1: comes in handy. In a minute, Alice explains how this 273 00:16:31,956 --> 00:16:35,596 Speaker 1: actually works in the case of the ALS drug verges 274 00:16:35,676 --> 00:16:46,436 Speaker 1: working on. Now now back to the show. So okay, 275 00:16:46,476 --> 00:16:49,076 Speaker 1: Alice and her colleagues at Verge have collected all these 276 00:16:49,116 --> 00:16:52,876 Speaker 1: tissue samples from ALS patients. They've used the samples to 277 00:16:52,916 --> 00:16:56,716 Speaker 1: generate this huge data set that shows genetic variation and 278 00:16:56,876 --> 00:17:00,036 Speaker 1: changes in how genes are expressed, along with lots of 279 00:17:00,076 --> 00:17:03,876 Speaker 1: clinical data about the patients, and then they build these 280 00:17:03,876 --> 00:17:07,956 Speaker 1: basically these AI models to try to figure out where 281 00:17:08,156 --> 00:17:12,276 Speaker 1: in this complicated biological process that's happening in this disease, 282 00:17:12,636 --> 00:17:16,276 Speaker 1: where they should try to intervene with a drug, Basically 283 00:17:16,316 --> 00:17:19,396 Speaker 1: where they should try and target a drug. I think 284 00:17:19,396 --> 00:17:22,356 Speaker 1: of this oftentimes, like if you think of a map 285 00:17:22,356 --> 00:17:24,596 Speaker 1: of all the airports in the US, you want to 286 00:17:24,596 --> 00:17:29,156 Speaker 1: figure out how to go after the hubs like Chicago 287 00:17:29,316 --> 00:17:32,316 Speaker 1: or New York. You don't want to go an airport 288 00:17:32,356 --> 00:17:34,556 Speaker 1: in Kansas or I will wouldn't be very effective at 289 00:17:34,596 --> 00:17:38,796 Speaker 1: stopping airplane travel in the country. So there's a lot 290 00:17:38,836 --> 00:17:42,476 Speaker 1: of different pieces of information that we collect to then 291 00:17:42,716 --> 00:17:45,156 Speaker 1: infer what are the best genes that are not only 292 00:17:45,276 --> 00:17:49,196 Speaker 1: central within this network, but also there's independent evidence of 293 00:17:49,436 --> 00:17:54,276 Speaker 1: a disease causal effect or a relationship to disease. And 294 00:17:54,356 --> 00:17:59,836 Speaker 1: so you do all that in this instance, and what 295 00:17:59,916 --> 00:18:03,076 Speaker 1: do you figure out? So what the algorithms spit out 296 00:18:03,076 --> 00:18:06,236 Speaker 1: is essentially a ranked list of targets, all right, So 297 00:18:06,276 --> 00:18:08,596 Speaker 1: these are ranked list of targets that are predicted if 298 00:18:08,596 --> 00:18:12,996 Speaker 1: we could dry them, would restore that network back to 299 00:18:13,116 --> 00:18:18,156 Speaker 1: levels of healthy people and potentially slow or stop the disease. 300 00:18:19,156 --> 00:18:20,916 Speaker 1: And then what we do is we take those targets 301 00:18:20,916 --> 00:18:23,396 Speaker 1: and we start testing them in the lab, all right, 302 00:18:23,436 --> 00:18:25,356 Speaker 1: So we actually what is kind of cool about the 303 00:18:25,396 --> 00:18:27,556 Speaker 1: platform is we get all these targets from human brain 304 00:18:27,596 --> 00:18:30,716 Speaker 1: tissue and we also can test them in human brain 305 00:18:30,796 --> 00:18:34,156 Speaker 1: cells in the lab. So you get a list it's 306 00:18:34,196 --> 00:18:37,436 Speaker 1: basically genes to target. You either it says upregulate or 307 00:18:37,676 --> 00:18:40,156 Speaker 1: make this gene express more or make this gene express less. 308 00:18:40,236 --> 00:18:44,236 Speaker 1: Is that basically what the AI is out putting exactly, Like, 309 00:18:44,836 --> 00:18:47,916 Speaker 1: so how long in the instance of this ALS drug. 310 00:18:47,916 --> 00:18:50,636 Speaker 1: How long was the list? More or less, our initial 311 00:18:50,676 --> 00:18:55,716 Speaker 1: set of targets was twenty two high confidence targets, and 312 00:18:55,756 --> 00:18:59,916 Speaker 1: then we actually then generated another chut choosing updated data 313 00:19:00,076 --> 00:19:03,916 Speaker 1: of about thirty more targets as well. And what was 314 00:19:04,036 --> 00:19:07,396 Speaker 1: really striking when we tested these targets is that when 315 00:19:07,436 --> 00:19:10,596 Speaker 1: we tested them in the lab, we found that on 316 00:19:10,836 --> 00:19:14,716 Speaker 1: average over sixty percent of them, though more recently actually 317 00:19:14,756 --> 00:19:17,756 Speaker 1: around eighty percent of them actually validated in the lab, 318 00:19:17,836 --> 00:19:22,036 Speaker 1: so they actually protected ALS patient cells from dying, which 319 00:19:22,076 --> 00:19:25,356 Speaker 1: is very high. So we're really excited that we're actually 320 00:19:25,356 --> 00:19:29,556 Speaker 1: seeing very robust validation of the computational predictions, at least 321 00:19:29,596 --> 00:19:33,596 Speaker 1: in the lab. Okay, so you have this list, you're 322 00:19:33,636 --> 00:19:37,436 Speaker 1: testing it, something like half of them seem promising, you said, 323 00:19:37,436 --> 00:19:41,996 Speaker 1: sixty percent seem promising. What happens next? Okay, So what 324 00:19:42,036 --> 00:19:44,796 Speaker 1: happens next is that we so we test them in 325 00:19:44,836 --> 00:19:48,796 Speaker 1: these human brain cells. We understand the mechanism. One of 326 00:19:48,836 --> 00:19:52,116 Speaker 1: the really interesting findings from this ALS program and specific 327 00:19:52,356 --> 00:19:54,236 Speaker 1: is that when we looked at the network that we 328 00:19:54,316 --> 00:19:57,596 Speaker 1: found in these patient spinal cords, we found a new 329 00:19:57,636 --> 00:20:02,276 Speaker 1: cause of disease that was previously unknown so most of 330 00:20:02,316 --> 00:20:04,836 Speaker 1: the hypotheses in ALS, where many of them to date, 331 00:20:04,876 --> 00:20:08,556 Speaker 1: have really been focused around these protein aggregates, these clumps 332 00:20:08,556 --> 00:20:11,316 Speaker 1: of proteins that we can easily observe by ie that 333 00:20:11,396 --> 00:20:13,636 Speaker 1: you see in ALS patients. Right, A lot of them 334 00:20:13,636 --> 00:20:17,716 Speaker 1: are observational hypotheses. But what we found by looking at 335 00:20:17,716 --> 00:20:21,036 Speaker 1: a deeper cut of the data is actually, at baseline, 336 00:20:21,716 --> 00:20:25,716 Speaker 1: most of these patients actually had a baseline dysfunction in 337 00:20:25,756 --> 00:20:28,556 Speaker 1: their life csomal pathway, which I like to call the 338 00:20:28,596 --> 00:20:32,436 Speaker 1: garbage disposal pathway. It's what is critical to clear out 339 00:20:32,556 --> 00:20:36,596 Speaker 1: junk from the cell. And because patients were at baseline 340 00:20:36,796 --> 00:20:40,156 Speaker 1: vulnerable to these toxic insults, it wasn't so much the 341 00:20:40,196 --> 00:20:42,876 Speaker 1: protein clumps that were directly causing it. It was because 342 00:20:42,916 --> 00:20:47,076 Speaker 1: they're already vulnerable to these clumps of proteins that their 343 00:20:47,076 --> 00:20:51,476 Speaker 1: cells started dying. And is the idea that the gene 344 00:20:51,476 --> 00:20:57,196 Speaker 1: you're targeting is causing the cell's garbage disposal to not work, right, 345 00:20:57,236 --> 00:20:59,476 Speaker 1: Like you're trying to fix the garbage disposal by targeting 346 00:20:59,476 --> 00:21:04,516 Speaker 1: this particular gene. Yeah, it's a central regulator of that pathway. 347 00:21:04,796 --> 00:21:06,756 Speaker 1: And it was also a target that was ranked I 348 00:21:06,836 --> 00:21:08,516 Speaker 1: think it was ranked number one or number two on 349 00:21:08,556 --> 00:21:12,756 Speaker 1: the list. So just to be clear, how how do 350 00:21:12,836 --> 00:21:16,916 Speaker 1: you get from you know, so you have fifty or 351 00:21:16,956 --> 00:21:21,476 Speaker 1: so things to test, fifty or so targets, something like 352 00:21:23,476 --> 00:21:26,716 Speaker 1: thirty of them seem promising. How do you decide which 353 00:21:26,716 --> 00:21:30,956 Speaker 1: of those thirty to proceed with? Yeah, so that's a 354 00:21:30,996 --> 00:21:33,236 Speaker 1: great question. We get asked that a lot. I think 355 00:21:33,316 --> 00:21:36,556 Speaker 1: at that point it's a strategic decision. Right, you were 356 00:21:36,596 --> 00:21:39,796 Speaker 1: a startup, Right, we have to be able to develop 357 00:21:39,836 --> 00:21:43,436 Speaker 1: things quickly and capital efficiently. So we were lucky in 358 00:21:43,476 --> 00:21:47,036 Speaker 1: that sense that one of the top targets was also 359 00:21:47,036 --> 00:21:51,076 Speaker 1: a target that already had where the path to developing 360 00:21:51,116 --> 00:21:54,796 Speaker 1: a drug was relatively smooth, A lot was known about 361 00:21:54,796 --> 00:21:57,876 Speaker 1: that target. We could start doing chemistry and designing molecules 362 00:21:57,916 --> 00:22:01,956 Speaker 1: relatively easily, and the target itself had actually been tested 363 00:22:02,356 --> 00:22:07,476 Speaker 1: in the clinic for other diseases, not als, but things 364 00:22:07,516 --> 00:22:10,716 Speaker 1: like Crohn's disease and surrounds, so we did know there 365 00:22:10,796 --> 00:22:15,596 Speaker 1: was some safety data around hitting that target. We do 366 00:22:15,676 --> 00:22:20,036 Speaker 1: then for targets where we can't develop all of the targets, right, 367 00:22:20,076 --> 00:22:23,196 Speaker 1: we can only take focused bets for targets where there's 368 00:22:23,236 --> 00:22:25,916 Speaker 1: a bit more technical risk, Right, It might be a 369 00:22:25,916 --> 00:22:28,996 Speaker 1: bit more exotic. People don't really understand how it works. 370 00:22:29,876 --> 00:22:33,156 Speaker 1: There's not a lot of tools out there to really 371 00:22:33,196 --> 00:22:37,276 Speaker 1: develop drugs against it. That's where we might partner with 372 00:22:37,316 --> 00:22:41,916 Speaker 1: a pharma company to develop those targets. And we have 373 00:22:42,236 --> 00:22:45,156 Speaker 1: such a collaboration with Eli Lily where we developed our 374 00:22:45,196 --> 00:22:48,676 Speaker 1: als target, but actually Lily has the opportunity to essentially 375 00:22:48,756 --> 00:22:53,636 Speaker 1: take you targets number three through twenty two plus and 376 00:22:53,796 --> 00:22:57,076 Speaker 1: choose four of them to develop themselves. Oh interesting. So 377 00:22:57,156 --> 00:23:00,316 Speaker 1: in that way, you're essentially laying off the risk to 378 00:23:00,396 --> 00:23:03,396 Speaker 1: this giant pharma company that can afford to make more bets. 379 00:23:04,236 --> 00:23:07,756 Speaker 1: I'd say we're distributing the risk and we're allowing us 380 00:23:07,756 --> 00:23:12,236 Speaker 1: to really capitalize on the entire opportunity all of the targets, 381 00:23:12,276 --> 00:23:15,236 Speaker 1: because it's impossible for any small startup to do, you know, 382 00:23:15,436 --> 00:23:18,676 Speaker 1: thirty different programs. And it's actually in line with what 383 00:23:18,796 --> 00:23:20,596 Speaker 1: a lot of pharma companies are looking for. A lot 384 00:23:20,636 --> 00:23:23,916 Speaker 1: of pharma companies are looking for. What is that novel 385 00:23:24,116 --> 00:23:26,396 Speaker 1: target that no one else is working on that's kind 386 00:23:26,436 --> 00:23:29,636 Speaker 1: of unexpected, Where if we could really get a competitive 387 00:23:29,716 --> 00:23:32,356 Speaker 1: edge in here, this would be really meaningful for a 388 00:23:32,436 --> 00:23:37,836 Speaker 1: position within within drug development in the next ten years. Well, 389 00:23:37,876 --> 00:23:42,636 Speaker 1: and I mean it also seems compelling because even though 390 00:23:43,156 --> 00:23:46,116 Speaker 1: this seems like a more promising way to do drug development, 391 00:23:46,596 --> 00:23:51,596 Speaker 1: drug development is hard enough that anyone candidate drug is 392 00:23:51,636 --> 00:23:55,396 Speaker 1: probably not going to work, right. Yeah, An any biotechniqus 393 00:23:55,396 --> 00:23:57,316 Speaker 1: to be able to have a pipeline and the ability 394 00:23:57,316 --> 00:24:00,956 Speaker 1: to withstand I think some failures because I think it's 395 00:24:01,036 --> 00:24:03,796 Speaker 1: unrealistic to expect one hundred percent of what you try 396 00:24:03,836 --> 00:24:07,876 Speaker 1: will work. But that doesn't reflect on the technology itself, 397 00:24:08,556 --> 00:24:11,516 Speaker 1: and that can be something unfortunate in biotech, where you know, 398 00:24:11,516 --> 00:24:16,116 Speaker 1: if the first thing fails, everyone's all can be. It 399 00:24:16,156 --> 00:24:18,876 Speaker 1: can be tempted to say, oh, the technology didn't work, 400 00:24:18,916 --> 00:24:21,676 Speaker 1: but in reality, you think about how many different drugs 401 00:24:21,716 --> 00:24:24,516 Speaker 1: that pharmac companies test all the time. Right, So I 402 00:24:24,556 --> 00:24:28,556 Speaker 1: think really promising technologies need to be afforded that runway 403 00:24:28,596 --> 00:24:31,196 Speaker 1: and that ability to really take multiple shots on goal 404 00:24:31,276 --> 00:24:32,996 Speaker 1: before you can get the end to really see if 405 00:24:33,036 --> 00:24:36,876 Speaker 1: it's working. Right. Well, I mean, if nine of traditionally 406 00:24:36,876 --> 00:24:41,036 Speaker 1: developed drugs fail once they get to clinical trials, you 407 00:24:41,076 --> 00:24:44,116 Speaker 1: could be way better but still likely to fail on 408 00:24:44,196 --> 00:24:48,356 Speaker 1: anyone drug. Yeah, Yeah, even a fifty percent would be huge, right, 409 00:24:48,356 --> 00:24:51,076 Speaker 1: but still that means one out of two drugs will fail. 410 00:24:54,876 --> 00:24:57,596 Speaker 1: Relative to the world we live in now, a world 411 00:24:57,636 --> 00:25:00,676 Speaker 1: where one out of two drugs fail could be a 412 00:25:00,676 --> 00:25:04,916 Speaker 1: world where we get more new drugs for less money. 413 00:25:05,596 --> 00:25:08,836 Speaker 1: In a minute, the Lightning Round including the worst thing 414 00:25:09,116 --> 00:25:12,236 Speaker 1: out being named to the Forbes thirty Under thirty, and 415 00:25:12,396 --> 00:25:15,836 Speaker 1: the best thing about accepting that your company might sail. 416 00:25:22,036 --> 00:25:24,156 Speaker 1: That's the end of the ads. Now we're going back 417 00:25:24,156 --> 00:25:27,476 Speaker 1: to the show. Let's let's close with the Lightning Round. 418 00:25:28,876 --> 00:25:32,396 Speaker 1: You personally interviewed over a thousand people when you were 419 00:25:32,396 --> 00:25:36,996 Speaker 1: starting your company, as I understand it, which seems very intense. 420 00:25:37,476 --> 00:25:39,716 Speaker 1: And I'm sure as if there's anything in your life 421 00:25:39,796 --> 00:25:43,396 Speaker 1: outside of work where you've been that intense. Oh, everything 422 00:25:44,356 --> 00:25:48,076 Speaker 1: that is a core to my being. If you ask 423 00:25:48,156 --> 00:25:51,476 Speaker 1: my spouse, you would say any new game that we 424 00:25:51,516 --> 00:25:54,436 Speaker 1: start playing. And I'm very competitive and it's just part 425 00:25:54,476 --> 00:25:56,516 Speaker 1: of my being. I iterate, I get a lot of 426 00:25:56,516 --> 00:25:58,636 Speaker 1: reps in He always likes to make fun of me 427 00:25:58,716 --> 00:26:02,036 Speaker 1: that I have an AI in my head. I'm constantly 428 00:26:02,156 --> 00:26:06,076 Speaker 1: learning and improving the model until eventually I become a 429 00:26:07,196 --> 00:26:10,476 Speaker 1: lean mean. We've been saying a lot of Katan recently, 430 00:26:10,916 --> 00:26:13,196 Speaker 1: and I think if we him fifteen times in a row, 431 00:26:14,036 --> 00:26:18,516 Speaker 1: So yeah, I am very intense and thorough in my life. 432 00:26:20,756 --> 00:26:26,876 Speaker 1: Is chat GPT overrated or underrated? Both? Actually? I think 433 00:26:26,916 --> 00:26:30,756 Speaker 1: it's both over and underrated. It's overrated for some applications 434 00:26:30,796 --> 00:26:34,396 Speaker 1: and underrated for others. I think it's overrated for things 435 00:26:34,436 --> 00:26:38,836 Speaker 1: where there aren't a lot of information available already on 436 00:26:38,876 --> 00:26:43,436 Speaker 1: that thing. I think it's underrated for applications at coding, 437 00:26:43,436 --> 00:26:45,756 Speaker 1: where there's already a large body of literature out there. 438 00:26:45,796 --> 00:26:48,716 Speaker 1: So it's really good at replicating things that exist, less 439 00:26:48,756 --> 00:26:53,556 Speaker 1: good at discovering new things that don't exist. I read 440 00:26:53,596 --> 00:26:56,236 Speaker 1: an interview where you said one of the things you've 441 00:26:56,316 --> 00:26:59,476 Speaker 1: learned as in running your company is you learn to 442 00:26:59,516 --> 00:27:02,076 Speaker 1: be okay with your company dying with your company not 443 00:27:02,236 --> 00:27:05,556 Speaker 1: making it, which I found like very surprising and interesting. 444 00:27:05,596 --> 00:27:08,836 Speaker 1: Can you just tell me a little bit about that. Yeah, 445 00:27:08,916 --> 00:27:11,276 Speaker 1: I mean, I think it gets to really the core 446 00:27:11,516 --> 00:27:13,916 Speaker 1: of how we drive our culture, which is I think 447 00:27:13,996 --> 00:27:17,436 Speaker 1: that soul for so long companies have been driven through 448 00:27:17,516 --> 00:27:20,796 Speaker 1: fear and bravado of you know, we're crushing it, We're 449 00:27:20,956 --> 00:27:23,036 Speaker 1: pounding on our talking about how we're crushing it, and 450 00:27:23,156 --> 00:27:27,236 Speaker 1: less about emotional vulnerably and introspection and self awareness, and 451 00:27:27,476 --> 00:27:30,796 Speaker 1: ultimately I found the thing that really transformed my leadership 452 00:27:30,836 --> 00:27:34,956 Speaker 1: style was learning what I had grips over of where 453 00:27:34,996 --> 00:27:37,756 Speaker 1: I was really attached to outcomes, And ultimately, I think 454 00:27:37,796 --> 00:27:41,076 Speaker 1: for all CEOs, a lot of that is tying meaning 455 00:27:41,476 --> 00:27:43,676 Speaker 1: to what happens with the company. If the company fails, 456 00:27:43,756 --> 00:27:46,956 Speaker 1: this means something about me as a person, and I 457 00:27:46,996 --> 00:27:50,756 Speaker 1: think that stifles a ton of innovation and curiosity and 458 00:27:50,836 --> 00:27:54,196 Speaker 1: tends to drive those cultures of fear. So ultimately, the thing, 459 00:27:54,236 --> 00:27:57,356 Speaker 1: for example, that got me to stop micromanaging was really 460 00:27:57,396 --> 00:28:00,556 Speaker 1: being okay with the company dying, because ultimately, what is 461 00:28:00,596 --> 00:28:03,916 Speaker 1: micromanaging if not just fear right or fear or control. 462 00:28:04,396 --> 00:28:06,076 Speaker 1: And once you let go of that fear and you 463 00:28:06,116 --> 00:28:08,636 Speaker 1: recognize you're just open to learning. You can still really 464 00:28:08,676 --> 00:28:10,876 Speaker 1: want the company to succeed, and you can be passionate 465 00:28:10,876 --> 00:28:14,876 Speaker 1: about it, but you're no longer thinking, oh, I'm screwed, 466 00:28:15,196 --> 00:28:17,956 Speaker 1: or like I'm a failure if this fails, and that 467 00:28:18,036 --> 00:28:22,156 Speaker 1: just opens a whole new level of levity and lightness. Nice. 468 00:28:23,156 --> 00:28:25,676 Speaker 1: What's the worst thing about being named to the Forbes 469 00:28:25,716 --> 00:28:31,316 Speaker 1: thirty Under thirty list? I think they did a photo 470 00:28:31,316 --> 00:28:35,316 Speaker 1: shoot where there was a there was a very revealing 471 00:28:35,356 --> 00:28:37,476 Speaker 1: split on the dress, and I still get constantly made 472 00:28:37,476 --> 00:28:43,036 Speaker 1: fun of by my close friends for that. What's one 473 00:28:43,076 --> 00:28:46,236 Speaker 1: example of a thing that went wrong as you were 474 00:28:46,316 --> 00:28:50,316 Speaker 1: building the company? Something bad that happened? Oh so many things. 475 00:28:50,356 --> 00:28:51,996 Speaker 1: We had a whole period where there was a ton 476 00:28:52,036 --> 00:28:55,756 Speaker 1: of attrition and people leaving, and you know, the first 477 00:28:55,796 --> 00:28:59,276 Speaker 1: time that happens to a founder can I took it personally, 478 00:28:59,316 --> 00:29:02,236 Speaker 1: It's like someone leaving your baby, and you wonder why. 479 00:29:02,916 --> 00:29:05,836 Speaker 1: That was actually a huge growth moment for me because 480 00:29:05,996 --> 00:29:09,596 Speaker 1: I was for so long trying to put for the 481 00:29:09,716 --> 00:29:12,636 Speaker 1: strong face. If it's okay, it's okay. And finally, at 482 00:29:12,676 --> 00:29:14,556 Speaker 1: the end of like a month of this, I just 483 00:29:14,596 --> 00:29:16,596 Speaker 1: sat in front of the company at an all hands 484 00:29:16,676 --> 00:29:18,796 Speaker 1: and I honestly I just broke down in tears. I said, 485 00:29:19,836 --> 00:29:23,116 Speaker 1: I feel like I failed you guys. You know I'm 486 00:29:23,116 --> 00:29:25,556 Speaker 1: still grieving this. I really don't know what to do. 487 00:29:25,876 --> 00:29:29,076 Speaker 1: And it was paradoxically in that moment, most of the 488 00:29:29,116 --> 00:29:31,396 Speaker 1: team really rose up to the occasion and I found 489 00:29:31,436 --> 00:29:33,436 Speaker 1: support in ways I didn't even know where possible from 490 00:29:33,436 --> 00:29:42,476 Speaker 1: the team. Alice saying, is the CEO and co founder 491 00:29:42,556 --> 00:29:47,596 Speaker 1: of verge Genomics. Today's show was produced by Edith Russolo. 492 00:29:47,876 --> 00:29:50,876 Speaker 1: It was edited by Sarah Nix and Lydia Geancott and 493 00:29:51,036 --> 00:29:55,876 Speaker 1: engineered by Amanda ka Wong. You're always looking for more 494 00:29:56,036 --> 00:29:58,276 Speaker 1: guests for the show. If there's someone out there working 495 00:29:58,276 --> 00:30:01,876 Speaker 1: on an interesting technical problem with big stakes, tell us 496 00:30:01,876 --> 00:30:06,196 Speaker 1: about that person. You can email us at problem at 497 00:30:06,236 --> 00:30:09,556 Speaker 1: Pushkin dot fm, or you can find me on Twitter 498 00:30:09,716 --> 00:30:12,716 Speaker 1: at Jacob Goldstein. I'm Jacob Goldstein and we'll be back 499 00:30:12,756 --> 00:30:22,276 Speaker 1: next week with another episode of What's Your Problem.