1 00:00:15,356 --> 00:00:22,596 Speaker 1: Pushkin. You want to start listing off the companies of 2 00:00:22,636 --> 00:00:24,436 Speaker 1: which you're a founder and a co founder and stop 3 00:00:24,476 --> 00:00:25,436 Speaker 1: wherever you get tired. 4 00:00:26,396 --> 00:00:28,076 Speaker 2: I mean I can try to do that, but that 5 00:00:28,196 --> 00:00:29,476 Speaker 2: might take some time. 6 00:00:30,116 --> 00:00:33,436 Speaker 1: Give me a handful. Count count off five on your fingers, 7 00:00:33,476 --> 00:00:34,036 Speaker 1: just for sure. 8 00:00:34,516 --> 00:00:42,116 Speaker 2: Sure, well, Maderna Momenta PureTech Seer, living Proof. 9 00:00:42,676 --> 00:00:44,076 Speaker 1: For a second thought, you were just going to do 10 00:00:44,116 --> 00:00:45,916 Speaker 1: the MS, which might have been. 11 00:00:45,836 --> 00:00:48,156 Speaker 2: A while I could, I could do with the aged. 12 00:00:50,036 --> 00:00:54,316 Speaker 1: Robert Langer has founded or co founded something like forty companies. 13 00:00:54,756 --> 00:00:58,196 Speaker 1: He is an institute professor at MIT. He holds over 14 00:00:58,276 --> 00:01:01,916 Speaker 1: a thousand patents, and his research has been cited more 15 00:01:01,956 --> 00:01:06,556 Speaker 1: than four hundred thousand times. But when he started his 16 00:01:06,636 --> 00:01:09,836 Speaker 1: career in the nineteen seventies, he didn't see bound for 17 00:01:09,916 --> 00:01:13,196 Speaker 1: professional glory. He had a hard time finding a job, 18 00:01:13,356 --> 00:01:16,156 Speaker 1: he couldn't get funding for his research, and his patent 19 00:01:16,156 --> 00:01:20,516 Speaker 1: applications kept getting rejected. And I think these two things, 20 00:01:21,076 --> 00:01:25,356 Speaker 1: his early struggles and his later massive success are in 21 00:01:25,396 --> 00:01:29,396 Speaker 1: fact closely connected. Langer was trying to do something that 22 00:01:29,556 --> 00:01:33,636 Speaker 1: was deeply and profoundly different than what anybody had done before. 23 00:01:33,956 --> 00:01:37,276 Speaker 1: Almost nobody understood it. Almost nobody knew what to do 24 00:01:37,356 --> 00:01:41,076 Speaker 1: with him, and then when his work finally did succeed, 25 00:01:41,636 --> 00:01:44,956 Speaker 1: it was such a new, powerful discovery that people are 26 00:01:45,036 --> 00:01:54,076 Speaker 1: still building on it today half a century later. I'm 27 00:01:54,156 --> 00:01:56,916 Speaker 1: Jacob Goldstein and this is What's Your Problem, the show 28 00:01:56,956 --> 00:01:59,076 Speaker 1: where I talk to people who are trying to make 29 00:01:59,156 --> 00:02:04,156 Speaker 1: technological progress. Robert Langer is still working, still doing research, 30 00:02:04,276 --> 00:02:06,956 Speaker 1: still founding companies, and we talked about some of his 31 00:02:07,076 --> 00:02:10,076 Speaker 1: current work in the later part of our conversation. But 32 00:02:10,236 --> 00:02:13,476 Speaker 1: to start, we went back to the mid nineteen seventies 33 00:02:13,756 --> 00:02:17,316 Speaker 1: when Langer got his doctorate in chemical engineering and he 34 00:02:17,396 --> 00:02:21,196 Speaker 1: did something that at the time was really unusual. He 35 00:02:21,236 --> 00:02:24,676 Speaker 1: did a postdoc with a medical school professor, a pediatric 36 00:02:24,756 --> 00:02:30,956 Speaker 1: surgeon named Judah Folkman. Langer's field is bioengineering, basically bringing 37 00:02:30,996 --> 00:02:34,876 Speaker 1: the tools of engineering to the fields of biology and medicine. 38 00:02:35,236 --> 00:02:39,276 Speaker 1: And bioengineering is a huge field today, but it barely 39 00:02:39,316 --> 00:02:43,276 Speaker 1: existed back when Langer started his postdoc with that doctor, 40 00:02:43,396 --> 00:02:48,956 Speaker 1: Judah Folkman, and bioengineering was exactly what Judah Foalkman needed. 41 00:02:49,396 --> 00:02:52,116 Speaker 1: Folkman had an idea for a new kind of drug, 42 00:02:52,636 --> 00:02:54,916 Speaker 1: and this kind of drug was a molecule that was 43 00:02:54,996 --> 00:02:57,756 Speaker 1: too big and complex to be given as a pill. 44 00:02:58,316 --> 00:03:01,756 Speaker 1: So Folkman needed somebody who could figure out how to 45 00:03:01,876 --> 00:03:05,516 Speaker 1: deliver this new kind of drug to patience. As you'll hear, 46 00:03:05,996 --> 00:03:10,676 Speaker 1: that delivery problem was fundamentally an engineering problem, and when 47 00:03:10,796 --> 00:03:14,756 Speaker 1: Langer solved that problem, he created an entirely new way 48 00:03:14,836 --> 00:03:18,716 Speaker 1: to get medicine to patients, and it proved incredibly useful. 49 00:03:22,436 --> 00:03:24,916 Speaker 1: Tell me about being an engineer and going off to 50 00:03:24,956 --> 00:03:28,356 Speaker 1: work in the nineteen seventies in the lab of a physician. 51 00:03:29,396 --> 00:03:31,276 Speaker 2: On the one hand, for me, it was very hard 52 00:03:31,356 --> 00:03:34,556 Speaker 2: because I had to learn a lot about medical things 53 00:03:34,556 --> 00:03:39,236 Speaker 2: and I didn't know very much biology, so that was 54 00:03:39,396 --> 00:03:43,836 Speaker 2: that was difficult. But on the other hand, being an engineer, 55 00:03:43,876 --> 00:03:46,276 Speaker 2: I guess I had a different perspective, you know that 56 00:03:46,396 --> 00:03:51,156 Speaker 2: I didn't maybe think the same way as a clinician 57 00:03:51,276 --> 00:03:55,676 Speaker 2: or surgeon or a biologist. You know, engineers they solve problems, 58 00:03:55,716 --> 00:03:58,836 Speaker 2: and that Judah Falkman, who was my boss at the time, 59 00:03:58,956 --> 00:04:01,116 Speaker 2: I mean, that's what he wanted. He wanted to see 60 00:04:01,116 --> 00:04:02,036 Speaker 2: a problem solved. 61 00:04:02,836 --> 00:04:06,476 Speaker 1: So let's talk specifically about that problem. What did you 62 00:04:06,796 --> 00:04:08,996 Speaker 1: what did you go to doctor Falkman's lab to work on. 63 00:04:09,996 --> 00:04:12,636 Speaker 2: Doctor Fokeland had this idea that if you could stop 64 00:04:12,676 --> 00:04:17,236 Speaker 2: blood vessels, you could stop cancer. It wasn't most people 65 00:04:17,236 --> 00:04:20,476 Speaker 2: didn't think he was right. In fact, he went further. 66 00:04:20,596 --> 00:04:24,956 Speaker 2: He said that the reason blood vessels come to the 67 00:04:24,996 --> 00:04:29,036 Speaker 2: tumor is that the tumor makes a chemical signal he 68 00:04:29,076 --> 00:04:32,916 Speaker 2: called the tumor antigenesis factor, and he said that was 69 00:04:32,996 --> 00:04:36,996 Speaker 2: chemically mediated. And also the idea that he thought about 70 00:04:37,116 --> 00:04:40,796 Speaker 2: is if that was chemically mediated, maybe stopping it could 71 00:04:40,836 --> 00:04:45,236 Speaker 2: also be chemically mediated. So my job really, in a 72 00:04:45,236 --> 00:04:47,876 Speaker 2: way was to prove that he was right, because almost 73 00:04:47,916 --> 00:04:50,716 Speaker 2: everybody told him he was wrong, and in so doing 74 00:04:50,836 --> 00:04:55,116 Speaker 2: isolate the first you know, blood vessel or antiogenesis inhibitor, 75 00:04:55,756 --> 00:04:56,156 Speaker 2: uh huh. 76 00:04:56,196 --> 00:04:59,436 Speaker 1: And so it's basically that there is this theory that 77 00:04:59,476 --> 00:05:03,716 Speaker 1: he had that tumors stimulate the growth of new blood vessels, 78 00:05:04,276 --> 00:05:07,596 Speaker 1: and then if that's true, perhaps you could inhibit the 79 00:05:07,636 --> 00:05:11,356 Speaker 1: growth of new blood vessels thereby inhibit the growth of tumors. Right, 80 00:05:13,116 --> 00:05:17,116 Speaker 1: And so you get there, and I'm interested in that 81 00:05:17,236 --> 00:05:20,316 Speaker 1: inhibition piece, right, because that seems like that's where you're really, 82 00:05:20,396 --> 00:05:24,996 Speaker 1: in a very direct way, bringing your engineering skills to 83 00:05:25,076 --> 00:05:27,796 Speaker 1: bear on this medical problem. So, like, talk about that 84 00:05:27,836 --> 00:05:29,556 Speaker 1: side of it and how you approached it. 85 00:05:30,476 --> 00:05:32,836 Speaker 2: So what we wanted to do was have a little 86 00:05:32,956 --> 00:05:37,876 Speaker 2: nanoparticle or microparticle that could deliver different molecules I was isolating, 87 00:05:39,116 --> 00:05:42,196 Speaker 2: and these were fairly large molecules, so that was really 88 00:05:42,236 --> 00:05:44,676 Speaker 2: the idea, and then see could it stop the blood vessels? 89 00:05:45,036 --> 00:05:50,196 Speaker 1: So this core idea of developing a nanoparticle to deliver 90 00:05:50,956 --> 00:05:56,156 Speaker 1: a large molecule basically a complicated drug, is an engineering problem, right, 91 00:05:56,196 --> 00:05:59,516 Speaker 1: this nanopart of It's like, we've got this this drug 92 00:05:59,676 --> 00:06:01,476 Speaker 1: call it that we think might be able to stop 93 00:06:01,516 --> 00:06:04,316 Speaker 1: tumor growth, but how do we get it to the tumor? Right? 94 00:06:04,356 --> 00:06:07,196 Speaker 1: That is a basic problem that you were coming up 95 00:06:07,196 --> 00:06:11,476 Speaker 1: against early in your research, and that that problem winds 96 00:06:11,556 --> 00:06:13,116 Speaker 1: up being a big deal, right, and the way you 97 00:06:13,156 --> 00:06:16,196 Speaker 1: go about solving that problem winds up being a big deal. 98 00:06:16,276 --> 00:06:18,236 Speaker 1: So tell me about that. 99 00:06:19,476 --> 00:06:27,276 Speaker 2: Well, the nanoparticles and microparticles, really it's taking molecules drugs, encapsulating, 100 00:06:27,356 --> 00:06:32,076 Speaker 2: surrounding them with a lipid or polymer, and delivering it 101 00:06:32,196 --> 00:06:35,556 Speaker 2: to sells or a patient or an animal. 102 00:06:36,156 --> 00:06:38,436 Speaker 1: And a lipid or a polymer is basically some fat 103 00:06:38,556 --> 00:06:39,356 Speaker 1: or some plastic. 104 00:06:40,196 --> 00:06:43,516 Speaker 2: Yeah, yeah, lipid is some fat and polymer some plastic. 105 00:06:43,636 --> 00:06:48,236 Speaker 2: Generally speaking, So, yeah, if you escape the drug by 106 00:06:48,276 --> 00:06:51,356 Speaker 2: itself and it wasn't packaged in those particles, it would 107 00:06:51,396 --> 00:06:55,996 Speaker 2: just get destroyed. I mean, so the number one reason 108 00:06:56,036 --> 00:06:59,716 Speaker 2: you do it is to protect it, you know, otherwise 109 00:06:59,756 --> 00:07:03,316 Speaker 2: it won't you know, they'll just get destroyed, probably almost immediately. 110 00:07:03,916 --> 00:07:07,716 Speaker 2: So you know, we ask people who are experts in 111 00:07:07,756 --> 00:07:10,316 Speaker 2: that area, know about price winners and others who had 112 00:07:10,356 --> 00:07:15,996 Speaker 2: done work or at least helped on delivery of small molecules, 113 00:07:16,276 --> 00:07:18,636 Speaker 2: and we asked them about that, but they all told 114 00:07:18,716 --> 00:07:22,836 Speaker 2: us it wasn't possible. So I spent years in the 115 00:07:22,916 --> 00:07:27,156 Speaker 2: laboratory experimenting, finding hundreds of different ways, failing hundreds at 116 00:07:27,156 --> 00:07:32,636 Speaker 2: different times, but finally I was successful. And you know, 117 00:07:32,676 --> 00:07:36,276 Speaker 2: we published a paper in Nature in nineteen seventy six, 118 00:07:36,756 --> 00:07:40,276 Speaker 2: the General Nature, and showed for the first time that 119 00:07:40,316 --> 00:07:43,476 Speaker 2: you could deliver large molecules this way. And we published 120 00:07:43,516 --> 00:07:46,756 Speaker 2: a paper in Science in nineteen seventy six showing for 121 00:07:46,836 --> 00:07:49,716 Speaker 2: the first time that you could stop blood vessels by 122 00:07:49,796 --> 00:07:50,956 Speaker 2: using approaches like this. 123 00:07:51,516 --> 00:07:54,996 Speaker 1: And as I understand it, even after you published those papers, 124 00:07:55,596 --> 00:07:56,916 Speaker 1: you met a lot of resistance. 125 00:07:57,916 --> 00:08:00,876 Speaker 2: Yeah I did. I suppose I met a lot of 126 00:08:00,876 --> 00:08:04,756 Speaker 2: resistance for a couple of ways. First, different people didn't 127 00:08:04,796 --> 00:08:06,636 Speaker 2: agree with it or didn't believe it or didn't think 128 00:08:06,676 --> 00:08:11,076 Speaker 2: it was possible. Secondly, my background really wasn't right. I 129 00:08:11,116 --> 00:08:15,756 Speaker 2: suppose for the different review sections, I was an engineer, 130 00:08:15,796 --> 00:08:20,836 Speaker 2: and when we sent the grants to the National Insituites 131 00:08:20,836 --> 00:08:23,156 Speaker 2: of Health in places like that, you know, they had 132 00:08:23,196 --> 00:08:26,436 Speaker 2: medical people or biological people reviewing it, and they said, well, 133 00:08:26,756 --> 00:08:28,916 Speaker 2: what can an engineer, You know, he doesn't know anything 134 00:08:28,956 --> 00:08:33,276 Speaker 2: about biology or oncology. Separately, I met a lot of 135 00:08:33,476 --> 00:08:36,676 Speaker 2: resistance when I tried to do this get a job 136 00:08:36,716 --> 00:08:40,396 Speaker 2: in an engineering department. They said, well, engineers a chemical 137 00:08:40,476 --> 00:08:44,836 Speaker 2: engineering department, They said, engineers really don't do experimental biology. 138 00:08:44,876 --> 00:08:48,036 Speaker 2: So I didn't get any faculty positions in a chemical 139 00:08:48,036 --> 00:08:51,116 Speaker 2: engineering department for a very very long time. I ended 140 00:08:51,196 --> 00:08:52,476 Speaker 2: up in an fishing department. 141 00:08:52,716 --> 00:08:57,476 Speaker 1: And so, I mean, this idea of bioengineering that is 142 00:08:57,516 --> 00:09:00,116 Speaker 1: a big deal now and was very novel. Then it 143 00:09:00,156 --> 00:09:02,636 Speaker 1: feels like you're sort of coming up against this problem 144 00:09:02,716 --> 00:09:06,436 Speaker 1: of creating a field that doesn't quite exist yet, or 145 00:09:06,476 --> 00:09:08,236 Speaker 1: at least creating a part of a field that doesn't 146 00:09:08,276 --> 00:09:10,396 Speaker 1: exist yet, which seems like, on the one hand, the 147 00:09:10,436 --> 00:09:13,476 Speaker 1: opportunity to solve very large problems was clearly there. On 148 00:09:13,516 --> 00:09:16,676 Speaker 1: the other hand, the kind of institutional structure to allow 149 00:09:16,716 --> 00:09:19,836 Speaker 1: that to happen was was not on your side. 150 00:09:19,996 --> 00:09:22,796 Speaker 2: Yeah, you're right. I mean, Ei there had been people 151 00:09:22,876 --> 00:09:27,236 Speaker 2: in chemical engineering doing work on what i'd call mathematical modeling, 152 00:09:27,276 --> 00:09:31,716 Speaker 2: you know, transport of molecules, but experimental stuff, inventing things. Yeah, 153 00:09:31,756 --> 00:09:36,596 Speaker 2: that and discovering you know, new molecules that certainly had 154 00:09:36,836 --> 00:09:40,036 Speaker 2: not been done, never been done in chemical engineering up 155 00:09:40,116 --> 00:09:45,836 Speaker 2: till that time. So so that ended up being hard. 156 00:09:46,196 --> 00:09:48,836 Speaker 1: Is it right that some of your early patent applications 157 00:09:48,836 --> 00:09:50,756 Speaker 1: around this technology were also rejected? 158 00:09:51,436 --> 00:09:55,636 Speaker 2: Yeah they were. I mean the first the main patent 159 00:09:55,676 --> 00:09:58,876 Speaker 2: on it got rejected five times in a row. But 160 00:09:59,276 --> 00:10:03,156 Speaker 2: you know, sometimes that happens. I'vet after that. I think 161 00:10:03,196 --> 00:10:05,276 Speaker 2: I had one when we came up with the idea 162 00:10:05,276 --> 00:10:08,276 Speaker 2: of tissue engineering. I think that got rejected even more. 163 00:10:08,396 --> 00:10:10,596 Speaker 2: So those things happen. 164 00:10:10,676 --> 00:10:13,436 Speaker 1: And so you get the patents and you end up 165 00:10:13,476 --> 00:10:18,476 Speaker 1: licensing the technology initially to one or more big big companies, right, 166 00:10:18,516 --> 00:10:20,796 Speaker 1: big pharmaceutical company. What happens with that? 167 00:10:21,436 --> 00:10:24,396 Speaker 2: Yeah, well, actually the hospital did that the license because 168 00:10:24,556 --> 00:10:27,276 Speaker 2: I mean, the past is my name, but they licensed it, 169 00:10:27,716 --> 00:10:30,396 Speaker 2: you know. I well, I was very excited about that. 170 00:10:30,476 --> 00:10:34,236 Speaker 2: There were two multi billion dollar companies won an animal 171 00:10:34,316 --> 00:10:37,916 Speaker 2: health one in human health. You know, So they they 172 00:10:37,956 --> 00:10:40,236 Speaker 2: gave me a consulting fee. They gave me actually a 173 00:10:40,316 --> 00:10:43,876 Speaker 2: very large grant, which for young professors, you know, terrific. 174 00:10:44,196 --> 00:10:46,476 Speaker 2: Most importantly, they were going to work on it, and 175 00:10:46,516 --> 00:10:48,996 Speaker 2: they did work on it for maybe up to a year, 176 00:10:49,116 --> 00:10:52,076 Speaker 2: but then they just gave up. So I got the 177 00:10:52,116 --> 00:10:54,956 Speaker 2: grant and the consulting fee, but I didn't get what 178 00:10:54,996 --> 00:10:57,396 Speaker 2: I wanted most, which was to see the work that 179 00:10:57,476 --> 00:10:59,076 Speaker 2: we did make a difference in the world. 180 00:10:59,476 --> 00:11:01,596 Speaker 1: Were you surprised when they gave up? What was your 181 00:11:01,636 --> 00:11:02,916 Speaker 1: response when they gave up? 182 00:11:03,916 --> 00:11:05,876 Speaker 2: I guess I was sad. I don't know that I 183 00:11:05,916 --> 00:11:09,596 Speaker 2: was surprised. I certainly have seen plenty of places give 184 00:11:09,676 --> 00:11:12,436 Speaker 2: up before, but it made me sad. I really thought 185 00:11:12,596 --> 00:11:16,636 Speaker 2: that this was a way of moving things forward, was 186 00:11:17,196 --> 00:11:21,436 Speaker 2: having companies, you know, take what you published and what 187 00:11:21,476 --> 00:11:25,436 Speaker 2: you did and develop it. But I was mostly sad. 188 00:11:25,556 --> 00:11:27,596 Speaker 1: So how do you get from there to starting your 189 00:11:27,636 --> 00:11:28,276 Speaker 1: first company? 190 00:11:29,356 --> 00:11:31,876 Speaker 2: Yeah? Well, a good friend of mine, Alex kleebman Off, 191 00:11:31,876 --> 00:11:35,036 Speaker 2: he was a professor in that nutrition department lady. He 192 00:11:35,116 --> 00:11:39,076 Speaker 2: was a professor in the chemistry department and mit he 193 00:11:39,196 --> 00:11:42,276 Speaker 2: said to me one day after this happened, he said, well, Bob, 194 00:11:42,316 --> 00:11:45,596 Speaker 2: we should start our own company. So I thought, yeah, 195 00:11:46,476 --> 00:11:48,716 Speaker 2: if you're not your own champion, nobody else is going 196 00:11:48,756 --> 00:11:52,716 Speaker 2: to be. So we did, and I got a number 197 00:11:52,716 --> 00:11:55,596 Speaker 2: of my students to join that company, and they were 198 00:11:55,676 --> 00:11:59,556 Speaker 2: very excited about it, so that that ended up. You know, 199 00:11:59,596 --> 00:12:01,356 Speaker 2: they weren't going to give up very easily. 200 00:12:02,236 --> 00:12:05,476 Speaker 1: And so you keep working on this original idea of 201 00:12:06,396 --> 00:12:10,356 Speaker 1: a particle that can deliver a drug, a large molecule 202 00:12:10,436 --> 00:12:15,196 Speaker 1: drug basically, and when does it become clear to you 203 00:12:15,276 --> 00:12:16,356 Speaker 1: that it's going to work. 204 00:12:17,516 --> 00:12:20,516 Speaker 2: Well, actually, for me, I was pretty clear it was 205 00:12:20,556 --> 00:12:21,756 Speaker 2: going to work when we wrote that. 206 00:12:21,756 --> 00:12:22,916 Speaker 1: Early paper in Nature. 207 00:12:23,596 --> 00:12:25,876 Speaker 2: I mean I thought i'd see it with my own eyes. 208 00:12:25,956 --> 00:12:28,596 Speaker 2: I put certain types of well I'm trying to think, 209 00:12:28,636 --> 00:12:31,396 Speaker 2: I explained. I put certain enzymes. Those are all large 210 00:12:31,396 --> 00:12:36,516 Speaker 2: molecules in these materials, and I had this test that 211 00:12:36,596 --> 00:12:39,836 Speaker 2: would turn color if the enzymes are coming out, and 212 00:12:40,356 --> 00:12:43,716 Speaker 2: I've got to see it not work many many, many times, 213 00:12:43,836 --> 00:12:46,916 Speaker 2: hundreds of times. But finally I did see it work, 214 00:12:46,956 --> 00:12:51,556 Speaker 2: and so I didn't see how this couldn't you know, 215 00:12:51,596 --> 00:12:53,676 Speaker 2: so since I saw it with my own eyes, but 216 00:12:53,756 --> 00:12:56,356 Speaker 2: that didn't mean that other people were going to necessarily 217 00:12:56,396 --> 00:13:00,356 Speaker 2: believe it. But I did, and you know I had 218 00:13:00,396 --> 00:13:04,436 Speaker 2: people still ten fifteen years later tell me couldn't possibly 219 00:13:04,476 --> 00:13:08,436 Speaker 2: be right. I mean, very experienced people. But you know 220 00:13:08,516 --> 00:13:11,476 Speaker 2: that's the world. I mean a lot of times they're skepticism. 221 00:13:11,916 --> 00:13:14,836 Speaker 1: And what was the first drug from that idea that 222 00:13:15,556 --> 00:13:18,916 Speaker 1: made it to the market, that made it to patients. 223 00:13:19,436 --> 00:13:22,316 Speaker 2: You know, we had this collaboration with a company called 224 00:13:22,356 --> 00:13:26,516 Speaker 2: Taketa as a Japanese company, and they had sent people 225 00:13:26,556 --> 00:13:28,996 Speaker 2: to our lab every year and we got a grants 226 00:13:28,996 --> 00:13:33,236 Speaker 2: from them, and they created what's called lupron depot and 227 00:13:33,316 --> 00:13:39,076 Speaker 2: that was that ultimately did get approved and still versions 228 00:13:39,116 --> 00:13:40,916 Speaker 2: of it are widely used today. 229 00:13:41,356 --> 00:13:43,196 Speaker 1: What kind of patients did that treat? What did that 230 00:13:43,276 --> 00:13:43,676 Speaker 1: drug do? 231 00:13:44,676 --> 00:13:48,316 Speaker 2: It was a way to treat advanced prostate cancer and endometriosis. 232 00:13:48,556 --> 00:13:51,836 Speaker 1: And was it the anti angiogenesis? Was it inhibiting the 233 00:13:51,876 --> 00:13:53,916 Speaker 1: formation of blood vessels or was it. 234 00:13:53,876 --> 00:13:56,676 Speaker 2: Something you no, No, it was it was affecting hormones. 235 00:13:56,756 --> 00:14:01,316 Speaker 2: It was a different hormonal thing, the antiogenesis ones that 236 00:14:01,436 --> 00:14:06,076 Speaker 2: there you know, other people used the essays we've developed 237 00:14:06,076 --> 00:14:08,396 Speaker 2: and other things that we did and things that they 238 00:14:08,436 --> 00:14:12,556 Speaker 2: did themselves, and they would ultimately get many drugs approved, 239 00:14:13,516 --> 00:14:16,116 Speaker 2: but it took many, many years. That didn't take place 240 00:14:16,236 --> 00:14:17,596 Speaker 2: till two thousand and four. 241 00:14:18,236 --> 00:14:22,276 Speaker 1: Can you just list off some of the conditions diseases 242 00:14:22,356 --> 00:14:25,436 Speaker 1: that are treated with this, you know, technology, and the 243 00:14:25,436 --> 00:14:27,396 Speaker 1: offshoots of this technology that you came. 244 00:14:27,316 --> 00:14:31,716 Speaker 2: Up with, well, prostate cancer and ametriosis. I mean, there 245 00:14:31,716 --> 00:14:42,236 Speaker 2: are treatments for heart diseases, different eye diseases, schizophrenia, opioid addiction, osteoarthritis, diabetes. 246 00:14:42,436 --> 00:14:45,156 Speaker 2: I mean, I'm sure I'm leaving all out a lot, 247 00:14:45,916 --> 00:14:46,596 Speaker 2: but those are some. 248 00:14:49,116 --> 00:14:51,796 Speaker 1: Still to come on the show. How Robert Langer wound 249 00:14:51,836 --> 00:15:08,916 Speaker 1: up creating forty companies also the research he's excited about today. 250 00:15:09,196 --> 00:15:13,116 Speaker 1: So after you started that one initial company, you wound 251 00:15:13,156 --> 00:15:16,436 Speaker 1: up starting or being a co founder of a lot 252 00:15:16,516 --> 00:15:18,036 Speaker 1: of companies. I don't have the number in front of you. 253 00:15:18,076 --> 00:15:20,796 Speaker 1: It's dozens, the right order of magnitude. 254 00:15:20,556 --> 00:15:22,756 Speaker 2: Yeah, forty forty forty one. 255 00:15:22,876 --> 00:15:28,796 Speaker 1: Yeah, Like, how's that happen? Like? What how'd that happen? Well, 256 00:15:28,796 --> 00:15:29,956 Speaker 1: it's a lot of companies. 257 00:15:30,396 --> 00:15:32,916 Speaker 2: Yeah, but it's over it's over close to a forty 258 00:15:32,956 --> 00:15:33,596 Speaker 2: year period. 259 00:15:33,796 --> 00:15:36,116 Speaker 1: Well a company a year seems like a lot time. 260 00:15:36,236 --> 00:15:36,596 Speaker 1: I don't know. 261 00:15:36,796 --> 00:15:39,196 Speaker 2: Yeah, well, I mean I have a big lab, I 262 00:15:39,196 --> 00:15:41,716 Speaker 2: have a lot of graduate students. Some of the graduate 263 00:15:41,756 --> 00:15:45,236 Speaker 2: students would see what I did and bostocks and they 264 00:15:45,276 --> 00:15:47,756 Speaker 2: wanted to start companies. So we did. I mean, you know, 265 00:15:47,796 --> 00:15:49,716 Speaker 2: we may have done work in the lab for five 266 00:15:49,796 --> 00:15:51,796 Speaker 2: or six years, and then when it got to a 267 00:15:51,796 --> 00:15:54,636 Speaker 2: certain stage, we spun it out and some people with 268 00:15:54,836 --> 00:16:00,316 Speaker 2: other people, colleagues of mine, would see that, you know, 269 00:16:00,396 --> 00:16:01,956 Speaker 2: I had done this, and so they'd come to me 270 00:16:01,996 --> 00:16:05,196 Speaker 2: and talked to me about companies. So I you know, so, yeah, 271 00:16:05,236 --> 00:16:07,516 Speaker 2: we kept doing it. I mean, to me, it's it's 272 00:16:07,556 --> 00:16:11,476 Speaker 2: been a great route for taking discoveries in the academic 273 00:16:11,556 --> 00:16:15,596 Speaker 2: lab and getting them out to the world. You know. 274 00:16:15,836 --> 00:16:19,996 Speaker 2: And as I mentioned, I had a hard time, maybe 275 00:16:20,076 --> 00:16:23,116 Speaker 2: given the stage of the work, to get large companies 276 00:16:23,956 --> 00:16:27,036 Speaker 2: that would do it. So we did it ourselves. 277 00:16:27,676 --> 00:16:31,836 Speaker 1: And when you started your first company, I feel like 278 00:16:31,876 --> 00:16:36,836 Speaker 1: it was much less common for professors to start companies 279 00:16:36,876 --> 00:16:40,356 Speaker 1: than it is now. I'm curious sort of culturally, you know, 280 00:16:40,676 --> 00:16:44,836 Speaker 1: within MIT, within academia, how what was that like? Did 281 00:16:44,836 --> 00:16:45,556 Speaker 1: you get pushback. 282 00:16:46,596 --> 00:16:49,836 Speaker 2: I think anytime money's involved, a lot of people will 283 00:16:49,836 --> 00:16:53,676 Speaker 2: tell you and I think there's jealousy, you know, about it, 284 00:16:53,716 --> 00:16:56,876 Speaker 2: and people feel you shouldn't be spending your time doing that, 285 00:16:58,796 --> 00:17:02,396 Speaker 2: even at an MIT. So yeah, I had I ran 286 00:17:02,436 --> 00:17:05,796 Speaker 2: into problems when people were thinking about me for promotion. 287 00:17:07,516 --> 00:17:11,356 Speaker 2: You know, some one point I had a partial share 288 00:17:11,476 --> 00:17:15,356 Speaker 2: and they took that away from me. So yeah, I 289 00:17:15,396 --> 00:17:19,996 Speaker 2: was discouraging in the beginning. In fact, I'd say when 290 00:17:20,036 --> 00:17:23,676 Speaker 2: I was in the nutrition department, a lot of people, 291 00:17:23,716 --> 00:17:26,636 Speaker 2: some people told me that the drug delivery ideas they 292 00:17:26,676 --> 00:17:28,556 Speaker 2: would never work and I should be looking for a 293 00:17:28,556 --> 00:17:29,036 Speaker 2: new job. 294 00:17:30,636 --> 00:17:35,796 Speaker 1: Do you feel like you have have gained insight into 295 00:17:35,956 --> 00:17:39,636 Speaker 1: what that moment is or particular elements of that moment 296 00:17:39,676 --> 00:17:44,596 Speaker 1: when you take something that is basic research, academic research 297 00:17:44,596 --> 00:17:46,436 Speaker 1: and decide, okay, this is the moment we're going to 298 00:17:46,476 --> 00:17:48,476 Speaker 1: take the leap. We're going to start a company, We're 299 00:17:48,476 --> 00:17:50,796 Speaker 1: going to try and commercialize it. How do you know? 300 00:17:51,876 --> 00:17:53,516 Speaker 2: Well, I don't think you ever know for sure, but 301 00:17:53,676 --> 00:17:57,196 Speaker 2: the kinds of high level rules that I've used are 302 00:17:57,596 --> 00:18:00,316 Speaker 2: generally you have what I'll call is a platform technology, 303 00:18:00,356 --> 00:18:02,756 Speaker 2: meaning it's almost like a plug and play thing. Those 304 00:18:02,796 --> 00:18:05,396 Speaker 2: drug delivery systems are a good example, right, you could 305 00:18:05,476 --> 00:18:10,636 Speaker 2: use it for drug A, drug BA, DRUGSY. Then I 306 00:18:10,676 --> 00:18:13,596 Speaker 2: think the next thing is that you've taken at a 307 00:18:13,596 --> 00:18:19,236 Speaker 2: certain distance. Right, you have maybe animal data. You also 308 00:18:19,316 --> 00:18:23,196 Speaker 2: have a paper and ideally a good journal like say 309 00:18:23,236 --> 00:18:26,396 Speaker 2: Science or Nature. You have a patent or your high 310 00:18:26,556 --> 00:18:29,956 Speaker 2: likelihood of getting a patent because you've advanced a certain distance. 311 00:18:30,636 --> 00:18:33,356 Speaker 2: And usually there are people in my lab that want 312 00:18:33,396 --> 00:18:37,116 Speaker 2: to be involved in it and that we're So those 313 00:18:37,156 --> 00:18:42,756 Speaker 2: are the kinds of things that inform my thinking about 314 00:18:42,876 --> 00:18:43,356 Speaker 2: about it. 315 00:18:43,996 --> 00:18:49,116 Speaker 1: So what is something right now on the basic research 316 00:18:49,156 --> 00:18:51,476 Speaker 1: side that you're excited about. What is a big idea 317 00:18:51,556 --> 00:18:53,796 Speaker 1: that is early that you think holds a lot of promise. 318 00:18:54,636 --> 00:18:57,796 Speaker 2: Well, I think the tissue engineering work we're doing holds 319 00:18:57,796 --> 00:18:59,996 Speaker 2: a lot of promise. I mean an example that we're 320 00:18:59,996 --> 00:19:03,716 Speaker 2: doing is we're working with the Leeway Si who's head 321 00:19:03,716 --> 00:19:06,756 Speaker 2: of MIT's pick Hour Institute, and I have a wonderful 322 00:19:06,836 --> 00:19:10,156 Speaker 2: post doc Alice Stanton. You know, we're actually creating a 323 00:19:10,156 --> 00:19:13,676 Speaker 2: brain on a chip. It's not been published yet, but 324 00:19:13,756 --> 00:19:18,476 Speaker 2: she's been able to convert you know, like say we 325 00:19:18,516 --> 00:19:21,836 Speaker 2: could take your cells and convert it first ips cells 326 00:19:21,876 --> 00:19:24,636 Speaker 2: and then convert each of those, depending on what we do, 327 00:19:24,676 --> 00:19:27,436 Speaker 2: to a different brain cell type, six different cell types. 328 00:19:27,476 --> 00:19:29,996 Speaker 2: She's found a matrix that she can put them on 329 00:19:30,036 --> 00:19:35,036 Speaker 2: and that really makes them grow and function. And you know, 330 00:19:35,396 --> 00:19:38,076 Speaker 2: so that's that's something I'm excited about. 331 00:19:38,276 --> 00:19:41,556 Speaker 1: And what when you say put it on a chip, 332 00:19:41,596 --> 00:19:43,036 Speaker 1: what does that mean? And then what do you do 333 00:19:43,156 --> 00:19:44,436 Speaker 1: with my brain on a chip? 334 00:19:44,836 --> 00:19:47,596 Speaker 2: Yeah? Well, what I mean by in a chip, it's 335 00:19:47,676 --> 00:19:49,876 Speaker 2: in vitro. It's not in an animal, it's not in 336 00:19:49,916 --> 00:19:53,716 Speaker 2: a person. What it means is that you could rather 337 00:19:53,876 --> 00:19:55,196 Speaker 2: like you could think about if you were going to 338 00:19:55,316 --> 00:19:57,796 Speaker 2: experiment on a person. I mean, of course there's a 339 00:19:57,836 --> 00:20:00,076 Speaker 2: lot you wouldn't be able to find out anyhow because 340 00:20:00,116 --> 00:20:02,516 Speaker 2: we'd have to take you apart, and we're obviously not 341 00:20:02,556 --> 00:20:03,076 Speaker 2: going to do that. 342 00:20:03,076 --> 00:20:03,836 Speaker 1: I appreciate that. 343 00:20:03,996 --> 00:20:07,236 Speaker 2: Yeah, And with animals, you know, it's a little bit 344 00:20:07,236 --> 00:20:11,876 Speaker 2: similar here. So what you do with it is you 345 00:20:11,916 --> 00:20:16,716 Speaker 2: could literally test thousands and thousands of two thousands and 346 00:20:16,756 --> 00:20:20,316 Speaker 2: thousands of experiments and get readouts on them. So it 347 00:20:20,396 --> 00:20:25,316 Speaker 2: might someday reduce animal testing, hopefully also reduce human testing, 348 00:20:25,876 --> 00:20:28,796 Speaker 2: and may greatly speed up drug discovery. I mean there's 349 00:20:28,796 --> 00:20:31,196 Speaker 2: so many drugs that you'd like to be able to 350 00:20:31,196 --> 00:20:34,076 Speaker 2: have for brain disease, right, like for Alzheimer's, for Luke 351 00:20:34,076 --> 00:20:37,436 Speaker 2: Grigg's disease, als for Parkinson's. So I hope. 352 00:20:37,516 --> 00:20:41,436 Speaker 1: So brain disease has been famously difficult to treat with drugs, right. 353 00:20:41,476 --> 00:20:45,716 Speaker 1: It's a very very hard set of diseases, right because. 354 00:20:45,476 --> 00:20:47,956 Speaker 2: We don't understand it well enough and the tests are 355 00:20:48,076 --> 00:20:51,676 Speaker 2: very very hard to do. So something like this, if 356 00:20:51,716 --> 00:20:55,156 Speaker 2: it truly ends up working well, you know, could change 357 00:20:55,156 --> 00:20:58,676 Speaker 2: that someday. But that's an example of something I'm excited 358 00:20:58,716 --> 00:20:59,476 Speaker 2: about as you. 359 00:20:59,636 --> 00:21:02,236 Speaker 1: As you said, it's like it's a platform, right, Presumably 360 00:21:02,236 --> 00:21:04,316 Speaker 1: if you could do brain cells, you could do different 361 00:21:04,396 --> 00:21:06,236 Speaker 1: kinds of cells. It could be a way to do 362 00:21:06,316 --> 00:21:07,116 Speaker 1: lots of testing. 363 00:21:07,636 --> 00:21:10,556 Speaker 2: Well, we've done, yeah, we've put in this case we 364 00:21:10,596 --> 00:21:13,956 Speaker 2: have six different brain cell types in vitro. We have 365 00:21:14,636 --> 00:21:16,996 Speaker 2: our working on other cell types too. We have a 366 00:21:17,036 --> 00:21:19,596 Speaker 2: guest or intestinal track on a chip. We've had a 367 00:21:19,596 --> 00:21:22,236 Speaker 2: heart on a chip. And of course it's not just 368 00:21:22,276 --> 00:21:24,916 Speaker 2: putting them on chip. Someday you could use it for 369 00:21:25,276 --> 00:21:28,716 Speaker 2: repairing tissues, you know, you could maybe, I mean, in fact, 370 00:21:28,996 --> 00:21:31,996 Speaker 2: Laura Nicholson, one of my former postdocs. She runs a 371 00:21:31,996 --> 00:21:35,436 Speaker 2: company that's making new blood vessels that's been used on 372 00:21:35,796 --> 00:21:39,876 Speaker 2: patients in the Ukraine. Others have used made artificial skin 373 00:21:39,996 --> 00:21:43,316 Speaker 2: for burn victims or patients with diabetic skin ulcers, and 374 00:21:43,396 --> 00:21:46,316 Speaker 2: people are trying to make new cartilage, all kinds of tissues. 375 00:21:46,596 --> 00:21:50,356 Speaker 2: So yeah, so that is a big you know, that's 376 00:21:50,356 --> 00:21:51,556 Speaker 2: an exciting area. 377 00:21:51,916 --> 00:21:54,396 Speaker 1: And that tissue engineering side. I mean, does that go 378 00:21:54,556 --> 00:21:57,756 Speaker 1: back to a kind of similar origin story, right, I 379 00:21:57,796 --> 00:22:00,316 Speaker 1: know there was sort of early tissue engineering work that 380 00:22:00,356 --> 00:22:02,036 Speaker 1: you did as well. What was that work? 381 00:22:02,516 --> 00:22:04,956 Speaker 2: Yeah, well they are. One of the people I got 382 00:22:04,956 --> 00:22:07,476 Speaker 2: to meet at Children's Hospital is Jay Vacanti. He was 383 00:22:07,516 --> 00:22:11,756 Speaker 2: a pediatric surgeon is and he was treating patients with 384 00:22:11,836 --> 00:22:14,716 Speaker 2: liver failure and one day he came to see me said, Bob, 385 00:22:15,276 --> 00:22:18,436 Speaker 2: you know I do all these transplants, would it ever 386 00:22:18,516 --> 00:22:20,996 Speaker 2: be possible to make a liver from scratch? And he 387 00:22:21,036 --> 00:22:24,836 Speaker 2: and I brainstorm and came up with a way that 388 00:22:24,956 --> 00:22:30,516 Speaker 2: we hope might do that with polymer scaffolds and cells. 389 00:22:31,116 --> 00:22:34,876 Speaker 2: And so we've continued on working together and separately in 390 00:22:34,956 --> 00:22:40,716 Speaker 2: different ways to make this happen. But that started probably 391 00:22:40,716 --> 00:22:44,196 Speaker 2: over forty years ago, and that certainly was the basis 392 00:22:44,196 --> 00:22:45,236 Speaker 2: for a lot of these things. 393 00:22:45,476 --> 00:22:48,876 Speaker 1: So we can't synthesize livers yet. But what are some 394 00:22:48,956 --> 00:22:51,836 Speaker 1: of the clinical applications that have been found to some 395 00:22:51,916 --> 00:22:53,196 Speaker 1: of the research you did there? 396 00:22:53,436 --> 00:22:56,796 Speaker 2: Well, you can make artificial skin for burn victims. You 397 00:22:57,276 --> 00:22:59,836 Speaker 2: looks like we'll be able to make blood vessels. I 398 00:22:59,836 --> 00:23:02,756 Speaker 2: mean there have been clinical trials on a variety of things, 399 00:23:02,916 --> 00:23:08,196 Speaker 2: ranging from new spinal cord, repaired hearing loss, you know, 400 00:23:08,356 --> 00:23:11,996 Speaker 2: a lot of different things. But I think ultimately it's unlimited. 401 00:23:12,316 --> 00:23:15,876 Speaker 2: You know, you could theoretically use approaches like this if 402 00:23:15,916 --> 00:23:19,116 Speaker 2: you understand the right cells, the right signals, the right biology, 403 00:23:19,676 --> 00:23:22,876 Speaker 2: and the right engineering. I don't see that there's necessarily 404 00:23:22,876 --> 00:23:25,476 Speaker 2: any limit to what you could use it for, but people, 405 00:23:25,516 --> 00:23:26,796 Speaker 2: we need to understand it more. 406 00:23:30,436 --> 00:23:32,756 Speaker 1: We'll be back in a minute with the Lightning Round. 407 00:23:34,116 --> 00:23:45,956 Speaker 1: M M. I want to finish. We're almost done. I 408 00:23:45,956 --> 00:23:48,076 Speaker 1: appreciate your time. I want to finish with the Lightning Round, 409 00:23:48,076 --> 00:23:51,836 Speaker 1: which is just some quicker, kind of more random show, 410 00:23:51,956 --> 00:23:57,716 Speaker 1: maybe occasionally silly questions. Who is one engineer from history 411 00:23:57,756 --> 00:23:59,916 Speaker 1: who you wish more people knew about? 412 00:24:01,036 --> 00:24:04,996 Speaker 2: Boy? Well, I suppose a lot of people don't realize 413 00:24:04,996 --> 00:24:08,316 Speaker 2: maybe that Leonardo da Vinci was a very good engineer. 414 00:24:09,356 --> 00:24:12,636 Speaker 1: Good What are some of your favorite engineering work of Leonardo's? 415 00:24:13,716 --> 00:24:16,236 Speaker 2: Well, I mean he did all kinds of things. He 416 00:24:16,276 --> 00:24:19,596 Speaker 2: looked at at hearts, he looked at at you know, 417 00:24:20,196 --> 00:24:22,996 Speaker 2: you know, waterflow. I mean he did he did a lot, 418 00:24:23,076 --> 00:24:23,676 Speaker 2: not just art. 419 00:24:25,716 --> 00:24:27,396 Speaker 1: Who is the best teacher you ever had? 420 00:24:31,636 --> 00:24:33,436 Speaker 2: Maybe George Sieli at Cornell? 421 00:24:34,516 --> 00:24:36,356 Speaker 1: What about him made him such a good teacher? 422 00:24:36,996 --> 00:24:40,236 Speaker 2: Well, first, he cared a lot and he explained things. Well, 423 00:24:40,636 --> 00:24:43,116 Speaker 2: but I think caring a lot that that means a lot. 424 00:24:44,836 --> 00:24:47,316 Speaker 1: But you're also a magician, and I'm curious if there 425 00:24:47,316 --> 00:24:49,716 Speaker 1: are any skills from close up magic that have been 426 00:24:49,716 --> 00:24:51,796 Speaker 1: helpful to you in your day job. 427 00:24:52,276 --> 00:24:54,596 Speaker 2: You know, the one thing that does make a difference 428 00:24:55,156 --> 00:25:01,756 Speaker 2: with magic is presentation. So you know, if you give 429 00:25:01,916 --> 00:25:03,996 Speaker 2: if so what I learned in magic. If I make 430 00:25:03,996 --> 00:25:07,436 Speaker 2: a mistake, sometimes of course you make it deliberately. But 431 00:25:08,516 --> 00:25:13,276 Speaker 2: if I made an mistake, you know, it's part of 432 00:25:13,316 --> 00:25:16,316 Speaker 2: the show. You don't get upset, You just you know, 433 00:25:16,436 --> 00:25:19,556 Speaker 2: you just you just go with the flow. And what 434 00:25:19,596 --> 00:25:22,236 Speaker 2: I'd say is if I made a mistake from the talk, 435 00:25:22,716 --> 00:25:25,276 Speaker 2: same thing. You know, it's like you don't get flustered. 436 00:25:25,316 --> 00:25:28,676 Speaker 2: You just say you just keep going, and that does 437 00:25:28,756 --> 00:25:29,516 Speaker 2: make a difference. 438 00:25:30,316 --> 00:25:34,516 Speaker 1: So your research also helped to create, as I understand it, 439 00:25:34,556 --> 00:25:38,836 Speaker 1: a line of hair care products called living Proof. Jennifer Aniston, 440 00:25:39,476 --> 00:25:42,316 Speaker 1: who I will say had great hair before the company started, 441 00:25:42,436 --> 00:25:46,556 Speaker 1: is involved in that company, And so I'm curious, what's 442 00:25:46,556 --> 00:25:49,796 Speaker 1: your favorite living Proof product? And are you using it 443 00:25:49,876 --> 00:25:50,316 Speaker 1: right now? 444 00:25:51,116 --> 00:25:54,116 Speaker 2: Well, so I would say, you know, one of the 445 00:25:54,156 --> 00:25:57,916 Speaker 2: living Proof products is called PhD and stands for Perfect 446 00:25:57,916 --> 00:25:58,396 Speaker 2: hair Day. 447 00:25:58,676 --> 00:26:02,716 Speaker 1: Oh, Perfect hair Day? Okay, are you using it right now? So? 448 00:26:02,996 --> 00:26:06,316 Speaker 2: I use a shampoos okay, but gee, my wife and 449 00:26:06,356 --> 00:26:10,516 Speaker 2: my daughter and lots of people use lots of the products. 450 00:26:11,116 --> 00:26:14,076 Speaker 2: But I basically use it in the shampoo every so 451 00:26:14,196 --> 00:26:17,236 Speaker 2: often when my hair gets longer, I have, you know, 452 00:26:17,316 --> 00:26:19,516 Speaker 2: just a spray that I put on that doesn't make 453 00:26:19,516 --> 00:26:20,516 Speaker 2: it frize up so much. 454 00:26:22,036 --> 00:26:24,836 Speaker 1: Great. Is there anything else you think we should talk about? 455 00:26:27,236 --> 00:26:29,596 Speaker 2: Well, the only other thing I'd say that we've done 456 00:26:29,636 --> 00:26:34,516 Speaker 2: that we really didn't touch on is you know, we're 457 00:26:34,516 --> 00:26:36,436 Speaker 2: doing a lot of work with the Gates Foundation to 458 00:26:36,476 --> 00:26:39,316 Speaker 2: help the developing world, you know, and I'm excited about 459 00:26:39,316 --> 00:26:41,956 Speaker 2: that as well. I mean, they've been a big supporter 460 00:26:42,116 --> 00:26:45,956 Speaker 2: of our lab and he's done a terrific job in 461 00:26:46,076 --> 00:26:48,956 Speaker 2: terms of helping and it's I think the work is 462 00:26:49,316 --> 00:26:53,596 Speaker 2: leading to new kinds of nutrition, new kinds of oral 463 00:26:53,676 --> 00:26:56,956 Speaker 2: delivery that could last much longer than just a day 464 00:26:57,556 --> 00:27:00,196 Speaker 2: can lead. It's also leading to what we call self 465 00:27:00,196 --> 00:27:02,436 Speaker 2: boosting injections, so you wouldn't have to come back for 466 00:27:02,476 --> 00:27:04,716 Speaker 2: a second shot. So I think it's leading to a 467 00:27:04,756 --> 00:27:06,916 Speaker 2: lot of things that I hope will someday help a 468 00:27:06,916 --> 00:27:10,916 Speaker 2: lot of people, whether you not only in the developing world, 469 00:27:10,996 --> 00:27:13,236 Speaker 2: but everyone in the world period. 470 00:27:14,356 --> 00:27:17,716 Speaker 1: Of those technologies that you just listed, is any one 471 00:27:17,756 --> 00:27:22,156 Speaker 1: of them, particularly you know, farther along in development. 472 00:27:21,836 --> 00:27:25,316 Speaker 2: Well, several of them are already. I mean the pills 473 00:27:25,356 --> 00:27:28,756 Speaker 2: that you can swallow orally are in that lasts for 474 00:27:28,796 --> 00:27:32,436 Speaker 2: a week or a month. They're in phase three clinical trials. 475 00:27:32,476 --> 00:27:35,556 Speaker 2: There's a company Lindra that Geotraverso and I have start 476 00:27:36,316 --> 00:27:37,956 Speaker 2: that's probably the most advanced. 477 00:27:38,476 --> 00:27:41,396 Speaker 1: Is that for anti malarials or what is the first 478 00:27:41,396 --> 00:27:42,236 Speaker 1: application there? 479 00:27:42,276 --> 00:27:46,076 Speaker 2: The most advanced application of schizophrenias and phase three trial 480 00:27:46,276 --> 00:27:49,316 Speaker 2: It is in clinical trials from malaria too. Okay, but 481 00:27:49,436 --> 00:27:50,796 Speaker 2: that's like at phase one. 482 00:27:50,836 --> 00:27:54,036 Speaker 1: So Presumably that would be a big deal because drug 483 00:27:54,076 --> 00:27:56,796 Speaker 1: adherence is always a problem. People very often don't take 484 00:27:56,836 --> 00:27:59,476 Speaker 1: their drugs. Presumably people who are mentally ill might have 485 00:27:59,516 --> 00:28:01,676 Speaker 1: more trouble with adherents. So if you could have a 486 00:28:01,716 --> 00:28:03,596 Speaker 1: pill once a week instead of every day, that would 487 00:28:03,636 --> 00:28:04,876 Speaker 1: be a very large. 488 00:28:04,676 --> 00:28:07,236 Speaker 2: Impress Yeah, and also once a month. You know, we've 489 00:28:07,236 --> 00:28:09,036 Speaker 2: been working onto you know, like a once a mo 490 00:28:09,156 --> 00:28:12,596 Speaker 2: on birth control pill and yeah, so all those things, 491 00:28:12,836 --> 00:28:14,196 Speaker 2: do you know, things moving forward. 492 00:28:17,996 --> 00:28:22,876 Speaker 1: Robert Langer is an institute professor at MIT. Today's show 493 00:28:23,036 --> 00:28:26,196 Speaker 1: was produced by Gabriel Hunter Chang. It was edited by 494 00:28:26,276 --> 00:28:30,036 Speaker 1: Lyddy Jean Kott and engineered by Sarah Bruguer. You can 495 00:28:30,076 --> 00:28:33,956 Speaker 1: email us at problem at Pushkin dot fm. I'm Jacob 496 00:28:33,956 --> 00:28:36,636 Speaker 1: Goldstein and we'll be back next week with another episode 497 00:28:36,636 --> 00:28:47,676 Speaker 1: of What's Your Problem.