1 00:00:00,640 --> 00:00:03,320 Speaker 1: Hey, they're aud Loots listeners. It is that time of 2 00:00:03,360 --> 00:00:06,600 Speaker 1: the year again. We are going to be doing a 3 00:00:06,720 --> 00:00:09,879 Speaker 1: call in show on the podcast. You can ask us 4 00:00:10,200 --> 00:00:13,119 Speaker 1: any of your burning questions. 5 00:00:12,720 --> 00:00:15,600 Speaker 2: That's right. You want to ask us about finance, markets 6 00:00:15,640 --> 00:00:18,759 Speaker 2: and economics, go for it. You want to ask us 7 00:00:18,880 --> 00:00:22,400 Speaker 2: about the year in podcasting, go for it. You want 8 00:00:22,400 --> 00:00:25,880 Speaker 2: to ask about where Tracy is with growing chicken or 9 00:00:26,040 --> 00:00:31,000 Speaker 2: raising chicken, growing chickens, raising chickens on her versioning firm 10 00:00:31,040 --> 00:00:33,360 Speaker 2: in Connecticut, go for it. This is your chance to 11 00:00:33,400 --> 00:00:34,120 Speaker 2: ask us anything. 12 00:00:34,360 --> 00:00:36,479 Speaker 3: Joe's favorite cut of steak. 13 00:00:36,240 --> 00:00:38,760 Speaker 2: Right, my favorite cut of steak, my favorite Chinese restaurant 14 00:00:38,760 --> 00:00:40,880 Speaker 2: in the East Village. It's all fair game, all right. 15 00:00:41,280 --> 00:00:43,560 Speaker 1: All you have to do is send a voice memo 16 00:00:43,760 --> 00:00:47,400 Speaker 1: with your question, your name, your age, and your location 17 00:00:47,600 --> 00:00:50,720 Speaker 1: to od Loots at Bloomberg dot net. 18 00:00:51,000 --> 00:00:54,120 Speaker 2: Deadline to submit is December seventeenth, So get the men, 19 00:00:54,200 --> 00:00:54,880 Speaker 2: assume as you can. 20 00:00:55,120 --> 00:00:57,800 Speaker 1: We're looking forward to hearing what you have to ask 21 00:00:57,920 --> 00:00:59,920 Speaker 1: and yeah, that's coming up. 22 00:01:02,720 --> 00:01:15,840 Speaker 4: Bloomberg Audio Studios, Podcasts, Radio News. 23 00:01:18,040 --> 00:01:21,000 Speaker 1: Hello and welcome to another episode of the Odd Lots podcast. 24 00:01:21,080 --> 00:01:22,440 Speaker 3: I'm Tracy Alloway. 25 00:01:22,120 --> 00:01:23,800 Speaker 2: And I'm Joe Why isn't they Joe. 26 00:01:23,800 --> 00:01:25,679 Speaker 3: We were doing a Q and A this morning. 27 00:01:25,880 --> 00:01:26,319 Speaker 2: That's right. 28 00:01:26,440 --> 00:01:28,240 Speaker 3: It's a lot of fun go on a live Q 29 00:01:28,360 --> 00:01:28,560 Speaker 3: and A. 30 00:01:29,160 --> 00:01:32,160 Speaker 1: And someone asked a question about whether or not we're 31 00:01:32,200 --> 00:01:37,440 Speaker 1: going to do more healthcare episodes, and no, we don't, 32 00:01:37,520 --> 00:01:41,880 Speaker 1: and there's a reason for that. I personally am incredibly 33 00:01:41,920 --> 00:01:45,000 Speaker 1: intimidated by the US healthcare system. I do not understand 34 00:01:45,040 --> 00:01:48,160 Speaker 1: it at all. It is just a complete mystery to me. 35 00:01:48,920 --> 00:01:51,440 Speaker 1: But I was very happy to say in response to 36 00:01:51,480 --> 00:01:55,520 Speaker 1: that question that this same day, that's right, we're actually 37 00:01:55,520 --> 00:01:58,920 Speaker 1: recording a healthcare episode with someone that we've wanted to 38 00:01:58,920 --> 00:02:00,480 Speaker 1: speak to for a long time. 39 00:02:01,200 --> 00:02:03,720 Speaker 2: I'm the same way in the sense that, first of all, yes, 40 00:02:03,880 --> 00:02:05,680 Speaker 2: I'm the same way in the sense that I really 41 00:02:05,720 --> 00:02:08,760 Speaker 2: do not know much about how the healthcare system works. 42 00:02:09,080 --> 00:02:11,480 Speaker 2: I don't even know where to begin asking the right questions. 43 00:02:11,520 --> 00:02:13,160 Speaker 2: There is athing you do. You have to just start 44 00:02:13,200 --> 00:02:15,799 Speaker 2: a random episode, and that gives you the germs of 45 00:02:15,840 --> 00:02:18,400 Speaker 2: the next question, the next episode, the next episode. But 46 00:02:18,440 --> 00:02:21,360 Speaker 2: they're so it seems so big and sprawling, etc. That 47 00:02:21,880 --> 00:02:24,280 Speaker 2: what is the first question to ask? So we just 48 00:02:24,320 --> 00:02:26,639 Speaker 2: have to plunge right in and just pick one which 49 00:02:26,680 --> 00:02:29,160 Speaker 2: we're doing now, and then maybe that will lead to 50 00:02:29,160 --> 00:02:32,000 Speaker 2: the string of healthcare episodes, which we should have done 51 00:02:32,000 --> 00:02:32,360 Speaker 2: a long time. 52 00:02:32,400 --> 00:02:34,959 Speaker 1: Again, that's exactly right. There's also a lot of new 53 00:02:35,000 --> 00:02:37,639 Speaker 1: stuff happening in healthcare at the moment, and we recorded 54 00:02:37,680 --> 00:02:40,720 Speaker 1: an episode on Chinese biotechs a little while ago that 55 00:02:40,800 --> 00:02:44,400 Speaker 1: was incredibly fascinating. Definitely, I'm very curious to see what's 56 00:02:44,440 --> 00:02:47,560 Speaker 1: going on on the US side of biotech and nesting, 57 00:02:47,680 --> 00:02:50,120 Speaker 1: and we do have another episode plan that's sort of 58 00:02:50,160 --> 00:02:53,280 Speaker 1: tangentially related to that. But clearly there's a lot to 59 00:02:53,280 --> 00:02:57,200 Speaker 1: talk about. The other very odd lotsy thing with this 60 00:02:57,280 --> 00:03:01,359 Speaker 1: guest is we like people who have interest in career histories, right. 61 00:03:01,440 --> 00:03:04,320 Speaker 2: That's right. How how they got to where they are 62 00:03:04,360 --> 00:03:08,640 Speaker 2: today is often a very interesting question. You know, I 63 00:03:08,680 --> 00:03:10,840 Speaker 2: have nothing against people who just took the normal path, 64 00:03:10,880 --> 00:03:13,440 Speaker 2: people who just sort of, you know, went to college 65 00:03:13,560 --> 00:03:14,760 Speaker 2: and they got their NBA. 66 00:03:14,800 --> 00:03:16,600 Speaker 3: And okay, you forgive them, I forgive them. 67 00:03:16,600 --> 00:03:18,960 Speaker 2: That's totally fine. But it's also interesting to hear about 68 00:03:19,040 --> 00:03:21,480 Speaker 2: the people who maybe walked in through the side door, 69 00:03:21,520 --> 00:03:22,079 Speaker 2: so to speak. 70 00:03:22,120 --> 00:03:24,440 Speaker 1: Absolutely so, we do, in fact have the perfect guest. 71 00:03:24,480 --> 00:03:26,760 Speaker 1: We have someone who has a lot of thoughts on 72 00:03:27,000 --> 00:03:30,560 Speaker 1: US healthcare and who is also a biotech investor and 73 00:03:30,760 --> 00:03:37,040 Speaker 1: also formerly the lead singer of Chester French. So Da Wallack, 74 00:03:37,120 --> 00:03:38,960 Speaker 1: Welcome to the show. Thanks so much for coming on. 75 00:03:39,280 --> 00:03:43,080 Speaker 5: Thanks for having me, guys. I'm I'm a odd lots junkie. 76 00:03:43,160 --> 00:03:45,920 Speaker 5: So this is like going to the Grammys. 77 00:03:45,960 --> 00:03:51,040 Speaker 2: Amazing, Thank you ever Grammy. No, don't sorry, sorry, sorry, 78 00:03:51,040 --> 00:03:52,040 Speaker 2: I shouldn't have I shouldn't have it. 79 00:03:52,200 --> 00:03:53,680 Speaker 5: Not yet is the right answer. 80 00:03:53,680 --> 00:03:55,120 Speaker 3: Not yet, not yet, not yet. 81 00:03:55,280 --> 00:03:57,960 Speaker 1: I guess my first question should be, can you talk 82 00:03:57,960 --> 00:04:01,160 Speaker 1: to us about the through line between being musician and 83 00:04:01,400 --> 00:04:04,200 Speaker 1: healthcare and how and biotech and how you got into 84 00:04:04,200 --> 00:04:06,360 Speaker 1: the space, because I think you know, it's not a 85 00:04:06,480 --> 00:04:09,480 Speaker 1: natural transition, to say the least. 86 00:04:10,360 --> 00:04:12,840 Speaker 5: Yeah, well, I'll tell you how I ended up doing this, 87 00:04:13,080 --> 00:04:16,720 Speaker 5: and then I'll try to connect them theoretically in some way. 88 00:04:16,760 --> 00:04:19,839 Speaker 5: It might be a little tenuous. I've basically had three 89 00:04:19,839 --> 00:04:22,480 Speaker 5: careers so far in my limited adult life. I was 90 00:04:22,520 --> 00:04:25,719 Speaker 5: a professional rock musician with the band that you mentioned 91 00:04:25,720 --> 00:04:28,479 Speaker 5: for several years, and then I kind of slipped into 92 00:04:28,480 --> 00:04:32,000 Speaker 5: the venture capital world when I invested in Spotify about 93 00:04:32,560 --> 00:04:35,920 Speaker 5: thirteen years ago, and that was pretty much the only 94 00:04:35,960 --> 00:04:38,120 Speaker 5: company in the private markets I was well positioned to 95 00:04:38,200 --> 00:04:42,280 Speaker 5: understand as a musician, and through the success of that, 96 00:04:42,360 --> 00:04:44,680 Speaker 5: I got turned on to how exciting venture capital was. 97 00:04:45,120 --> 00:04:48,760 Speaker 5: Started doing other types of investments across different industries. Was 98 00:04:48,800 --> 00:04:51,920 Speaker 5: involved in SpaceX and Ripple and a bunch of interesting 99 00:04:51,960 --> 00:04:56,040 Speaker 5: other startups, and then ultimately a guy I knew started 100 00:04:56,160 --> 00:04:59,360 Speaker 5: an early stage a healthcare company. It was a telemedicine 101 00:04:59,360 --> 00:05:03,160 Speaker 5: startup called doctor on Demand, and telemedicine at the time 102 00:05:03,240 --> 00:05:05,400 Speaker 5: was not a hot topic because this is pre COVID, 103 00:05:05,960 --> 00:05:08,760 Speaker 5: so we still primarily went to the doctor in person. 104 00:05:09,560 --> 00:05:12,919 Speaker 5: And when I made that investment, I started to learn 105 00:05:12,960 --> 00:05:15,680 Speaker 5: more and more about our healthcare system and was just 106 00:05:15,720 --> 00:05:18,040 Speaker 5: blown away by how screwed up and stupid it was. 107 00:05:18,680 --> 00:05:22,880 Speaker 5: And then that eventually evolved into learning more about biotechnology 108 00:05:23,400 --> 00:05:27,080 Speaker 5: and the other sub sectors of healthcare that are critical 109 00:05:27,120 --> 00:05:30,000 Speaker 5: to medicine, and it's ended up being what I do. 110 00:05:30,480 --> 00:05:32,640 Speaker 5: In terms of the connection between music and any of 111 00:05:32,640 --> 00:05:34,880 Speaker 5: this stuff, there are a couple of ways I can 112 00:05:34,920 --> 00:05:37,040 Speaker 5: think about it. One is, I tell people, now my 113 00:05:37,200 --> 00:05:40,640 Speaker 5: job is like being a record producer for scientists, so 114 00:05:40,680 --> 00:05:43,599 Speaker 5: there's a little bit of a parallel there. But the 115 00:05:43,680 --> 00:05:47,960 Speaker 5: other is that I think there's a unique challenge in 116 00:05:48,080 --> 00:05:52,520 Speaker 5: music to combining art and commerce, And in healthcare there's 117 00:05:52,560 --> 00:05:55,040 Speaker 5: a similar parallel challenge, which is how do you combine 118 00:05:55,080 --> 00:05:59,719 Speaker 5: medicine and capitalism, which don't naturally go together very well? 119 00:06:00,120 --> 00:06:03,840 Speaker 2: Producer, analogy makes a ton of sense. And you know, 120 00:06:03,880 --> 00:06:06,160 Speaker 2: there are probably a lot of musicians who are really brilliant, 121 00:06:06,200 --> 00:06:09,320 Speaker 2: they're really great musicians, but for whatever reason, the lightning 122 00:06:09,360 --> 00:06:11,520 Speaker 2: doesn't strike where they are or doesn't strike nearby, and 123 00:06:11,520 --> 00:06:14,679 Speaker 2: they don't take off. Probably many brilliant scientists, et cetera, 124 00:06:14,839 --> 00:06:18,840 Speaker 2: but the path from brilliant science to commercial blockbuster can often, 125 00:06:18,920 --> 00:06:23,080 Speaker 2: I assume, be tricky or dispiriting in many ways, et cetera. 126 00:06:23,320 --> 00:06:27,400 Speaker 2: Biotech specifically, of all the things in investing, biotech strikes 127 00:06:27,480 --> 00:06:29,440 Speaker 2: me is this whole different world than the rest of 128 00:06:29,520 --> 00:06:31,600 Speaker 2: like investing. You know, when I think of like a 129 00:06:31,640 --> 00:06:36,440 Speaker 2: software company, it's like, oh, okay, well they've accumulated these 130 00:06:36,440 --> 00:06:40,120 Speaker 2: clients and their churn is low, et cetera. Yeah, this 131 00:06:40,200 --> 00:06:42,520 Speaker 2: seems like a company that has traction is going to grow. 132 00:06:42,760 --> 00:06:46,160 Speaker 2: When it comes to biotech, it's like, okay, here's some 133 00:06:46,839 --> 00:06:50,040 Speaker 2: patent on a sequence and maybe ten years from now 134 00:06:50,080 --> 00:06:53,400 Speaker 2: it'll get approved to something that'll be a therapy. It 135 00:06:53,480 --> 00:06:56,160 Speaker 2: seems so much harder to figure out, like what are 136 00:06:56,200 --> 00:06:59,159 Speaker 2: the heuristics that one would use to establish this is 137 00:06:59,200 --> 00:07:02,400 Speaker 2: a likely this science is likely going to turn into 138 00:07:02,440 --> 00:07:02,799 Speaker 2: a business. 139 00:07:02,800 --> 00:07:06,520 Speaker 5: Oh, that's absolutely true. It's like a completely different paradigm. 140 00:07:06,560 --> 00:07:09,680 Speaker 5: As an investor, I think the typical biotech company is 141 00:07:09,720 --> 00:07:12,920 Speaker 5: like a bag of options, and each one of the 142 00:07:13,040 --> 00:07:16,200 Speaker 5: drugs that the company is working on in success could 143 00:07:16,240 --> 00:07:19,360 Speaker 5: be worth billions of dollars, but that's ten years away 144 00:07:19,600 --> 00:07:23,920 Speaker 5: often minimum, and so you're trying to price things based 145 00:07:23,960 --> 00:07:28,320 Speaker 5: on their ultimate potential scale times their probability of succeeding, 146 00:07:28,840 --> 00:07:31,760 Speaker 5: and unfortunately, the base rates in terms of probability of 147 00:07:31,800 --> 00:07:35,800 Speaker 5: success are very low. So if you take small molecules, 148 00:07:35,840 --> 00:07:40,360 Speaker 5: which is one major area of drugs, the base case 149 00:07:40,640 --> 00:07:44,480 Speaker 5: is like a five percent probability of success from the 150 00:07:44,600 --> 00:07:47,800 Speaker 5: original idea to an FDA approval and a marketed drug. 151 00:07:48,280 --> 00:07:51,680 Speaker 5: Now you get to a higher sort of prior probability 152 00:07:51,760 --> 00:07:57,480 Speaker 5: with antibodies or so called biologics other classes of drugs 153 00:07:57,920 --> 00:08:01,800 Speaker 5: that are intrinsically more likely to work than small molecules, 154 00:08:02,120 --> 00:08:04,840 Speaker 5: but still in every case you're dealing with very low 155 00:08:04,880 --> 00:08:08,360 Speaker 5: probabilities of success, and the entire challenge as a biotech 156 00:08:08,400 --> 00:08:11,800 Speaker 5: investor is how do you manage those low probability events 157 00:08:12,200 --> 00:08:15,080 Speaker 5: and build portfolios that are still likely to make money 158 00:08:15,400 --> 00:08:19,160 Speaker 5: despite the fact that each individual project is relatively unlikely 159 00:08:19,200 --> 00:08:23,200 Speaker 5: to work. I'd say, in tech, there's this well described 160 00:08:23,280 --> 00:08:26,760 Speaker 5: kind of power law distribution of winners and losers, which 161 00:08:26,800 --> 00:08:29,440 Speaker 5: is to say, a very small number of companies make 162 00:08:29,480 --> 00:08:32,040 Speaker 5: all the money and pay for the huge number of losers. 163 00:08:32,640 --> 00:08:35,840 Speaker 5: In biotech, that's still true to a degree, but the 164 00:08:35,880 --> 00:08:39,880 Speaker 5: magnitudes of the winners are lower, and so a really 165 00:08:39,960 --> 00:08:44,000 Speaker 5: good biotech investor probably has a lower, sorry a higher 166 00:08:44,080 --> 00:08:48,480 Speaker 5: batting average than the typical tech investor, but the wins 167 00:08:48,520 --> 00:08:49,559 Speaker 5: are not as big. 168 00:08:50,760 --> 00:08:54,040 Speaker 1: So one thing I'm really curious about is how you 169 00:08:54,200 --> 00:08:58,880 Speaker 1: source potential investments and how you find you use the 170 00:08:58,920 --> 00:09:02,640 Speaker 1: analogy of the record, how you find talent in the space, 171 00:09:02,760 --> 00:09:05,720 Speaker 1: or how the talent kind of finds you, and whether 172 00:09:05,840 --> 00:09:09,120 Speaker 1: or not it's different from again, the sort of software 173 00:09:09,800 --> 00:09:12,360 Speaker 1: or tech space that we usually talk about when it 174 00:09:12,360 --> 00:09:13,400 Speaker 1: comes to venture capital. 175 00:09:14,360 --> 00:09:17,000 Speaker 5: You know, when I started doing venture investing, it was, 176 00:09:17,040 --> 00:09:21,080 Speaker 5: like I said, twelve thirteen years ago, it was obviously 177 00:09:21,120 --> 00:09:25,160 Speaker 5: a well established part of the capital markets. But you know, 178 00:09:25,200 --> 00:09:28,199 Speaker 5: I cold emailed Brian Armstrong from coinbase and was meeting 179 00:09:28,240 --> 00:09:33,400 Speaker 5: with them two days later. And it's hard to overstate 180 00:09:34,240 --> 00:09:37,240 Speaker 5: how much money has rushed in over the past decades. 181 00:09:37,320 --> 00:09:40,200 Speaker 5: So what went from being an established but still kind 182 00:09:40,240 --> 00:09:44,320 Speaker 5: of marginal part of the capital markets is now all 183 00:09:44,360 --> 00:09:47,960 Speaker 5: anyone thinks or talks about. And so in biotech, what 184 00:09:48,040 --> 00:09:50,800 Speaker 5: I found getting into this area was that it was 185 00:09:51,040 --> 00:09:53,800 Speaker 5: more like that venture market I had encountered. There was 186 00:09:53,840 --> 00:09:57,640 Speaker 5: a scarcity of capital relative to the to the caliber 187 00:09:57,640 --> 00:10:01,320 Speaker 5: of ideas that were out there, and so I'd say 188 00:10:01,440 --> 00:10:05,319 Speaker 5: deal sourcing is much easier in a sense because there's 189 00:10:05,400 --> 00:10:09,520 Speaker 5: less money chasing a huge number of good ideas, and 190 00:10:09,800 --> 00:10:12,360 Speaker 5: those ideas, by and large do come out of our 191 00:10:12,520 --> 00:10:17,040 Speaker 5: university and research infrastructure here in America. The same is 192 00:10:17,080 --> 00:10:20,560 Speaker 5: also true in other parts of the world in Europe, China, India, 193 00:10:20,600 --> 00:10:25,960 Speaker 5: and so forth. But it's really the translation of those 194 00:10:26,080 --> 00:10:32,280 Speaker 5: academic concepts into products that could make money that is 195 00:10:32,320 --> 00:10:34,840 Speaker 5: the challenge. That's the so called valley of death that 196 00:10:34,880 --> 00:10:37,880 Speaker 5: people sometimes talk about in our industry. There are just 197 00:10:37,920 --> 00:10:41,120 Speaker 5: an immense number of cool ideas. If you go into 198 00:10:41,400 --> 00:10:45,200 Speaker 5: any university in our country, but such a small number 199 00:10:45,240 --> 00:10:47,920 Speaker 5: of them is ever going to cross that chasm. And 200 00:10:47,960 --> 00:10:51,000 Speaker 5: part of that is that the expertise and the personnel 201 00:10:51,080 --> 00:10:55,080 Speaker 5: required to do that translational work is not the same 202 00:10:55,120 --> 00:10:58,320 Speaker 5: expertise that is required to do the inventing in the 203 00:10:58,320 --> 00:11:03,920 Speaker 5: first place. So that is really what the large pharmaceutical 204 00:11:03,960 --> 00:11:08,480 Speaker 5: companies have a specialized expertise and they train people in 205 00:11:08,600 --> 00:11:11,240 Speaker 5: this translational work. How do you go from early science 206 00:11:11,600 --> 00:11:14,400 Speaker 5: to real products? 207 00:11:29,160 --> 00:11:32,480 Speaker 2: When I go to a typical venture capitalist website or 208 00:11:32,520 --> 00:11:34,800 Speaker 2: I see their Twitter bio or something like that, it'll 209 00:11:34,840 --> 00:11:37,920 Speaker 2: say like, we bet great founders, and I'm like, thanks, 210 00:11:37,920 --> 00:11:41,120 Speaker 2: that's very helpful because that distinguishes you from the venture 211 00:11:41,160 --> 00:11:43,920 Speaker 2: capitalists who back crappy founders. So I'm glad I'm gonna 212 00:11:43,920 --> 00:11:47,199 Speaker 2: invest with you. And said, what's the biotech equivalent? What's 213 00:11:47,240 --> 00:11:51,120 Speaker 2: the cliche in your industry that every VC says that 214 00:11:51,240 --> 00:11:54,319 Speaker 2: ostensibly distinguishes them from all the others. 215 00:11:55,640 --> 00:11:57,800 Speaker 5: Well, I'm not sure what the vcs say. I mean, 216 00:11:57,880 --> 00:12:00,640 Speaker 5: they are kind of commoditized in the sense that most 217 00:12:00,679 --> 00:12:04,400 Speaker 5: of the firms look pretty similar. They employ thirty PhDs 218 00:12:04,480 --> 00:12:07,880 Speaker 5: and physicians, and the value of those people is that 219 00:12:07,920 --> 00:12:10,760 Speaker 5: they can make sense of the information that you have 220 00:12:10,840 --> 00:12:16,000 Speaker 5: to process to invest intelligently in this space. In terms 221 00:12:16,000 --> 00:12:19,800 Speaker 5: of what distinguishes the founders that they like to look at, 222 00:12:20,360 --> 00:12:23,040 Speaker 5: I'd say again, it's kind of the inverse of what 223 00:12:23,120 --> 00:12:26,679 Speaker 5: you find in tech. There's a real premium on quote 224 00:12:26,720 --> 00:12:30,640 Speaker 5: gray hair in the biotech industry because the only way 225 00:12:30,679 --> 00:12:33,120 Speaker 5: to learn this stuff is to do it over and 226 00:12:33,200 --> 00:12:35,560 Speaker 5: over again and to have had a lot of failures. 227 00:12:36,120 --> 00:12:40,400 Speaker 5: And if you think about a software company, the tropes 228 00:12:40,480 --> 00:12:45,240 Speaker 5: you are familiar with are you know, fail fast, pivot right. 229 00:12:45,320 --> 00:12:48,000 Speaker 5: You know, like, you launch something, it doesn't work, you 230 00:12:48,040 --> 00:12:50,560 Speaker 5: tweak the product design, you go into a different market. 231 00:12:50,880 --> 00:12:54,080 Speaker 5: You can adapt very readily to the market. In biotech, 232 00:12:54,120 --> 00:12:57,280 Speaker 5: if you choose to embark upon a clinical program, you're 233 00:12:57,320 --> 00:13:01,440 Speaker 5: in for thirty or forty million bucks, an easy door 234 00:13:01,480 --> 00:13:04,800 Speaker 5: to walk back out of. And so there's a real 235 00:13:04,840 --> 00:13:08,320 Speaker 5: premium on people with experience who have done it multiple times. 236 00:13:08,920 --> 00:13:11,679 Speaker 5: That is a little bit at odds in recent years 237 00:13:12,240 --> 00:13:16,199 Speaker 5: with a movement that people have I think awkwardly dubbed 238 00:13:16,240 --> 00:13:20,840 Speaker 5: tech bio instead of biotech. And really these are Silicon 239 00:13:20,960 --> 00:13:26,000 Speaker 5: Valley tech investors, not totally unlike myself, who have gotten 240 00:13:26,000 --> 00:13:30,000 Speaker 5: into biotech, and they think that what's about to change 241 00:13:30,600 --> 00:13:32,160 Speaker 5: is it's going to go the way of the tech 242 00:13:32,200 --> 00:13:35,080 Speaker 5: industry and the next big companies are going to be 243 00:13:35,120 --> 00:13:38,160 Speaker 5: started by really clever twenty one year olds coming out 244 00:13:38,200 --> 00:13:42,640 Speaker 5: of Stanford and that hypothesis people have been testing now 245 00:13:42,679 --> 00:13:44,839 Speaker 5: for a few years. I'd say it's a little too 246 00:13:44,880 --> 00:13:49,559 Speaker 5: early to issue a verdict, but that's never really been 247 00:13:49,600 --> 00:13:50,080 Speaker 5: our theory. 248 00:13:50,400 --> 00:13:54,440 Speaker 1: Is that hypothesis just predicated on AI coming in and 249 00:13:54,520 --> 00:13:56,480 Speaker 1: making you know, drug development easier. 250 00:13:56,559 --> 00:13:57,319 Speaker 3: Is that all it is. 251 00:13:58,320 --> 00:14:01,199 Speaker 5: There's a lot of that. I'd say there are two 252 00:14:01,200 --> 00:14:04,280 Speaker 5: parts of it. One of it is maybe more substantive 253 00:14:04,320 --> 00:14:07,000 Speaker 5: than that, and this is a little nuanced, but I 254 00:14:07,000 --> 00:14:10,280 Speaker 5: know odd lots of people like Nuance one of the 255 00:14:10,280 --> 00:14:14,679 Speaker 5: big transformations that really gave rise to the biotech industry. 256 00:14:15,040 --> 00:14:17,400 Speaker 5: And when I use that term biotech, I'm distinguishing it 257 00:14:17,440 --> 00:14:21,600 Speaker 5: from big pharma. So biotech really just means small drug companies, 258 00:14:21,640 --> 00:14:25,440 Speaker 5: many of them are public. What really gave rise to 259 00:14:25,480 --> 00:14:28,000 Speaker 5: that industry was the big pharmas at the behest of 260 00:14:28,080 --> 00:14:35,520 Speaker 5: Wall Street deprioritized early stage research because Wall Street said, 261 00:14:35,560 --> 00:14:38,680 Speaker 5: you're wasting a lot of money on this really risky 262 00:14:38,840 --> 00:14:42,560 Speaker 5: early stage discovery work. What we would rather you did 263 00:14:43,360 --> 00:14:47,400 Speaker 5: was just let all these crazy guys like da finance 264 00:14:47,480 --> 00:14:51,240 Speaker 5: startups and once they work, just buy them. You know, 265 00:14:51,280 --> 00:14:53,120 Speaker 5: you're going to pay a higher price, but you won't 266 00:14:53,120 --> 00:14:56,200 Speaker 5: be burning all this money on early stuff. What that 267 00:14:56,280 --> 00:15:01,160 Speaker 5: led to was an exodus a very specialized technical experts 268 00:15:01,200 --> 00:15:04,360 Speaker 5: from the pharma companies, and it created the so called 269 00:15:04,440 --> 00:15:09,760 Speaker 5: cro or contract research organization ecosystem. So you now, as 270 00:15:09,800 --> 00:15:12,760 Speaker 5: a consequence of that, for the past twenty years, have 271 00:15:12,960 --> 00:15:20,240 Speaker 5: had a very proficient environment full of contract organizations that 272 00:15:20,280 --> 00:15:23,200 Speaker 5: you can hire as a little company to outsource a 273 00:15:23,200 --> 00:15:25,400 Speaker 5: lot of work that you couldn't in the past. So 274 00:15:25,800 --> 00:15:28,400 Speaker 5: the best analogy to to tech would be sort of 275 00:15:28,400 --> 00:15:33,000 Speaker 5: like virtual servers or cloud infrastructure, Like you know, to 276 00:15:33,080 --> 00:15:35,359 Speaker 5: have a startup, you used to have all these servers 277 00:15:35,600 --> 00:15:38,000 Speaker 5: in your office, and then at some point you didn't 278 00:15:38,000 --> 00:15:41,000 Speaker 5: need that, so the cost of new company formation went 279 00:15:41,080 --> 00:15:45,280 Speaker 5: way down. So part of the argument for younger, more 280 00:15:45,320 --> 00:15:48,520 Speaker 5: agile founders has been, look, we got this whole new 281 00:15:48,960 --> 00:15:52,440 Speaker 5: kind of infrastructure through which they can build companies in 282 00:15:52,480 --> 00:15:56,960 Speaker 5: a really agile way. The other argument, you know exactly 283 00:15:57,040 --> 00:16:01,600 Speaker 5: your question, is around AI, and that is basically, look 284 00:16:01,640 --> 00:16:05,240 Speaker 5: these old people don't understand AI. Let's get some young 285 00:16:05,800 --> 00:16:08,520 Speaker 5: Silicon Valley computer science he types to do this, and 286 00:16:08,560 --> 00:16:10,000 Speaker 5: they're gonna show them how it's done. 287 00:16:10,880 --> 00:16:15,720 Speaker 2: I feel like that's probably a phenomenon that goes beyond biotech, 288 00:16:16,040 --> 00:16:19,320 Speaker 2: where there's this fantasy, and maybe in some cases it's 289 00:16:19,360 --> 00:16:23,320 Speaker 2: even correct, but there is this fantasy that every industry 290 00:16:23,400 --> 00:16:26,760 Speaker 2: out there must be dominated by old dinosaurs who don't 291 00:16:26,800 --> 00:16:29,320 Speaker 2: know how to use tech and who have been doing 292 00:16:29,360 --> 00:16:32,600 Speaker 2: something the same way forever. And so you're twenty twenty five, 293 00:16:32,640 --> 00:16:34,600 Speaker 2: it must be out of date by now and they 294 00:16:34,600 --> 00:16:35,600 Speaker 2: haven't figured this out. 295 00:16:35,720 --> 00:16:39,320 Speaker 3: And if we could just cough cough, journalism. 296 00:16:38,960 --> 00:16:42,120 Speaker 2: Yeah right, if we could just hire wiz kids, then 297 00:16:42,200 --> 00:16:45,520 Speaker 2: we could reinvent the industry from first principles and just 298 00:16:45,560 --> 00:16:47,920 Speaker 2: do a much better job than the legacy of things. 299 00:16:48,080 --> 00:16:51,720 Speaker 2: And I think, whether it's healthcare or whether it's industrial 300 00:16:51,800 --> 00:16:55,080 Speaker 2: stuff that we see Silicon Valley getting excited about right now, 301 00:16:55,200 --> 00:16:58,360 Speaker 2: it just feels like the default assumption must be that 302 00:16:58,400 --> 00:17:01,680 Speaker 2: the veterans are doing something wrong, and with pure brain power, 303 00:17:01,720 --> 00:17:03,120 Speaker 2: we could figure out what that thing is. 304 00:17:04,040 --> 00:17:07,439 Speaker 5: I think that is a reasonable characterization or what people 305 00:17:07,520 --> 00:17:10,879 Speaker 5: say today in a lot of different places, and I 306 00:17:10,960 --> 00:17:16,320 Speaker 5: don't think it's true in my sector. But as with 307 00:17:16,720 --> 00:17:23,240 Speaker 5: every conversation about AI, the challenge is balancing two ideas 308 00:17:23,280 --> 00:17:25,640 Speaker 5: that can be true at the same time but seem contradictory. 309 00:17:25,640 --> 00:17:28,960 Speaker 5: And one is that this stuff is amazing, and it is, 310 00:17:29,320 --> 00:17:34,520 Speaker 5: particularly in life sciences, responsible for some true breakthroughs, like 311 00:17:34,600 --> 00:17:37,960 Speaker 5: the breakthrough that won Demisisabus at deep Mind the Nobel 312 00:17:38,000 --> 00:17:43,200 Speaker 5: Prize last year with alpha fold, which was this amazing 313 00:17:43,240 --> 00:17:47,040 Speaker 5: discovery they made that using machine learning models you could 314 00:17:47,119 --> 00:17:50,520 Speaker 5: solve a problem that had gone unsolved for decades, which 315 00:17:50,600 --> 00:17:53,840 Speaker 5: was can you predict from the sequence of a protein's 316 00:17:53,880 --> 00:17:58,960 Speaker 5: amino acids what three dimensional shape a protein is going 317 00:17:59,000 --> 00:18:04,520 Speaker 5: to take in a physical environment. And I just threw 318 00:18:04,560 --> 00:18:07,919 Speaker 5: around a bunch of terms of art. But this is 319 00:18:08,119 --> 00:18:11,879 Speaker 5: fundamental to drug development and drug discovery. So it's like, 320 00:18:11,920 --> 00:18:15,560 Speaker 5: on the one hand, you can't deny these breakthroughs that 321 00:18:15,560 --> 00:18:19,359 Speaker 5: we're experiencing. You can't deny that when you talk to Gemini, 322 00:18:19,960 --> 00:18:22,800 Speaker 5: it's staggering what this thing can do. I mean, I'm 323 00:18:22,840 --> 00:18:25,560 Speaker 5: sitting there all day having it teach me about asset 324 00:18:25,600 --> 00:18:28,399 Speaker 5: pricing models or whatever else I'm interested in. But at 325 00:18:28,440 --> 00:18:32,840 Speaker 5: the same time, the religious movement that is powering all 326 00:18:32,880 --> 00:18:35,440 Speaker 5: of the investment and a lot of the entrepreneurship here 327 00:18:35,840 --> 00:18:39,520 Speaker 5: across industries is full of hot air and is making 328 00:18:40,040 --> 00:18:43,360 Speaker 5: claims that are preposterous unless you are a zealot. 329 00:18:44,240 --> 00:18:47,000 Speaker 2: Just real quickly, if we had been having this conversation 330 00:18:47,359 --> 00:18:49,600 Speaker 2: in a month ago, would you have said Gemini or 331 00:18:49,600 --> 00:18:52,119 Speaker 2: would you have said CHADJPT Because I switched from chad 332 00:18:52,200 --> 00:18:54,439 Speaker 2: JPT to Gemini in the last month, and I'm just 333 00:18:54,480 --> 00:18:57,800 Speaker 2: curious whether you're what you would have said a month ago. 334 00:18:57,920 --> 00:18:59,960 Speaker 5: A month ago, I was using all of them. Now 335 00:19:00,040 --> 00:19:01,119 Speaker 5: I'm only using Gemen. 336 00:19:01,160 --> 00:19:03,080 Speaker 2: It's interesting, all right, good data plant. 337 00:19:03,119 --> 00:19:06,239 Speaker 1: Okay, talk to us about the choke points when it 338 00:19:06,280 --> 00:19:09,760 Speaker 1: comes to new drug development, because I imagine, okay, maybe 339 00:19:09,800 --> 00:19:13,040 Speaker 1: AI machine learning can speed up some of the research 340 00:19:13,160 --> 00:19:16,199 Speaker 1: or discovery process, but even after that, you have to 341 00:19:16,240 --> 00:19:20,000 Speaker 1: go through these really long clinical trials that in some 342 00:19:20,119 --> 00:19:24,719 Speaker 1: cases take decades. What are the major I guess, like 343 00:19:24,880 --> 00:19:27,959 Speaker 1: stumbling blocks to getting something to the market. 344 00:19:29,359 --> 00:19:33,639 Speaker 5: Your question held the answer. The process of taking a 345 00:19:33,680 --> 00:19:36,399 Speaker 5: drug from idea to the market. You can think of 346 00:19:36,480 --> 00:19:41,119 Speaker 5: as a funnel. To just use a visual analogy and 347 00:19:41,200 --> 00:19:43,840 Speaker 5: into the top of the funnel, go all the millions 348 00:19:43,840 --> 00:19:46,359 Speaker 5: of ideas that people have, and then as you go 349 00:19:46,400 --> 00:19:49,400 Speaker 5: down the funnel, you are spending progressively more and more 350 00:19:49,440 --> 00:19:54,040 Speaker 5: and more money to prove two things. The first is 351 00:19:54,080 --> 00:19:57,480 Speaker 5: that the drug is safe and won't harm or kill people, 352 00:19:57,960 --> 00:20:00,960 Speaker 5: and the second is that the drug works and actually 353 00:20:01,040 --> 00:20:05,800 Speaker 5: modifies the disease that you're trying to treat. And the 354 00:20:05,880 --> 00:20:10,080 Speaker 5: tragedy of our moment is that the only way to 355 00:20:10,119 --> 00:20:13,000 Speaker 5: figure out if drugs are safe and effective is to 356 00:20:13,160 --> 00:20:17,440 Speaker 5: try them in human beings, living, breathing human beings, and 357 00:20:17,480 --> 00:20:23,600 Speaker 5: that is extraordinarily time consuming and incredibly expensive financially. So 358 00:20:24,680 --> 00:20:27,959 Speaker 5: I wish for the day when AI is able to 359 00:20:28,280 --> 00:20:33,080 Speaker 5: fully simulate an accurate human in the computer and we 360 00:20:33,280 --> 00:20:36,159 Speaker 5: don't need to do clinical trials on real people. But 361 00:20:36,359 --> 00:20:40,280 Speaker 5: until that moment, the vast majority of the cost and 362 00:20:40,400 --> 00:20:45,720 Speaker 5: expense and time that is involved in drug discovery remains 363 00:20:45,760 --> 00:20:49,800 Speaker 5: with us. So most of the AI technologies that people 364 00:20:49,800 --> 00:20:54,640 Speaker 5: are excited about really would have the effect of putting 365 00:20:54,720 --> 00:20:58,240 Speaker 5: more good ideas into the top of the funnel, But 366 00:20:58,520 --> 00:21:02,080 Speaker 5: unfortunately that doesn't solve a problem that we have. We 367 00:21:02,240 --> 00:21:05,960 Speaker 5: already are drowning in good ideas, and the issue is 368 00:21:06,040 --> 00:21:08,399 Speaker 5: exactly the choke point or bottleneck that you're referring to. 369 00:21:08,480 --> 00:21:11,200 Speaker 2: This is really there's actually two questions. First of all, 370 00:21:12,040 --> 00:21:14,760 Speaker 2: is there low hanging fruit from a regulatory side to 371 00:21:14,840 --> 00:21:17,679 Speaker 2: accelerate that process. People like to fathom, oh, the FDA 372 00:21:17,840 --> 00:21:20,000 Speaker 2: must be super There's another area people will say, oh, 373 00:21:20,000 --> 00:21:22,439 Speaker 2: the FDA must be super slow and do things one 374 00:21:22,480 --> 00:21:24,679 Speaker 2: way we could expede this up. I don't know. Is 375 00:21:24,720 --> 00:21:27,760 Speaker 2: there somewhere along the process where like from a regulatory 376 00:21:27,760 --> 00:21:31,080 Speaker 2: standpoint or some other thing, that the either the cost 377 00:21:31,119 --> 00:21:33,840 Speaker 2: of the timelines could shrink or is it mostly still 378 00:21:33,880 --> 00:21:36,800 Speaker 2: just the reality of we have to test these things 379 00:21:36,800 --> 00:21:38,680 Speaker 2: on humans and that's costly going, it takes time. 380 00:21:39,000 --> 00:21:40,800 Speaker 5: Well, we don't need to do anything. We could have 381 00:21:40,880 --> 00:21:43,760 Speaker 5: no FDA and anyone who has a good drug idea 382 00:21:43,920 --> 00:21:46,760 Speaker 5: just launches it commercially and if some people die from 383 00:21:46,840 --> 00:21:49,080 Speaker 5: that and it doesn't do anything, that's fine. By the way. 384 00:21:49,160 --> 00:21:52,840 Speaker 5: That's kind of like the supplement yeah, time and the 385 00:21:52,880 --> 00:21:56,560 Speaker 5: way we deal with it. Milton Friedman famously thought that 386 00:21:56,600 --> 00:22:00,159 Speaker 5: the FDA should only assess the safety of drugs, and 387 00:22:00,200 --> 00:22:02,199 Speaker 5: if a drug was proven safe, put it on the 388 00:22:02,240 --> 00:22:04,959 Speaker 5: market and let the market dictate whether people determine they 389 00:22:04,960 --> 00:22:07,520 Speaker 5: should pay for it based on their lived experience with 390 00:22:07,600 --> 00:22:12,199 Speaker 5: whether it works or not. Now, I just personally prefer 391 00:22:12,280 --> 00:22:15,600 Speaker 5: to live in a world where if I've got something 392 00:22:15,800 --> 00:22:19,760 Speaker 5: that's going wrong, I can more or less trust that 393 00:22:19,880 --> 00:22:22,680 Speaker 5: the product my doctor gives me has been proven safe 394 00:22:22,720 --> 00:22:27,280 Speaker 5: and effective. And that reflects that we have today a 395 00:22:27,320 --> 00:22:30,960 Speaker 5: pretty high bar for approving drugs. But we could certainly 396 00:22:31,040 --> 00:22:33,120 Speaker 5: lower that bar. We could change the type of data 397 00:22:33,160 --> 00:22:36,160 Speaker 5: that the FDA requires, And that's what's happening in China. 398 00:22:36,200 --> 00:22:38,440 Speaker 5: By the way, I know you mentioned this other episode 399 00:22:38,480 --> 00:22:41,679 Speaker 5: you did with my friend Tim. In China, the regulatory 400 00:22:41,760 --> 00:22:45,080 Speaker 5: environment has been moving pretty rapidly, and they've done that 401 00:22:45,119 --> 00:22:48,080 Speaker 5: deliberately because they want to be more productive. They want 402 00:22:48,119 --> 00:22:51,199 Speaker 5: to approve more drugs, and they're trying to strike that 403 00:22:51,320 --> 00:22:56,240 Speaker 5: balance between being prolific and holding things to a high 404 00:22:56,280 --> 00:22:59,000 Speaker 5: standard at the same time. So you know, we'll see. 405 00:22:59,080 --> 00:23:00,920 Speaker 2: And I just want to up and one other thing 406 00:23:01,040 --> 00:23:03,719 Speaker 2: you said, because I think it seems important someone like 407 00:23:03,760 --> 00:23:06,119 Speaker 2: Sam Altman, when he talks about the promise of AI, 408 00:23:06,880 --> 00:23:09,640 Speaker 2: a lot of it is like, Oh, we could find 409 00:23:09,680 --> 00:23:12,360 Speaker 2: the next drug that cures cancer. In the meantime, we're 410 00:23:12,400 --> 00:23:14,400 Speaker 2: going to make this sort of slot machine that makes 411 00:23:14,440 --> 00:23:16,960 Speaker 2: weird videos, et cetera. But really we're trying to find 412 00:23:17,000 --> 00:23:19,280 Speaker 2: these wonder drugs in long term. But for what it 413 00:23:19,359 --> 00:23:23,399 Speaker 2: sounds like you said, candidates are not where the shortages like. 414 00:23:23,480 --> 00:23:27,800 Speaker 2: The issue is not that we lack sufficiently a number 415 00:23:27,840 --> 00:23:31,959 Speaker 2: of sufficiently promising molecule combinations. The scarcity is not on 416 00:23:32,040 --> 00:23:33,280 Speaker 2: that at that point. 417 00:23:33,560 --> 00:23:35,199 Speaker 5: That's my view. I mean, I'll steal me in the 418 00:23:35,200 --> 00:23:37,880 Speaker 5: other argument. The other argument would be, well, look, Dea, 419 00:23:38,040 --> 00:23:40,399 Speaker 5: you said ten minutes ago that these drugs have a 420 00:23:40,480 --> 00:23:44,399 Speaker 5: five percent probability of working from the outset. You know, 421 00:23:44,560 --> 00:23:49,119 Speaker 5: if we had better predictive models that told us certain 422 00:23:49,200 --> 00:23:52,240 Speaker 5: candidates were much more likely to work than others, wouldn't 423 00:23:52,280 --> 00:23:55,920 Speaker 5: that be great? And my rejoinder to that is yes, 424 00:23:56,440 --> 00:23:59,280 Speaker 5: but how would we know that we've done that? Meaning 425 00:23:59,800 --> 00:24:02,639 Speaker 5: if the three of us tomorrow invented a black box 426 00:24:02,680 --> 00:24:07,160 Speaker 5: that produced drug candidate concepts, and we were certain that 427 00:24:07,320 --> 00:24:10,439 Speaker 5: our model doubled the prior probability from five percent to 428 00:24:10,520 --> 00:24:15,560 Speaker 5: ten percent, that would be a truly revolutionary innovation on 429 00:24:15,640 --> 00:24:20,520 Speaker 5: our part. But how many candidates from that model would 430 00:24:20,520 --> 00:24:22,560 Speaker 5: we need to take all the way to an approval 431 00:24:23,160 --> 00:24:28,560 Speaker 5: before we had statistically demonstrated that we in fact increased 432 00:24:28,600 --> 00:24:34,480 Speaker 5: the rate of success. So people may have already cracked 433 00:24:34,520 --> 00:24:37,320 Speaker 5: that code. You know, Google may have already cracked that code. 434 00:24:37,320 --> 00:24:40,359 Speaker 5: Sam Waltman may have cracked that code. But someone's going 435 00:24:40,440 --> 00:24:43,280 Speaker 5: to need to spend thirty billion dollars developing the drug 436 00:24:43,320 --> 00:24:46,320 Speaker 5: ideas he has before we know whether he's done that, 437 00:24:46,880 --> 00:24:51,240 Speaker 5: And until that money is spent, it's pure conjecture and salesmanship. 438 00:25:06,560 --> 00:25:10,719 Speaker 1: How are you actually evaluating opportunities in the US against 439 00:25:11,160 --> 00:25:14,320 Speaker 1: China competition, Because you know, if clinical trials are the 440 00:25:14,320 --> 00:25:17,399 Speaker 1: major choke point, and if China seems to be trying 441 00:25:17,480 --> 00:25:21,240 Speaker 1: to make that process as efficient as possible, it seems 442 00:25:21,320 --> 00:25:23,800 Speaker 1: like maybe they have an advantage. 443 00:25:23,960 --> 00:25:27,119 Speaker 5: I mean, they definitely have an advantage. And if I 444 00:25:27,240 --> 00:25:29,560 Speaker 5: had to make a bet today on our sector, it 445 00:25:29,600 --> 00:25:31,520 Speaker 5: would be that China is going to be the big 446 00:25:31,560 --> 00:25:34,160 Speaker 5: story over the next decade or two. I think it's 447 00:25:34,200 --> 00:25:39,400 Speaker 5: a fundamental structural shift in the global biotechnology market. And 448 00:25:39,880 --> 00:25:43,240 Speaker 5: their advantages are multiple. I mean, their advantages are regulatory, 449 00:25:43,840 --> 00:25:47,440 Speaker 5: they relate to the personnel. We have lost an amazing 450 00:25:47,640 --> 00:25:52,160 Speaker 5: amount of talent who was educated here in our graduate 451 00:25:52,160 --> 00:25:56,280 Speaker 5: schools and now has gone back to China. And furthermore, 452 00:25:56,840 --> 00:26:01,240 Speaker 5: they are able to develop things in the clinic, which 453 00:26:01,280 --> 00:26:04,320 Speaker 5: is to say, do clinical trials a lot faster and 454 00:26:04,359 --> 00:26:08,520 Speaker 5: at a much higher volume than our infrastructure can handle. 455 00:26:08,880 --> 00:26:11,480 Speaker 5: So they've got big advantages. Now, how do I think 456 00:26:11,480 --> 00:26:14,159 Speaker 5: about investing in the US versus China. I don't that 457 00:26:14,280 --> 00:26:17,119 Speaker 5: much because I don't speak Mandarin, and I think it 458 00:26:17,119 --> 00:26:19,800 Speaker 5: would be really difficult for me to invest in China today. 459 00:26:20,400 --> 00:26:24,680 Speaker 5: But increasingly companies in the US are starting to outsource 460 00:26:25,160 --> 00:26:28,240 Speaker 5: certain parts of the research process to Chinese companies, and 461 00:26:28,320 --> 00:26:31,880 Speaker 5: increasingly they're going to outsource parts of the clinical development process, 462 00:26:32,080 --> 00:26:34,399 Speaker 5: the clinical trials to China. That's going to make a 463 00:26:34,480 --> 00:26:35,320 Speaker 5: huge impact on the AUA. 464 00:26:35,400 --> 00:26:38,240 Speaker 1: Yeah, this was actually my next question. I guess how 465 00:26:38,280 --> 00:26:42,760 Speaker 1: translatable is a successful clinical trial in China to a 466 00:26:42,880 --> 00:26:44,080 Speaker 1: market like the US. 467 00:26:44,800 --> 00:26:49,159 Speaker 5: Three or four years ago, what both investors and regulators 468 00:26:49,200 --> 00:26:51,560 Speaker 5: in the US would have told you was that it's 469 00:26:51,560 --> 00:26:54,520 Speaker 5: not that translatable because they're liars and they make up 470 00:26:54,560 --> 00:26:57,880 Speaker 5: all the data, and it's rampant with fraud. And there 471 00:26:57,880 --> 00:26:59,400 Speaker 5: may have been some truth to that, but I think 472 00:26:59,400 --> 00:27:02,600 Speaker 5: there was also a good amount of racism and what 473 00:27:03,400 --> 00:27:05,600 Speaker 5: sort of woke everyone up in the past couple of 474 00:27:05,680 --> 00:27:10,120 Speaker 5: years was that some very significant clinical trials were done 475 00:27:10,160 --> 00:27:13,840 Speaker 5: in China. People were suspicious of the data. Then they 476 00:27:14,640 --> 00:27:17,359 Speaker 5: replicated those trials in Europe or the United States and 477 00:27:17,400 --> 00:27:21,280 Speaker 5: got very similar data, and folks thought, WHOA, maybe they're 478 00:27:21,320 --> 00:27:25,880 Speaker 5: not so bad at this. So I think decreasingly people 479 00:27:26,200 --> 00:27:31,680 Speaker 5: are skeptical, and which said less awkwardly, people are trusting 480 00:27:31,760 --> 00:27:34,800 Speaker 5: more and more what's coming out of China. And it's 481 00:27:34,800 --> 00:27:37,240 Speaker 5: incumbent upon the Chinese to the extent that they want 482 00:27:37,240 --> 00:27:42,120 Speaker 5: this to be a major strategy to continue enhancing people's 483 00:27:42,160 --> 00:27:44,800 Speaker 5: trust in the quality of their work and their data. 484 00:27:45,800 --> 00:27:48,760 Speaker 5: If they can do that. I think it's a global industry. 485 00:27:48,760 --> 00:27:50,800 Speaker 5: A lot of the companies are multinationals. They don't care 486 00:27:50,800 --> 00:27:52,760 Speaker 5: if the drug comes out of the US or comes 487 00:27:52,760 --> 00:27:54,240 Speaker 5: out of China. 488 00:27:54,280 --> 00:27:57,120 Speaker 2: This is a really good question about private or VC 489 00:27:57,280 --> 00:28:01,080 Speaker 2: stage investing per se, but about biotech more broadly. You know, 490 00:28:01,119 --> 00:28:03,200 Speaker 2: I've covered the stock market for a long time in 491 00:28:03,280 --> 00:28:05,840 Speaker 2: various ways. I've never spent any time really getting to 492 00:28:05,880 --> 00:28:09,520 Speaker 2: know a publicly traded biotech doc is, are you insane 493 00:28:09,720 --> 00:28:13,800 Speaker 2: to try to invest in biotech if you don't have PhD? 494 00:28:14,160 --> 00:28:17,800 Speaker 2: Level understanding of biology, Like, can anyone have alpha in 495 00:28:17,840 --> 00:28:21,280 Speaker 2: this industry if they don't actually know science. 496 00:28:21,720 --> 00:28:22,600 Speaker 5: I think it's tough. 497 00:28:23,400 --> 00:28:25,000 Speaker 2: Yeah, it seems very tough to you me. 498 00:28:25,200 --> 00:28:29,359 Speaker 5: Look, yeah, I mean, here's the thing. What's really interesting 499 00:28:29,400 --> 00:28:34,639 Speaker 5: about biotech in the public markets is it's abundantly clear 500 00:28:34,840 --> 00:28:39,360 Speaker 5: that active investors can have alpha in biotech, whereas as 501 00:28:39,400 --> 00:28:42,360 Speaker 5: you guys know, that is not clear in the rest 502 00:28:42,400 --> 00:28:46,560 Speaker 5: of the public equity landscape. And so whereas there is 503 00:28:46,840 --> 00:28:52,280 Speaker 5: very little, if not negative persistence of performance among active 504 00:28:52,400 --> 00:28:56,640 Speaker 5: equity managers broadly, in biotech, you have a small number 505 00:28:56,640 --> 00:29:00,000 Speaker 5: of firms that have been doing great for sometimes decades, 506 00:29:00,680 --> 00:29:01,320 Speaker 5: and it. 507 00:29:01,120 --> 00:29:03,680 Speaker 2: Is and they all have real science expertise on Stowe 508 00:29:03,760 --> 00:29:04,080 Speaker 2: they do. 509 00:29:04,480 --> 00:29:08,480 Speaker 5: And you know, the dynamic between them and the generalists, 510 00:29:08,520 --> 00:29:11,520 Speaker 5: so to speak, is that they do a lot of 511 00:29:11,600 --> 00:29:14,320 Speaker 5: very detailed work to make sense of the information you 512 00:29:14,400 --> 00:29:17,880 Speaker 5: need to process to value these companies and to assess 513 00:29:17,920 --> 00:29:21,640 Speaker 5: their probability of success. And then the generalists often follow 514 00:29:22,000 --> 00:29:25,880 Speaker 5: those specialists into these names and the fortunes of the 515 00:29:25,960 --> 00:29:29,680 Speaker 5: industry in these cycles, like we're coming out of a 516 00:29:29,760 --> 00:29:32,560 Speaker 5: four year great depression for biotech, I should just mention 517 00:29:33,600 --> 00:29:37,520 Speaker 5: a lot of those fortunes ride on the sector rotations 518 00:29:37,560 --> 00:29:41,120 Speaker 5: of the generalists. So the specialists have to stick with 519 00:29:41,160 --> 00:29:43,840 Speaker 5: biotech because that's what they do. But whether or not 520 00:29:43,960 --> 00:29:47,840 Speaker 5: companies can IPO, whether or not companies can fund their 521 00:29:47,840 --> 00:29:51,120 Speaker 5: next clinical trial, is largely a function of whether the 522 00:29:51,320 --> 00:29:54,160 Speaker 5: generalists are in the sector at that moment or not. 523 00:29:54,720 --> 00:29:58,360 Speaker 5: And we're just in the midst of the early rotation 524 00:29:58,520 --> 00:29:59,280 Speaker 5: of generalists. 525 00:29:59,320 --> 00:30:02,960 Speaker 1: Back into wait, the biotech investing downturn, was that just 526 00:30:03,000 --> 00:30:05,360 Speaker 1: a function of higher interest rates or was something else 527 00:30:05,400 --> 00:30:05,760 Speaker 1: going on? 528 00:30:06,880 --> 00:30:10,040 Speaker 5: It was a confluence of everything that could go wrong 529 00:30:10,040 --> 00:30:12,920 Speaker 5: at the same time. It was higher interest rates, which 530 00:30:13,400 --> 00:30:16,440 Speaker 5: really punished these biotech stocks relative to other companies because 531 00:30:16,520 --> 00:30:18,960 Speaker 5: you know, no cash flows for ten years and then 532 00:30:19,000 --> 00:30:21,480 Speaker 5: a big bowl of some money. So these companies are 533 00:30:21,600 --> 00:30:26,440 Speaker 5: very sensitive to discount rates. Add to that this dynamic 534 00:30:26,480 --> 00:30:29,680 Speaker 5: where the generalists had gotten out of the sector, that 535 00:30:30,320 --> 00:30:35,479 Speaker 5: ultimately is fatal. And then consider the fact that we 536 00:30:35,560 --> 00:30:38,760 Speaker 5: had such a come down after the sugar high of COVID. 537 00:30:38,880 --> 00:30:42,240 Speaker 5: So obviously during COVID there was this moment of clarity 538 00:30:42,600 --> 00:30:48,479 Speaker 5: where everyone for a second recognized that this sector is 539 00:30:48,920 --> 00:30:51,040 Speaker 5: for each of us at some point in our lives. 540 00:30:51,160 --> 00:30:53,960 Speaker 5: The most important thing that happens in the global economy. 541 00:30:54,960 --> 00:30:59,200 Speaker 5: Like without the biotech industry, you know, we're all in trouble. 542 00:30:59,640 --> 00:31:01,880 Speaker 5: And we kind of go through life pretending like we're 543 00:31:01,920 --> 00:31:04,640 Speaker 5: never going to need this industry, and then you get cancer, 544 00:31:04,760 --> 00:31:07,200 Speaker 5: your dad gets cancer, your kid gets some rare disease, 545 00:31:07,360 --> 00:31:09,400 Speaker 5: and you go, holy cow. I wish I had thought 546 00:31:09,400 --> 00:31:11,920 Speaker 5: about this before. Maybe all these people who are doing 547 00:31:11,960 --> 00:31:15,680 Speaker 5: this with their lives are not evil bloodsuckers who Bernie 548 00:31:15,680 --> 00:31:21,160 Speaker 5: Sanders needs to take down. And that is I think 549 00:31:21,240 --> 00:31:24,600 Speaker 5: part of what dawned on people during COVID, when we 550 00:31:24,680 --> 00:31:29,600 Speaker 5: all were vulnerable and we all were yearning for a solution. 551 00:31:30,040 --> 00:31:33,440 Speaker 1: Talk a little bit more about, I guess, the financial 552 00:31:33,440 --> 00:31:38,280 Speaker 1: incentives about actually developing new drugs. So we all know 553 00:31:38,600 --> 00:31:41,800 Speaker 1: the story of if you're based in the US, you 554 00:31:41,840 --> 00:31:45,080 Speaker 1: can go to Mexico or wherever else and buy the 555 00:31:45,120 --> 00:31:48,040 Speaker 1: same medicine for like five bucks as opposed to five 556 00:31:48,120 --> 00:31:51,440 Speaker 1: hundred dollars or perhaps even more in the US. And 557 00:31:51,520 --> 00:31:54,719 Speaker 1: the argument for that seems to be that, well, you know, 558 00:31:54,800 --> 00:31:58,680 Speaker 1: the big pharma companies need to be rewarded for all 559 00:31:58,680 --> 00:32:01,160 Speaker 1: the research and the effort, the risk that they actually 560 00:32:01,200 --> 00:32:04,280 Speaker 1: take on and for some reason, the US seems to 561 00:32:04,320 --> 00:32:06,440 Speaker 1: be the designated place to do that. 562 00:32:07,120 --> 00:32:11,920 Speaker 3: But like, why why? Is my question? Why US drugs? 563 00:32:12,720 --> 00:32:18,480 Speaker 5: Well, the big bounty for a drug development company is 564 00:32:18,520 --> 00:32:22,920 Speaker 5: the United States market, and that's partly because we as 565 00:32:22,960 --> 00:32:26,200 Speaker 5: a society have decided that we want all the new, 566 00:32:26,320 --> 00:32:30,640 Speaker 5: most advanced drugs. We want them first, and we don't 567 00:32:30,760 --> 00:32:34,320 Speaker 5: want to deny them to people who could benefit from them. Now, 568 00:32:34,640 --> 00:32:38,440 Speaker 5: the price we pay for those commitments is that our 569 00:32:38,560 --> 00:32:41,800 Speaker 5: drug prices are higher than the prices in other countries. 570 00:32:42,600 --> 00:32:44,960 Speaker 5: And the reason their prices are lower is because their 571 00:32:45,000 --> 00:32:48,840 Speaker 5: governments choose which drugs their people will have access to, 572 00:32:49,760 --> 00:32:53,840 Speaker 5: and they make those choices and then negotiate the prices 573 00:32:53,880 --> 00:32:56,240 Speaker 5: with the companies, and they basically will say to Pfizer 574 00:32:56,720 --> 00:32:59,600 Speaker 5: or Astro Zeneca, look, if you want your drugs sold 575 00:32:59,640 --> 00:33:02,240 Speaker 5: here in Japan, you're going to take the price that 576 00:33:02,320 --> 00:33:05,240 Speaker 5: we give you, and then the pharma company decides whether 577 00:33:05,320 --> 00:33:08,120 Speaker 5: they want to accept that deal or not. Now, the 578 00:33:08,240 --> 00:33:13,160 Speaker 5: United States absolutely could choose as a civilization to negotiate 579 00:33:13,200 --> 00:33:16,840 Speaker 5: in that same manner. Our government could make the choice 580 00:33:16,960 --> 00:33:20,200 Speaker 5: for US as to exactly what we're willing to pay 581 00:33:20,240 --> 00:33:23,080 Speaker 5: for every drug. There would be two consequences to that. 582 00:33:23,440 --> 00:33:28,240 Speaker 5: One is that we would go without certain drugs. The 583 00:33:28,280 --> 00:33:30,880 Speaker 5: second is that a lot of drugs would not even 584 00:33:30,920 --> 00:33:34,520 Speaker 5: be developed in the first place, because the total pool 585 00:33:34,560 --> 00:33:37,760 Speaker 5: of profits available to drug companies would be much smaller. 586 00:33:38,480 --> 00:33:42,760 Speaker 5: And so I don't know that there is any perfect 587 00:33:42,960 --> 00:33:47,640 Speaker 5: answer to how much pharmaceutical innovation we should have in 588 00:33:47,680 --> 00:33:50,560 Speaker 5: the world. We get to choose how much innovation we 589 00:33:50,600 --> 00:33:53,719 Speaker 5: want to occur, and the way we choose that is 590 00:33:53,840 --> 00:33:57,280 Speaker 5: by determining the size of that bounty that exists. How 591 00:33:57,320 --> 00:34:01,200 Speaker 5: big is the profit pool we want to allow for 592 00:34:01,320 --> 00:34:04,080 Speaker 5: innovative drug development, and a lot of that is driven 593 00:34:04,120 --> 00:34:07,120 Speaker 5: by our patent law. Remember, a patent in this industry 594 00:34:07,240 --> 00:34:10,680 Speaker 5: is a legalized monopoly. So we give drug companies a 595 00:34:10,760 --> 00:34:13,759 Speaker 5: legal monopoly for a limited period of time, and that 596 00:34:13,800 --> 00:34:16,279 Speaker 5: dictates how much money they're able to make off of 597 00:34:16,320 --> 00:34:19,160 Speaker 5: a new drug. We could shorten the patent life, and 598 00:34:19,200 --> 00:34:21,640 Speaker 5: that would reduce the profit pool and you'd have less 599 00:34:21,680 --> 00:34:24,560 Speaker 5: drug development. We could remove the patent life, you could 600 00:34:24,600 --> 00:34:27,520 Speaker 5: have a permanent monopoly, and believe me, the industry would 601 00:34:27,520 --> 00:34:30,799 Speaker 5: double or triple overnight. So it's a choice we have 602 00:34:30,840 --> 00:34:32,280 Speaker 5: to make, and it's a civic choice. 603 00:34:32,360 --> 00:34:35,520 Speaker 2: You mentioned the Bernie Sanders of the world, who they 604 00:34:35,600 --> 00:34:37,600 Speaker 2: look at the profits of drug companies, or they look 605 00:34:37,640 --> 00:34:41,520 Speaker 2: at the prices of drugs, and you know if perhaps 606 00:34:41,520 --> 00:34:43,680 Speaker 2: if they got their way, there would be less investment 607 00:34:43,719 --> 00:34:47,680 Speaker 2: in drug discovery, etc. At all, maybe less profits. Going 608 00:34:47,680 --> 00:34:50,840 Speaker 2: back to COVID. However, there was also the backlash on 609 00:34:50,880 --> 00:34:54,640 Speaker 2: the other side, essentially just this deep skepticism towards the 610 00:34:54,680 --> 00:34:57,560 Speaker 2: premise of pharma and that what are these scientists doing 611 00:34:57,600 --> 00:34:59,880 Speaker 2: and why don't they tell you about this root the 612 00:35:00,080 --> 00:35:03,160 Speaker 2: people have used for thousands of years that cured these 613 00:35:03,200 --> 00:35:05,359 Speaker 2: diseases that they don't want you to know about so 614 00:35:05,400 --> 00:35:07,680 Speaker 2: that they can sell your stuff, talk to us about 615 00:35:07,719 --> 00:35:11,920 Speaker 2: like just this sort of political environment investing in biotech 616 00:35:12,160 --> 00:35:15,120 Speaker 2: in a political environment, or a growing number of people 617 00:35:15,640 --> 00:35:20,160 Speaker 2: frankly seem to distrust the premise of scientific expertise. 618 00:35:22,600 --> 00:35:26,719 Speaker 5: Look, it's tough, and some of the blame certainly belongs 619 00:35:26,800 --> 00:35:31,480 Speaker 5: with the scientific community, because you know, to the extent that, say, 620 00:35:31,480 --> 00:35:35,040 Speaker 5: in the early days of COVID, communication with the public 621 00:35:35,120 --> 00:35:39,000 Speaker 5: about say, the value of masks was not clear and 622 00:35:39,000 --> 00:35:41,920 Speaker 5: it was maybe even misleading. Some of the presentation of 623 00:35:42,040 --> 00:35:46,560 Speaker 5: data regarding the efficacy of the vaccines was not transparent, 624 00:35:47,080 --> 00:35:51,640 Speaker 5: and that eroded the public's trust in a very understandable way. Now, 625 00:35:52,200 --> 00:35:56,799 Speaker 5: I'm no apologist for medicine or science, because I don't 626 00:35:56,800 --> 00:36:01,160 Speaker 5: think these are privileged priesthoods. I think every person should 627 00:36:01,160 --> 00:36:06,200 Speaker 5: be able to be engaged in and understand science and medicine. 628 00:36:06,680 --> 00:36:12,960 Speaker 5: And unfortunately, the entire history of medicine began with medical 629 00:36:13,560 --> 00:36:18,640 Speaker 5: science as total witchcraft and sorcery. So if you go 630 00:36:18,719 --> 00:36:23,359 Speaker 5: back to antiquity, the first people calling themselves doctors objectively 631 00:36:23,440 --> 00:36:28,440 Speaker 5: understood nothing. So this was pure sophistry from the beginning. 632 00:36:29,040 --> 00:36:33,120 Speaker 5: And we are on this long journey through which medicine 633 00:36:33,200 --> 00:36:37,880 Speaker 5: is going from total bs and witchcraft to slowly turning 634 00:36:38,000 --> 00:36:41,760 Speaker 5: into a real science, something that deserves to be called science. 635 00:36:42,560 --> 00:36:48,200 Speaker 5: Medicine is filled with common practices that are not rigorously 636 00:36:48,239 --> 00:36:53,000 Speaker 5: based on evidence, and that is symptomatic of where we 637 00:36:53,080 --> 00:36:56,759 Speaker 5: are in that journey that I'm describing. So I'm an 638 00:36:56,760 --> 00:37:01,319 Speaker 5: advocate for medicine becoming always more and more scientific. I 639 00:37:01,360 --> 00:37:05,440 Speaker 5: believe that scientific policymakers, scientists, and academia need to do 640 00:37:05,440 --> 00:37:09,080 Speaker 5: a much better job communicating transparently, and that's the only 641 00:37:09,120 --> 00:37:11,680 Speaker 5: way to engender that kind of trust you're talking about, Joe, 642 00:37:12,160 --> 00:37:15,480 Speaker 5: and the trust is critical because it is what gives 643 00:37:15,520 --> 00:37:17,800 Speaker 5: permission to this industry's existence. 644 00:37:17,920 --> 00:37:21,600 Speaker 1: Wait, talk more about I guess autonomy when it comes 645 00:37:21,640 --> 00:37:25,040 Speaker 1: to medical decisions, because this is, you know, a big 646 00:37:25,160 --> 00:37:28,840 Speaker 1: culture shock of non Americans who come to the US 647 00:37:29,080 --> 00:37:33,120 Speaker 1: is drug adverts on TV where they you know, here's 648 00:37:33,160 --> 00:37:35,719 Speaker 1: this great drug, and then they read off all the 649 00:37:36,120 --> 00:37:38,800 Speaker 1: risk factors really really quickly, and one of the risks 650 00:37:38,880 --> 00:37:43,239 Speaker 1: is always death or so your brain damage. You're something yeah, 651 00:37:43,280 --> 00:37:47,400 Speaker 1: and I'm always like, again, I've never asked for a 652 00:37:47,520 --> 00:37:51,200 Speaker 1: drug that I've seen on TV. I do remember when 653 00:37:51,239 --> 00:37:53,840 Speaker 1: I when I first came to the US as an adult, 654 00:37:54,000 --> 00:37:56,440 Speaker 1: I went to get a prescription. I found a new 655 00:37:56,520 --> 00:37:59,640 Speaker 1: doctor to do that, and I said I needed this 656 00:37:59,640 --> 00:38:01,520 Speaker 1: thing and the doctor was like, oh, well, we have 657 00:38:01,560 --> 00:38:04,120 Speaker 1: to run all these medical tests before we can give 658 00:38:04,160 --> 00:38:07,520 Speaker 1: you that, and it ended up in a big argument 659 00:38:07,680 --> 00:38:10,759 Speaker 1: with my insurance provider. And I remember talking to people 660 00:38:10,760 --> 00:38:12,279 Speaker 1: about that and they were like, well, you should have 661 00:38:12,280 --> 00:38:15,680 Speaker 1: pushed back against the doctor about the testing, and I 662 00:38:15,719 --> 00:38:17,560 Speaker 1: was like, what do I know? I just do what 663 00:38:17,600 --> 00:38:21,280 Speaker 1: the doctor tells me, right, how much say should people? 664 00:38:21,520 --> 00:38:22,000 Speaker 5: Actually? 665 00:38:22,880 --> 00:38:25,759 Speaker 1: It sounds weird but you know, given the lack of experience, 666 00:38:25,760 --> 00:38:29,120 Speaker 1: and given the way other systems work around the world, 667 00:38:29,160 --> 00:38:32,400 Speaker 1: how much, say, should people have in their own medical treatment. 668 00:38:33,320 --> 00:38:36,320 Speaker 5: I think ultimately they should have almost all of the say, 669 00:38:36,640 --> 00:38:39,480 Speaker 5: it's your body. Ultimately, you have to make the best 670 00:38:39,560 --> 00:38:43,640 Speaker 5: decision you can make, and you should regard physicians, nurses, 671 00:38:43,800 --> 00:38:47,480 Speaker 5: others in the system as consultants who support you in 672 00:38:47,520 --> 00:38:52,919 Speaker 5: making wise decisions. The one caveat there, however, is that 673 00:38:53,000 --> 00:38:56,200 Speaker 5: we do socialize a lot of our medical costs, and 674 00:38:56,280 --> 00:39:00,120 Speaker 5: in many other countries they completely socialize medical costs to 675 00:39:00,160 --> 00:39:02,840 Speaker 5: the extent that you want the rest of us to 676 00:39:02,880 --> 00:39:05,799 Speaker 5: pay for your medical care. I do believe we need 677 00:39:05,840 --> 00:39:09,839 Speaker 5: to have some standards around what it's appropriate to pay for. 678 00:39:10,960 --> 00:39:13,239 Speaker 1: Yeah, I mean, at the moment, it seems like most 679 00:39:13,280 --> 00:39:17,680 Speaker 1: of those decisions are left up to the insurers, which again, 680 00:39:18,200 --> 00:39:20,880 Speaker 1: in other places in the world, it would be left 681 00:39:20,960 --> 00:39:26,480 Speaker 1: up to the governments to make those decisions. Are insurers 682 00:39:26,520 --> 00:39:28,919 Speaker 1: the sort of another limiting factor here? 683 00:39:30,960 --> 00:39:34,400 Speaker 5: I believe they are. I believe the private insurance industry 684 00:39:34,480 --> 00:39:38,160 Speaker 5: adds zero value to the United States healthcare system almost that. 685 00:39:38,239 --> 00:39:40,920 Speaker 5: I mean that may slightly overstate it, but it's close 686 00:39:40,960 --> 00:39:43,760 Speaker 5: to zero in my book, and I really don't believe 687 00:39:43,760 --> 00:39:46,480 Speaker 5: insurance companies ought to be the ones making decisions about 688 00:39:46,680 --> 00:39:48,120 Speaker 5: what medical care is appropriate. 689 00:39:48,960 --> 00:39:50,920 Speaker 2: I notice they're in the video. You have a really 690 00:39:51,000 --> 00:39:53,600 Speaker 2: nice looking microphone? Is that a musical? Is that a 691 00:39:53,640 --> 00:39:55,240 Speaker 2: microphone for recording music? 692 00:39:55,600 --> 00:39:57,400 Speaker 5: Yeah? This is the one I uh, this is the 693 00:39:57,400 --> 00:39:58,000 Speaker 5: one I sing on. 694 00:39:58,160 --> 00:40:00,200 Speaker 2: It's it's first of all, you sound good, but it 695 00:40:00,239 --> 00:40:04,279 Speaker 2: also looks a lot cooler than the typical microphone that 696 00:40:04,320 --> 00:40:06,560 Speaker 2: are that our guests to is do you do you? 697 00:40:06,640 --> 00:40:08,640 Speaker 2: Are you still? Are you still playing much music? 698 00:40:09,120 --> 00:40:11,600 Speaker 5: I do, but but thankfully now it's just for fun 699 00:40:11,680 --> 00:40:15,600 Speaker 5: not for money, which is a much more comfortable place 700 00:40:15,600 --> 00:40:16,680 Speaker 5: for it to live in my life. 701 00:40:16,719 --> 00:40:21,320 Speaker 2: Are you do you think it all about AI generated music? 702 00:40:21,760 --> 00:40:24,759 Speaker 2: And uh, the effect that that's going to have on musicians. 703 00:40:25,200 --> 00:40:27,279 Speaker 2: I feel like a lot of musicians, like the ones 704 00:40:27,400 --> 00:40:29,400 Speaker 2: that I follow on Instagram, is there have a lot 705 00:40:29,400 --> 00:40:30,359 Speaker 2: of anxiety about this. 706 00:40:32,000 --> 00:40:36,080 Speaker 5: There is anxiety, And look, I mean it's really hard 707 00:40:36,239 --> 00:40:39,720 Speaker 5: to make a living as a musician now It's always 708 00:40:39,719 --> 00:40:42,920 Speaker 5: been really hard, and you know, I can't imagine what 709 00:40:42,960 --> 00:40:45,880 Speaker 5: the lifestyle was of a loot player in George the 710 00:40:45,920 --> 00:40:49,600 Speaker 5: Second Royal Court or something but you know, it's a 711 00:40:49,719 --> 00:40:53,680 Speaker 5: tough business and it is scary when new technology comes 712 00:40:53,680 --> 00:40:55,879 Speaker 5: on the scene that might change the way you make 713 00:40:55,960 --> 00:40:58,480 Speaker 5: money as an artist. I live through that with Spotify, 714 00:40:58,520 --> 00:41:01,840 Speaker 5: people were terrified of it, and you know, fortunately what 715 00:41:01,960 --> 00:41:02,920 Speaker 5: it did. 716 00:41:02,920 --> 00:41:06,200 Speaker 2: Over done what you did at long Spotify and then 717 00:41:06,239 --> 00:41:08,200 Speaker 2: hedge their own risk to it. But keep going. 718 00:41:08,320 --> 00:41:13,680 Speaker 5: No, but looks spot Spotify by multiples increased the total 719 00:41:13,760 --> 00:41:16,479 Speaker 5: revenue of the recorded music business, which was the goal. 720 00:41:17,000 --> 00:41:21,880 Speaker 5: So mission accomplished. Now, look, AI is going to make music, 721 00:41:22,440 --> 00:41:27,480 Speaker 5: and I think like all creative people, like journalists, like investors, 722 00:41:27,520 --> 00:41:30,799 Speaker 5: everyone's going to think about how they can use it 723 00:41:30,920 --> 00:41:34,040 Speaker 5: to be more effective, have more leverage, have a cooler output. 724 00:41:34,320 --> 00:41:38,680 Speaker 5: I mean, I have very little doubt that artists are 725 00:41:38,680 --> 00:41:43,040 Speaker 5: going to do unbelievably cool and original stuff with AI tools, 726 00:41:43,040 --> 00:41:46,400 Speaker 5: and it's already happening, and for whatever reason, I have 727 00:41:46,560 --> 00:41:49,359 Speaker 5: very little trepidation that they're going to be put out 728 00:41:49,360 --> 00:41:54,200 Speaker 5: of business because I think ultimately music is communication and. 729 00:41:55,440 --> 00:41:57,439 Speaker 2: Real quickly on that when you talk about like doing 730 00:41:57,520 --> 00:42:00,040 Speaker 2: unbelievably cool things with music. So I see in the 731 00:42:00,040 --> 00:42:03,840 Speaker 2: background you have a piano, for example, and one of 732 00:42:03,920 --> 00:42:06,799 Speaker 2: the things when I think about AI music is and 733 00:42:06,920 --> 00:42:09,440 Speaker 2: actually I think, like for example, the founder of Suno 734 00:42:09,440 --> 00:42:12,000 Speaker 2: and some of these other AI music companies have talked 735 00:42:12,000 --> 00:42:14,640 Speaker 2: about this is like, well, music, learning to play instruments 736 00:42:14,680 --> 00:42:18,200 Speaker 2: is really hard, and therefore, can we separate in some 737 00:42:18,280 --> 00:42:21,399 Speaker 2: way the craft of music, the hours that someone has 738 00:42:21,440 --> 00:42:24,399 Speaker 2: to spend just doing scales on the piano before they 739 00:42:24,440 --> 00:42:27,360 Speaker 2: can compose something. Maybe you could what wouldn't it be 740 00:42:27,440 --> 00:42:30,799 Speaker 2: nice if we could just have amazing, beautiful piano sonatas 741 00:42:30,880 --> 00:42:34,760 Speaker 2: without ever having had both put in those thousands of hours. 742 00:42:34,800 --> 00:42:37,760 Speaker 2: You know, Mary had a little lamb and then so forth. 743 00:42:38,000 --> 00:42:40,640 Speaker 2: But it does raise the question to my mind of 744 00:42:40,960 --> 00:42:45,279 Speaker 2: whether one can create great art if they never had 745 00:42:45,320 --> 00:42:47,560 Speaker 2: to learn the craft. 746 00:42:48,600 --> 00:42:54,560 Speaker 5: I think the nuance with which one can communicate through 747 00:42:54,719 --> 00:43:00,840 Speaker 5: music is a function of how many options you perceive. 748 00:43:02,480 --> 00:43:06,120 Speaker 5: In other words, if you know the piano inside out, 749 00:43:07,200 --> 00:43:11,040 Speaker 5: you're aware of so many creative choices that are at 750 00:43:11,120 --> 00:43:15,560 Speaker 5: your disposal at any given moment. And if your ability 751 00:43:15,560 --> 00:43:19,880 Speaker 5: to express yourself is squeezed down to what you can 752 00:43:19,920 --> 00:43:24,480 Speaker 5: put into a natural language prompt, now those musical ideas 753 00:43:24,520 --> 00:43:28,080 Speaker 5: are having to pass through the medium of language to 754 00:43:28,160 --> 00:43:35,560 Speaker 5: be realized, and that inherently erodes the resolution and the 755 00:43:35,600 --> 00:43:38,120 Speaker 5: expansiveness with which you can express yourself. 756 00:43:38,680 --> 00:43:40,600 Speaker 1: I feel like there's a danger here that you go 757 00:43:40,719 --> 00:43:43,920 Speaker 1: off on a big orality tangent and whether ideas can 758 00:43:43,960 --> 00:43:45,879 Speaker 1: exist without words and things like that. 759 00:43:46,160 --> 00:43:49,400 Speaker 2: No, but I do think this that answered very deciightful, Like, 760 00:43:49,440 --> 00:43:52,120 Speaker 2: can you actually create great piano music if you don't 761 00:43:52,120 --> 00:43:53,840 Speaker 2: know the limits of what the piano can do and 762 00:43:53,840 --> 00:43:57,960 Speaker 2: if you're only trying to describe in language, make this 763 00:43:58,040 --> 00:44:00,480 Speaker 2: beautiful sonata? I think that's very tough. But I thought 764 00:44:00,480 --> 00:44:01,480 Speaker 2: that answer made last time. 765 00:44:02,080 --> 00:44:02,279 Speaker 3: DA. 766 00:44:02,360 --> 00:44:04,279 Speaker 1: We're gonna have to wrap it up soon. I have 767 00:44:04,400 --> 00:44:06,759 Speaker 1: one last question, and I'm gonna kind of I'm gonna 768 00:44:06,760 --> 00:44:09,279 Speaker 1: put you on the spot. Can you can you sing 769 00:44:09,320 --> 00:44:12,440 Speaker 1: a little odd lot song for us? Like three bars 770 00:44:12,480 --> 00:44:14,560 Speaker 1: of an odd lot song? I don't care if you 771 00:44:14,640 --> 00:44:18,160 Speaker 1: generate it with you know, I guess Gemini now, but. 772 00:44:18,480 --> 00:44:22,320 Speaker 5: Do you think you could Let's see? I mean, oh wow, 773 00:44:23,600 --> 00:44:27,719 Speaker 5: I'm gonna turn this. Let's see here. 774 00:44:27,719 --> 00:44:29,680 Speaker 2: This is really cool. Yeah, if you came, you are 775 00:44:29,680 --> 00:44:33,319 Speaker 2: watching the video. So he's moving his microphone, he's moving 776 00:44:33,360 --> 00:44:35,160 Speaker 2: his microphone to his keyboard. 777 00:44:35,360 --> 00:44:37,239 Speaker 5: Okay, can you see great? 778 00:44:37,280 --> 00:44:38,160 Speaker 2: Yeah? Go for it? 779 00:44:39,640 --> 00:44:41,000 Speaker 5: Okay, We're gonna try. 780 00:44:44,040 --> 00:44:50,480 Speaker 6: And it's all about it's all about It's all about Tracy, 781 00:44:51,840 --> 00:44:57,239 Speaker 6: It's all about it's all about it's all about Joe. 782 00:44:58,520 --> 00:44:59,759 Speaker 5: How's that pretty good? 783 00:45:00,239 --> 00:45:01,560 Speaker 2: You have a great voice. 784 00:45:01,680 --> 00:45:02,239 Speaker 3: Yeah it is. 785 00:45:02,680 --> 00:45:05,640 Speaker 2: Do you ever want to compose an outro song for uh? Yeah, 786 00:45:05,680 --> 00:45:06,640 Speaker 2: something like that? 787 00:45:06,800 --> 00:45:08,680 Speaker 5: Oh, I would love to. I'm I'm. I am the 788 00:45:08,719 --> 00:45:13,279 Speaker 5: composer of two or three podcast theme songs. And I 789 00:45:13,280 --> 00:45:15,719 Speaker 5: have to say, I love your guys theme music. It 790 00:45:16,480 --> 00:45:18,719 Speaker 5: gets me excited. And I got to end on this 791 00:45:18,960 --> 00:45:21,520 Speaker 5: for you guys. You know, in high school the reason 792 00:45:21,560 --> 00:45:23,319 Speaker 5: I got into investing. In high school, I was an 793 00:45:23,360 --> 00:45:25,879 Speaker 5: economics nerd. Oh yeah, I heard and my. 794 00:45:25,800 --> 00:45:28,640 Speaker 1: Hobby, we heard that you actually wrote like some a 795 00:45:28,719 --> 00:45:31,480 Speaker 1: paper that won like a prize from the FED or 796 00:45:31,520 --> 00:45:32,319 Speaker 1: something like that. 797 00:45:32,719 --> 00:45:36,440 Speaker 5: So the Federal Reserve had this nerd competition they sponsored 798 00:45:36,480 --> 00:45:39,000 Speaker 5: called FED Challenge. And I was the captain of my 799 00:45:39,080 --> 00:45:41,200 Speaker 5: high school team one year and we got to DC 800 00:45:41,320 --> 00:45:44,440 Speaker 5: and we we saw Green Span walk out with his 801 00:45:44,600 --> 00:45:48,600 Speaker 5: wizened face and hands. And anyways, if I had had 802 00:45:48,680 --> 00:45:50,680 Speaker 5: odd lots to listen to in high school, man, I 803 00:45:50,680 --> 00:45:53,040 Speaker 5: would have been in heaven because you guys touch on 804 00:45:53,560 --> 00:45:56,880 Speaker 5: so much interesting stuff, and this just has to be 805 00:45:56,920 --> 00:46:00,520 Speaker 5: the most exciting thing for young people to experience in 806 00:46:00,640 --> 00:46:03,800 Speaker 5: order to get turned onto business and economics and finance 807 00:46:03,880 --> 00:46:08,320 Speaker 5: and recognize these aren't just boring, you know, staid topics. 808 00:46:08,320 --> 00:46:09,160 Speaker 5: They're fascinating. 809 00:46:09,280 --> 00:46:11,080 Speaker 1: Thank you for saying that. I really appreciate it, and 810 00:46:11,080 --> 00:46:13,040 Speaker 1: also thank you for singing for us. I think that 811 00:46:13,080 --> 00:46:16,640 Speaker 1: was an all thoughts first. Yeah, yeah, well on the spot. 812 00:46:16,680 --> 00:46:21,879 Speaker 1: I know we've had merl Hazard the country Singing Economist 813 00:46:21,920 --> 00:46:24,080 Speaker 1: on before, but that was fantastic day. 814 00:46:24,120 --> 00:46:25,879 Speaker 3: Wallack, thank you so much for coming on the show. 815 00:46:25,960 --> 00:46:28,399 Speaker 3: Really appreciate it. Thanks you, guys, than for that was great. 816 00:46:40,719 --> 00:46:41,680 Speaker 3: That was really interesting. 817 00:46:41,719 --> 00:46:43,600 Speaker 2: Joe, that was super fun. He was great. 818 00:46:43,800 --> 00:46:46,920 Speaker 1: He's also pretty good at you know. I know again 819 00:46:46,960 --> 00:46:49,440 Speaker 1: he said it was tenuous, but the through line from 820 00:46:49,719 --> 00:46:51,839 Speaker 1: music to biotech kind of makes sense. 821 00:46:52,080 --> 00:46:53,960 Speaker 2: I think it makes a lot of sense. And the 822 00:46:54,160 --> 00:46:57,919 Speaker 2: especially the fact that you know these are extreme. These 823 00:46:57,920 --> 00:47:01,600 Speaker 2: are all startup investing. We know, you know, there's this 824 00:47:01,680 --> 00:47:05,360 Speaker 2: power law phenomenon where one of your twenty portfolio company 825 00:47:05,400 --> 00:47:06,160 Speaker 2: is going to make all the money. 826 00:47:06,239 --> 00:47:08,239 Speaker 3: Yeah, the lottery ticket, but you. 827 00:47:08,200 --> 00:47:11,000 Speaker 2: Know, like biotech is like lottery tickets upon lottery tickets 828 00:47:11,040 --> 00:47:13,800 Speaker 2: there's so much success uncertainty. 829 00:47:14,200 --> 00:47:16,439 Speaker 1: There's so much with lower payouts as. 830 00:47:16,400 --> 00:47:19,719 Speaker 2: Lower payouts, there's so much success uncertainty. There's so much 831 00:47:19,760 --> 00:47:23,800 Speaker 2: time that elapses between the initial work and where you 832 00:47:23,840 --> 00:47:26,600 Speaker 2: see if there's any signals of traction. It does feel 833 00:47:26,600 --> 00:47:29,600 Speaker 2: a lot like the uncertainty that exists in the music 834 00:47:29,640 --> 00:47:33,279 Speaker 2: industry and selecting which of these hundred bands that all 835 00:47:33,320 --> 00:47:36,040 Speaker 2: sound great and they're all really talented, actually has what 836 00:47:36,120 --> 00:47:38,960 Speaker 2: it takes to be a commercial hit. A lot of parallels. 837 00:47:39,040 --> 00:47:42,520 Speaker 1: Yeah, I thought that the dinosaur bias point was an 838 00:47:42,520 --> 00:47:45,520 Speaker 1: interesting one as well, because you can imagine, like again 839 00:47:45,600 --> 00:47:48,799 Speaker 1: to the timeline point, you kind of have to be 840 00:47:48,880 --> 00:47:52,160 Speaker 1: old to have any success in the industry historically, just 841 00:47:52,160 --> 00:47:54,680 Speaker 1: because it can take you know, a decade to get 842 00:47:54,960 --> 00:47:57,600 Speaker 1: a particular drug to market, so you don't have that 843 00:47:57,719 --> 00:48:01,400 Speaker 1: much opportunity to have you know, those wins unless you 844 00:48:01,440 --> 00:48:02,680 Speaker 1: get old and. 845 00:48:02,640 --> 00:48:05,560 Speaker 2: There's no shortage. There's no you know, there may be 846 00:48:05,680 --> 00:48:08,560 Speaker 2: regulatory things that can be done, but fundamentally, if you 847 00:48:08,600 --> 00:48:10,520 Speaker 2: want to know whether something works, and if you want 848 00:48:10,520 --> 00:48:12,520 Speaker 2: to know whether this drug is going to kill people 849 00:48:12,520 --> 00:48:14,920 Speaker 2: who take it or not, and whether it's safe or not, 850 00:48:15,200 --> 00:48:17,760 Speaker 2: there is no substitute for doing a test and seeing 851 00:48:17,920 --> 00:48:21,880 Speaker 2: what happens. And to your point or to your observation 852 00:48:21,880 --> 00:48:24,719 Speaker 2: about the dinosaurs, like I do think that lots of 853 00:48:24,800 --> 00:48:28,280 Speaker 2: people have this fantasy that anytime there is a legacy 854 00:48:28,320 --> 00:48:30,720 Speaker 2: industry of any sort, that if you just got twenty 855 00:48:30,760 --> 00:48:33,000 Speaker 2: one year olds from Stanford in the same room. 856 00:48:32,960 --> 00:48:34,800 Speaker 3: They gave them a garage to work out. 857 00:48:34,680 --> 00:48:36,480 Speaker 2: Garage, that they would do it a lot better than 858 00:48:36,520 --> 00:48:41,000 Speaker 2: the veterans. That was the Doge premise, and Doge doesn't 859 00:48:41,040 --> 00:48:41,800 Speaker 2: exist anymore. 860 00:48:41,880 --> 00:48:44,000 Speaker 3: So yeah, shall we leave it there. 861 00:48:44,160 --> 00:48:44,919 Speaker 2: Let's leave it there. 862 00:48:45,120 --> 00:48:47,400 Speaker 1: This has been another episode of the All Thoughts podcast. 863 00:48:47,480 --> 00:48:50,760 Speaker 1: I'm Tracy Alloway. You can follow me at Tracy Alloway. 864 00:48:50,400 --> 00:48:52,400 Speaker 2: And I'm Joe wasn't Thal. You can follow me at 865 00:48:52,400 --> 00:48:54,719 Speaker 2: the Stalwart, follow our guesst d A Wallack He's at 866 00:48:54,800 --> 00:48:58,280 Speaker 2: d A Wallack. Follow our producers Carmen Rodriguez at Carmen 867 00:48:58,400 --> 00:49:01,920 Speaker 2: armand Dashel Bennett a Dashboy, and kill Brooks at Kilbrooks. 868 00:49:02,120 --> 00:49:04,840 Speaker 2: From our Oddlots content, go to Bloomberg dot com slash 869 00:49:04,920 --> 00:49:07,719 Speaker 2: odd Lots were the daily newsletter and all of our episodes, 870 00:49:08,000 --> 00:49:09,799 Speaker 2: and you can chat about all of these topics. Twenty 871 00:49:09,880 --> 00:49:13,640 Speaker 2: four seven in our discord Discord dot gg slash online. 872 00:49:13,760 --> 00:49:16,120 Speaker 1: And if you enjoy Odd Lots, if you want us 873 00:49:16,160 --> 00:49:18,520 Speaker 1: to do more healthcare episodes, then please leave us a 874 00:49:18,600 --> 00:49:22,200 Speaker 1: positive review on your favorite podcast platform. And remember, if 875 00:49:22,239 --> 00:49:24,960 Speaker 1: you are a Bloomberg subscriber, you can listen to all 876 00:49:25,000 --> 00:49:28,080 Speaker 1: of our episodes absolutely ad free. All you need to 877 00:49:28,120 --> 00:49:31,000 Speaker 1: do is find the Bloomberg channel on Apple Podcasts and 878 00:49:31,120 --> 00:49:32,319 Speaker 1: follow the instructions there. 879 00:49:32,760 --> 00:50:00,160 Speaker 3: Thanks for listening, oh