1 00:00:00,080 --> 00:00:03,040 Speaker 1: Maternal out already reporting a narrow with an expected first 2 00:00:03,080 --> 00:00:06,240 Speaker 1: quarter loss cost cutting of setting a steep decline in 3 00:00:06,280 --> 00:00:09,480 Speaker 1: its COVID business. Maternal expecting to receive US approval for 4 00:00:09,560 --> 00:00:12,959 Speaker 1: its second product, an RSV vaccine, in the coming days, 5 00:00:13,039 --> 00:00:15,880 Speaker 1: joining us now as the Maternal CEO. Stephan Bansal, Stephan, 6 00:00:15,920 --> 00:00:17,840 Speaker 1: wonderful to catch up with you, sir. The stock is 7 00:00:17,920 --> 00:00:20,120 Speaker 1: just about positive in the pre market. Can you talk 8 00:00:20,120 --> 00:00:23,280 Speaker 1: to me about how you're balancing cost cutting with investing 9 00:00:23,280 --> 00:00:25,680 Speaker 1: in innovation given what's in the pipeline. 10 00:00:26,600 --> 00:00:29,040 Speaker 2: Sure, well, good morning, Thank you for having me so 11 00:00:29,280 --> 00:00:32,519 Speaker 2: very pleased with a quote. We basically try to focus 12 00:00:32,760 --> 00:00:35,199 Speaker 2: on how do we drive sales, how do we drive 13 00:00:35,360 --> 00:00:38,040 Speaker 2: R and D, how do we prioritize opportunities, which is 14 00:00:38,080 --> 00:00:40,760 Speaker 2: the way. For example, we announced that we are stopping 15 00:00:40,800 --> 00:00:44,519 Speaker 2: the partnership with Metagenom in research engine editing. 16 00:00:44,960 --> 00:00:45,760 Speaker 3: Same thing if you look. 17 00:00:45,640 --> 00:00:48,879 Speaker 2: At the portfolio, we're looking very carefully at all investments. 18 00:00:49,360 --> 00:00:53,400 Speaker 2: And the good thing about those vaccines, like respiratory vaccines 19 00:00:53,479 --> 00:00:56,280 Speaker 2: is your only pay the face free study. Whats so 20 00:00:56,280 --> 00:00:58,600 Speaker 2: if you think about COVID, we still up sells from COVID, 21 00:00:59,040 --> 00:01:01,279 Speaker 2: but the investment in the idea of COVID has come 22 00:01:01,320 --> 00:01:04,040 Speaker 2: down a lot. As you said ours, we we are 23 00:01:04,080 --> 00:01:07,360 Speaker 2: anticipating a launch this spring, but we're not going to 24 00:01:07,400 --> 00:01:09,520 Speaker 2: do another Phase three four URSV, so you can still 25 00:01:09,720 --> 00:01:13,520 Speaker 2: basically have a lot of new studies going on. We're 26 00:01:13,600 --> 00:01:15,760 Speaker 2: using the capital you used to put in the other 27 00:01:15,880 --> 00:01:18,679 Speaker 2: products before. And then if you look at oncology, as 28 00:01:18,720 --> 00:01:20,720 Speaker 2: you know, we're in a fifty to fifty profit share 29 00:01:20,760 --> 00:01:23,200 Speaker 2: with Merk, so merk is paying half of a phase 30 00:01:23,280 --> 00:01:25,640 Speaker 2: free study. So that's how we're managing when we're seeing 31 00:01:25,640 --> 00:01:28,600 Speaker 2: a lot in technology. You might have seen last week 32 00:01:28,880 --> 00:01:32,000 Speaker 2: an announcement with open Ai. We actually more than seven 33 00:01:32,040 --> 00:01:34,800 Speaker 2: hundred fifty GPT is going and that is helping us 34 00:01:34,840 --> 00:01:37,800 Speaker 2: a lot scale the company across not only science but 35 00:01:37,920 --> 00:01:41,480 Speaker 2: surd having a lot of productivity in manufacturing in commercial 36 00:01:41,600 --> 00:01:43,680 Speaker 2: illegal So that's kind of how we're doing it. 37 00:01:43,920 --> 00:01:45,840 Speaker 1: So Stephan, let's talk about something that our colleagues here 38 00:01:45,880 --> 00:01:49,240 Speaker 1: at Bloomberger extremely focused on, and that's your RS three shot, 39 00:01:49,640 --> 00:01:52,080 Speaker 1: which according to our colleagues, some data is showing that 40 00:01:52,160 --> 00:01:54,600 Speaker 1: maybe it doesn't last as long as others in the market. 41 00:01:54,760 --> 00:01:56,320 Speaker 1: What we all want to know here at Bloomberger is 42 00:01:56,320 --> 00:01:59,200 Speaker 1: whether that raises questions about the promise of your technology 43 00:01:59,240 --> 00:02:01,760 Speaker 1: in treating other diseases. How would you answer that? 44 00:02:03,160 --> 00:02:04,800 Speaker 2: So will first say that if you look at the 45 00:02:04,920 --> 00:02:08,240 Speaker 2: data the duration of the over vaccines, they are very similar. 46 00:02:08,480 --> 00:02:11,519 Speaker 2: So I don't think it is scientifically correct to say 47 00:02:11,520 --> 00:02:13,960 Speaker 2: that one of a vaccine doesn't last as long as 48 00:02:14,000 --> 00:02:17,480 Speaker 2: your ones of a too that are improved and ours. 49 00:02:17,760 --> 00:02:20,720 Speaker 2: Look at the data. This will be debated at the 50 00:02:20,760 --> 00:02:24,840 Speaker 2: CDC meeting at the end of drewne that for reccommodations. 51 00:02:25,520 --> 00:02:27,960 Speaker 2: So this doesn't worry me. If you look at duration, 52 00:02:28,639 --> 00:02:32,680 Speaker 2: the duration of vaccination is induced by T cell. If 53 00:02:32,720 --> 00:02:35,919 Speaker 2: you look at cancer product, the only reason it works 54 00:02:36,400 --> 00:02:39,240 Speaker 2: is T cells, not antibodies. Antibodies don't have a row 55 00:02:39,360 --> 00:02:42,960 Speaker 2: in cancer. It's about T cells going and attacking your cancer. 56 00:02:43,440 --> 00:02:46,519 Speaker 2: If our vaccine technology didn't have good T cell response, 57 00:02:46,919 --> 00:02:49,040 Speaker 2: the cancer product will not look as good as it is. 58 00:02:49,400 --> 00:02:51,520 Speaker 2: So I'm not worried at all about duration. 59 00:02:52,280 --> 00:02:54,960 Speaker 4: Pretty much every time we speak, Stefan, I ask you, basically, 60 00:02:54,960 --> 00:02:56,640 Speaker 4: have we couraged cancer yet? So I'm glad that you 61 00:02:56,680 --> 00:02:58,400 Speaker 4: went there because that's been sort of one of the 62 00:02:58,400 --> 00:03:01,120 Speaker 4: big questions in the hope for a lot of the 63 00:03:01,240 --> 00:03:04,680 Speaker 4: MR and A vaccines. You have this Melanima vaccine in 64 00:03:04,720 --> 00:03:06,760 Speaker 4: the works, what more do you have to do to 65 00:03:06,800 --> 00:03:09,640 Speaker 4: get it sort of set up for the approval process 66 00:03:09,680 --> 00:03:12,720 Speaker 4: to apply for that and are you using artificial intelligence 67 00:03:12,720 --> 00:03:13,800 Speaker 4: to extradite. 68 00:03:13,360 --> 00:03:15,320 Speaker 3: That great question. 69 00:03:15,639 --> 00:03:19,680 Speaker 2: So if you look at cancer treatment in melanoma, we've 70 00:03:19,720 --> 00:03:23,080 Speaker 2: said that we need to achieve free things to be 71 00:03:23,160 --> 00:03:27,400 Speaker 2: able to talk to regulator about accelerated approval. So the 72 00:03:27,440 --> 00:03:29,720 Speaker 2: face to day ties data we shared on the show 73 00:03:30,080 --> 00:03:34,240 Speaker 2: several times, we see duration. If you remember in December 74 00:03:34,240 --> 00:03:37,000 Speaker 2: we shad a three year survival. It was better than 75 00:03:37,040 --> 00:03:39,720 Speaker 2: the two year survival. So the difference between people on 76 00:03:39,840 --> 00:03:43,560 Speaker 2: all treatment and people that are just getting cathedral is 77 00:03:43,560 --> 00:03:47,360 Speaker 2: getting wider. So there's a very strong evidence that the 78 00:03:47,440 --> 00:03:48,360 Speaker 2: drug is working. 79 00:03:48,400 --> 00:03:49,320 Speaker 3: So that's number one. 80 00:03:49,760 --> 00:03:53,280 Speaker 2: Number two is we need a phase free study to 81 00:03:53,400 --> 00:03:57,560 Speaker 2: be substantially enrolled, and so we are working very actively. 82 00:03:57,920 --> 00:04:01,600 Speaker 2: Face free study started two months earlier plan last summer 83 00:04:02,200 --> 00:04:05,120 Speaker 2: and so when we are substantially enrolled, we will meet 84 00:04:05,120 --> 00:04:08,280 Speaker 2: that criteria and it could be late this year. And 85 00:04:08,320 --> 00:04:10,800 Speaker 2: the third one is a plant because of course we 86 00:04:10,880 --> 00:04:14,560 Speaker 2: need to file in the restrection du all the information 87 00:04:14,640 --> 00:04:19,000 Speaker 2: about the manufacturing process BFD the day you file is 88 00:04:19,080 --> 00:04:22,159 Speaker 2: allowed to go, of course, audit your plant. That plant 89 00:04:22,200 --> 00:04:23,720 Speaker 2: is being built. I had a chance to go there 90 00:04:23,760 --> 00:04:27,160 Speaker 2: two weeks ago. The team is working NonStop, scheduling literally 91 00:04:27,160 --> 00:04:29,479 Speaker 2: by the days, a bit like we did during COVID 92 00:04:29,560 --> 00:04:34,120 Speaker 2: during the pandemic, and so I anticipate that potentially sometime 93 00:04:34,200 --> 00:04:37,520 Speaker 2: next year, you know, the if a regulator was willing 94 00:04:37,560 --> 00:04:40,839 Speaker 2: to look at the accelerated approval file, we should have 95 00:04:40,880 --> 00:04:42,960 Speaker 2: this product available to help a lot of people, because 96 00:04:43,040 --> 00:04:46,440 Speaker 2: one in two people benefit with no thises coming back 97 00:04:46,600 --> 00:04:49,560 Speaker 2: or our deaths compared to the best drug available today 98 00:04:49,560 --> 00:04:50,480 Speaker 2: to them on the market. 99 00:04:50,680 --> 00:04:52,479 Speaker 4: Stephan, can you just give us a sense of you 100 00:04:52,520 --> 00:04:56,120 Speaker 4: talk about artificial intelligence. Everyone's talking about artificial intelligence. Could 101 00:04:56,120 --> 00:04:58,720 Speaker 4: you just talk about how much that could expedite generally 102 00:04:59,160 --> 00:05:01,640 Speaker 4: some of the drug production that we're seeing. Just how 103 00:05:01,760 --> 00:05:05,039 Speaker 4: much that could really get us to achieve you know, 104 00:05:05,080 --> 00:05:09,200 Speaker 4: that cure for cancer, that cure for als, cure for Alzheimer's. 105 00:05:09,400 --> 00:05:11,160 Speaker 4: You know, it's funny you're talking about sex and city. 106 00:05:11,200 --> 00:05:12,720 Speaker 4: I sit around and I worry about these things. You know, 107 00:05:12,760 --> 00:05:14,880 Speaker 4: what are we to cure these things? So I'm just wondering, 108 00:05:14,920 --> 00:05:17,200 Speaker 4: you know, this is going to be in our lifetime 109 00:05:17,200 --> 00:05:19,600 Speaker 4: in the next couple of years because of some of 110 00:05:19,600 --> 00:05:20,960 Speaker 4: the machine learning. 111 00:05:22,320 --> 00:05:24,360 Speaker 2: Yes, so I think there's a few things to tear 112 00:05:24,400 --> 00:05:27,279 Speaker 2: part in your in your great question. First is I 113 00:05:27,320 --> 00:05:31,760 Speaker 2: think machine learning in academic labs, in research labs, in 114 00:05:31,760 --> 00:05:36,360 Speaker 2: industry is helping accelerate the understanding of a human body. 115 00:05:36,880 --> 00:05:39,440 Speaker 2: If you think about you know, this is Alzheimer and 116 00:05:39,480 --> 00:05:42,320 Speaker 2: all those complicated disease that we do not have solutions 117 00:05:42,320 --> 00:05:46,000 Speaker 2: for yet as a society. It's because we do not 118 00:05:46,240 --> 00:05:49,920 Speaker 2: understand the biology. We do not understand how the disease happened, 119 00:05:50,000 --> 00:05:52,599 Speaker 2: how the disease evolved, and so we are just trying 120 00:05:52,680 --> 00:05:56,960 Speaker 2: things and some work, but very few work. Most of 121 00:05:56,960 --> 00:05:59,680 Speaker 2: them don't work because we're just trying and guessing. If 122 00:05:59,680 --> 00:06:02,240 Speaker 2: you look at a biology, once we understand that something works, 123 00:06:02,560 --> 00:06:06,240 Speaker 2: then the industry can comes with very very good actions 124 00:06:06,520 --> 00:06:08,520 Speaker 2: to deal with those. So I think AI will accelerate 125 00:06:08,600 --> 00:06:11,479 Speaker 2: the understanding of biology, which would be fundamental to bring 126 00:06:11,520 --> 00:06:15,320 Speaker 2: new drug. Then AI is already used to accelerate discovery 127 00:06:15,400 --> 00:06:17,320 Speaker 2: in terms of what tool do you go after a 128 00:06:17,360 --> 00:06:20,719 Speaker 2: disease once you understand it. At modern already we have 129 00:06:20,839 --> 00:06:24,320 Speaker 2: different chemical matters that are generated by our AI system 130 00:06:24,640 --> 00:06:27,400 Speaker 2: that are helping us to accelerate the work that humans 131 00:06:27,440 --> 00:06:29,839 Speaker 2: are doing. So it's an accelerator to the teams. And 132 00:06:29,880 --> 00:06:33,160 Speaker 2: then there's a huge chapter on productivity. If you think 133 00:06:33,200 --> 00:06:37,279 Speaker 2: about clinical development phase one, two and three, it's basically 134 00:06:37,320 --> 00:06:41,960 Speaker 2: doing experiment in human, getting the data, finding the doors, 135 00:06:42,520 --> 00:06:45,560 Speaker 2: doing more experiment, and when you have all studied on, 136 00:06:45,720 --> 00:06:48,120 Speaker 2: you get all the data and you submit by relator. 137 00:06:48,520 --> 00:06:49,960 Speaker 3: My point is it's all about data. 138 00:06:50,000 --> 00:06:53,520 Speaker 2: They are literally hundreds of business processes that need to happen, 139 00:06:53,880 --> 00:06:56,760 Speaker 2: and I think many of those, if not most of those, 140 00:06:56,839 --> 00:06:59,280 Speaker 2: you've got to be able to apply AI to shrink 141 00:06:59,520 --> 00:07:03,160 Speaker 2: time to go faster. An example we shared in March 142 00:07:03,240 --> 00:07:06,839 Speaker 2: in a vaccine. Then the team wrote a GPT to 143 00:07:06,880 --> 00:07:09,440 Speaker 2: help us to do those selections. When you do clinical 144 00:07:09,480 --> 00:07:12,720 Speaker 2: study your phase one, you try several doses and then 145 00:07:12,720 --> 00:07:14,560 Speaker 2: based on the data you get in a clinic, you 146 00:07:14,640 --> 00:07:17,840 Speaker 2: decide which jos go into your phase three. Well, it 147 00:07:18,000 --> 00:07:20,320 Speaker 2: used to take around a month to do that by 148 00:07:20,360 --> 00:07:23,200 Speaker 2: having people and meeting and experts looking at the data 149 00:07:23,360 --> 00:07:26,880 Speaker 2: while we develop a GPT that basically get all the 150 00:07:26,960 --> 00:07:30,680 Speaker 2: data from the clinical study and suggests to us a 151 00:07:30,760 --> 00:07:33,720 Speaker 2: those in literally a minutes or two. That is already 152 00:07:33,880 --> 00:07:36,720 Speaker 2: a tool that has been developed that I've seen used 153 00:07:36,760 --> 00:07:37,440 Speaker 2: at the company. 154 00:07:37,680 --> 00:07:39,559 Speaker 3: That's just one example. So here you go to shrink 155 00:07:39,560 --> 00:07:39,920 Speaker 3: them off. 156 00:07:40,160 --> 00:07:42,120 Speaker 2: And if you do that on the hundreds of business 157 00:07:42,200 --> 00:07:44,880 Speaker 2: processes that have to happen in preparing the drug for 158 00:07:44,920 --> 00:07:48,280 Speaker 2: the clinic, the clinical testing, the analyzing of the data, 159 00:07:48,640 --> 00:07:51,360 Speaker 2: the communication with the FDA, I think you can save 160 00:07:51,440 --> 00:07:53,800 Speaker 2: a lot of time. I don't know yet, because only 161 00:07:53,920 --> 00:07:56,160 Speaker 2: history will show us in the next few years, can 162 00:07:56,200 --> 00:07:58,920 Speaker 2: you shave thirty percent, forty percent, fifty percent of how 163 00:07:58,920 --> 00:08:01,239 Speaker 2: many years it takes you to develop a drug? 164 00:08:01,280 --> 00:08:02,800 Speaker 3: I think it's going to be very significant. 165 00:08:03,800 --> 00:08:05,600 Speaker 1: Stephan, We've got to leave you there. It's fantastic to 166 00:08:05,640 --> 00:08:07,640 Speaker 1: catch up with you, so amazing to listen to you 167 00:08:07,800 --> 00:08:10,640 Speaker 1: talk about the efforts taking place at Maderna. Madenna CEO 168 00:08:10,680 --> 00:08:11,520 Speaker 1: Stefan Bansell