1 00:00:05,640 --> 00:00:08,840 Speaker 1: Overall, the FDA has had a number of spectacular hits 2 00:00:08,880 --> 00:00:13,880 Speaker 1: and pretty notable failures, and it is worth asking how 3 00:00:13,920 --> 00:00:15,880 Speaker 1: good of a job is the agency doing and are 4 00:00:15,880 --> 00:00:18,680 Speaker 1: they living up to their mandate to protect the public health. 5 00:00:19,600 --> 00:00:21,360 Speaker 1: My boss at the time said Richard, we don't have 6 00:00:21,400 --> 00:00:24,440 Speaker 1: to solve problems. All we have to do is appear 7 00:00:24,560 --> 00:00:34,600 Speaker 1: to solve problems. Welcome to Calling Bullshit, the podcast about 8 00:00:34,600 --> 00:00:38,599 Speaker 1: purpose washing, the gap between what an organization says they 9 00:00:38,640 --> 00:00:41,839 Speaker 1: stand for and what they actually do and what they 10 00:00:41,880 --> 00:00:45,080 Speaker 1: would need to change to practice what they preach. I'm 11 00:00:45,120 --> 00:00:47,479 Speaker 1: your host time onto you, and I've spent over a 12 00:00:47,560 --> 00:00:51,760 Speaker 1: decade helping organizations define what they stand for, their purpose 13 00:00:52,200 --> 00:00:54,560 Speaker 1: and then help them to use that purpose to drive 14 00:00:54,640 --> 00:01:00,560 Speaker 1: transformation throughout their business. Unfortunately, at a lot of institution today, 15 00:01:00,680 --> 00:01:04,360 Speaker 1: there's still a pretty wide gap between word. Indeed, that 16 00:01:04,400 --> 00:01:08,839 Speaker 1: gap has a name, bullshit. But, and this is important, 17 00:01:09,200 --> 00:01:12,479 Speaker 1: we believe that bullshit is a treatable condition. So when 18 00:01:12,480 --> 00:01:16,800 Speaker 1: our bullshit detector lights up, we're going to explore everything 19 00:01:16,880 --> 00:01:30,360 Speaker 1: the organization should do to fix it. There was a 20 00:01:30,440 --> 00:01:34,440 Speaker 1: time in the not too distant past when cocaine was 21 00:01:34,560 --> 00:01:39,160 Speaker 1: marketed to children and literal snake oil was sold as medicine, 22 00:01:39,840 --> 00:01:42,959 Speaker 1: a time when sawdust was put into food as filler, 23 00:01:43,520 --> 00:01:47,600 Speaker 1: and when rats were ground up with the beef, and 24 00:01:47,680 --> 00:01:52,120 Speaker 1: it was all perfectly legal. Food and drugs were sold 25 00:01:52,160 --> 00:01:58,960 Speaker 1: in a completely unregulated market until President Theodore Roosevelt signed 26 00:01:58,960 --> 00:02:02,040 Speaker 1: into law the Food and Drugs Act of nine six, 27 00:02:02,720 --> 00:02:06,440 Speaker 1: giving the executive branch the power to regulate food and 28 00:02:06,560 --> 00:02:09,720 Speaker 1: drugs and thus creating what we now call the f 29 00:02:09,880 --> 00:02:15,359 Speaker 1: d A. Today, the Food and Drug Administration has an 30 00:02:15,400 --> 00:02:20,760 Speaker 1: expansive jurisdiction regulating things raging from microwave ovens to cat 31 00:02:20,880 --> 00:02:25,840 Speaker 1: nip to ibuprofen and beyond, way beyond chew. On this 32 00:02:26,400 --> 00:02:30,760 Speaker 1: the FDA regulates twenty cents of every dollar that you 33 00:02:30,880 --> 00:02:35,640 Speaker 1: spend to cover such a broad water front. They employ 34 00:02:35,880 --> 00:02:41,000 Speaker 1: eighteen thousand people that work across several inter agencies, including 35 00:02:41,080 --> 00:02:47,360 Speaker 1: the Center for Drugs, Biologics, Devices, Veterinary Medicine, Food Safety 36 00:02:47,360 --> 00:02:53,320 Speaker 1: and Nutrition, and tobacco. Their formal purposes a few paragraphs long, 37 00:02:53,639 --> 00:02:56,000 Speaker 1: but on the homepage of their website they have a 38 00:02:56,000 --> 00:02:59,520 Speaker 1: phrase that nicely captures the spirit of it, and it 39 00:02:59,639 --> 00:03:04,639 Speaker 1: really Today, as in the past, the FDA strives above 40 00:03:04,680 --> 00:03:08,000 Speaker 1: all else to safeguard the health and well being of 41 00:03:08,040 --> 00:03:17,799 Speaker 1: the American people. That is a monumentally important purpose, one 42 00:03:17,840 --> 00:03:22,240 Speaker 1: in which the stakes for all of us are high. However, 43 00:03:22,480 --> 00:03:25,480 Speaker 1: if you've been listening to the news lately, it seems 44 00:03:25,560 --> 00:03:28,080 Speaker 1: like the f d A is falling pretty short of 45 00:03:28,120 --> 00:03:31,959 Speaker 1: living up to it. Speaking about the epidemic of youth 46 00:03:32,040 --> 00:03:35,360 Speaker 1: use of e cigarettes, it retrospects the FDA should have 47 00:03:35,400 --> 00:03:40,320 Speaker 1: acted center nationwide of baby formula stock is gone. Part 48 00:03:40,360 --> 00:03:43,120 Speaker 1: of the blame for a lot of these shortages rest 49 00:03:43,160 --> 00:03:46,520 Speaker 1: at the feet of the f d A. The FDA 50 00:03:46,640 --> 00:03:50,560 Speaker 1: first approved OxyContin. Since then, more than five hundred thousand 51 00:03:50,560 --> 00:03:57,560 Speaker 1: people have died from opioid related overdoses. We've actually encountered 52 00:03:57,560 --> 00:04:02,200 Speaker 1: the FDA twice before, first when discussing their role regulating Jewel, 53 00:04:02,640 --> 00:04:06,320 Speaker 1: and again while researching Perdue Pharma for a potential episode. 54 00:04:07,040 --> 00:04:10,480 Speaker 1: The FDA was certainly a character in the first Jewel episode, 55 00:04:10,840 --> 00:04:13,360 Speaker 1: but when we checked in on Jewel again this season, 56 00:04:13,840 --> 00:04:16,640 Speaker 1: it was clear that the f d A shared a 57 00:04:16,800 --> 00:04:19,880 Speaker 1: lot more of the blame for the damage caused by 58 00:04:20,000 --> 00:04:25,240 Speaker 1: Jewels unregulated product, and so we decided to aim our 59 00:04:25,320 --> 00:04:29,720 Speaker 1: BS detectors directly at the f d A. Starting with 60 00:04:29,880 --> 00:04:39,640 Speaker 1: picking up the strange case of Big Tobacco, Vaping and Jewel. Okay, folks, 61 00:04:39,680 --> 00:04:43,400 Speaker 1: I am very excited to introduce Lauren Etter for her 62 00:04:43,560 --> 00:04:47,200 Speaker 1: second time on the show. Lauren, thank you for being 63 00:04:47,240 --> 00:04:50,520 Speaker 1: here and welcome back to calling Bullshit. Thanks for having 64 00:04:50,520 --> 00:04:53,120 Speaker 1: me tie great to be here. Lauren is the author 65 00:04:53,320 --> 00:04:57,840 Speaker 1: of the Devil's playbook, Big Tobacco, Jewel and the Addiction 66 00:04:57,880 --> 00:05:01,640 Speaker 1: of a New Generation. But before we continued the conversation 67 00:05:01,680 --> 00:05:05,000 Speaker 1: with Lauren, let's catch you up on the escalating story 68 00:05:05,040 --> 00:05:08,640 Speaker 1: of the regulator and the company with a b S 69 00:05:08,800 --> 00:05:14,240 Speaker 1: roundtable recap. Okay, So, Jewel Labs launched its Jewel E 70 00:05:14,400 --> 00:05:18,560 Speaker 1: cigarettes as a smoking cessation device in it was actually 71 00:05:18,560 --> 00:05:22,000 Speaker 1: effective and had the potential to help smokers get off cigarettes. 72 00:05:22,400 --> 00:05:25,440 Speaker 1: But Jewel was funded by venture capitalists with a growth 73 00:05:25,480 --> 00:05:28,920 Speaker 1: at all costs mindset, which led them to target kids, 74 00:05:29,160 --> 00:05:32,159 Speaker 1: the demographic most susceptible to picking up a smoking habit, 75 00:05:32,360 --> 00:05:36,760 Speaker 1: thus sparking a new youth nicotine epidemic total BS. But 76 00:05:36,960 --> 00:05:40,719 Speaker 1: at the time the cigarettes were totally unregulated. Finally, in 77 00:05:41,800 --> 00:05:45,600 Speaker 1: the FDA began regulating the cigarettes as tobacco products and 78 00:05:45,680 --> 00:05:49,120 Speaker 1: requiring manufacturers to submit applications to be on the market. 79 00:05:49,400 --> 00:05:52,800 Speaker 1: And guess what, Jewel Labs didn't get around to submitting 80 00:05:52,839 --> 00:05:57,760 Speaker 1: their application to the FDA until July, all while they 81 00:05:57,760 --> 00:06:00,839 Speaker 1: were still selling Jewel more bs. You would think the 82 00:06:00,880 --> 00:06:03,320 Speaker 1: story ends here, but it doesn't. It then took the 83 00:06:03,360 --> 00:06:07,200 Speaker 1: FDA two more years to review Jewels application, all the 84 00:06:07,240 --> 00:06:11,440 Speaker 1: while jewels are still being sold. Finally, in June of 85 00:06:11,480 --> 00:06:14,680 Speaker 1: twenty two, the f D announcer decision to deny Jewel 86 00:06:14,760 --> 00:06:18,760 Speaker 1: Labs application, essentially banning the e cigarette. Their reasoning Jewel 87 00:06:18,800 --> 00:06:21,120 Speaker 1: did not provide enough scientific data to show that their 88 00:06:21,120 --> 00:06:25,800 Speaker 1: products were not harmful. Not surprisingly, Jewel Labs immediately sued 89 00:06:25,880 --> 00:06:28,479 Speaker 1: and within a few days was granted an emergency stay, 90 00:06:28,560 --> 00:06:31,160 Speaker 1: allowing them to keep their products on the market. And 91 00:06:31,240 --> 00:06:34,200 Speaker 1: now the FDA has walked back their decision and it's 92 00:06:34,200 --> 00:06:38,359 Speaker 1: currently re reviewing the application, leaving Jewel. You guessed it 93 00:06:38,480 --> 00:06:43,680 Speaker 1: still on the market, and that you're round. I think 94 00:06:43,680 --> 00:06:48,360 Speaker 1: we have different internet connections and that's your round table recap. 95 00:06:50,440 --> 00:06:54,159 Speaker 1: The FDA website says, today, as in the past, the 96 00:06:54,240 --> 00:06:58,200 Speaker 1: FDA strives above all else to safeguard the health and 97 00:06:58,240 --> 00:07:01,640 Speaker 1: well being of the American people. So first, I just 98 00:07:01,680 --> 00:07:03,880 Speaker 1: wanted to ask you, how do you think they're doing 99 00:07:04,360 --> 00:07:07,679 Speaker 1: at meeting their mission or achieving their purpose? I would 100 00:07:07,680 --> 00:07:12,760 Speaker 1: say just in general that with tobacco, for many years, 101 00:07:13,120 --> 00:07:16,720 Speaker 1: until the early two thousands, the agency didn't have any 102 00:07:16,840 --> 00:07:21,760 Speaker 1: jurisdiction over the tobacco industry. It was a completely unregulated market. 103 00:07:22,120 --> 00:07:25,560 Speaker 1: And it was only after the Master Settlement Agreement and 104 00:07:25,760 --> 00:07:28,920 Speaker 1: after the tobacco companies has gotten crushed in that whole ordeal. 105 00:07:29,480 --> 00:07:33,000 Speaker 1: When you know, for years Congress tried to figure out 106 00:07:33,000 --> 00:07:35,720 Speaker 1: how to regulate tobacco. The Supreme Court even at one 107 00:07:35,760 --> 00:07:38,840 Speaker 1: point said they didn't have jurisdiction. So this has been 108 00:07:38,880 --> 00:07:43,040 Speaker 1: an issue like where should the tobacco industry sit doesn't 109 00:07:43,080 --> 00:07:47,280 Speaker 1: make sense for the f d A to regulate this industry, 110 00:07:47,320 --> 00:07:51,760 Speaker 1: But for years there's industry was totally unregulated. So I 111 00:07:51,800 --> 00:07:54,560 Speaker 1: think that there's a lot of questions worthy of tasking 112 00:07:54,720 --> 00:07:57,480 Speaker 1: about the f d A and certainly about their role 113 00:07:57,680 --> 00:08:01,920 Speaker 1: in the regulation of tobacco. You may remember that we 114 00:08:01,920 --> 00:08:06,960 Speaker 1: we score organizations on their level of BS. Would you 115 00:08:07,000 --> 00:08:11,160 Speaker 1: be comfortable giving the f d A a score even 116 00:08:11,200 --> 00:08:14,600 Speaker 1: as it relates to the Jewel situation of zero to 117 00:08:15,240 --> 00:08:19,760 Speaker 1: in terms of gaps between word indeed. So if you 118 00:08:19,760 --> 00:08:23,280 Speaker 1: look at what the agency is doing in the top 119 00:08:23,320 --> 00:08:26,880 Speaker 1: back the overall kind of tobacco space and their efforts 120 00:08:26,920 --> 00:08:33,120 Speaker 1: to implement a harm reduction framework, they're actually making strides. 121 00:08:33,280 --> 00:08:36,480 Speaker 1: My issue is, I don't feel like the FDA has 122 00:08:37,160 --> 00:08:42,800 Speaker 1: enough resources to police the kind of illicit sale of 123 00:08:42,840 --> 00:08:44,440 Speaker 1: the product to keep it out of the hands of 124 00:08:44,440 --> 00:08:47,960 Speaker 1: the youth. They're just a very underfunded agency, especially the 125 00:08:48,720 --> 00:08:52,120 Speaker 1: tobacco products. And so if you're looking at the twin 126 00:08:52,280 --> 00:08:57,120 Speaker 1: problems of adult smoking and youth nicotine addiction, I feel 127 00:08:57,120 --> 00:08:59,600 Speaker 1: like they they are focusing quite a bit on the 128 00:08:59,600 --> 00:09:04,320 Speaker 1: adult smoking issue and then the youth nicotine addiction issue. 129 00:09:04,720 --> 00:09:07,480 Speaker 1: I don't know if I have enough confidence that they're 130 00:09:07,480 --> 00:09:10,319 Speaker 1: going to be able to ultimately keep this highly addictive 131 00:09:10,679 --> 00:09:14,240 Speaker 1: product out of the hands of kids, which is ultimately 132 00:09:14,280 --> 00:09:17,440 Speaker 1: the goal to stop a new generation from becoming addicted. 133 00:09:17,559 --> 00:09:21,640 Speaker 1: So can I give them a score? Um? I just 134 00:09:21,679 --> 00:09:23,960 Speaker 1: don't know if I feel comfortable settling on a on 135 00:09:24,040 --> 00:09:28,520 Speaker 1: a number, Yeah, that's okay, but it's it's definitely not 136 00:09:29,360 --> 00:09:33,360 Speaker 1: the best number. Zero right there bumbling their way through this, 137 00:09:33,760 --> 00:09:38,880 Speaker 1: and let's just say, with the best of intention, right, 138 00:09:38,960 --> 00:09:46,120 Speaker 1: they have failed to protect America's youth so far from vaping, 139 00:09:46,160 --> 00:09:48,920 Speaker 1: which is, at the end of the day, an extremely 140 00:09:49,160 --> 00:09:53,200 Speaker 1: efficient delivery system for what ought to be a controlled 141 00:09:53,240 --> 00:09:57,640 Speaker 1: substance in my view, right, But the FDA is constantly 142 00:09:57,679 --> 00:10:01,080 Speaker 1: barraged with this flood of new products, and they're constantly 143 00:10:01,160 --> 00:10:05,800 Speaker 1: on their heels responding to these like rapidly innovating markets. 144 00:10:05,800 --> 00:10:10,400 Speaker 1: That makes it very difficult for them to contain this, again, 145 00:10:10,520 --> 00:10:14,679 Speaker 1: highly addictive product that's being sold at seven elevens, and 146 00:10:15,160 --> 00:10:17,920 Speaker 1: you know corner stores around the country that not all 147 00:10:17,960 --> 00:10:21,360 Speaker 1: of them have the greatest I D checking abilities. So 148 00:10:21,760 --> 00:10:24,319 Speaker 1: I admit it's a very difficult task for them to do. 149 00:10:24,400 --> 00:10:26,439 Speaker 1: I think it should be one of the most important 150 00:10:26,920 --> 00:10:29,520 Speaker 1: tasks that they undertake, and I know that they're taking 151 00:10:29,520 --> 00:10:32,160 Speaker 1: it seriously. I'm just concerned that they might not have 152 00:10:32,280 --> 00:10:36,880 Speaker 1: enough resources to adequately keep these highly addictive products out 153 00:10:36,880 --> 00:10:41,800 Speaker 1: of the hands of young people. Is the FDA's lack 154 00:10:41,880 --> 00:10:45,679 Speaker 1: of resources the reason they fumbled through regulating e cigarettes 155 00:10:45,679 --> 00:10:50,199 Speaker 1: and jewel in particular. Lauren thinks so, or at least 156 00:10:50,200 --> 00:10:53,199 Speaker 1: she thinks it's a significant part of it, and although 157 00:10:53,240 --> 00:10:56,319 Speaker 1: she deferred on giving them a B S score, she 158 00:10:56,480 --> 00:11:00,960 Speaker 1: does believe the FDA is making strides, no doubt, it 159 00:11:01,120 --> 00:11:05,080 Speaker 1: is an incredibly challenging undertaking. But if your purpose is 160 00:11:05,080 --> 00:11:08,040 Speaker 1: to safeguard the health of the American people and a 161 00:11:08,040 --> 00:11:13,240 Speaker 1: new generation gets addicted to nicotine on your watch, it's 162 00:11:13,520 --> 00:11:18,440 Speaker 1: definitely a harbinger of bullshit. But maybe the Center for 163 00:11:18,480 --> 00:11:23,560 Speaker 1: Tobacco Products is an outlier considering the FDA covers so 164 00:11:23,640 --> 00:11:26,600 Speaker 1: much ground, I'm going to reserve judgment and see what's 165 00:11:26,600 --> 00:11:30,280 Speaker 1: cooking at their second biggest interagency, the Center of Food 166 00:11:30,280 --> 00:11:36,640 Speaker 1: Safety and Applied Nutrition. With food specific we're trying to 167 00:11:36,679 --> 00:11:38,520 Speaker 1: do a little more because we don't want to just 168 00:11:38,600 --> 00:11:41,600 Speaker 1: make food safe. We want to give consumers information that 169 00:11:41,679 --> 00:11:45,520 Speaker 1: helped them to choose a healthier diet. Richard Williams worked 170 00:11:45,520 --> 00:11:49,240 Speaker 1: at the FDA for nearly three decades, and he left 171 00:11:49,360 --> 00:11:52,400 Speaker 1: the Center for Food Safety and Applied Nutrition with a 172 00:11:52,559 --> 00:11:55,760 Speaker 1: bad taste in his mouth. I think the big thing 173 00:11:55,880 --> 00:11:58,319 Speaker 1: was as I approached the end of my twenty seven years, 174 00:11:58,440 --> 00:12:00,360 Speaker 1: it occurred to me in all those years is that 175 00:12:00,559 --> 00:12:04,320 Speaker 1: most of what we had done had not succeeded. And 176 00:12:04,400 --> 00:12:06,720 Speaker 1: so the more I thought about it, the more I thought, well, 177 00:12:06,760 --> 00:12:10,080 Speaker 1: what actually happened? What went wrong? And I thought maybe 178 00:12:10,080 --> 00:12:12,200 Speaker 1: if I could write a book about it, we can 179 00:12:12,280 --> 00:12:15,520 Speaker 1: start changing things and and actually solving things for consumers. 180 00:12:16,640 --> 00:12:21,400 Speaker 1: Richard's book Fixing Food and f D A Insider Unravels 181 00:12:21,440 --> 00:12:25,440 Speaker 1: the myth and the Solutions, is an eye opening account 182 00:12:25,480 --> 00:12:28,600 Speaker 1: of his tenure at the organization, beginning when he was 183 00:12:28,679 --> 00:12:32,880 Speaker 1: hired as their very first economic analyst. You can blame 184 00:12:32,880 --> 00:12:35,640 Speaker 1: it on President Carter. President Carter was the first one 185 00:12:35,679 --> 00:12:39,079 Speaker 1: who said we need some economic thinking in these particularly 186 00:12:39,120 --> 00:12:43,360 Speaker 1: social regulations. More and more presidents begin insisting that we 187 00:12:43,440 --> 00:12:45,480 Speaker 1: do these things and that we take take them into 188 00:12:45,520 --> 00:12:49,000 Speaker 1: account so we can make better decisions. Let's talk about 189 00:12:49,040 --> 00:12:52,320 Speaker 1: your job and some of the eye opening parts of 190 00:12:52,320 --> 00:12:56,280 Speaker 1: your book, because you, it sounds like, walked into that 191 00:12:56,400 --> 00:12:58,760 Speaker 1: job with the best of intention. You wanted to help 192 00:12:58,800 --> 00:13:04,240 Speaker 1: the I achieve its mission of making people safer. One 193 00:13:04,280 --> 00:13:07,120 Speaker 1: of the first assignments that you had was to do 194 00:13:07,160 --> 00:13:11,480 Speaker 1: a cost benefit analysis on lead acetate in men's hair dye. 195 00:13:11,640 --> 00:13:14,360 Speaker 1: I wondered if you would tell us that story. Sure, 196 00:13:14,480 --> 00:13:17,000 Speaker 1: that was the first one just assigned to me. All 197 00:13:17,040 --> 00:13:19,480 Speaker 1: I had to go by was the executive order and 198 00:13:19,679 --> 00:13:22,800 Speaker 1: what I knew from from being an economist. So I 199 00:13:22,880 --> 00:13:25,839 Speaker 1: jumped into it and found out that this was basically 200 00:13:26,040 --> 00:13:28,199 Speaker 1: the thing that you put in your hair, and over 201 00:13:28,240 --> 00:13:32,080 Speaker 1: time it takes the great away. Richard wanted to know 202 00:13:32,120 --> 00:13:35,800 Speaker 1: of using lead acetate could cause skin cancer, and after 203 00:13:35,880 --> 00:13:39,000 Speaker 1: a deep dive into the toxicology, he found that the 204 00:13:39,160 --> 00:13:43,320 Speaker 1: risk was near zero, thus there was no benefit to 205 00:13:43,400 --> 00:13:47,480 Speaker 1: banning the drug. There was, however, a high cost to 206 00:13:47,600 --> 00:13:51,240 Speaker 1: taking it off the market. It was the only product 207 00:13:51,240 --> 00:13:54,439 Speaker 1: of its kind available, so without it, gray haired men 208 00:13:54,480 --> 00:13:57,839 Speaker 1: would have no alternative, I mean, other than living with 209 00:13:57,920 --> 00:14:01,280 Speaker 1: the fact that their hair was great. I turned into 210 00:14:01,240 --> 00:14:03,400 Speaker 1: my analysis. We didn't think a thing about it until 211 00:14:03,440 --> 00:14:05,800 Speaker 1: a couple of weeks later a young woman had come 212 00:14:05,840 --> 00:14:09,400 Speaker 1: down from the center director's office and she said, this 213 00:14:09,480 --> 00:14:11,800 Speaker 1: is great. Now we need you to do the other one. 214 00:14:12,600 --> 00:14:14,600 Speaker 1: And I said, what other one? She said, well, you 215 00:14:14,640 --> 00:14:18,040 Speaker 1: said that benefits were lower than the cost and if 216 00:14:18,080 --> 00:14:21,400 Speaker 1: we decide not to do anything about let acitate that 217 00:14:21,400 --> 00:14:23,480 Speaker 1: will work. But we want you to do one that 218 00:14:23,600 --> 00:14:27,200 Speaker 1: shows the exact opposite, that the benefits succeed the costs, 219 00:14:27,240 --> 00:14:30,480 Speaker 1: that it does costs cancer. And I'm like, well, no, 220 00:14:30,720 --> 00:14:33,480 Speaker 1: that wouldn't be honest. I'm not going to do that. 221 00:14:33,840 --> 00:14:36,040 Speaker 1: And she said, you don't understand. This is an order, 222 00:14:36,480 --> 00:14:38,960 Speaker 1: you know, from the sixth floor. And I said, no, 223 00:14:39,960 --> 00:14:42,480 Speaker 1: I'm not going to do that. I didn't hear anything 224 00:14:42,480 --> 00:14:44,520 Speaker 1: more about it. A few weeks later, I was in 225 00:14:44,560 --> 00:14:47,000 Speaker 1: our first my first training class where I was getting 226 00:14:47,000 --> 00:14:50,320 Speaker 1: my introduction to FDA, and the deputy Center director was 227 00:14:50,360 --> 00:14:52,120 Speaker 1: in there, and at lunch, I went up and I said, 228 00:14:52,120 --> 00:14:55,440 Speaker 1: you know, a funny thing happened to me, and I 229 00:14:55,480 --> 00:14:58,560 Speaker 1: explained the situation, and he said that order came from me, 230 00:14:58,840 --> 00:15:00,640 Speaker 1: and you're going to do it or you're going to 231 00:15:00,680 --> 00:15:04,720 Speaker 1: be fired. And I said something really stupid. I just 232 00:15:04,800 --> 00:15:06,400 Speaker 1: made this up on the spot. I said, I'm not 233 00:15:06,480 --> 00:15:10,600 Speaker 1: an economic prostitute. And he said, then you were fired 234 00:15:12,120 --> 00:15:15,200 Speaker 1: and you are going to be leaving this agency. And 235 00:15:16,520 --> 00:15:19,000 Speaker 1: so I went back to my office. I'm like, should 236 00:15:19,040 --> 00:15:22,000 Speaker 1: I start packing up my stuff? I didn't know. And 237 00:15:22,040 --> 00:15:25,680 Speaker 1: then nothing ever happened. He apparently decided it would be 238 00:15:25,680 --> 00:15:28,640 Speaker 1: a bad idea to fire over that kind of thing. 239 00:15:29,840 --> 00:15:31,920 Speaker 1: But that was just the first time I was threatened 240 00:15:31,920 --> 00:15:33,960 Speaker 1: with being fired. I mean, good on you that you 241 00:15:34,000 --> 00:15:36,960 Speaker 1: did not gave to that pressure. But it sounds like 242 00:15:37,080 --> 00:15:40,800 Speaker 1: that this was not an isolated occurrence. Were there other 243 00:15:40,960 --> 00:15:45,440 Speaker 1: instances where you were asked to essentially compromise your integrity 244 00:15:45,600 --> 00:15:50,240 Speaker 1: for twenty seven years. There were other instances. Was it 245 00:15:50,360 --> 00:15:55,440 Speaker 1: was it never stopped. That is incomprehensible to me, and 246 00:15:56,560 --> 00:15:59,880 Speaker 1: it's terrible. And so, for the first time in my life, 247 00:16:00,080 --> 00:16:03,000 Speaker 1: decided to become political with a small p like a 248 00:16:03,040 --> 00:16:09,520 Speaker 1: bureaucrat politician operator, which you have. Yeah, sadly, Well, so 249 00:16:09,640 --> 00:16:12,320 Speaker 1: let's let's follow that threat a little bit, because I 250 00:16:12,400 --> 00:16:16,040 Speaker 1: assume that, let's assume the best of people. Assume that 251 00:16:16,120 --> 00:16:20,080 Speaker 1: people who were feeling the pressure above you in the organization, 252 00:16:20,440 --> 00:16:24,560 Speaker 1: we're feeling it from other entities. You know, in your book, 253 00:16:24,600 --> 00:16:27,080 Speaker 1: you go into the FDA spending a lot of energy 254 00:16:27,120 --> 00:16:31,560 Speaker 1: responding to influence from just an incredible variety of stakeholders. 255 00:16:31,600 --> 00:16:36,480 Speaker 1: There's you know, the executive branch, there's Congress, there's courts, 256 00:16:36,560 --> 00:16:43,080 Speaker 1: there's press, private industry, activists, academics. Can you help us 257 00:16:43,280 --> 00:16:48,080 Speaker 1: understand the FDA's relationship with some of those stakeholders? Sure, 258 00:16:48,240 --> 00:16:50,400 Speaker 1: I mean, you know, first of all, you have to 259 00:16:50,480 --> 00:16:53,880 Speaker 1: start understanding one thing. FDA has been around since nineteen 260 00:16:53,880 --> 00:16:58,640 Speaker 1: o six, and they have continued to accumulate literally power 261 00:16:59,240 --> 00:17:01,800 Speaker 1: such that one off there said that in the twentieth century, 262 00:17:02,360 --> 00:17:05,159 Speaker 1: there was no regulatory agency in the world that was 263 00:17:05,240 --> 00:17:08,040 Speaker 1: more powerful than f DA, and that's because they're very 264 00:17:08,080 --> 00:17:11,359 Speaker 1: good at what they do. So if you think about it, 265 00:17:11,560 --> 00:17:14,480 Speaker 1: they work for the president, and the presidents are supposed 266 00:17:14,480 --> 00:17:18,320 Speaker 1: to oversee them, but every president, Republican and Democrat, has 267 00:17:18,359 --> 00:17:22,600 Speaker 1: said it's nearly impossible to control the administrative state. That is, 268 00:17:22,640 --> 00:17:26,040 Speaker 1: all the agencies that they oversee. There are hundreds of 269 00:17:26,080 --> 00:17:29,240 Speaker 1: thousands of employees. They all have their own little agendas, 270 00:17:29,240 --> 00:17:31,879 Speaker 1: and it's very, very difficult for the presidents to do anything. 271 00:17:32,560 --> 00:17:36,840 Speaker 1: Um So that's a problem. Congress they just don't oversee FDA. 272 00:17:36,880 --> 00:17:39,680 Speaker 1: They're afraid of f d A and um so there's 273 00:17:39,720 --> 00:17:43,280 Speaker 1: not a lot of oversight. They're interesting, Why is Congress 274 00:17:43,320 --> 00:17:46,000 Speaker 1: afraid of the f d A. Well, FDA has very 275 00:17:46,040 --> 00:17:49,600 Speaker 1: complex issues, very hard for Congress. People are really busy, 276 00:17:50,080 --> 00:17:52,080 Speaker 1: you know, with lots and lots of things, mostly getting 277 00:17:52,119 --> 00:17:55,040 Speaker 1: reelected and raising money. They don't have a lot of time, 278 00:17:55,119 --> 00:17:59,119 Speaker 1: and their stats aren't very good at overseeing these complex issues. 279 00:17:59,200 --> 00:18:01,640 Speaker 1: I get it, but it means that FDA is kind 280 00:18:01,640 --> 00:18:03,880 Speaker 1: of more free to do what they want, so they 281 00:18:03,920 --> 00:18:07,600 Speaker 1: hold Congress at bay the courts have up until recently 282 00:18:07,600 --> 00:18:11,960 Speaker 1: and generally will give them deference to how they interpret 283 00:18:12,160 --> 00:18:17,440 Speaker 1: their statutes. Then they have other people, academics they generally 284 00:18:17,440 --> 00:18:20,399 Speaker 1: give grants to that kind of buys them off. And 285 00:18:20,440 --> 00:18:23,359 Speaker 1: then when they go into Congress to ask, you know, 286 00:18:23,520 --> 00:18:25,639 Speaker 1: which they do every year, for more money, they actually 287 00:18:25,640 --> 00:18:28,280 Speaker 1: have large food firms going and testifying and saying, yeah, 288 00:18:28,359 --> 00:18:31,400 Speaker 1: FDA should get more money. There's a reason for that. 289 00:18:31,920 --> 00:18:35,120 Speaker 1: Large firms usually get regulations that are easy for them 290 00:18:35,160 --> 00:18:38,159 Speaker 1: to comply with but hard for small firms, so it 291 00:18:38,200 --> 00:18:41,879 Speaker 1: puts their small competitors at a disadvantage. I want to 292 00:18:41,920 --> 00:18:45,040 Speaker 1: go into that issue a little bit later, the issue 293 00:18:45,280 --> 00:18:49,240 Speaker 1: of big companies versus small companies, because that that feels 294 00:18:49,280 --> 00:18:52,960 Speaker 1: like an important topic. But before we go there, let's 295 00:18:52,960 --> 00:18:58,359 Speaker 1: talk about one other constituency, consumers and where they rank 296 00:18:58,720 --> 00:19:02,800 Speaker 1: in that higher are key. I think consumers and small 297 00:19:02,800 --> 00:19:05,959 Speaker 1: businesses both are at the bottom. When I first came 298 00:19:06,000 --> 00:19:08,080 Speaker 1: into f d A, you know, I truly believe in 299 00:19:08,119 --> 00:19:10,760 Speaker 1: what we're trying to do. I thought these are important issues, 300 00:19:11,440 --> 00:19:14,840 Speaker 1: and over time I found out well, initially when I 301 00:19:14,880 --> 00:19:17,639 Speaker 1: got there, I was told this that this actually happened 302 00:19:17,640 --> 00:19:20,080 Speaker 1: with the let acetate rule. I said, this rule won't 303 00:19:20,119 --> 00:19:22,399 Speaker 1: do anything. Why are we doing it? And my boss 304 00:19:22,400 --> 00:19:24,640 Speaker 1: at the time said, Richard, we don't have to solve problems. 305 00:19:25,160 --> 00:19:27,880 Speaker 1: All we have to do is appear to solve problems. 306 00:19:27,960 --> 00:19:31,480 Speaker 1: We have to do something about it. And that never 307 00:19:31,560 --> 00:19:33,960 Speaker 1: left me, and I noticed more and more that was 308 00:19:34,040 --> 00:19:36,600 Speaker 1: the case once we passed the regulation. No matter what 309 00:19:36,640 --> 00:19:40,080 Speaker 1: the regulation did, it didn't matter. It looked like we 310 00:19:40,119 --> 00:19:44,000 Speaker 1: had addressed the problem. And that and that leaves consumers 311 00:19:44,680 --> 00:19:47,680 Speaker 1: and public health in in in last place, if you will, 312 00:19:48,240 --> 00:19:52,320 Speaker 1: what a mess. Okay, So let's talk about some of 313 00:19:52,359 --> 00:19:55,560 Speaker 1: the you know, in your view, the biggest and most 314 00:19:55,640 --> 00:19:58,800 Speaker 1: dangerous problems that face consumers today that the FDA is 315 00:19:58,880 --> 00:20:02,800 Speaker 1: responsible for. And we can live within the domain of food, 316 00:20:03,040 --> 00:20:07,520 Speaker 1: but feel free also to talk about stuff outside of food. Yeah. Well, 317 00:20:07,960 --> 00:20:10,680 Speaker 1: I do think it's obesity. I think that is probably 318 00:20:10,720 --> 00:20:13,960 Speaker 1: the number one and it's going to get worse. You know, 319 00:20:13,960 --> 00:20:16,879 Speaker 1: there was a harvest study that projected that by the 320 00:20:16,960 --> 00:20:19,600 Speaker 1: end of this decade, which is only eight years away, 321 00:20:19,840 --> 00:20:23,479 Speaker 1: half of this country will be obese. Um FDA plays 322 00:20:23,680 --> 00:20:26,920 Speaker 1: a role in that with food labeling. Food labeling has 323 00:20:26,960 --> 00:20:29,840 Speaker 1: not worked, It's not going to work. It's really not 324 00:20:29,880 --> 00:20:33,399 Speaker 1: going to be the answer. Putting calories on foods hasn't 325 00:20:33,440 --> 00:20:36,240 Speaker 1: really changed anyone's behavior as far as we can tell, 326 00:20:37,960 --> 00:20:42,119 Speaker 1: sticking with obesity, and I don't think you you made 327 00:20:42,160 --> 00:20:45,080 Speaker 1: this connection overtly in the book, But one of the 328 00:20:45,119 --> 00:20:48,720 Speaker 1: impressions that I've gotten both from you know what we've 329 00:20:48,800 --> 00:20:52,400 Speaker 1: learned from your book and also other research that we've done, 330 00:20:52,520 --> 00:20:57,480 Speaker 1: is that intended or not, the combination of the industrialized 331 00:20:57,520 --> 00:21:02,680 Speaker 1: food system where people eat a diet and combine that 332 00:21:02,760 --> 00:21:06,639 Speaker 1: with a for profit healthcare system that then profits from 333 00:21:06,680 --> 00:21:10,639 Speaker 1: the multiple diseases that result from things like obesity is 334 00:21:10,800 --> 00:21:16,560 Speaker 1: kind of the perfect dystopian partnership. What do you think 335 00:21:16,600 --> 00:21:20,440 Speaker 1: about that first? And and do you think the FDA 336 00:21:20,560 --> 00:21:24,640 Speaker 1: has played a role in that. I don't personally think 337 00:21:24,720 --> 00:21:28,320 Speaker 1: that is the exact issue. I think it is an issue. 338 00:21:28,400 --> 00:21:31,880 Speaker 1: I think certainly food companies, like every other company in 339 00:21:32,000 --> 00:21:34,480 Speaker 1: this country, are there to make a profit. The problem 340 00:21:34,560 --> 00:21:37,880 Speaker 1: is we have no idea what a healthy diet looks 341 00:21:37,960 --> 00:21:41,880 Speaker 1: like because nutrition science is the worst science that we have, 342 00:21:42,600 --> 00:21:45,520 Speaker 1: and so we've got people out there hawking books, you know, 343 00:21:45,680 --> 00:21:48,240 Speaker 1: on diets, something say well, you gotta have a high 344 00:21:48,240 --> 00:21:49,760 Speaker 1: carb diet, and so you have to have a low 345 00:21:49,800 --> 00:21:53,159 Speaker 1: carb diet, high fat diet, low fat diet, on and 346 00:21:53,200 --> 00:21:55,280 Speaker 1: on and on. And the truth of the matter is 347 00:21:55,359 --> 00:21:58,560 Speaker 1: one we don't know, and to what we're beginning to 348 00:21:58,600 --> 00:22:01,200 Speaker 1: find out is that one diet probably isn't right for everyone. 349 00:22:01,800 --> 00:22:06,040 Speaker 1: Different people with different genetic backgrounds, different underlying health conditions 350 00:22:06,080 --> 00:22:08,960 Speaker 1: probably need different diets. So I think, in my mind 351 00:22:09,359 --> 00:22:13,159 Speaker 1: a bigger problem and this is why basically companies can 352 00:22:13,200 --> 00:22:15,359 Speaker 1: sell whatever they want. It's because we just don't know 353 00:22:15,520 --> 00:22:19,280 Speaker 1: enough yet because the science is so bad. Well that 354 00:22:19,280 --> 00:22:21,720 Speaker 1: that that's a good point. I think to pivot to 355 00:22:22,280 --> 00:22:25,679 Speaker 1: another aspect that you touch on in your book. You 356 00:22:25,720 --> 00:22:29,480 Speaker 1: believe that entrepreneurs are solving many problems that the FDA 357 00:22:29,640 --> 00:22:33,680 Speaker 1: can't or won't. Can you talk more about that. FDA 358 00:22:33,760 --> 00:22:37,400 Speaker 1: has recently announced they're going to start looking at technologies. 359 00:22:37,880 --> 00:22:39,479 Speaker 1: So the first thing that they're gonna do, and this 360 00:22:39,560 --> 00:22:42,400 Speaker 1: is gonna be tremendously helpful if we can get it done, 361 00:22:42,840 --> 00:22:46,120 Speaker 1: is using blockchain, the same thing that you use for cryptocurrency, 362 00:22:46,800 --> 00:22:51,679 Speaker 1: in order to start recalling things faster. For example, this 363 00:22:51,760 --> 00:22:55,800 Speaker 1: technology could in theory, reduce the time it takes to 364 00:22:55,880 --> 00:22:59,280 Speaker 1: trace the origin of a contaminated food upbreak from two 365 00:22:59,280 --> 00:23:05,080 Speaker 1: weeks into two seconds. So that does several things. One, 366 00:23:05,280 --> 00:23:08,200 Speaker 1: if you can trace sinks back very quickly, you can 367 00:23:08,240 --> 00:23:10,399 Speaker 1: find out what the root caused us in others, what 368 00:23:10,720 --> 00:23:14,960 Speaker 1: actually went wrong that caused this problem. More importantly, it 369 00:23:14,960 --> 00:23:18,480 Speaker 1: gets bad products off the market more quickly. And it 370 00:23:18,600 --> 00:23:24,159 Speaker 1: also doesn't basically indict everybody right yes, which sort of 371 00:23:24,240 --> 00:23:27,480 Speaker 1: leads to something that you've entered at already in this interview, 372 00:23:27,520 --> 00:23:30,720 Speaker 1: which is that the playing field is not necessarily level 373 00:23:31,119 --> 00:23:35,440 Speaker 1: between small companies and big companies. Can you talk more 374 00:23:35,480 --> 00:23:39,159 Speaker 1: about that? So there's a law on the regulatary flexibility 375 00:23:39,200 --> 00:23:41,919 Speaker 1: access we have to take into account. You know what 376 00:23:42,000 --> 00:23:44,239 Speaker 1: the impacts these things are on small producers, and if 377 00:23:44,280 --> 00:23:47,600 Speaker 1: we can to give them some kind of relief, And 378 00:23:47,720 --> 00:23:50,280 Speaker 1: given the fact that the benefits birth that great, I said, well, 379 00:23:50,320 --> 00:23:52,440 Speaker 1: at least let's examp some of these small firms and 380 00:23:52,600 --> 00:23:55,760 Speaker 1: make the requirements easier on them. We didn't do that. 381 00:23:56,359 --> 00:23:58,280 Speaker 1: Of course, all the large firms didn't want to give 382 00:23:58,400 --> 00:24:01,119 Speaker 1: small firms a break. But as they say, it's an 383 00:24:01,160 --> 00:24:05,080 Speaker 1: unlevel playing field. They have much more influence over FDA, 384 00:24:05,200 --> 00:24:07,600 Speaker 1: and this is true in any regulatory agency than the 385 00:24:07,640 --> 00:24:10,640 Speaker 1: small firms do. So the small firms get driven out 386 00:24:10,640 --> 00:24:13,320 Speaker 1: of business, and I think it's a shame that we 387 00:24:13,359 --> 00:24:15,720 Speaker 1: have two different laws that are supposed to protect small 388 00:24:15,760 --> 00:24:18,680 Speaker 1: firms and we're still not doing enough for them. One 389 00:24:18,680 --> 00:24:21,919 Speaker 1: of the other topics that you you touch on, and 390 00:24:21,960 --> 00:24:25,040 Speaker 1: that that I've encountered in other environments is the topic 391 00:24:25,080 --> 00:24:27,960 Speaker 1: of conflict of interest. One of the things that I 392 00:24:28,040 --> 00:24:32,359 Speaker 1: just have a hard time getting over is government employees 393 00:24:32,440 --> 00:24:35,320 Speaker 1: leaving the FDA and going to work for the corporations 394 00:24:35,359 --> 00:24:38,760 Speaker 1: that they're trying to regulate. And you give a very 395 00:24:38,800 --> 00:24:43,000 Speaker 1: poignant example of a person who wrote a regulation just 396 00:24:43,280 --> 00:24:46,280 Speaker 1: so that they could jump out and get a job 397 00:24:46,359 --> 00:24:50,600 Speaker 1: consulting on that same regulation to help corporations figure out 398 00:24:50,640 --> 00:24:54,320 Speaker 1: how to wend their way through it. Is that common. 399 00:24:55,200 --> 00:24:57,000 Speaker 1: It's hard for me to know. I don't have data 400 00:24:57,080 --> 00:24:59,640 Speaker 1: on how common is. I certainly saw it often enough. 401 00:25:00,240 --> 00:25:02,480 Speaker 1: The law allows it. What they do say is that 402 00:25:02,680 --> 00:25:04,720 Speaker 1: it has to be a number of years before you 403 00:25:04,760 --> 00:25:08,200 Speaker 1: can come back and lobby the f DAY for that industry. 404 00:25:08,280 --> 00:25:09,760 Speaker 1: But you can go out and work for them. You 405 00:25:09,760 --> 00:25:12,280 Speaker 1: can tell them how to comply, and you can make money. 406 00:25:13,080 --> 00:25:14,720 Speaker 1: I could be wrong, about this, But I think a 407 00:25:14,760 --> 00:25:16,359 Speaker 1: lot of people feel, you know what, I work in 408 00:25:16,400 --> 00:25:21,040 Speaker 1: a quote unquote low wage government job. I should be 409 00:25:21,080 --> 00:25:23,440 Speaker 1: allowed to go out and make money like everybody else. 410 00:25:23,720 --> 00:25:25,920 Speaker 1: I sort of served my country in this job, and 411 00:25:26,200 --> 00:25:29,440 Speaker 1: I have some sympathy for that. But again, like you say, 412 00:25:29,840 --> 00:25:31,840 Speaker 1: it seems to be something wrong if you can be 413 00:25:31,920 --> 00:25:34,480 Speaker 1: a part of writing a rule and then you know, 414 00:25:34,600 --> 00:25:36,080 Speaker 1: right it in such a way that you can go 415 00:25:36,080 --> 00:25:39,040 Speaker 1: out and profit from it. Yeah. I mean this is 416 00:25:39,040 --> 00:25:42,960 Speaker 1: a really hard question, so I don't expect an answer, honestly, 417 00:25:43,080 --> 00:25:45,960 Speaker 1: But how do you solve that problem? You know, it's 418 00:25:46,000 --> 00:25:49,240 Speaker 1: just a great question. I wish I did know, But 419 00:25:49,400 --> 00:25:52,200 Speaker 1: I'm pretty sure it's not the worst problem. F d A. 420 00:25:52,680 --> 00:25:55,320 Speaker 1: What's the worst problem? The fact that they're not solving 421 00:25:55,359 --> 00:25:59,200 Speaker 1: any problems, the fact that decades and decades go by, 422 00:25:59,440 --> 00:26:02,560 Speaker 1: uh there, their budget keeps going up, they get more people, 423 00:26:03,040 --> 00:26:06,280 Speaker 1: people believe in them, they believe that they're keeping us safe, 424 00:26:06,800 --> 00:26:10,199 Speaker 1: and that it just keeps going like that. And the 425 00:26:10,240 --> 00:26:12,960 Speaker 1: fact that you can go to Congress every single year 426 00:26:13,400 --> 00:26:15,359 Speaker 1: and say we've got to do something. One out of 427 00:26:15,400 --> 00:26:18,320 Speaker 1: six people in this country getting sick from food poisoning 428 00:26:18,720 --> 00:26:21,000 Speaker 1: every year for thirty or forty years you're saying the 429 00:26:21,040 --> 00:26:23,280 Speaker 1: same thing. Congress goes, oh my god, that's terrible, we 430 00:26:23,320 --> 00:26:26,800 Speaker 1: got to give you more money. Well, no, sooner or 431 00:26:26,920 --> 00:26:31,240 Speaker 1: later you gotta start saying, Okay, you gotta do something different. So, 432 00:26:31,400 --> 00:26:34,720 Speaker 1: just to be fair, are there any big winds that 433 00:26:34,800 --> 00:26:36,679 Speaker 1: you would point two times that the f d A 434 00:26:36,800 --> 00:26:41,040 Speaker 1: has gotten it really right? Oh? Absolutely, And first of all, 435 00:26:41,119 --> 00:26:43,480 Speaker 1: let me say, a lot of it is the system. 436 00:26:43,480 --> 00:26:45,200 Speaker 1: It's not the people. There are a lot of great 437 00:26:45,240 --> 00:26:48,719 Speaker 1: people that work for FDA. They're very smart, they're very dedicated, 438 00:26:48,760 --> 00:26:53,120 Speaker 1: they believe in it, but unfortunately they they they've run 439 00:26:53,119 --> 00:26:55,520 Speaker 1: out of ideas. But one thing I think where we 440 00:26:55,560 --> 00:26:59,240 Speaker 1: got it right was transfatty acids. This was a case 441 00:26:59,359 --> 00:27:03,040 Speaker 1: where an initially we were just going to ask firms 442 00:27:03,040 --> 00:27:05,800 Speaker 1: if they wanted to voluntarily label it. Trans fatty acids 443 00:27:05,840 --> 00:27:09,160 Speaker 1: are worse for you than than saturated fats, so that 444 00:27:09,240 --> 00:27:11,359 Speaker 1: was the bottom line. We really needed to do something, 445 00:27:11,960 --> 00:27:13,680 Speaker 1: so we kind of went round and round. There's a 446 00:27:13,680 --> 00:27:16,720 Speaker 1: long story about it, but we ended up with mandatory 447 00:27:16,800 --> 00:27:19,760 Speaker 1: labeling and companies sort of got the message and they 448 00:27:19,760 --> 00:27:23,320 Speaker 1: began pulling trans fatty acids because they're added to most foods, 449 00:27:23,520 --> 00:27:25,520 Speaker 1: so if they're added, that means you can easily take 450 00:27:25,520 --> 00:27:27,960 Speaker 1: them out as opposed to saturated fat, which is just 451 00:27:28,040 --> 00:27:30,960 Speaker 1: a part of the food. So they started take them 452 00:27:30,960 --> 00:27:35,119 Speaker 1: out and then eventually adding them has become illegal. I 453 00:27:35,160 --> 00:27:37,120 Speaker 1: think that's probably one of the best things that we did. 454 00:27:37,640 --> 00:27:40,400 Speaker 1: Is there anything that I haven't asked you about that 455 00:27:40,480 --> 00:27:43,600 Speaker 1: you think people should know about the f d A. 456 00:27:44,320 --> 00:27:47,080 Speaker 1: I do think right now with the Commissioner saying our 457 00:27:47,119 --> 00:27:49,960 Speaker 1: food safety system is broken with what has been happening 458 00:27:50,000 --> 00:27:52,560 Speaker 1: with infant formula, which f d A incidentally played a 459 00:27:52,680 --> 00:27:55,760 Speaker 1: huge role in why that happened. For forty years they've 460 00:27:55,760 --> 00:27:57,760 Speaker 1: been telling firms who want to come and start making 461 00:27:57,840 --> 00:28:01,360 Speaker 1: infant formulated, said no, they've kept it at six firms 462 00:28:02,440 --> 00:28:05,640 Speaker 1: new kidding. So they created essentially the monopoly that then 463 00:28:06,160 --> 00:28:09,720 Speaker 1: right hurt the hurt the whole countactly exactly. So when 464 00:28:09,760 --> 00:28:12,359 Speaker 1: you had one manufacturer drop out, of course there's a 465 00:28:12,440 --> 00:28:15,320 Speaker 1: huge problem. So with with these problems coming up, I 466 00:28:15,359 --> 00:28:17,560 Speaker 1: think the curtain has fallen down and we see what's 467 00:28:17,560 --> 00:28:20,439 Speaker 1: going on behind the curtain. Yeah, it couldn't have a 468 00:28:20,480 --> 00:28:22,920 Speaker 1: better time to start thinking about doing things in new 469 00:28:22,920 --> 00:28:25,879 Speaker 1: way and actually trying to solve problems than just appear 470 00:28:25,960 --> 00:28:28,520 Speaker 1: to be doing that something. So I'm hoping now is 471 00:28:28,560 --> 00:28:31,280 Speaker 1: the time. That's a nice note to wrap up on. 472 00:28:31,400 --> 00:28:34,520 Speaker 1: So let's just say that you're the FDA commissioner for 473 00:28:34,560 --> 00:28:38,000 Speaker 1: a day, or a month, or a year, for whatever 474 00:28:38,080 --> 00:28:42,280 Speaker 1: period of time is necessary to make real change. What 475 00:28:42,320 --> 00:28:46,760 Speaker 1: would you do to help the f d A achieve 476 00:28:47,080 --> 00:28:50,000 Speaker 1: its stated mission, which, as it says on the website, 477 00:28:50,000 --> 00:28:52,480 Speaker 1: it says as in the past, the FDA strives above 478 00:28:52,600 --> 00:28:55,920 Speaker 1: all else to safeguard the health and well being of 479 00:28:55,920 --> 00:28:59,000 Speaker 1: the American people. What's the one thing you would change 480 00:28:59,000 --> 00:29:02,800 Speaker 1: about the FDA to help them achieve that goal. I would, 481 00:29:02,960 --> 00:29:05,440 Speaker 1: I would retask them. I would say, look, we're not 482 00:29:05,480 --> 00:29:08,120 Speaker 1: going to pass all these regulations. Let's start looking at 483 00:29:08,120 --> 00:29:11,680 Speaker 1: these new technologies. If we think they're not completely safe, 484 00:29:11,680 --> 00:29:14,160 Speaker 1: that they need some sort of adjustment, let's focus on those, 485 00:29:14,480 --> 00:29:17,880 Speaker 1: let's promote them. I'll just give you quickly a list 486 00:29:17,920 --> 00:29:22,760 Speaker 1: of things. Precision fermentation, genetic engineering, three D printers are 487 00:29:22,760 --> 00:29:25,760 Speaker 1: coming along, consumer nutrition devices. They're going to have to 488 00:29:25,800 --> 00:29:29,560 Speaker 1: go through the medical device basically preapproval thing. We've got 489 00:29:29,560 --> 00:29:32,520 Speaker 1: to get those through faster. Those are gonna help consumers 490 00:29:32,600 --> 00:29:36,880 Speaker 1: eat better, nanopackaging, where we have smart packaging consumers when 491 00:29:36,880 --> 00:29:40,120 Speaker 1: their food is becoming spoiled. All of these things, I 492 00:29:40,120 --> 00:29:43,040 Speaker 1: would say, let's start looking there. That's the future. Those 493 00:29:43,080 --> 00:29:46,120 Speaker 1: are the solutions. Let's start looking at real solutions and 494 00:29:46,160 --> 00:29:50,080 Speaker 1: stop trying to pass these regulations. They were fine a 495 00:29:50,120 --> 00:29:52,520 Speaker 1: hundred years ago, they're not fine now, so I would 496 00:29:52,600 --> 00:29:58,080 Speaker 1: retask them. I love that, Okay, Richard. On this show, 497 00:29:58,400 --> 00:30:00,800 Speaker 1: we have a tool that we call the b S 498 00:30:00,920 --> 00:30:05,080 Speaker 1: scale that we used to measure the gap between word 499 00:30:05,120 --> 00:30:07,800 Speaker 1: and deed. And our scale goes from zero to one, 500 00:30:08,520 --> 00:30:12,880 Speaker 1: zero being the best zero bs and a hundred being 501 00:30:12,880 --> 00:30:16,760 Speaker 1: the worst total BS. So what score would you give 502 00:30:17,160 --> 00:30:20,040 Speaker 1: the f D A Well, mostly I'm qualified to talk 503 00:30:20,080 --> 00:30:22,240 Speaker 1: about the food's part the FDA, so I'll just focus 504 00:30:22,280 --> 00:30:26,280 Speaker 1: on them on that scale, because you know, they say 505 00:30:26,320 --> 00:30:29,440 Speaker 1: that they're protecting consumers. They say all these things, and 506 00:30:29,480 --> 00:30:31,400 Speaker 1: they're not doing it. I would give them about a 507 00:30:31,440 --> 00:30:35,840 Speaker 1: seventy five. So it's of what they do in foods 508 00:30:35,920 --> 00:30:41,400 Speaker 1: is bullshit. Okay, room for improvement, all right, Richard, thank 509 00:30:41,440 --> 00:30:43,120 Speaker 1: you so much for being here today. This was a 510 00:30:43,120 --> 00:30:45,080 Speaker 1: great conversation, and I also want to thank you for 511 00:30:45,120 --> 00:30:47,080 Speaker 1: writing this book and doing the work that you're doing 512 00:30:47,320 --> 00:30:49,960 Speaker 1: post your time at the FDA. It's incredibly important and 513 00:30:50,040 --> 00:30:51,880 Speaker 1: we thank you for it. Well, thank you again for 514 00:30:51,920 --> 00:30:56,560 Speaker 1: having me on. It's been great. Well, that's interesting, if 515 00:30:56,600 --> 00:31:01,720 Speaker 1: not totally disconcerting. Earlier in the episode, Lauren Adder posited 516 00:31:01,760 --> 00:31:04,280 Speaker 1: that the FDA had fallen short due to a lack 517 00:31:04,320 --> 00:31:09,719 Speaker 1: of resources, but Richard, a former FDA employee, says that 518 00:31:09,760 --> 00:31:14,560 Speaker 1: the organization floundered in spite of a consistently increasing budget. 519 00:31:15,320 --> 00:31:18,080 Speaker 1: If there was a harbinger of bullshit before, there now 520 00:31:18,120 --> 00:31:22,240 Speaker 1: seems to be a flashing neon arrow. But to be fair, 521 00:31:22,680 --> 00:31:27,880 Speaker 1: we've yet to explore the biggest inter agency. Buckle up, folks, 522 00:31:27,920 --> 00:31:31,480 Speaker 1: because next we're looking into the f d a's Center 523 00:31:31,560 --> 00:31:50,520 Speaker 1: of Drug Evaluation and Research. Right after this. Okay, folks, 524 00:31:51,080 --> 00:31:54,840 Speaker 1: it is my great pleasure to introduce Dr Gail Van 525 00:31:54,920 --> 00:31:58,480 Speaker 1: Norman to the show. Gail, welcome to Calling Bullshit. Well, 526 00:31:58,520 --> 00:32:01,640 Speaker 1: thank you, Ti, it's a pleasure to be here. Gail 527 00:32:01,760 --> 00:32:04,600 Speaker 1: is a clinician and professor at the University of Washington. 528 00:32:05,000 --> 00:32:08,880 Speaker 1: She writes and teaches about the medical research process, everything 529 00:32:08,960 --> 00:32:13,160 Speaker 1: from the FDA to commercialization to how animal testing works. 530 00:32:15,080 --> 00:32:18,520 Speaker 1: So the reason that we're doing this episode on calling 531 00:32:18,560 --> 00:32:21,120 Speaker 1: BS is that the f d A has come up 532 00:32:21,520 --> 00:32:24,280 Speaker 1: a couple of times on this show, and our research 533 00:32:24,400 --> 00:32:26,560 Speaker 1: left me with the impression that the f d A 534 00:32:26,840 --> 00:32:31,440 Speaker 1: sometimes gets things very right and sometimes it gets things 535 00:32:31,720 --> 00:32:34,560 Speaker 1: very wrong. Would you agree with that? I would. I 536 00:32:34,600 --> 00:32:39,040 Speaker 1: think the the FDA operates under constraints that are imperfect, 537 00:32:39,160 --> 00:32:43,000 Speaker 1: and even the best of organizations tasked with such a 538 00:32:43,040 --> 00:32:47,520 Speaker 1: complex mission are going to have misses and hits. Over All, 539 00:32:47,520 --> 00:32:50,760 Speaker 1: the FDA has had a number of spectacular hits and 540 00:32:51,040 --> 00:32:54,640 Speaker 1: pretty notable failures. Yeah, so let's let's get into some 541 00:32:54,680 --> 00:32:57,280 Speaker 1: of those, because I'd love to have you start out providing, 542 00:32:57,520 --> 00:32:59,920 Speaker 1: you know, an example or two of times when the 543 00:33:00,040 --> 00:33:02,840 Speaker 1: FDA has truly lived up to their mission and really 544 00:33:02,880 --> 00:33:05,960 Speaker 1: gotten it right. Okay, Well, let's start with the lidamide 545 00:33:05,960 --> 00:33:09,640 Speaker 1: because it's a sort of a classic historical example of 546 00:33:09,640 --> 00:33:12,320 Speaker 1: what can go right and wrong with medical research, as 547 00:33:12,320 --> 00:33:16,040 Speaker 1: well as what happens at the FDA. Phillidamide was an 548 00:33:16,080 --> 00:33:20,080 Speaker 1: anti nausea drug introduced in Germany in the nineteen fifties, 549 00:33:20,400 --> 00:33:23,520 Speaker 1: and it was considered one of the safest consumer drugs 550 00:33:23,560 --> 00:33:27,120 Speaker 1: ever to hit the market because it was so safe 551 00:33:27,760 --> 00:33:32,360 Speaker 1: practitioners really picked out its use in pregnancy because nausea 552 00:33:32,440 --> 00:33:34,840 Speaker 1: during pregnancy is not only a misery to women, it 553 00:33:34,880 --> 00:33:37,120 Speaker 1: can be dangerous to the health of the mother and 554 00:33:37,160 --> 00:33:40,000 Speaker 1: the fetus in utero, and so having something that controls 555 00:33:40,080 --> 00:33:43,360 Speaker 1: nausea and vomiting is very important. The drug had been 556 00:33:43,440 --> 00:33:47,080 Speaker 1: tested in animals prior to human tests that had been tested, 557 00:33:47,080 --> 00:33:50,720 Speaker 1: and I believe the number is something like fifty or 558 00:33:50,720 --> 00:33:53,720 Speaker 1: a hundred different species of animals, including rats and mice 559 00:33:53,800 --> 00:33:57,720 Speaker 1: and dogs and cats and armadillos and ferrets and rabbits. 560 00:33:58,120 --> 00:33:59,920 Speaker 1: It was considered such a safe drug that it was 561 00:34:00,040 --> 00:34:02,880 Speaker 1: not required that you have a prescription to use it. Uh. 562 00:34:02,920 --> 00:34:06,560 Speaker 1: The drug manufacturer gave it away free to its factory 563 00:34:06,560 --> 00:34:11,440 Speaker 1: workers pregnant factory workers to use during pregnancy. Wanting to 564 00:34:11,480 --> 00:34:14,880 Speaker 1: expand the market into the US, the manufacturer submitted the 565 00:34:14,960 --> 00:34:17,880 Speaker 1: drug for review with the f d A. The committee 566 00:34:17,920 --> 00:34:22,440 Speaker 1: reviewing Philidamie was headed by Francis Oldham Kelsey, who just 567 00:34:22,520 --> 00:34:25,719 Speaker 1: so happened to be the first woman to hold the position. 568 00:34:27,640 --> 00:34:29,799 Speaker 1: She later joked that they gave her what they thought 569 00:34:29,840 --> 00:34:32,000 Speaker 1: would be the easiest one they could possibly give and 570 00:34:32,040 --> 00:34:33,920 Speaker 1: whether they did that because she was a woman or 571 00:34:33,960 --> 00:34:37,200 Speaker 1: she was new. We'll just leave to speculation. Anyway, she 572 00:34:37,239 --> 00:34:40,080 Speaker 1: read the data and something about it didn't ring true 573 00:34:40,120 --> 00:34:42,160 Speaker 1: with her. She didn't like it, and she said, I'm 574 00:34:42,200 --> 00:34:44,440 Speaker 1: not going to approve this. I'm gonna stop you, and 575 00:34:44,480 --> 00:34:48,040 Speaker 1: I want to see a few more studies. And just 576 00:34:48,160 --> 00:34:51,480 Speaker 1: a few months later, on Christmas Day, in the first 577 00:34:51,480 --> 00:34:55,560 Speaker 1: baby was born in Germany without years, a little baby girl. 578 00:34:55,960 --> 00:34:59,279 Speaker 1: And that was followed by over ten thousand cases of 579 00:34:59,400 --> 00:35:03,440 Speaker 1: severely deformed infants that were born and probably twenty cases 580 00:35:03,440 --> 00:35:07,960 Speaker 1: of in utero deaths. And the US saw exactly seventeen 581 00:35:08,080 --> 00:35:14,480 Speaker 1: cases of thelidamid deformities, presumably in the children of mothers 582 00:35:14,520 --> 00:35:16,920 Speaker 1: who brought the drug in from out of country. We 583 00:35:16,920 --> 00:35:19,640 Speaker 1: were saved that plague because we never approved the drug 584 00:35:19,680 --> 00:35:23,440 Speaker 1: in the United States for that use. I mean that Actually, 585 00:35:23,480 --> 00:35:25,920 Speaker 1: I had a galvanic response to that story, like it 586 00:35:25,960 --> 00:35:27,680 Speaker 1: makes the hair on the back of my next stand 587 00:35:27,760 --> 00:35:29,719 Speaker 1: up to make me think how close we came to 588 00:35:29,760 --> 00:35:35,280 Speaker 1: a total unmitigated disaster. You know, yeah, it really is true. 589 00:35:35,840 --> 00:35:39,200 Speaker 1: And let's let's talk about COVID also, because that's fresh 590 00:35:39,239 --> 00:35:42,560 Speaker 1: in everyone's mind, and that's clearly a case where you know, 591 00:35:42,760 --> 00:35:46,880 Speaker 1: I mean it seemingly the normal process takes forever, and 592 00:35:46,920 --> 00:35:50,960 Speaker 1: it just it feels like COVID vaccines just magically appeared. 593 00:35:51,120 --> 00:35:54,560 Speaker 1: How did that happen? It was very incredible, right. Yeah. 594 00:35:55,080 --> 00:35:57,279 Speaker 1: I'm thinking to myself two and a half years ago 595 00:35:57,440 --> 00:36:01,920 Speaker 1: when COVID hit and I have in my distant medical background, 596 00:36:01,960 --> 00:36:05,360 Speaker 1: I have a background in immunology, and I remember people 597 00:36:05,400 --> 00:36:07,319 Speaker 1: coming and asking me, well, how long will it take 598 00:36:07,320 --> 00:36:09,480 Speaker 1: for us to have a vaccine? And the average time 599 00:36:09,520 --> 00:36:12,560 Speaker 1: to get a vaccine created for a new disease is 600 00:36:12,600 --> 00:36:17,480 Speaker 1: fifteen years, So it's it was like, so right away 601 00:36:17,480 --> 00:36:21,160 Speaker 1: what was happening was the the government was saying We'll 602 00:36:21,160 --> 00:36:23,319 Speaker 1: have a vaccine for you in a year, and I 603 00:36:23,400 --> 00:36:25,719 Speaker 1: was going, not, on your life, you won't. It's just 604 00:36:25,800 --> 00:36:30,120 Speaker 1: not trying to happen. No chance. But of course there 605 00:36:30,320 --> 00:36:36,800 Speaker 1: was a chance due to several important factors all occurring simultaneously. First, 606 00:36:37,080 --> 00:36:40,959 Speaker 1: the US government created a public private partnership offering ten 607 00:36:41,120 --> 00:36:45,359 Speaker 1: billion dollars to pharmaceutical companies to start immediately making and 608 00:36:45,400 --> 00:36:52,640 Speaker 1: testing vaccines. Second, the mRNA vaccine had become available. This 609 00:36:52,719 --> 00:36:55,520 Speaker 1: new technology created a way for the body to show 610 00:36:55,520 --> 00:36:59,920 Speaker 1: effecsimile of the virus to itself, making testing on humans 611 00:37:00,280 --> 00:37:04,880 Speaker 1: much easier. And finally, the Century Cures Act allowed for 612 00:37:05,000 --> 00:37:11,000 Speaker 1: emergency authorization and the quick release of the vaccines. All 613 00:37:11,040 --> 00:37:15,560 Speaker 1: those things came together. It's just amazing. Yeah, it is amazing. 614 00:37:16,600 --> 00:37:20,720 Speaker 1: So the litamide and covid vaccines y f d A truly, 615 00:37:21,040 --> 00:37:23,319 Speaker 1: and they do so many more things right as well. 616 00:37:23,360 --> 00:37:26,240 Speaker 1: Those are just a couple of examples. But let's let's 617 00:37:26,280 --> 00:37:29,719 Speaker 1: pivot to where they get it wrong. For for me, 618 00:37:30,200 --> 00:37:33,080 Speaker 1: because we did the research we did, Purdue Pharma comes 619 00:37:33,120 --> 00:37:38,720 Speaker 1: to mind and the OxyContin crisis. So the thing that 620 00:37:38,719 --> 00:37:41,839 Speaker 1: that shocked me about the Purdue situation and I did 621 00:37:41,840 --> 00:37:46,040 Speaker 1: not I did not understand this prior to doing this research. 622 00:37:46,200 --> 00:37:48,920 Speaker 1: But there are people at the FDA who have to 623 00:37:49,000 --> 00:37:53,000 Speaker 1: write the language that goes on warning labels on drugs, 624 00:37:53,520 --> 00:37:57,480 Speaker 1: and there are different kinds of warning labels. And the 625 00:37:57,560 --> 00:38:04,360 Speaker 1: big thing in in news world OxyContin was addictiveness. Any 626 00:38:04,440 --> 00:38:08,359 Speaker 1: opioid heretofore had been deemed to be addictive and had 627 00:38:08,400 --> 00:38:11,120 Speaker 1: to carry a label that said it was addictive and 628 00:38:11,280 --> 00:38:14,759 Speaker 1: would be prescribed in that same way. I e very sparingly, 629 00:38:15,760 --> 00:38:18,280 Speaker 1: and Purdue figured out a way to get the warning 630 00:38:18,360 --> 00:38:22,439 Speaker 1: label written in such a way as to avoid any 631 00:38:22,520 --> 00:38:27,480 Speaker 1: language of addiction, so that more doctors would be inclined 632 00:38:27,719 --> 00:38:32,440 Speaker 1: to write prescriptions for it. And my understanding of this 633 00:38:32,520 --> 00:38:35,320 Speaker 1: story is that the person who wrote the language actually 634 00:38:36,040 --> 00:38:39,040 Speaker 1: hold up in a hotel room with the executives from 635 00:38:39,040 --> 00:38:44,120 Speaker 1: Purdue and they all crafted that language together. And so 636 00:38:44,280 --> 00:38:49,120 Speaker 1: this gets to a larger issue obviously, which is sort 637 00:38:49,120 --> 00:38:54,080 Speaker 1: of the revolving door or the I would say, fraught 638 00:38:54,120 --> 00:39:00,400 Speaker 1: relationship between giant for profit industries and low paid government workers. 639 00:39:01,280 --> 00:39:05,640 Speaker 1: And further, to sort of add insult entry, once the 640 00:39:05,680 --> 00:39:09,560 Speaker 1: language had been crafted, once OxyContin was approved and being 641 00:39:09,640 --> 00:39:13,399 Speaker 1: sold across the world, that person left the f d 642 00:39:13,440 --> 00:39:17,040 Speaker 1: A and took a job at Purdue Pharma, you know, 643 00:39:17,160 --> 00:39:21,359 Speaker 1: kind of closing the circle as it were. And you know, 644 00:39:21,719 --> 00:39:26,640 Speaker 1: that story really disturbed me. It outraged me, honestly, and 645 00:39:26,800 --> 00:39:29,000 Speaker 1: I would love to hear your take on that and 646 00:39:29,239 --> 00:39:33,120 Speaker 1: how widespread a problem like that really is. I think 647 00:39:33,160 --> 00:39:36,200 Speaker 1: what you're getting at is a real problem in that 648 00:39:36,360 --> 00:39:40,600 Speaker 1: there is a variously porous interface between the kinds of 649 00:39:40,680 --> 00:39:44,480 Speaker 1: researchers who work within within the f d A and 650 00:39:44,560 --> 00:39:48,719 Speaker 1: those that work in commercial industry, and so people do 651 00:39:49,080 --> 00:39:53,160 Speaker 1: switch teams and that's perfectly legal right now, is it not? 652 00:39:53,520 --> 00:39:57,040 Speaker 1: Oh sure? I mean the current Commissioner of the FDA, 653 00:39:57,239 --> 00:40:00,640 Speaker 1: Robert Coliff, was an executive for or I can't remember 654 00:40:00,680 --> 00:40:04,279 Speaker 1: the name of the research company that's a subsidiary of 655 00:40:04,320 --> 00:40:06,920 Speaker 1: Alphabet anyway, he made two point seven million dollars a 656 00:40:07,040 --> 00:40:10,759 Speaker 1: year in salary and he used that And I'm not 657 00:40:10,760 --> 00:40:12,759 Speaker 1: criticizing him. I don't make it make it sound like 658 00:40:12,760 --> 00:40:15,919 Speaker 1: it was nefarious. That was actually a selling point for 659 00:40:16,040 --> 00:40:18,719 Speaker 1: his appointment as the Commissioner to the FDA because he 660 00:40:18,760 --> 00:40:21,960 Speaker 1: knew the ins and outs of commercial research companies. So 661 00:40:22,000 --> 00:40:24,799 Speaker 1: you can make an argument that it's helpful. And by 662 00:40:24,800 --> 00:40:27,920 Speaker 1: the way, his salary there right now is now three dollars, 663 00:40:27,920 --> 00:40:30,879 Speaker 1: which is considerably less. And he put in a lot 664 00:40:30,880 --> 00:40:35,000 Speaker 1: of agreements to say he would not participate in owning, 665 00:40:35,040 --> 00:40:38,359 Speaker 1: selling talking to you know, he self restricted to say, 666 00:40:38,560 --> 00:40:40,279 Speaker 1: I'm going to make sure you know that who I 667 00:40:40,320 --> 00:40:43,759 Speaker 1: work for, you know who my boss is. But other 668 00:40:43,800 --> 00:40:46,200 Speaker 1: people have not been quite so clear about it. And 669 00:40:46,280 --> 00:40:49,680 Speaker 1: so you have somebody who works on an FDA committee 670 00:40:49,840 --> 00:40:52,920 Speaker 1: and they have a drug presented to them, and the 671 00:40:52,960 --> 00:40:55,799 Speaker 1: people who are presenting the drug whisper in their ears Wow, 672 00:40:55,840 --> 00:40:58,879 Speaker 1: we really think your help has been really valuable, and 673 00:40:59,040 --> 00:41:01,239 Speaker 1: we'd like to get think about coming on as a 674 00:41:01,360 --> 00:41:04,520 Speaker 1: researcher or a head of marketing in our company. And 675 00:41:04,560 --> 00:41:07,040 Speaker 1: they offer you a two and a half million dollar raise, 676 00:41:07,480 --> 00:41:10,080 Speaker 1: it can be a little hard to turn it turned down. 677 00:41:11,960 --> 00:41:15,560 Speaker 1: And so there are connections which between the FDA and 678 00:41:15,719 --> 00:41:19,880 Speaker 1: the commercial world, which if properly aligned, are helpful because 679 00:41:19,880 --> 00:41:22,040 Speaker 1: it can give the FDA insights as to what the 680 00:41:22,040 --> 00:41:25,239 Speaker 1: company is doing, but if they're not properly aligned, can 681 00:41:25,320 --> 00:41:27,799 Speaker 1: lead to conflicts of interest into the detriment of the 682 00:41:27,840 --> 00:41:30,239 Speaker 1: safety and health of the American public. Right. So, the 683 00:41:30,280 --> 00:41:32,920 Speaker 1: FDA is a government organization with a mission to protect 684 00:41:33,000 --> 00:41:36,440 Speaker 1: us all from the effects, intended or not of private 685 00:41:36,600 --> 00:41:41,040 Speaker 1: for profit interests. So, so let's talk a little bit 686 00:41:41,080 --> 00:41:48,919 Speaker 1: about money. How does how does the FDA receive its funding? Well, 687 00:41:48,960 --> 00:41:54,000 Speaker 1: the FDA used to be back in the nineteen twenties, 688 00:41:54,080 --> 00:41:56,520 Speaker 1: it was all government funded. It just came out of 689 00:41:56,520 --> 00:41:59,960 Speaker 1: the General Treasury Fund for the FDA. Now, about forty 690 00:42:00,080 --> 00:42:03,600 Speaker 1: five percent of the FDA's funds come from something called 691 00:42:03,719 --> 00:42:06,680 Speaker 1: user fees and application fees that are paid by the 692 00:42:06,760 --> 00:42:10,680 Speaker 1: drug companies to get their drugs reviewed. When that was 693 00:42:10,760 --> 00:42:13,600 Speaker 1: first proposed, there was a lot of concern that this 694 00:42:13,640 --> 00:42:17,120 Speaker 1: would create a conflict of interest for the FDA because 695 00:42:17,200 --> 00:42:18,759 Speaker 1: a lot of their funding would come from the very 696 00:42:18,800 --> 00:42:23,200 Speaker 1: people they're trying to regulate. But it became so popular 697 00:42:23,320 --> 00:42:26,560 Speaker 1: because those user fees were enacted because the FDA was 698 00:42:26,640 --> 00:42:32,120 Speaker 1: woefully short staffed. Congress enacted this as a way specifically 699 00:42:32,160 --> 00:42:34,200 Speaker 1: for the FDA to hire the people that needed to 700 00:42:34,239 --> 00:42:37,560 Speaker 1: do the work, and it led to a big reduction 701 00:42:37,880 --> 00:42:41,239 Speaker 1: in through put times and it accomplished exactly what was 702 00:42:41,280 --> 00:42:43,480 Speaker 1: supposed to do. So it's still popular to this day. 703 00:42:43,880 --> 00:42:45,440 Speaker 1: And how much do you think the f d A 704 00:42:45,840 --> 00:42:52,160 Speaker 1: is affected by lobbying, either by big drug companies lobbying 705 00:42:52,480 --> 00:42:56,600 Speaker 1: on the health to change laws and regulations, or patient 706 00:42:56,640 --> 00:43:01,359 Speaker 1: advocacy groups lobbying to influence the way the organization is 707 00:43:02,120 --> 00:43:07,719 Speaker 1: is funded or regulated itself. Well, both affect the f 708 00:43:07,840 --> 00:43:10,400 Speaker 1: d A. I mean, it's not immune to them. The 709 00:43:10,480 --> 00:43:15,640 Speaker 1: FDA acts is funded, as I mentioned, from congressional funding, 710 00:43:16,320 --> 00:43:21,880 Speaker 1: and Congress is affected very heavily by patient advocacy. You 711 00:43:21,960 --> 00:43:25,760 Speaker 1: get patient advocacy groups getting Congress to write both good 712 00:43:25,840 --> 00:43:29,040 Speaker 1: and cockamami laws all the time. With regard to health 713 00:43:29,600 --> 00:43:32,080 Speaker 1: and so the f d A is not immune from 714 00:43:32,080 --> 00:43:36,239 Speaker 1: those effects. The other thing I would say that there's 715 00:43:36,280 --> 00:43:39,800 Speaker 1: no it's hard to say that there's a direct effect, 716 00:43:39,920 --> 00:43:43,439 Speaker 1: like you know, Glaxo smith Klein plays their two million 717 00:43:43,480 --> 00:43:46,040 Speaker 1: dollar drug fee and that makes the FDA prove the drug. 718 00:43:46,760 --> 00:43:51,759 Speaker 1: That doesn't happen. But the other kind of effect is 719 00:43:51,800 --> 00:43:55,480 Speaker 1: hard to regulate against. The individual who sits in an 720 00:43:55,520 --> 00:44:01,600 Speaker 1: influential part of a committee who then is favorably impressed 721 00:44:01,600 --> 00:44:04,560 Speaker 1: by a drug company about their drug in various ways 722 00:44:05,200 --> 00:44:09,600 Speaker 1: and advocates for it. Those do happen, and individuals that 723 00:44:09,640 --> 00:44:13,960 Speaker 1: sit on these committees can make big differences in what 724 00:44:14,160 --> 00:44:18,480 Speaker 1: the FDA decides to do. One of the things that 725 00:44:18,760 --> 00:44:21,320 Speaker 1: is just true is that our health care system is 726 00:44:21,400 --> 00:44:26,560 Speaker 1: run largely as a for profit enterprise. And is that 727 00:44:26,719 --> 00:44:32,480 Speaker 1: part of the problem here, oh sure? Or is that 728 00:44:32,520 --> 00:44:35,319 Speaker 1: the whole problem here? That that might even be the 729 00:44:35,320 --> 00:44:38,040 Speaker 1: whole problem for all we know. I mean, I suppose 730 00:44:38,120 --> 00:44:42,200 Speaker 1: there are good aspects commercialization, because it does, in an 731 00:44:42,239 --> 00:44:48,840 Speaker 1: ideal world, promote competition and innovation, but it also promotes 732 00:44:48,920 --> 00:44:53,080 Speaker 1: manipulation and profit taking, and neither of those serve patients 733 00:44:53,120 --> 00:44:56,080 Speaker 1: at all. They serve the companies that make the drugs, 734 00:44:56,400 --> 00:44:58,719 Speaker 1: and they're talking about well, that's right. I mean we're 735 00:44:58,719 --> 00:45:02,839 Speaker 1: not just talking about the company making a profit. We're 736 00:45:02,880 --> 00:45:07,880 Speaker 1: talking about unimaginable amounts of money. We're talking about a 737 00:45:07,960 --> 00:45:13,480 Speaker 1: drug generating a hundred billion dollars in profits, a hundred 738 00:45:13,719 --> 00:45:18,319 Speaker 1: billion dollars. So this is money that will buy anybody's soul, right, 739 00:45:19,040 --> 00:45:22,560 Speaker 1: maybe even mine? I don't think so, but you never know. Well, 740 00:45:22,760 --> 00:45:27,000 Speaker 1: and know, like, it's an interesting question to ask yourself, right, 741 00:45:27,120 --> 00:45:30,120 Speaker 1: is like what's my number? But in the current system, 742 00:45:30,480 --> 00:45:33,919 Speaker 1: the consequences for breaking the law often take the form 743 00:45:34,000 --> 00:45:39,319 Speaker 1: of fines. But with profits of that size, you know, 744 00:45:39,320 --> 00:45:41,200 Speaker 1: would it be fair to say the fines are just 745 00:45:41,239 --> 00:45:45,040 Speaker 1: a cost of doing business? Oh? Absolutely, it's it's amazing. 746 00:45:45,080 --> 00:45:47,040 Speaker 1: I mean, what you see in the media, what the 747 00:45:47,040 --> 00:45:50,480 Speaker 1: American public sees is the sacklers paid what was it 748 00:45:52,719 --> 00:45:58,120 Speaker 1: billion dollars? Well, weighed against what did they make. Even 749 00:45:58,160 --> 00:46:02,200 Speaker 1: if it's a hundred over a hundred billion, it's less 750 00:46:02,200 --> 00:46:05,719 Speaker 1: than of their whole profit margin. And remember it's not 751 00:46:05,800 --> 00:46:08,200 Speaker 1: just the profit, it's not just the actual dollars that 752 00:46:08,280 --> 00:46:11,279 Speaker 1: go in the bank. It's how much their stock is worth. Right, 753 00:46:11,719 --> 00:46:14,520 Speaker 1: So if you're making a hundred billion dollars in profits, 754 00:46:14,520 --> 00:46:18,680 Speaker 1: your stock becomes worth a huge amount more. And these 755 00:46:18,719 --> 00:46:22,880 Speaker 1: companies are run by individuals who have heavy stock interests, 756 00:46:23,400 --> 00:46:26,000 Speaker 1: who may have individual conflicts of interest with doing the 757 00:46:26,120 --> 00:46:30,000 Speaker 1: right thing, and more and more, we're talking about dollar 758 00:46:30,080 --> 00:46:34,799 Speaker 1: amounts that seem insurmountable in their ability to bribe and 759 00:46:34,880 --> 00:46:38,080 Speaker 1: attract people into behaviors that we would hope we wouldn't 760 00:46:38,080 --> 00:46:41,560 Speaker 1: see in this industry. Right And and you know, just 761 00:46:41,600 --> 00:46:45,319 Speaker 1: to contrast that, to make this really clear, how much 762 00:46:45,440 --> 00:46:49,919 Speaker 1: money would a typical f D A employee take home, 763 00:46:50,160 --> 00:46:53,160 Speaker 1: if there is such a thing, And how much can 764 00:46:53,200 --> 00:46:59,359 Speaker 1: we contrast that with an average pharma executive even just roughly? Uh? Well, 765 00:46:59,360 --> 00:47:04,480 Speaker 1: I I the farmer executives take home millions of dollars. 766 00:47:04,520 --> 00:47:07,720 Speaker 1: The average FDA salaries a hundred and ten thousand dollars 767 00:47:07,719 --> 00:47:11,839 Speaker 1: a year, and their top executive makes three dollars a year. 768 00:47:11,880 --> 00:47:16,960 Speaker 1: So it's considerably less than anything the commercial environment could offer, right. Yeah, 769 00:47:17,000 --> 00:47:20,520 Speaker 1: And it just seems like there's a massive incentive for 770 00:47:20,560 --> 00:47:22,960 Speaker 1: these big companies to try to figure out how to 771 00:47:23,000 --> 00:47:27,080 Speaker 1: gain the system in one way or another. Well, I 772 00:47:27,120 --> 00:47:30,640 Speaker 1: think it's not just there's this incentive. There's virtually no 773 00:47:30,880 --> 00:47:33,840 Speaker 1: disincentive to do it, because if you're that rich, you 774 00:47:33,880 --> 00:47:38,440 Speaker 1: can just pay the fine and move on, doesn't doesn't matter. 775 00:47:38,560 --> 00:47:41,600 Speaker 1: You actually factor that into the cost of developing the drug, 776 00:47:41,760 --> 00:47:44,359 Speaker 1: the cost of putting the drug out there. Yeah, well 777 00:47:44,360 --> 00:47:47,600 Speaker 1: that's terrifying. So, Gail, another way to look at our 778 00:47:47,640 --> 00:47:52,880 Speaker 1: show is fundamentally it's about trust, and it's just incredibly 779 00:47:52,920 --> 00:47:56,160 Speaker 1: important that we trust institutions like the f d A 780 00:47:56,239 --> 00:48:00,000 Speaker 1: literally lives around the line. And honestly, I can understand 781 00:48:00,040 --> 00:48:03,200 Speaker 1: in someone who has lost trust in politicians today or 782 00:48:03,239 --> 00:48:05,840 Speaker 1: in big pharma, or even in the f d A. 783 00:48:06,760 --> 00:48:10,839 Speaker 1: You know, this is the same institution that approved oxy. 784 00:48:10,920 --> 00:48:14,200 Speaker 1: So what are some things that you think the FDA 785 00:48:14,320 --> 00:48:18,960 Speaker 1: should do to try to rebuild trust with people. I think, 786 00:48:19,160 --> 00:48:22,520 Speaker 1: first of all, we need to start with Congress and say, 787 00:48:22,640 --> 00:48:26,120 Speaker 1: who do we have in Congress sitting on the committees 788 00:48:26,200 --> 00:48:29,880 Speaker 1: that give the FDA it's marching orders. Frankly, I can't. 789 00:48:30,840 --> 00:48:32,800 Speaker 1: I can't tell you who those people are. I should 790 00:48:32,800 --> 00:48:35,200 Speaker 1: be able to, but I can't right now, and do 791 00:48:35,239 --> 00:48:39,080 Speaker 1: I trust them, particularly when I well know that many 792 00:48:39,160 --> 00:48:42,839 Speaker 1: of our congressional representatives are, for lack of a better 793 00:48:42,880 --> 00:48:47,520 Speaker 1: word ignorance of science and how it works and don't 794 00:48:47,560 --> 00:48:51,400 Speaker 1: care to learn it, and the hander to sort of 795 00:48:51,440 --> 00:48:54,759 Speaker 1: the conspiracy theorists who want to think that we're all 796 00:48:54,760 --> 00:48:57,560 Speaker 1: out to get them. So I think we need to 797 00:48:57,600 --> 00:49:00,160 Speaker 1: look at that and ask should there be special or 798 00:49:00,239 --> 00:49:04,640 Speaker 1: qualifications for people who determine how this agency works. I 799 00:49:04,680 --> 00:49:07,880 Speaker 1: think overall the agency does a remarkable job given the 800 00:49:07,880 --> 00:49:11,160 Speaker 1: mission that it has and the number of opportunities for 801 00:49:11,239 --> 00:49:16,640 Speaker 1: failure that it has, how relatively few it really has experienced. 802 00:49:17,400 --> 00:49:19,239 Speaker 1: I think we need to look at how we can 803 00:49:19,280 --> 00:49:22,040 Speaker 1: reduce conflicts of interest that we've talked about within the 804 00:49:22,080 --> 00:49:25,280 Speaker 1: agency so that we don't have people who are pretending 805 00:49:25,320 --> 00:49:27,880 Speaker 1: to serve the FDA but are really serving a commercial 806 00:49:27,920 --> 00:49:31,360 Speaker 1: interest or serving themselves so that they can position themselves 807 00:49:31,400 --> 00:49:35,160 Speaker 1: for for a well paying job with the commercial companies. 808 00:49:35,200 --> 00:49:37,360 Speaker 1: I think that would be helpful. And I think we 809 00:49:37,400 --> 00:49:41,680 Speaker 1: need to have real penalties for pharmaceutical companies that openly 810 00:49:42,520 --> 00:49:45,439 Speaker 1: commit criminal acts. This is in the criminal Code now 811 00:49:46,440 --> 00:49:50,880 Speaker 1: produted things that were criminal, and that perhaps money is 812 00:49:50,920 --> 00:49:53,480 Speaker 1: not the price those people should be paying that people 813 00:49:54,200 --> 00:49:58,040 Speaker 1: people there should be real prison time assigned when those 814 00:49:58,040 --> 00:50:02,040 Speaker 1: sorts of things happen because as a CEO who knows 815 00:50:02,200 --> 00:50:04,480 Speaker 1: that their signature on a piece of paper might put 816 00:50:04,520 --> 00:50:07,240 Speaker 1: them in jail one day, may think twice before signing 817 00:50:07,239 --> 00:50:12,240 Speaker 1: it and may have better oversight. Yeah, okay, alright, Gail, 818 00:50:12,560 --> 00:50:16,080 Speaker 1: this is really my last question on this show. We 819 00:50:16,200 --> 00:50:20,480 Speaker 1: have a tool called the b S scale, and the 820 00:50:20,520 --> 00:50:24,480 Speaker 1: b S scale goes from zero to one, zero being 821 00:50:24,520 --> 00:50:29,120 Speaker 1: the best score meaning zero bs one being the worst 822 00:50:29,400 --> 00:50:34,040 Speaker 1: total bs. So on that scale, what score would you 823 00:50:34,160 --> 00:50:38,280 Speaker 1: give the f d A in achieving its stated mission? 824 00:50:39,360 --> 00:50:41,759 Speaker 1: I would give it a really good score. I think 825 00:50:41,800 --> 00:50:45,600 Speaker 1: that to do it's the complicated job it does, to 826 00:50:45,719 --> 00:50:48,439 Speaker 1: do it with the high degree of success, and it's 827 00:50:48,520 --> 00:50:52,680 Speaker 1: had in protecting the American public for nearly two hundred 828 00:50:52,760 --> 00:50:55,920 Speaker 1: years now. I think that they deserve a score of 829 00:50:57,080 --> 00:51:01,279 Speaker 1: and that they've done really, really well. Gail, I want 830 00:51:01,280 --> 00:51:04,040 Speaker 1: to thank you for being with us today. Thanks for 831 00:51:04,400 --> 00:51:07,560 Speaker 1: calling bullshit well, thank you for having me. It's been fun. 832 00:51:07,600 --> 00:51:11,359 Speaker 1: It really has been folks. It is time to make 833 00:51:11,360 --> 00:51:14,799 Speaker 1: the call. The f d A is a complicated institution 834 00:51:14,960 --> 00:51:19,319 Speaker 1: with huge responsibilities. We couldn't possibly cover the entirety of 835 00:51:19,360 --> 00:51:22,239 Speaker 1: what they do in a single episode, but with the 836 00:51:22,239 --> 00:51:24,560 Speaker 1: help of our three experts, we're still able to get 837 00:51:24,680 --> 00:51:29,360 Speaker 1: enough insight to answer the fundamental question, does the f 838 00:51:29,480 --> 00:51:33,680 Speaker 1: d A strive above all else to safeguard the health 839 00:51:33,920 --> 00:51:39,480 Speaker 1: of the American people. Although they've had some notable successes, 840 00:51:40,080 --> 00:51:45,560 Speaker 1: we're calling bullshit on the f d A, But as always, 841 00:51:45,760 --> 00:51:48,319 Speaker 1: we're not here to just curse the darkness. When we 842 00:51:48,440 --> 00:51:51,520 Speaker 1: come back, we'll speak with an expert in f d 843 00:51:51,680 --> 00:51:54,399 Speaker 1: A conflicts of interest to see if we can light 844 00:51:54,440 --> 00:51:57,760 Speaker 1: a few candles in the halls of this bureaucratic beheemoth. 845 00:52:14,840 --> 00:52:18,280 Speaker 1: My name is Genevieve Cantor. I am an assistant professor 846 00:52:18,520 --> 00:52:22,000 Speaker 1: at the Proman School of Medicine at the University of Pennsylvania. 847 00:52:22,160 --> 00:52:27,040 Speaker 1: I'm trained as an economist and I study regulation of 848 00:52:27,120 --> 00:52:31,440 Speaker 1: biomedical technologies, the f d A, and conflicts of interest. 849 00:52:32,040 --> 00:52:34,600 Speaker 1: Thank you for being here and welcome to calling bullshit. 850 00:52:35,520 --> 00:52:38,760 Speaker 1: Thanks happy to be here. What we're here to do 851 00:52:39,120 --> 00:52:42,759 Speaker 1: is talk about ideas for helping the f d A 852 00:52:42,920 --> 00:52:48,360 Speaker 1: better live their purpose. And before we get into those ideas, 853 00:52:48,680 --> 00:52:50,640 Speaker 1: I don't do this in every show, but but I 854 00:52:50,680 --> 00:52:54,640 Speaker 1: wanted to actually say something to our listeners because I've 855 00:52:54,680 --> 00:52:58,160 Speaker 1: been feeling some of this stuff myself as I prepped 856 00:52:58,160 --> 00:53:02,080 Speaker 1: for this episode. It's really easy, and I would say 857 00:53:02,080 --> 00:53:06,000 Speaker 1: even understandable, when faced with a problem as complex as 858 00:53:06,040 --> 00:53:08,799 Speaker 1: the FDA to basically just shrug and give up. And 859 00:53:08,840 --> 00:53:12,200 Speaker 1: it's it's easy to just declare that the problem is 860 00:53:12,239 --> 00:53:15,719 Speaker 1: impossible and to kind of move on. And I want 861 00:53:15,719 --> 00:53:18,720 Speaker 1: to ask our listeners to suspend their disbelief for this section, 862 00:53:18,840 --> 00:53:22,000 Speaker 1: because we really do want to explore actions that the 863 00:53:22,080 --> 00:53:25,320 Speaker 1: FDA or the executive branch to whom the FDA reports 864 00:53:25,360 --> 00:53:29,560 Speaker 1: could actually enact to help the f d A build 865 00:53:29,840 --> 00:53:33,720 Speaker 1: better trust with people and really deliver on the promise 866 00:53:33,760 --> 00:53:36,880 Speaker 1: of keeping all of us safer. It's such an important purpose, 867 00:53:37,640 --> 00:53:40,359 Speaker 1: and I really believe we need to take an optimistic 868 00:53:40,400 --> 00:53:45,880 Speaker 1: point of view here because giving up on it is unthinkable. Sorry, 869 00:53:45,920 --> 00:53:48,759 Speaker 1: I had to get that off my chest initially. So 870 00:53:48,880 --> 00:53:51,960 Speaker 1: let's get into some ideas. Genevieve, I'm gonna ask you 871 00:53:52,040 --> 00:53:55,839 Speaker 1: to go first. In two minutes, can you tell us 872 00:53:55,920 --> 00:53:59,640 Speaker 1: the one thing that you would do to change the 873 00:53:59,719 --> 00:54:04,600 Speaker 1: f So, if I were emperor of the United States, 874 00:54:04,840 --> 00:54:08,000 Speaker 1: the one thing I would do is to replace our 875 00:54:08,040 --> 00:54:11,239 Speaker 1: current approval process with a system where the firms are 876 00:54:11,320 --> 00:54:14,560 Speaker 1: given conditional approval of their products and they have to 877 00:54:14,600 --> 00:54:18,880 Speaker 1: seek renewal of approval every say, ten years. So in 878 00:54:18,920 --> 00:54:22,200 Speaker 1: the current system, a firm applies for approval of a 879 00:54:22,280 --> 00:54:25,759 Speaker 1: product and receives that approval until basically the end of 880 00:54:25,800 --> 00:54:28,560 Speaker 1: time or the sun burns itself out or something. So 881 00:54:29,040 --> 00:54:32,600 Speaker 1: with this ten year sort of renewable approval, you could 882 00:54:32,600 --> 00:54:36,000 Speaker 1: incentivize the monitoring of how well drugs are working, because 883 00:54:36,000 --> 00:54:38,840 Speaker 1: that will be required to get your renewal for the approval. 884 00:54:39,200 --> 00:54:42,440 Speaker 1: You incentivize the monitoring of how safe drugs are because 885 00:54:42,480 --> 00:54:46,480 Speaker 1: again that's part of the renewal process that will allow 886 00:54:46,600 --> 00:54:49,040 Speaker 1: us the government to pull drugs off the market that 887 00:54:49,160 --> 00:54:51,239 Speaker 1: turn out to be not effective or that turned out 888 00:54:51,239 --> 00:54:53,719 Speaker 1: to be unsafe, because sometimes you don't see a lot 889 00:54:53,719 --> 00:54:56,799 Speaker 1: of safety events in the small clinical trial populations and 890 00:54:56,840 --> 00:55:00,560 Speaker 1: you only see them in the broader population. I love 891 00:55:00,640 --> 00:55:05,160 Speaker 1: that idea. That is an incredibly smart idea. Not surprisingly, 892 00:55:05,560 --> 00:55:09,440 Speaker 1: you are the experts, so well done. I think that's 893 00:55:09,440 --> 00:55:12,399 Speaker 1: a fantastic idea. Will we will return to that, I'm 894 00:55:12,400 --> 00:55:15,520 Speaker 1: sure throughout the conversation. So here's my idea. The f 895 00:55:15,680 --> 00:55:19,719 Speaker 1: d A is much more important to all of us 896 00:55:19,880 --> 00:55:23,240 Speaker 1: than is currently reflected in the salaries of the people 897 00:55:23,239 --> 00:55:27,080 Speaker 1: who work there. Their job is literally to to save lives, 898 00:55:27,080 --> 00:55:31,240 Speaker 1: to protect us from harm. But there's a pretty massive 899 00:55:31,320 --> 00:55:33,920 Speaker 1: salary gap between the average f d A worker in 900 00:55:33,960 --> 00:55:38,680 Speaker 1: the average pharmat exact or food company exact, which has 901 00:55:38,719 --> 00:55:42,400 Speaker 1: resulted in this revolving door where regulators from the FDA 902 00:55:42,640 --> 00:55:46,680 Speaker 1: moved to higher paying jobs at food and drug companies. 903 00:55:47,520 --> 00:55:50,920 Speaker 1: And I think that the knowledge that that reward is 904 00:55:50,960 --> 00:55:54,439 Speaker 1: there waiting for them if they play ball while they're 905 00:55:54,440 --> 00:55:56,880 Speaker 1: at the f d A has the effect of creating 906 00:55:56,920 --> 00:56:00,279 Speaker 1: huge conflicts of interest. And so my idea is to 907 00:56:00,480 --> 00:56:05,680 Speaker 1: change the FDA through compensation reform, pay salaries competitive with 908 00:56:05,719 --> 00:56:08,839 Speaker 1: the private sector, because right now the FDA is kind 909 00:56:08,840 --> 00:56:13,040 Speaker 1: of a drab government bureaucracy and it attracts people who 910 00:56:13,120 --> 00:56:16,719 Speaker 1: are up for working in that kind of job. Closing 911 00:56:16,760 --> 00:56:20,000 Speaker 1: the salary gap would begin to level the playing field. 912 00:56:20,160 --> 00:56:23,040 Speaker 1: When you know you pay people more money, you attract 913 00:56:23,120 --> 00:56:27,040 Speaker 1: better people, and their job is so incredibly important that 914 00:56:27,040 --> 00:56:30,960 Speaker 1: that would be better for all of us. So salary 915 00:56:31,040 --> 00:56:33,320 Speaker 1: reform is the idea I want to put on the table. 916 00:56:33,360 --> 00:56:35,719 Speaker 1: But let's some we can we can get back to that. 917 00:56:35,760 --> 00:56:37,880 Speaker 1: I want to I want to talk more about your idea, 918 00:56:37,960 --> 00:56:42,200 Speaker 1: because I think it's so smart because when things and 919 00:56:42,200 --> 00:56:44,520 Speaker 1: and a lot goes right at the FDA, right, but 920 00:56:44,520 --> 00:56:48,560 Speaker 1: but when things go wrong, it's not often on day 921 00:56:48,600 --> 00:56:51,800 Speaker 1: one you don't know that a problem is a problem 922 00:56:51,960 --> 00:56:54,520 Speaker 1: right out of the gate. And yet once something is approved, 923 00:56:54,560 --> 00:56:57,200 Speaker 1: it's been released into the wild and you almost can't 924 00:56:57,280 --> 00:57:00,400 Speaker 1: get it back in the current system. And so that 925 00:57:00,480 --> 00:57:04,640 Speaker 1: I think that would you know, really increase people's safety. 926 00:57:04,719 --> 00:57:08,000 Speaker 1: What barriers do you think we would encounter if we 927 00:57:08,080 --> 00:57:11,280 Speaker 1: decided to try to actually enact that today? Who would 928 00:57:11,320 --> 00:57:14,640 Speaker 1: have an issue with that idea? Probably the same parties 929 00:57:14,680 --> 00:57:17,760 Speaker 1: that have blocked a lot of reforms in this space, 930 00:57:17,920 --> 00:57:21,520 Speaker 1: the pharmaceutical companies. I suspect that some of the arguments 931 00:57:21,560 --> 00:57:24,400 Speaker 1: that would be presented might be that it would be 932 00:57:24,880 --> 00:57:27,680 Speaker 1: even more costly and time consuming for firms than the 933 00:57:27,680 --> 00:57:33,840 Speaker 1: existing approval processes. Uh IN might delay access to some products, 934 00:57:34,200 --> 00:57:38,240 Speaker 1: But overall, broadly speaking, it's not politically feasible because the 935 00:57:38,360 --> 00:57:43,720 Speaker 1: drug companies would intensely oppose this kind of conditional approval. 936 00:57:44,600 --> 00:57:47,760 Speaker 1: And when they oppose that kind of thing, how does 937 00:57:47,880 --> 00:57:52,720 Speaker 1: what form does that opposition take? So every five years 938 00:57:52,920 --> 00:57:57,080 Speaker 1: there is legislation related to authorizing the budget for the 939 00:57:57,200 --> 00:58:00,280 Speaker 1: f D a um. In fact, there's actually currently the 940 00:58:00,280 --> 00:58:03,600 Speaker 1: re authorization happening this year and we expect to see 941 00:58:03,600 --> 00:58:07,360 Speaker 1: it past at the end of this summer actually, And 942 00:58:07,440 --> 00:58:11,520 Speaker 1: so usually it's through you know, lobbying legislators as to 943 00:58:12,600 --> 00:58:17,120 Speaker 1: the features that might go into this reauthorization package. So if, 944 00:58:17,400 --> 00:58:19,560 Speaker 1: for example, if it were to be introduced in one 945 00:58:19,560 --> 00:58:23,840 Speaker 1: of these five year reauthorization bill's armor as well as 946 00:58:23,880 --> 00:58:29,840 Speaker 1: individual pharmaceutical lobbyists would oppose the inclusion of such a conditionality, 947 00:58:30,840 --> 00:58:34,160 Speaker 1: and the lobbyists are there to threaten by removing financial 948 00:58:34,240 --> 00:58:38,720 Speaker 1: support from Congress. That's right through you know, campaign contributions. 949 00:58:39,880 --> 00:58:43,440 Speaker 1: M hmm. Yeah. That sort of leads us to a 950 00:58:43,520 --> 00:58:46,600 Speaker 1: discussion around conflict of interest in general. So so I 951 00:58:46,720 --> 00:58:49,000 Speaker 1: want to I want to go there, But before we 952 00:58:49,200 --> 00:58:52,720 Speaker 1: go there more deeply, what do you think of the 953 00:58:52,800 --> 00:58:55,200 Speaker 1: idea that I put on on the table, This idea 954 00:58:55,240 --> 00:58:58,120 Speaker 1: of like leveling the playing field from a salary standpoint, 955 00:58:58,160 --> 00:59:00,480 Speaker 1: Does that make any sense? I'd like it, And I 956 00:59:00,560 --> 00:59:05,040 Speaker 1: think you've tackled head on one of the issues with 957 00:59:05,240 --> 00:59:08,720 Speaker 1: the approval process at the FDA, which is they do 958 00:59:08,920 --> 00:59:13,080 Speaker 1: lose a lot of very good people because the pharmaceutical 959 00:59:13,160 --> 00:59:17,560 Speaker 1: companies are able to entice government workers who have a 960 00:59:17,640 --> 00:59:21,480 Speaker 1: lot of experience and knowledge. Away, I do see some 961 00:59:21,680 --> 00:59:24,120 Speaker 1: constraints while we're on the topic of you know, pluses 962 00:59:24,160 --> 00:59:29,000 Speaker 1: and minuses. One might be that these are civil servants, 963 00:59:29,040 --> 00:59:32,400 Speaker 1: and so what you describe is just a generic problem 964 00:59:32,640 --> 00:59:36,440 Speaker 1: among the civil service. So are there some rules related 965 00:59:36,440 --> 00:59:39,840 Speaker 1: to parity relating you know, GS scales and so on 966 00:59:40,280 --> 00:59:43,040 Speaker 1: that you would have to consider. I see, just the 967 00:59:43,120 --> 00:59:48,360 Speaker 1: way government workers are compensated has to be essentially universal 968 00:59:48,440 --> 00:59:54,040 Speaker 1: there or standardized. Yet in some way. A second issue 969 00:59:54,200 --> 00:59:57,160 Speaker 1: is just where that money would come from, because the 970 00:59:57,240 --> 01:00:01,240 Speaker 1: central source of conflicts actually, even with funding the f 971 01:00:01,400 --> 01:00:06,880 Speaker 1: d A is user fees, so basically requiring pharmaceutical companies 972 01:00:07,000 --> 01:00:09,440 Speaker 1: to pony up you know, hundreds of thousands of dollars 973 01:00:09,840 --> 01:00:13,800 Speaker 1: to finance the review process. Now it's not earmarks, so 974 01:00:13,920 --> 01:00:17,320 Speaker 1: that if visor submits a you know, drug for review, 975 01:00:17,720 --> 01:00:21,800 Speaker 1: that you know, people will necessarily favor the approval of 976 01:00:21,880 --> 01:00:24,920 Speaker 1: that drug. But financially the f d A and its 977 01:00:24,960 --> 01:00:29,120 Speaker 1: operations are funded in large part by pharmaceutical companies through 978 01:00:29,200 --> 01:00:32,600 Speaker 1: these user fees. I would consider that attacks in a 979 01:00:32,680 --> 01:00:36,439 Speaker 1: way of getting your product to market, and I think 980 01:00:36,520 --> 01:00:38,840 Speaker 1: that that makes sense. I mean, I do realize it 981 01:00:38,920 --> 01:00:44,120 Speaker 1: gives them a voice in the world of money and politics, 982 01:00:44,240 --> 01:00:49,040 Speaker 1: and so raising those fees probably would be unpopular with them. 983 01:00:49,440 --> 01:00:54,000 Speaker 1: I want to continue to talk about different conflicts of interest. 984 01:00:54,200 --> 01:00:56,920 Speaker 1: You wrote a chapter in a book called Conflicts of 985 01:00:57,080 --> 01:01:01,960 Speaker 1: Interest in FDA Advisory Committee that was eye opening for me. 986 01:01:02,880 --> 01:01:07,080 Speaker 1: Can you first just explain what an advisory committee is 987 01:01:07,200 --> 01:01:11,800 Speaker 1: and how that actually works. Sure, when a drug comes 988 01:01:11,920 --> 01:01:14,680 Speaker 1: up for approval, the f d A, you know, has 989 01:01:14,760 --> 01:01:18,880 Speaker 1: the final say, but oftentimes it doesn't have the internal 990 01:01:19,040 --> 01:01:23,360 Speaker 1: expertise or perhaps even the person hours to commit to 991 01:01:23,440 --> 01:01:26,680 Speaker 1: doing a full review of a particular drug, or you know, 992 01:01:26,760 --> 01:01:31,000 Speaker 1: the evidence is complicated and it needs external advice. So 993 01:01:31,560 --> 01:01:36,400 Speaker 1: frequently it convenes these advisory committees to review the application 994 01:01:36,560 --> 01:01:40,040 Speaker 1: for a particular drug. The people on these advisory committees 995 01:01:40,080 --> 01:01:44,000 Speaker 1: are not formally employees or full time employees of the 996 01:01:44,200 --> 01:01:49,120 Speaker 1: f d A. They are external experts. They're sitting at universities, 997 01:01:49,800 --> 01:01:55,960 Speaker 1: research institutes, think tanks, and they are physicians, researchers, some statisticians. 998 01:01:56,480 --> 01:01:58,680 Speaker 1: But one of the important things about this is if 999 01:01:58,720 --> 01:02:02,000 Speaker 1: you work for the government, there very clear ethics rules 1000 01:02:02,120 --> 01:02:06,480 Speaker 1: regarding your financial ties to industry. Um, if you were 1001 01:02:06,720 --> 01:02:09,000 Speaker 1: working at a university and then you get called on 1002 01:02:09,120 --> 01:02:12,360 Speaker 1: to be on these advisory committees, you know, these people 1003 01:02:12,400 --> 01:02:16,920 Speaker 1: sitting at universities have relationships with drug companies, They are 1004 01:02:16,960 --> 01:02:19,560 Speaker 1: consultants for them, they have their research funded by them. 1005 01:02:20,160 --> 01:02:23,320 Speaker 1: And so one of the things I looked at was 1006 01:02:23,880 --> 01:02:27,920 Speaker 1: whether the financial ties of these external experts who are 1007 01:02:28,000 --> 01:02:31,000 Speaker 1: called upon to advise whether a drug should be approved 1008 01:02:31,120 --> 01:02:34,280 Speaker 1: or not. Whether the financial ties of these experts had 1009 01:02:34,400 --> 01:02:39,360 Speaker 1: to drug companies was associated with whether they voted for 1010 01:02:39,560 --> 01:02:42,840 Speaker 1: approval of the drug or not, and how open they 1011 01:02:42,880 --> 01:02:45,760 Speaker 1: were to approval of the drug. Right, And it sounded 1012 01:02:45,800 --> 01:02:48,640 Speaker 1: to me like you had a specific hypothesis going into 1013 01:02:48,680 --> 01:02:51,680 Speaker 1: the work, which actually you even you were surprised by 1014 01:02:51,720 --> 01:02:55,480 Speaker 1: the results. Is that right? Yeah? So, I mean conventional wisdom, 1015 01:02:55,720 --> 01:02:59,440 Speaker 1: certainly in the ethics literature on conflicts of interest, which is, 1016 01:02:59,560 --> 01:03:02,520 Speaker 1: if you have, you know, when tied to industry, you know, 1017 01:03:02,680 --> 01:03:05,520 Speaker 1: that's not good. But if you have multiple ties, that's 1018 01:03:05,680 --> 01:03:10,160 Speaker 1: even more not good. That's like really really bad, yes, exactly. 1019 01:03:10,840 --> 01:03:13,960 Speaker 1: And then so we found two things. One was it 1020 01:03:14,120 --> 01:03:19,440 Speaker 1: turned out that when we compared how these experts voted, 1021 01:03:20,120 --> 01:03:23,720 Speaker 1: and we compared people who had one financial tie to 1022 01:03:23,840 --> 01:03:27,680 Speaker 1: those who had no financial ties. We actually found that 1023 01:03:28,480 --> 01:03:32,040 Speaker 1: people with a single financial tie to a company were 1024 01:03:32,080 --> 01:03:35,160 Speaker 1: more likely to vote in favor of the product sponsored 1025 01:03:35,360 --> 01:03:38,160 Speaker 1: by that company than people who had no financial ties. 1026 01:03:38,480 --> 01:03:40,640 Speaker 1: So there did to seem to be some bias, but 1027 01:03:40,760 --> 01:03:44,160 Speaker 1: it was only if you had a single financial tie 1028 01:03:44,240 --> 01:03:47,400 Speaker 1: to a company. In contrast, people who had a lot 1029 01:03:47,440 --> 01:03:50,160 Speaker 1: of financial ties, So people who had ties to murt 1030 01:03:50,360 --> 01:03:53,760 Speaker 1: Fiser lots of companies did not appear to vote any 1031 01:03:53,840 --> 01:03:57,400 Speaker 1: differently on average than people who had no financial ties. 1032 01:03:58,200 --> 01:04:00,760 Speaker 1: People with a lot of ties did not appear to 1033 01:04:00,800 --> 01:04:03,560 Speaker 1: be biased in how they voted, and so we talk 1034 01:04:03,600 --> 01:04:05,400 Speaker 1: a little bit about why that might be the case. 1035 01:04:06,080 --> 01:04:09,800 Speaker 1: One you know, hypothesis that makes sense to me is that, 1036 01:04:10,240 --> 01:04:11,600 Speaker 1: you know, a lot of times when people have a 1037 01:04:11,680 --> 01:04:14,919 Speaker 1: lot of financial ties, it's because they're really really good 1038 01:04:15,240 --> 01:04:17,480 Speaker 1: at what they do, and so a lot of companies 1039 01:04:17,520 --> 01:04:20,440 Speaker 1: want a piece of their brain. Um, it's not so 1040 01:04:20,560 --> 01:04:23,320 Speaker 1: they're not hiring these people so they could be hired 1041 01:04:23,400 --> 01:04:25,640 Speaker 1: guns to say what the company wants them to say. 1042 01:04:25,920 --> 01:04:28,400 Speaker 1: They're hiring these people because they're just really good at 1043 01:04:28,440 --> 01:04:31,200 Speaker 1: advising them. Yes, some of it is just they literally 1044 01:04:31,280 --> 01:04:35,560 Speaker 1: want great advice from smart people. But we did find 1045 01:04:35,680 --> 01:04:37,640 Speaker 1: some bias. I mean, I think the other thing that 1046 01:04:37,760 --> 01:04:40,040 Speaker 1: came out in the paper was that, you know, the 1047 01:04:40,160 --> 01:04:44,200 Speaker 1: type of conflict matters. So that hypothesis that I presented earlier, 1048 01:04:44,320 --> 01:04:47,240 Speaker 1: one tie is bad, many ties worse. It's sort of 1049 01:04:47,360 --> 01:04:50,640 Speaker 1: a the simplistic rule of film we had that doesn't 1050 01:04:50,680 --> 01:04:55,280 Speaker 1: acknowledge the fact that different kinds of ties matter for influence. 1051 01:04:55,720 --> 01:04:57,800 Speaker 1: And so the other thing we found was that it 1052 01:04:57,880 --> 01:05:01,960 Speaker 1: did not seem, for example, that experts who had ties 1053 01:05:02,560 --> 01:05:05,120 Speaker 1: through research funding so their research was funded by the 1054 01:05:05,200 --> 01:05:09,120 Speaker 1: drug company, were biased. But the kind of financial ties 1055 01:05:09,200 --> 01:05:12,680 Speaker 1: that really mattered were either you were you had an 1056 01:05:12,720 --> 01:05:16,200 Speaker 1: ownership stake in the company, you had some kind of 1057 01:05:16,280 --> 01:05:19,640 Speaker 1: stock in the company, which makes sense, and also a 1058 01:05:19,760 --> 01:05:22,200 Speaker 1: very strong effect came from whether you were on an 1059 01:05:22,240 --> 01:05:25,480 Speaker 1: advisory board for the company. Lots of times you may 1060 01:05:25,520 --> 01:05:27,840 Speaker 1: be on the board and you have a fiduciary responsibility 1061 01:05:28,400 --> 01:05:33,720 Speaker 1: for the company to act in uh interest right exactly. Yeah, 1062 01:05:33,800 --> 01:05:36,280 Speaker 1: that's a clear conflict like that that seems like that 1063 01:05:36,360 --> 01:05:38,440 Speaker 1: should be illegal. I mean, a lot of this should 1064 01:05:38,440 --> 01:05:40,920 Speaker 1: be illegal. To be honest, you would think it would, 1065 01:05:41,160 --> 01:05:43,800 Speaker 1: and if you read the rules, people who do have 1066 01:05:43,920 --> 01:05:47,920 Speaker 1: that kind of financial interest should not be participating in 1067 01:05:47,960 --> 01:05:51,400 Speaker 1: the advisory committees. But there is also a process that 1068 01:05:51,480 --> 01:05:55,200 Speaker 1: allows the FDA to make exceptions, and many times they 1069 01:05:55,240 --> 01:05:58,240 Speaker 1: make exceptions. And so you have people with these kinds 1070 01:05:58,280 --> 01:06:01,360 Speaker 1: of financial ties on these advisor committees, but things that 1071 01:06:01,440 --> 01:06:04,959 Speaker 1: don't matter, consulting research, you know, other kinds of ties 1072 01:06:05,000 --> 01:06:09,000 Speaker 1: don't matter. So I guess the paper was really advocating 1073 01:06:09,040 --> 01:06:13,200 Speaker 1: for more subtle policy related to conflicts of interests. Yeah, 1074 01:06:13,320 --> 01:06:16,480 Speaker 1: more disclosure, right, I mean, it seems like that should 1075 01:06:16,520 --> 01:06:20,240 Speaker 1: be fair if the FDA is tasked with, as it 1076 01:06:20,320 --> 01:06:24,280 Speaker 1: says on the website, above all else, safeguarding the health 1077 01:06:24,480 --> 01:06:27,040 Speaker 1: and well being of the American people. If that truly 1078 01:06:27,360 --> 01:06:33,000 Speaker 1: is above all else, then you know, you should have 1079 01:06:33,160 --> 01:06:36,880 Speaker 1: to disclose all of your financial ties. And it should 1080 01:06:36,880 --> 01:06:40,280 Speaker 1: be acknowledged that there are financial ties that are okay. 1081 01:06:40,560 --> 01:06:42,520 Speaker 1: In other words, if you're just being paid as an 1082 01:06:42,520 --> 01:06:45,600 Speaker 1: advisor for a company, and that's you being paid for 1083 01:06:45,680 --> 01:06:48,800 Speaker 1: your professional expertise, and that's fine. If you own a 1084 01:06:48,920 --> 01:06:52,880 Speaker 1: piece or you're on the on the board of one 1085 01:06:52,880 --> 01:06:55,400 Speaker 1: of the companies that's in question, that would disqualify you. 1086 01:06:55,800 --> 01:06:59,480 Speaker 1: And I don't understand why we can't enact rules like that. 1087 01:06:59,760 --> 01:07:03,400 Speaker 1: What would prevent us from from closing those loopholes? That's 1088 01:07:03,440 --> 01:07:07,000 Speaker 1: a great question. The two arguments that I've seen presented 1089 01:07:07,240 --> 01:07:12,520 Speaker 1: are that we would no longer be able to find 1090 01:07:12,800 --> 01:07:16,640 Speaker 1: people qualified enough to be on our advisory committees if 1091 01:07:16,680 --> 01:07:22,800 Speaker 1: we just outright banned the participation of people with financial ties, 1092 01:07:23,280 --> 01:07:26,200 Speaker 1: because all the all the really smart people are already 1093 01:07:26,480 --> 01:07:30,360 Speaker 1: on the take, essentially like or you know, whereas my 1094 01:07:30,400 --> 01:07:33,120 Speaker 1: study indicated, you know a lot of people want want 1095 01:07:33,160 --> 01:07:36,840 Speaker 1: a piece of their brains, so it would be difficult. 1096 01:07:37,120 --> 01:07:40,480 Speaker 1: And we can kind of see this because in the 1097 01:07:40,800 --> 01:07:44,520 Speaker 1: two thousands, the f d A in fact capped the 1098 01:07:44,600 --> 01:07:47,400 Speaker 1: exceptions that could be issued for people who had these 1099 01:07:47,520 --> 01:07:51,720 Speaker 1: kind of financial ties. Um So prior to this, the 1100 01:07:52,240 --> 01:07:55,480 Speaker 1: f d A issued just exceptions willy nilly and just 1101 01:07:55,600 --> 01:07:59,360 Speaker 1: basically said a lot of the apparently disqualifying financial interests 1102 01:07:59,720 --> 01:08:03,760 Speaker 1: did matter. In the two thousands, with one of the authorizations, 1103 01:08:04,040 --> 01:08:07,160 Speaker 1: the percentage of people who had these exceptions, and what 1104 01:08:07,320 --> 01:08:10,440 Speaker 1: you saw was of course a decline and people who 1105 01:08:10,520 --> 01:08:13,280 Speaker 1: had these exceptions in these financial ties. But what you 1106 01:08:13,440 --> 01:08:18,879 Speaker 1: also saw was more positions on the advisory committees being vacant, 1107 01:08:19,120 --> 01:08:21,799 Speaker 1: a longer time it would take to convene these advisory 1108 01:08:21,840 --> 01:08:24,120 Speaker 1: committees because they would need to spend more time to 1109 01:08:24,200 --> 01:08:29,000 Speaker 1: find people who are not conflicted. Yeah right, Yeah, I 1110 01:08:29,080 --> 01:08:32,120 Speaker 1: want to delve a little bit more into this idea 1111 01:08:32,439 --> 01:08:35,840 Speaker 1: of trust, because ultimately the FDA's job is to make 1112 01:08:36,000 --> 01:08:41,200 Speaker 1: us safe. And without guiding your answer at all, what 1113 01:08:41,439 --> 01:08:46,599 Speaker 1: needs to change so that we can trust the companies 1114 01:08:46,640 --> 01:08:48,920 Speaker 1: that are making our food and the companies that are 1115 01:08:49,000 --> 01:08:53,280 Speaker 1: making our drugs. What what about our system has to 1116 01:08:53,439 --> 01:08:56,320 Speaker 1: change to create more trust? I have to say I'm 1117 01:08:56,320 --> 01:08:59,320 Speaker 1: a little conflicted about this. I mean, the mission of 1118 01:08:59,360 --> 01:09:03,280 Speaker 1: the fd as pretty clear, you know, to ensure, among 1119 01:09:03,520 --> 01:09:05,560 Speaker 1: other things, to ensure the safety and efficacy of the 1120 01:09:05,640 --> 01:09:08,200 Speaker 1: drugs that are marketed in the US. The mission the 1121 01:09:08,240 --> 01:09:12,040 Speaker 1: drug companies is not that it is to make money 1122 01:09:12,120 --> 01:09:15,519 Speaker 1: for their shareholders, and so, you know, I do wonder 1123 01:09:15,760 --> 01:09:21,600 Speaker 1: whether it's realistic um to expect organizations whose objective is 1124 01:09:21,680 --> 01:09:25,839 Speaker 1: profitmaking to not do what they can do to make profits. 1125 01:09:26,240 --> 01:09:27,960 Speaker 1: But I like the way you framed it, because you 1126 01:09:28,040 --> 01:09:31,560 Speaker 1: framed it as a system, you know, not just the 1127 01:09:31,720 --> 01:09:33,600 Speaker 1: f d A. But I think the onus is on 1128 01:09:34,240 --> 01:09:39,680 Speaker 1: legislators policymakers to create rules that put guard rails on 1129 01:09:39,840 --> 01:09:44,479 Speaker 1: drug companies that still but that's still incentivize them to 1130 01:09:44,720 --> 01:09:47,840 Speaker 1: do the right thing. I think, as as you you 1131 01:09:47,960 --> 01:09:51,880 Speaker 1: might have framed it, so things that minimize gaming, how 1132 01:09:51,960 --> 01:09:55,560 Speaker 1: clinical trials are run and analyze, things that incentivize the 1133 01:09:55,600 --> 01:10:00,200 Speaker 1: collection of data on effectiveness and safety, and start the 1134 01:10:00,280 --> 01:10:03,080 Speaker 1: things that we want, which is quality information on safety 1135 01:10:03,120 --> 01:10:06,040 Speaker 1: and effectiveness with things that drug companies want, which is 1136 01:10:06,479 --> 01:10:13,120 Speaker 1: access to the market. Right, Okay, Genevieve, is there anything 1137 01:10:13,280 --> 01:10:18,519 Speaker 1: else that you think our our listeners should know about 1138 01:10:18,960 --> 01:10:21,960 Speaker 1: the f DA and things that either could or should change. 1139 01:10:22,479 --> 01:10:25,360 Speaker 1: I do think, you know, the problems we have with 1140 01:10:25,479 --> 01:10:28,479 Speaker 1: the f d A are structural. You know, there is 1141 01:10:28,520 --> 01:10:31,120 Speaker 1: that structural tension there about how can we get how 1142 01:10:31,160 --> 01:10:33,759 Speaker 1: can the agency have independence be based on the science, 1143 01:10:34,320 --> 01:10:37,639 Speaker 1: but you know, still have a commissioner that is serving 1144 01:10:37,680 --> 01:10:40,320 Speaker 1: at the pledgure of the president. The other structural tension 1145 01:10:40,600 --> 01:10:42,880 Speaker 1: and we see this, you know in our discussion as well, 1146 01:10:43,040 --> 01:10:45,960 Speaker 1: is you know, you can have an agency that approves 1147 01:10:46,520 --> 01:10:50,519 Speaker 1: drugs almost too fast, so they proved drugs that don't work, 1148 01:10:51,120 --> 01:10:54,240 Speaker 1: or you have safety issues, but you've increase access to 1149 01:10:54,320 --> 01:10:57,640 Speaker 1: the drug. But that's opposed to well that you know, 1150 01:10:57,720 --> 01:11:00,519 Speaker 1: the alternative is to have an agency that approves things 1151 01:11:00,600 --> 01:11:03,880 Speaker 1: too slowly, which is their drugs that it's preventing from 1152 01:11:03,920 --> 01:11:06,680 Speaker 1: being out on the market that people need to get 1153 01:11:06,680 --> 01:11:11,799 Speaker 1: access to. Right, we do need to have information systems 1154 01:11:11,840 --> 01:11:14,320 Speaker 1: that adapt an approval system that is adaptive in the 1155 01:11:14,400 --> 01:11:16,439 Speaker 1: same way that you know, all other systems in the 1156 01:11:16,479 --> 01:11:19,000 Speaker 1: world that we live in, you know, adapt to new information. 1157 01:11:19,280 --> 01:11:21,840 Speaker 1: And so hopefully that proposal and yours as well sort 1158 01:11:21,880 --> 01:11:24,439 Speaker 1: of adapts to you know, what we know and we 1159 01:11:24,479 --> 01:11:28,599 Speaker 1: can make better decisions that way. Okay, So last question 1160 01:11:29,320 --> 01:11:32,519 Speaker 1: on this podcast. We have a tool that we call 1161 01:11:32,600 --> 01:11:36,960 Speaker 1: the BS index, and the BS index measures the gap 1162 01:11:37,240 --> 01:11:41,360 Speaker 1: between word indeed, and it goes from zero to a hundred. 1163 01:11:41,600 --> 01:11:45,720 Speaker 1: Zero is the best score, So zero b s a 1164 01:11:45,840 --> 01:11:49,840 Speaker 1: hundred is the worst score total BS. And so the 1165 01:11:49,960 --> 01:11:54,280 Speaker 1: f d A today says that it strives above all 1166 01:11:54,520 --> 01:11:57,600 Speaker 1: else to safeguard the health and well being of the 1167 01:11:57,640 --> 01:12:02,439 Speaker 1: American people. What's or would you give the f d A. 1168 01:12:04,320 --> 01:12:09,439 Speaker 1: So I would say that my determined of the BA 1169 01:12:09,560 --> 01:12:14,160 Speaker 1: scores based on two main things that I think caused 1170 01:12:14,200 --> 01:12:16,040 Speaker 1: me to be worried about the f D A one 1171 01:12:16,160 --> 01:12:18,680 Speaker 1: is the degree of industry influence that leads it to 1172 01:12:19,640 --> 01:12:23,080 Speaker 1: diverge from its mission, as well as the degree to 1173 01:12:23,160 --> 01:12:26,360 Speaker 1: which it's vulnerable to political influence, you know, from the 1174 01:12:26,400 --> 01:12:31,360 Speaker 1: executive branch. Overall, I think the structure of the organization 1175 01:12:31,800 --> 01:12:38,400 Speaker 1: gives us generally reasonably high quality decisions. I would give 1176 01:12:38,479 --> 01:12:44,160 Speaker 1: it a forty five. Okay, that was great. Thank you 1177 01:12:44,280 --> 01:12:47,280 Speaker 1: so much for being here today, Jenny. I really appreciate it. 1178 01:12:47,960 --> 01:12:51,519 Speaker 1: Thank you for inviting me and having us think together 1179 01:12:51,800 --> 01:12:54,120 Speaker 1: about this issue. It was I really really had a 1180 01:12:54,160 --> 01:12:58,479 Speaker 1: great time. Thank you. All Right, folks, it's time for 1181 01:12:58,560 --> 01:13:02,719 Speaker 1: the FDA to get their fish'll b s score. Somehow 1182 01:13:02,800 --> 01:13:05,160 Speaker 1: feels appropriate that I try to get a little scientific 1183 01:13:05,280 --> 01:13:08,160 Speaker 1: with this one. So if I were to average our 1184 01:13:08,280 --> 01:13:12,400 Speaker 1: guest scores, the final score would be a forty eight. However, 1185 01:13:13,160 --> 01:13:17,280 Speaker 1: that still feels low to me, especially given Richard's account 1186 01:13:17,400 --> 01:13:20,559 Speaker 1: of basically being told to deliver the results his boss 1187 01:13:20,680 --> 01:13:24,160 Speaker 1: is wanted or risk losing his job at the agency. 1188 01:13:24,880 --> 01:13:28,360 Speaker 1: That's just plain scary. So I'm going to give the 1189 01:13:28,479 --> 01:13:31,920 Speaker 1: FDA a fifty five. I hope that acknowledges some of 1190 01:13:32,000 --> 01:13:36,040 Speaker 1: the agency's big winds over the years, but leaves plenty 1191 01:13:36,160 --> 01:13:40,439 Speaker 1: of room for some much needed improvement. And f d 1192 01:13:40,560 --> 01:13:44,120 Speaker 1: A Commissioner Robert Caliph, if you ever want to come 1193 01:13:44,160 --> 01:13:47,000 Speaker 1: on the show to discuss anything we've touched on today, 1194 01:13:47,439 --> 01:13:51,400 Speaker 1: please know that you have an open invitation. And if 1195 01:13:51,439 --> 01:13:54,280 Speaker 1: you're starting a purpose led business or thinking about beginning 1196 01:13:54,320 --> 01:13:57,360 Speaker 1: the journey of transformation to become one, here are three 1197 01:13:57,479 --> 01:14:04,559 Speaker 1: things you can take away from today's episod D One. 1198 01:14:05,800 --> 01:14:09,479 Speaker 1: Becoming purpose led is a big responsibility. It means that 1199 01:14:09,600 --> 01:14:13,519 Speaker 1: you're dedicating yourself to managing better outcomes for all of 1200 01:14:13,600 --> 01:14:18,560 Speaker 1: your stakeholders. That includes customers, employees, the community you do 1201 01:14:18,760 --> 01:14:23,000 Speaker 1: business in, and the planet. If all companies did that, 1202 01:14:23,400 --> 01:14:26,519 Speaker 1: we wouldn't need the f d A. The FDA only 1203 01:14:26,680 --> 01:14:31,360 Speaker 1: exists because the motivation to manage for maximum profitability is 1204 01:14:31,560 --> 01:14:35,639 Speaker 1: so powerful in our culture. The companies will knowingly sell 1205 01:14:35,720 --> 01:14:40,200 Speaker 1: products and services that damage people and the planet. Purpose 1206 01:14:40,320 --> 01:14:43,920 Speaker 1: Led companies win the trust of all their stakeholders by 1207 01:14:43,960 --> 01:14:47,880 Speaker 1: being transparent, by taking on problems they discover, and by 1208 01:14:48,040 --> 01:14:52,360 Speaker 1: solving them. They tend to do the right thing, even 1209 01:14:52,439 --> 01:14:57,120 Speaker 1: if it costs the money. Two. Once you know your purpose, 1210 01:14:57,400 --> 01:15:01,519 Speaker 1: take action against it. Since on. Fortunately, we do need 1211 01:15:01,600 --> 01:15:04,080 Speaker 1: the FDA to protect us from companies who would do 1212 01:15:04,240 --> 01:15:07,719 Speaker 1: us harm. The FDA needs to do a better job 1213 01:15:07,840 --> 01:15:11,920 Speaker 1: of that. The action idea that Genevieve brought today sticks 1214 01:15:12,000 --> 01:15:16,160 Speaker 1: with me. Any approval of any product is only conditional 1215 01:15:16,400 --> 01:15:19,519 Speaker 1: and time based and must be re reviewed once it's 1216 01:15:19,560 --> 01:15:21,840 Speaker 1: been in the market and we can all see the 1217 01:15:21,920 --> 01:15:26,400 Speaker 1: effects that it's actually having. That one change would help 1218 01:15:26,520 --> 01:15:29,240 Speaker 1: the f d A better live their purpose. What are 1219 01:15:29,280 --> 01:15:33,360 Speaker 1: the actions that you're taking to better live yours? Three? 1220 01:15:34,280 --> 01:15:38,520 Speaker 1: Simple is better. The f d A story is enormously 1221 01:15:38,680 --> 01:15:42,080 Speaker 1: complex because they cover such a broad waterfront and have 1222 01:15:42,320 --> 01:15:46,360 Speaker 1: so many stakeholders with so many different agendas. Define a 1223 01:15:46,560 --> 01:15:51,640 Speaker 1: simple reason why your company exists. Choose clarity and specificity 1224 01:15:51,840 --> 01:15:56,080 Speaker 1: over fluff. Solve a real problem, Do the right thing 1225 01:15:56,160 --> 01:15:59,880 Speaker 1: by your stakeholders, and keep doing that until you win. 1226 01:16:05,640 --> 01:16:08,919 Speaker 1: And if this episode made it through your approval process, 1227 01:16:09,320 --> 01:16:12,320 Speaker 1: subscribe to the Calling Bullshit podcast on the I Heart 1228 01:16:12,400 --> 01:16:15,920 Speaker 1: Radio app Apple Podcasts, where wherever you listen to people 1229 01:16:16,040 --> 01:16:19,240 Speaker 1: speaking to your ears and friends. I'd like to ask 1230 01:16:19,320 --> 01:16:22,679 Speaker 1: for your help. If you enjoy the Calling Bullshit podcast, 1231 01:16:22,840 --> 01:16:25,200 Speaker 1: please take a second to rate and review us on 1232 01:16:25,280 --> 01:16:29,920 Speaker 1: Apple podcasts or on your preferred platform. Thanks to our 1233 01:16:30,000 --> 01:16:34,679 Speaker 1: guests Lauren Edtter, Richard Williams, Gail Van Norman, and Genevieve Cantor. 1234 01:16:35,320 --> 01:16:37,639 Speaker 1: Learn more about them and get links to their work 1235 01:16:37,840 --> 01:16:41,360 Speaker 1: in our show notes, and many thanks to our production 1236 01:16:41,439 --> 01:16:46,800 Speaker 1: team Hannah bial, Amanda Ginsburg, d s Moss, Hailey, Pascalites, 1237 01:16:47,040 --> 01:16:52,000 Speaker 1: Parker Silzer, and Basil Soaper. Calling Bullshit was created by 1238 01:16:52,040 --> 01:16:54,680 Speaker 1: co Collective and it's hosted by me Ti Monting. You 1239 01:16:55,000 --> 01:17:01,719 Speaker 1: thanks for listening. Four