1 00:00:02,759 --> 00:00:05,960 Speaker 1: Every day you're generating data about your health. You might 2 00:00:06,000 --> 00:00:08,600 Speaker 1: not even be aware of it. Maybe your phone counts 3 00:00:08,600 --> 00:00:11,680 Speaker 1: how many steps you take. Maybe your watch measures your 4 00:00:11,680 --> 00:00:14,920 Speaker 1: pulse or your heart rhythm, or you use an app 5 00:00:14,960 --> 00:00:18,160 Speaker 1: to track your exercise or diet, and that doesn't even 6 00:00:18,239 --> 00:00:22,000 Speaker 1: count your medical data, the records that doctors and insurance 7 00:00:22,000 --> 00:00:25,680 Speaker 1: companies and pharmacies keep about all of us. All that 8 00:00:25,760 --> 00:00:34,360 Speaker 1: data goes somewhere, and it's valuable to someone. Welcome to Prognosis, 9 00:00:34,600 --> 00:00:38,440 Speaker 1: Bloomberg's podcast about the intersection of health and technology and 10 00:00:38,479 --> 00:00:42,240 Speaker 1: the unexpected places it's taking us. I'm your host Michelle 11 00:00:42,240 --> 00:00:46,680 Speaker 1: fay Cortes. The amount of health data is increasing fast, 12 00:00:47,479 --> 00:00:50,800 Speaker 1: from medical records, to health apps and devices, to our 13 00:00:50,840 --> 00:00:54,640 Speaker 1: shopping habits and online browsing. Every day we leave digital 14 00:00:54,680 --> 00:00:59,240 Speaker 1: footprints revealing intimate aspects of our lives. That comes with 15 00:00:59,280 --> 00:01:03,120 Speaker 1: benefits and risks. But no one has sorted it all 16 00:01:03,120 --> 00:01:06,240 Speaker 1: out yet and lost protect people haven't caught up with 17 00:01:06,280 --> 00:01:10,400 Speaker 1: the advances in technology. Having all that data promises to 18 00:01:10,440 --> 00:01:13,440 Speaker 1: help researchers come up with new treatments, and it can 19 00:01:13,480 --> 00:01:17,039 Speaker 1: improve doctor's care. But the risk is that personal information 20 00:01:17,200 --> 00:01:21,320 Speaker 1: you'd rather keep to yourself could be exposed. Here's Bloomberg's 21 00:01:21,319 --> 00:01:27,600 Speaker 1: health reporter John Tazzi. Good afternoon, Thank you for calling 22 00:01:27,640 --> 00:01:32,119 Speaker 1: Anthem Member services. My name is Kathy. How can I help? Hi, Kathy, 23 00:01:32,280 --> 00:01:35,440 Speaker 1: my name is John Tazzi. I'm a reporter with Bloomberg 24 00:01:35,480 --> 00:01:37,839 Speaker 1: News and I'm recording this for a story about medical 25 00:01:37,920 --> 00:01:41,080 Speaker 1: data and privacy. Um. I'm an ANTHEM member and I'd 26 00:01:41,120 --> 00:01:43,600 Speaker 1: like to request a list of who Anthem has shared 27 00:01:43,600 --> 00:01:50,280 Speaker 1: my personal information with. So you're a reporter with Bloomberg correct, Okay, 28 00:01:50,400 --> 00:01:52,640 Speaker 1: and you are inquired. I recently learned that I have 29 00:01:52,720 --> 00:01:55,600 Speaker 1: the right to ask my health insurance company what they're 30 00:01:55,600 --> 00:01:58,680 Speaker 1: doing with my data. It's one of the rights given 31 00:01:58,680 --> 00:02:02,400 Speaker 1: to me under a law called HIPPA. HIPPA stands for 32 00:02:02,480 --> 00:02:06,720 Speaker 1: the Health Insurance Portability and Accountability Act. It was passed 33 00:02:08,120 --> 00:02:10,440 Speaker 1: and it's the main law in the United States that 34 00:02:10,560 --> 00:02:14,080 Speaker 1: governs what medical providers and insurance companies can do with 35 00:02:14,120 --> 00:02:17,600 Speaker 1: our healthcare data. HIPPA determines how medical data can be 36 00:02:17,639 --> 00:02:22,000 Speaker 1: shared and what happens if it's shared improperly. It also 37 00:02:22,040 --> 00:02:24,560 Speaker 1: gives people rights over their data, like the right to 38 00:02:24,560 --> 00:02:27,120 Speaker 1: get a copy of your medical record or to find 39 00:02:27,160 --> 00:02:29,359 Speaker 1: out how your data has been shared with other parties, 40 00:02:30,000 --> 00:02:33,840 Speaker 1: but HIPPA doesn't cover everything. There is the idea, and 41 00:02:33,880 --> 00:02:39,600 Speaker 1: it is extremely widespread that health information is inherently going 42 00:02:39,639 --> 00:02:43,560 Speaker 1: to be protected by some law somewhere, but it's not true, 43 00:02:43,720 --> 00:02:46,400 Speaker 1: not at all. This is Pam Dixon. I am the 44 00:02:46,440 --> 00:02:49,920 Speaker 1: executive director of the World Privacy Forum, we're a public 45 00:02:49,960 --> 00:02:53,040 Speaker 1: interest research group, and has been a privacy advocate for 46 00:02:53,120 --> 00:02:56,720 Speaker 1: twenty years. She told me that people often assume there's 47 00:02:56,720 --> 00:03:00,400 Speaker 1: some kind of automatic protection for health data. That's not 48 00:03:00,560 --> 00:03:04,920 Speaker 1: the case. People universally believe that their health data, no 49 00:03:04,960 --> 00:03:08,960 Speaker 1: matter where it is, has some form of legal protection 50 00:03:09,200 --> 00:03:14,000 Speaker 1: and is somehow magically confidential. HIPPA applies to the records 51 00:03:14,040 --> 00:03:17,800 Speaker 1: that your doctor, other medical providers, and your insurance plan hold, 52 00:03:18,560 --> 00:03:21,200 Speaker 1: But more and more data about our health isn't just 53 00:03:21,320 --> 00:03:25,000 Speaker 1: in medical records. HIPPO covered data is a smaller and 54 00:03:25,080 --> 00:03:28,679 Speaker 1: smaller percentage of all of the health data that's out 55 00:03:28,680 --> 00:03:33,160 Speaker 1: there now, And it is so so important for folks 56 00:03:33,200 --> 00:03:37,520 Speaker 1: to understand this because much of the health data that's 57 00:03:37,720 --> 00:03:41,840 Speaker 1: that we're working with today is not covered under HIPPO protections. 58 00:03:42,680 --> 00:03:46,720 Speaker 1: Here's one famous example. Journalist Charles J. Hig reported that 59 00:03:46,840 --> 00:03:51,360 Speaker 1: target used detailed profiles of customers to predict when women 60 00:03:51,400 --> 00:03:54,560 Speaker 1: became pregnant, and then the company sent them promotions for 61 00:03:54,600 --> 00:03:57,880 Speaker 1: baby cones or diapers. The result was creepy in the 62 00:03:57,880 --> 00:04:00,440 Speaker 1: best case, and in the worst case, could have revealed 63 00:04:00,440 --> 00:04:05,040 Speaker 1: information they may not have wanted public. Increasingly, health data 64 00:04:05,200 --> 00:04:09,800 Speaker 1: is being collected by technology companies, data brokers, advertisers, and 65 00:04:09,920 --> 00:04:13,080 Speaker 1: other entities that are not subject to hip hop, and 66 00:04:13,160 --> 00:04:16,320 Speaker 1: it's being used and may be misused in ways that 67 00:04:16,400 --> 00:04:20,359 Speaker 1: a lot of people don't understand. Think about the apps 68 00:04:20,360 --> 00:04:23,280 Speaker 1: on your phone. Maybe you have something to track your 69 00:04:23,320 --> 00:04:26,040 Speaker 1: steps or to log what foods you eat or when 70 00:04:26,080 --> 00:04:29,960 Speaker 1: you exercise. Unless those apps come from your medical provider 71 00:04:30,080 --> 00:04:33,320 Speaker 1: or health plan, they're not covered by HIPPA, and that 72 00:04:33,360 --> 00:04:36,000 Speaker 1: means that the company is collecting your data are far 73 00:04:36,080 --> 00:04:39,040 Speaker 1: less restricted in how they use it, and how they 74 00:04:39,160 --> 00:04:42,920 Speaker 1: use it may not always be transparent. A study published 75 00:04:42,920 --> 00:04:46,160 Speaker 1: in the journal Drama Network Open in April looked at 76 00:04:46,200 --> 00:04:49,200 Speaker 1: thirty six top apps to help people with depression and 77 00:04:49,279 --> 00:04:53,320 Speaker 1: quitting smoking. Of them, we're sending data to Google or 78 00:04:53,360 --> 00:04:58,600 Speaker 1: Facebook for marketing, advertising, or analytics, But less than half 79 00:04:58,640 --> 00:05:03,200 Speaker 1: of those apps disclosed that. The authors wrote most apps 80 00:05:03,320 --> 00:05:06,400 Speaker 1: offered users no way to anticipate that data will be 81 00:05:06,440 --> 00:05:09,920 Speaker 1: shared in this way. As a result, users are denied 82 00:05:09,960 --> 00:05:14,520 Speaker 1: an informed choice about whether such sharing is acceptable to them. 83 00:05:14,560 --> 00:05:16,480 Speaker 1: This is the kind of risk that has some people 84 00:05:16,680 --> 00:05:20,760 Speaker 1: really worried. Even though some privacy advocates think hipp as 85 00:05:20,760 --> 00:05:24,960 Speaker 1: protections should be stronger, they're a good start. It's the 86 00:05:25,000 --> 00:05:28,080 Speaker 1: world of data beyond hip as reach that we need 87 00:05:28,120 --> 00:05:31,400 Speaker 1: to pay a lot more attention to. Because of the 88 00:05:31,480 --> 00:05:36,359 Speaker 1: lack of UH sort of a uniform standard across the 89 00:05:36,400 --> 00:05:39,480 Speaker 1: country with regard to data that it isn't protected by 90 00:05:39,600 --> 00:05:44,400 Speaker 1: hippa UM, there are concerns about the privacy, particularly of 91 00:05:44,480 --> 00:05:48,480 Speaker 1: health data. This is Aleana Peters. I'm currently a shareholder 92 00:05:48,600 --> 00:05:52,040 Speaker 1: at Pulsonelli, which is a national law firm. Aliana worked 93 00:05:52,040 --> 00:05:54,800 Speaker 1: for the federal government for about twelve years. She wrote 94 00:05:54,800 --> 00:05:57,640 Speaker 1: and enforced hippa regulations before she went to work for 95 00:05:57,680 --> 00:06:01,719 Speaker 1: a private law firm in twenty bike PAM. She's concerned 96 00:06:01,760 --> 00:06:04,960 Speaker 1: about the growing volume of health data that hippo doesn't cover. 97 00:06:05,360 --> 00:06:08,760 Speaker 1: The information that your employer holds about you related to 98 00:06:08,800 --> 00:06:12,000 Speaker 1: your health would not be protected by hippa UM. The 99 00:06:12,080 --> 00:06:15,200 Speaker 1: information that you share with social media about your health 100 00:06:15,360 --> 00:06:19,040 Speaker 1: or the groups that you participate in on social media 101 00:06:19,120 --> 00:06:23,279 Speaker 1: about health issues is not protected. There are applications that 102 00:06:23,320 --> 00:06:26,840 Speaker 1: are direct to consumer. That means they are marketed directly 103 00:06:26,880 --> 00:06:30,360 Speaker 1: to consumers and have everything to do with you know, 104 00:06:30,839 --> 00:06:36,920 Speaker 1: weight loss, to disease management, UM to disease prevention, because 105 00:06:36,920 --> 00:06:40,159 Speaker 1: they're marketed directly to a consumer and don't ever interact 106 00:06:40,320 --> 00:06:43,159 Speaker 1: with a healthcare provider on their behalf or with a 107 00:06:43,200 --> 00:06:46,720 Speaker 1: health plan that would not be covered by hippa UM. 108 00:06:46,800 --> 00:06:50,479 Speaker 1: So there's a there's a huge amount of healthcare data 109 00:06:51,080 --> 00:06:55,080 Speaker 1: UM out there that isn't actually covered by a standard 110 00:06:55,200 --> 00:06:58,880 Speaker 1: set of legal requirements. Here are some of the ways 111 00:06:58,880 --> 00:07:01,960 Speaker 1: you might be revealing data without knowing it. You use 112 00:07:01,960 --> 00:07:04,159 Speaker 1: a credit card to buy a pregnancy test at a 113 00:07:04,200 --> 00:07:07,840 Speaker 1: retail drug store. You order new pants online, revealing your 114 00:07:07,839 --> 00:07:12,040 Speaker 1: waist size. You search Google for symptoms of anxiety. You 115 00:07:12,080 --> 00:07:15,520 Speaker 1: subscribe to a magazine about diabetes. You use an app 116 00:07:15,560 --> 00:07:18,400 Speaker 1: to track your morning runs. You take a direct to 117 00:07:18,480 --> 00:07:22,120 Speaker 1: consumer DNA test, You take an uber to your therapist's 118 00:07:22,200 --> 00:07:26,600 Speaker 1: office at the same time each week. Just because information 119 00:07:26,640 --> 00:07:29,760 Speaker 1: about your health could be gleaned from these activities doesn't 120 00:07:29,760 --> 00:07:33,240 Speaker 1: mean it will be the problem is. We often don't 121 00:07:33,280 --> 00:07:35,960 Speaker 1: have a very good idea of where this data ends 122 00:07:36,040 --> 00:07:39,440 Speaker 1: up after it's collected. Some of it could end up 123 00:07:39,440 --> 00:07:42,360 Speaker 1: in the hands of data brokers. Data brokers are a 124 00:07:42,440 --> 00:07:45,920 Speaker 1: multibillion dollar industry made up of thousands of companies that 125 00:07:46,000 --> 00:07:49,520 Speaker 1: you've probably never heard of. They compile information about people 126 00:07:49,560 --> 00:07:52,960 Speaker 1: and sell it to marketers. They collect information from public 127 00:07:52,960 --> 00:07:55,880 Speaker 1: records and even that data that you might not realize 128 00:07:55,880 --> 00:07:59,520 Speaker 1: you're making, like your retail purchases, what groups you belong to, 129 00:07:59,600 --> 00:08:03,400 Speaker 1: online magazines and services you subscribe to, and information you 130 00:08:03,440 --> 00:08:07,720 Speaker 1: fill out in surveys or online registrations. They take all 131 00:08:07,760 --> 00:08:10,880 Speaker 1: of this information and make lists of people for marketers 132 00:08:10,920 --> 00:08:14,120 Speaker 1: to target. In testimony before the Senate Commerce Committee, in 133 00:08:15,280 --> 00:08:19,080 Speaker 1: Pam the Privacy Advocate described how the data broker industry 134 00:08:19,200 --> 00:08:22,600 Speaker 1: tracks people by the diseases they have and the medicines 135 00:08:22,680 --> 00:08:26,720 Speaker 1: they take. There are lists of millions of people that 136 00:08:26,800 --> 00:08:30,960 Speaker 1: are categorized by the diseases that they have, ranging from 137 00:08:31,000 --> 00:08:37,559 Speaker 1: cancer to bed wedding, Alzheimer's terrible diseases, some of them benign, 138 00:08:37,640 --> 00:08:40,560 Speaker 1: some of them relating to mental illness. There are lists 139 00:08:40,800 --> 00:08:45,400 Speaker 1: of millions of people and what prescription drugs that they take, 140 00:08:46,559 --> 00:08:52,320 Speaker 1: and these lists exist entirely outside of hip hop, outside 141 00:08:52,360 --> 00:08:55,720 Speaker 1: of what hip hop o the any kind of federal 142 00:08:56,080 --> 00:09:00,280 Speaker 1: health protection. Pam told Congress about some lists that the 143 00:09:00,360 --> 00:09:04,080 Speaker 1: darker sides of this business model. They included lists of 144 00:09:04,200 --> 00:09:08,440 Speaker 1: rape victims and people with genetic diseases. She found lists 145 00:09:08,480 --> 00:09:11,520 Speaker 1: for sale of people who had HIV and aids, of 146 00:09:11,559 --> 00:09:14,920 Speaker 1: people with dementia, and of people with alcohol or drug addiction. 147 00:09:15,600 --> 00:09:19,360 Speaker 1: There were lists of domestic violence victims and police officers 148 00:09:19,400 --> 00:09:23,560 Speaker 1: home addresses. The list of rape victims cost less than 149 00:09:23,600 --> 00:09:27,360 Speaker 1: eight cents per name. Pam said that some of these 150 00:09:27,400 --> 00:09:29,840 Speaker 1: lists were taken down within an hour or two of 151 00:09:29,880 --> 00:09:33,319 Speaker 1: her testimony, but most of them have reappeared at some point, 152 00:09:33,600 --> 00:09:37,040 Speaker 1: and six years after her testimony, she says not much 153 00:09:37,040 --> 00:09:41,040 Speaker 1: has changed. The data. Broker dossiers are often described as 154 00:09:41,080 --> 00:09:44,720 Speaker 1: marketing lists, but Pam said that doesn't necessarily mean the 155 00:09:44,760 --> 00:09:47,640 Speaker 1: buyers or marketers, and it also doesn't mean that the 156 00:09:47,720 --> 00:09:52,280 Speaker 1: lists are used as they're intended. For example, employers or 157 00:09:52,320 --> 00:09:55,760 Speaker 1: insurance companies could also be buying and using this data. 158 00:09:55,960 --> 00:09:59,559 Speaker 1: There's no law against this, So all of this points 159 00:09:59,600 --> 00:10:02,800 Speaker 1: to a knee for more protection. The laws we have 160 00:10:03,120 --> 00:10:07,160 Speaker 1: just don't reach far enough. But despite its limits, HIPPA 161 00:10:07,200 --> 00:10:14,240 Speaker 1: does provide a good framework for where to start. Here's 162 00:10:14,280 --> 00:10:17,600 Speaker 1: the good news. When data is covered by HIPPA, the 163 00:10:17,679 --> 00:10:22,520 Speaker 1: law gives people important protections. Healthcare providers and insurance plans 164 00:10:22,520 --> 00:10:27,040 Speaker 1: are barred from disclosing individually identifiable data under HIPPOP and 165 00:10:27,080 --> 00:10:30,240 Speaker 1: it goes further. As you might remember, the law also 166 00:10:30,280 --> 00:10:34,640 Speaker 1: grounds people rights over their data. It gives people seven 167 00:10:34,720 --> 00:10:39,000 Speaker 1: different rights, and the rights are really important because before 168 00:10:39,080 --> 00:10:42,679 Speaker 1: HIPPA there were huge problems. Pam says, it was really 169 00:10:42,720 --> 00:10:45,240 Speaker 1: difficult to get a copy of your own medical records 170 00:10:45,280 --> 00:10:49,800 Speaker 1: before HIPPA. Before HIPPA, good luck getting a consistent copy 171 00:10:50,040 --> 00:10:53,160 Speaker 1: of your health file. It wasn't a legal requirement anywhere, 172 00:10:53,240 --> 00:10:56,320 Speaker 1: so you you can predict what was happening prior to HIPPO. 173 00:10:56,360 --> 00:10:59,800 Speaker 1: It was a disaster trying to get your health information. 174 00:11:00,120 --> 00:11:01,880 Speaker 1: It also gives you the right to know if someone 175 00:11:01,920 --> 00:11:04,600 Speaker 1: has subpoenaed your medical records, which might happen in a 176 00:11:04,679 --> 00:11:07,960 Speaker 1: nasty divorce case, for example. And it gives you the 177 00:11:08,040 --> 00:11:11,800 Speaker 1: right to request an accounting of disclosures that's the list 178 00:11:11,840 --> 00:11:13,920 Speaker 1: of who your doctor or health plan has shared your 179 00:11:13,960 --> 00:11:16,760 Speaker 1: medical records with. The list that I'm trying to get 180 00:11:16,840 --> 00:11:20,920 Speaker 1: from Anthem. HIPPA also sets the rules for what those 181 00:11:21,040 --> 00:11:23,800 Speaker 1: entities can do with your data. They can't just make 182 00:11:23,840 --> 00:11:26,880 Speaker 1: it public. They can't tell a reporter or your employer 183 00:11:27,120 --> 00:11:30,280 Speaker 1: or a family member about your diagnosis, your treatment, or 184 00:11:30,320 --> 00:11:34,920 Speaker 1: any other private information without your permission. HIPPA does allow 185 00:11:35,000 --> 00:11:38,320 Speaker 1: medical providers and health plans to release data if it's 186 00:11:38,520 --> 00:11:43,400 Speaker 1: de identified. That means removing information like your name, address, 187 00:11:43,600 --> 00:11:48,240 Speaker 1: precise zip code, and other details. This d identified data 188 00:11:48,360 --> 00:11:51,320 Speaker 1: can be used for research. It can also be sold. 189 00:11:51,800 --> 00:11:54,680 Speaker 1: For example, when drug companies want to know which doctors 190 00:11:54,679 --> 00:11:57,839 Speaker 1: are writing the most prescriptions for their medications, they pay 191 00:11:57,960 --> 00:12:02,079 Speaker 1: data brokers who collect that information. Then pharmaceutical companies can 192 00:12:02,120 --> 00:12:05,680 Speaker 1: send their salespeople to doctors who are the highest volume prescribers. 193 00:12:06,840 --> 00:12:09,240 Speaker 1: The data they're buying doesn't have your name on it, 194 00:12:09,720 --> 00:12:14,040 Speaker 1: but it does represent you aggregated with other people, and 195 00:12:14,160 --> 00:12:19,720 Speaker 1: once it's de identified, it's no longer bound by hippa's protections. 196 00:12:19,760 --> 00:12:23,280 Speaker 1: Some privacy advocates I talked to described this as a 197 00:12:23,400 --> 00:12:27,160 Speaker 1: violation of privacy. The fact that you can't control de 198 00:12:27,400 --> 00:12:31,439 Speaker 1: identified versions of your data is really troubling to some people. 199 00:12:32,200 --> 00:12:35,160 Speaker 1: It's especially concerning because of the risk that some de 200 00:12:35,320 --> 00:12:39,280 Speaker 1: identified data could be re identified that it could be 201 00:12:39,320 --> 00:12:43,240 Speaker 1: matched back to you as an individual. Most experts I 202 00:12:43,320 --> 00:12:47,360 Speaker 1: talked to said that this risk is real, but small. Still, 203 00:12:47,520 --> 00:12:51,640 Speaker 1: the odds of being reidentified have increased since HIPPA was 204 00:12:51,720 --> 00:12:56,679 Speaker 1: first passed in the Here's PAM. The world has changed. 205 00:12:57,200 --> 00:13:01,400 Speaker 1: So back then, I mean, the statistic chance of reidentifying 206 00:13:01,440 --> 00:13:06,560 Speaker 1: records was enormously low. Now the chance of reidentifying records 207 00:13:06,679 --> 00:13:10,080 Speaker 1: is a little bit easier because computing power has advanced 208 00:13:10,320 --> 00:13:13,600 Speaker 1: so much and there's so many more data sets that 209 00:13:13,640 --> 00:13:18,400 Speaker 1: allow for more identifiability. But there are also benefits to 210 00:13:18,480 --> 00:13:23,040 Speaker 1: making de identified data available. Medical researchers rely on it 211 00:13:23,120 --> 00:13:27,120 Speaker 1: to learn about how to improve care, public health officials 212 00:13:27,200 --> 00:13:30,760 Speaker 1: use it to track epidemics and trends in population health, 213 00:13:31,280 --> 00:13:34,600 Speaker 1: and as a journalist, I often cite research or findings 214 00:13:34,640 --> 00:13:37,240 Speaker 1: based on this kind of data, from how common certain 215 00:13:37,280 --> 00:13:40,800 Speaker 1: medical procedures are to how often a new drug is prescribed. 216 00:13:41,920 --> 00:13:45,400 Speaker 1: I work in privacy and I definitely have an opinion 217 00:13:45,440 --> 00:13:49,720 Speaker 1: on privacy. I'm I'm for privacy and something that was 218 00:13:49,880 --> 00:13:52,880 Speaker 1: very hard for me to learn and it took years. 219 00:13:53,559 --> 00:13:58,760 Speaker 1: UM was the value of releasing data. Pam said she's 220 00:13:58,760 --> 00:14:02,200 Speaker 1: come to realize that trade offs between keeping data totally 221 00:14:02,240 --> 00:14:05,920 Speaker 1: private and using some d identified pieces of it. If 222 00:14:05,920 --> 00:14:08,959 Speaker 1: you want to cure diseases, you're going to have to 223 00:14:09,000 --> 00:14:12,400 Speaker 1: study the disease, and you can't do that without information 224 00:14:12,440 --> 00:14:16,559 Speaker 1: about the disease. Information about that disease resides in people's 225 00:14:16,600 --> 00:14:21,760 Speaker 1: experience with that disease as patients. We might also benefit 226 00:14:21,800 --> 00:14:25,680 Speaker 1: directly from having more of our healthcare data digitized to 227 00:14:25,800 --> 00:14:28,240 Speaker 1: learn about these benefits. I paid a visit to the 228 00:14:28,240 --> 00:14:34,560 Speaker 1: Commonwealth Fund. Welcome, Thank you. I was there to see 229 00:14:34,560 --> 00:14:39,000 Speaker 1: a man named David Blumenthal. I'm president of the Commal Fund, 230 00:14:39,080 --> 00:14:43,200 Speaker 1: which is a national health careful anthropy based in New 231 00:14:43,240 --> 00:14:47,120 Speaker 1: York City, and our goal is create hard for regal 232 00:14:47,160 --> 00:14:49,880 Speaker 1: system in the United States. David's office is in a 233 00:14:49,960 --> 00:14:53,280 Speaker 1: landmarked hundred and eleven year old mansion on Manhattan's Upper 234 00:14:53,320 --> 00:14:56,840 Speaker 1: east Side, overlooking Central Park. It used to belong to 235 00:14:56,880 --> 00:15:00,040 Speaker 1: the Harkness family, which endowed the Commonwealth Fund ascend the 236 00:15:00,000 --> 00:15:03,200 Speaker 1: three ago with money they made as investors in John D. 237 00:15:03,360 --> 00:15:07,400 Speaker 1: Rockefeller's Standard Oil Company. David Blumenthal is a big name 238 00:15:07,400 --> 00:15:10,320 Speaker 1: in healthcare. He worked as a primary care doctor at 239 00:15:10,360 --> 00:15:14,760 Speaker 1: Massachusetts General Hospital. He advised Senator Ted Kennedy on healthcare 240 00:15:15,160 --> 00:15:18,160 Speaker 1: and later worked for President Barack Obama as the country's 241 00:15:18,200 --> 00:15:21,320 Speaker 1: top health I T official. He helped implement a lot 242 00:15:21,440 --> 00:15:24,800 Speaker 1: called the High Tech Act, which updated some HIPPA rules. 243 00:15:25,280 --> 00:15:28,640 Speaker 1: It also gave medical providers billions of dollars in federal 244 00:15:28,680 --> 00:15:33,520 Speaker 1: subsidies to digitize paper records. The High Tech Act was 245 00:15:33,560 --> 00:15:38,360 Speaker 1: intended to modernize America's paper based healthcare system. As recently 246 00:15:38,400 --> 00:15:41,680 Speaker 1: as ten years ago, a majority of doctor's offices in 247 00:15:41,720 --> 00:15:46,000 Speaker 1: the United States still used paper records. David's a big 248 00:15:46,040 --> 00:15:49,320 Speaker 1: believer in how the accumulation of digital healthcare data can 249 00:15:49,360 --> 00:15:53,680 Speaker 1: help people. As it grows, it begins to represent the 250 00:15:53,840 --> 00:16:01,760 Speaker 1: healthcare experience of millions, are even billions of people, and 251 00:16:01,840 --> 00:16:06,920 Speaker 1: that is incredibly valuable. He says. Apps that draw on 252 00:16:07,000 --> 00:16:09,800 Speaker 1: patients data could help them take better care of themselves. 253 00:16:10,320 --> 00:16:12,920 Speaker 1: They could prompt people to get flu shots or alert 254 00:16:12,960 --> 00:16:17,200 Speaker 1: diabetics when their bud sugar gets out of whack. David 255 00:16:17,280 --> 00:16:20,080 Speaker 1: says he sees the benefits of greater access to medical 256 00:16:20,160 --> 00:16:23,240 Speaker 1: data as a physician and as a patient. Though he 257 00:16:23,320 --> 00:16:25,800 Speaker 1: works in New York, he lives in Boston, and he 258 00:16:25,880 --> 00:16:28,840 Speaker 1: still sees doctors at Mass General where he used to work, 259 00:16:28,920 --> 00:16:32,280 Speaker 1: and it's affiliated hospitals in the Partners health care system. 260 00:16:32,360 --> 00:16:34,680 Speaker 1: He finds it comforting that he can walk into any 261 00:16:34,680 --> 00:16:37,400 Speaker 1: of the dozens of clinics or hospitals in the system 262 00:16:37,560 --> 00:16:40,800 Speaker 1: and they'll still have his records. I have seen and 263 00:16:41,040 --> 00:16:47,720 Speaker 1: use that that connectiveness with my own care, and it's 264 00:16:47,840 --> 00:16:52,760 Speaker 1: enormously reassure um that you don't have to be you know, 265 00:16:53,040 --> 00:16:55,760 Speaker 1: your medicines will be known, your results of all your 266 00:16:55,800 --> 00:17:00,920 Speaker 1: tests will be known, and all that saving That could 267 00:17:00,920 --> 00:17:03,960 Speaker 1: solve some big problems in the US health care system. 268 00:17:04,000 --> 00:17:06,480 Speaker 1: There's a lot of evidence that patients are harmed all 269 00:17:06,480 --> 00:17:09,920 Speaker 1: the time because their care is fragmented and not coordinated. 270 00:17:10,640 --> 00:17:13,159 Speaker 1: A specialist who doesn't know all the medications you're on 271 00:17:13,400 --> 00:17:16,040 Speaker 1: might prescribe a new drug that has a bad interaction 272 00:17:16,080 --> 00:17:19,440 Speaker 1: with one you're already taking. One study of more than 273 00:17:19,440 --> 00:17:22,639 Speaker 1: half a million patients with chronic illnesses like diabetes or 274 00:17:22,640 --> 00:17:25,920 Speaker 1: heart disease found that people who had more fragmented care 275 00:17:26,160 --> 00:17:30,879 Speaker 1: had higher costs, lower quality care, and more preventable hospital visits. 276 00:17:31,440 --> 00:17:33,359 Speaker 1: This is a real problem that a lot of people 277 00:17:33,400 --> 00:17:36,919 Speaker 1: in healthcare would like to solve. Policymakers are trying to 278 00:17:36,920 --> 00:17:39,760 Speaker 1: make the whole country's health care system work better together. 279 00:17:39,960 --> 00:17:43,480 Speaker 1: They're trying to encourage different electronic medical record systems to 280 00:17:43,600 --> 00:17:46,520 Speaker 1: talk to each other. They're also making it easier for 281 00:17:46,600 --> 00:17:50,120 Speaker 1: patients on government health insurance like medicare to get access 282 00:17:50,160 --> 00:17:52,640 Speaker 1: to their health data. The goal is a health care 283 00:17:52,680 --> 00:17:57,000 Speaker 1: system that seamlessly relays important information that could save your life. 284 00:17:57,640 --> 00:18:00,840 Speaker 1: David gave me a classic example. You live in Austin, 285 00:18:01,000 --> 00:18:03,960 Speaker 1: but you get into a car accident in Chicago. Once 286 00:18:04,000 --> 00:18:07,320 Speaker 1: you get to the emergency room, maybe you're dazed or unconscious, 287 00:18:07,440 --> 00:18:10,520 Speaker 1: or you forget to tell the physician about analergy. But 288 00:18:10,880 --> 00:18:13,919 Speaker 1: if digital records were more widely accessible, that might not 289 00:18:14,000 --> 00:18:18,840 Speaker 1: be an issue. The merger room physician finds your Apple 290 00:18:19,240 --> 00:18:21,439 Speaker 1: phone and everything is on the Apple phone, or it 291 00:18:21,520 --> 00:18:24,800 Speaker 1: can access your record in a cloud because there's an 292 00:18:24,840 --> 00:18:28,879 Speaker 1: agreement to share those informations, and so that increases the 293 00:18:28,880 --> 00:18:31,560 Speaker 1: reliability of your ka, reduces the chance of an error, 294 00:18:31,880 --> 00:18:36,840 Speaker 1: reduces the chance of a of a bad outcome. That's 295 00:18:36,840 --> 00:18:43,600 Speaker 1: the benefit. The risk is most promised, and the sicker 296 00:18:43,640 --> 00:18:48,119 Speaker 1: people are the less concerned they are about price. But 297 00:18:48,200 --> 00:18:52,040 Speaker 1: there are also downsides. Just as with app collected data, 298 00:18:52,440 --> 00:18:56,280 Speaker 1: more traditional medical data sharing has its drawbacks. The risk 299 00:18:56,440 --> 00:18:59,879 Speaker 1: is that nothing is ever truly private. As soon as 300 00:19:00,000 --> 00:19:05,520 Speaker 1: your information is available electron reform, either in a server 301 00:19:05,760 --> 00:19:13,200 Speaker 1: or in the cloud, it is potentially David has experienced 302 00:19:13,240 --> 00:19:17,840 Speaker 1: this firsthand as a federal employee. His data was breached 303 00:19:17,920 --> 00:19:21,560 Speaker 1: in a hack of the government's employee database. I've given 304 00:19:21,680 --> 00:19:26,640 Speaker 1: up on the idea of privacy. It's just not feasible anymore. 305 00:19:27,400 --> 00:19:31,040 Speaker 1: It hasn't that I know, have happened to my health, 306 00:19:32,200 --> 00:19:35,960 Speaker 1: but it could um and I expected mine. Once data 307 00:19:36,040 --> 00:19:38,760 Speaker 1: is digitized and stored, there's a risk it might end 308 00:19:38,840 --> 00:19:43,760 Speaker 1: up somewhere you don't want it to. HIPPA requires medical 309 00:19:43,800 --> 00:19:46,480 Speaker 1: providers and health plans to tell you when your data 310 00:19:46,520 --> 00:19:49,760 Speaker 1: has been breached, and under the High Tech Act, if 311 00:19:49,800 --> 00:19:52,919 Speaker 1: a breach affects more than five people, the companies have 312 00:19:53,040 --> 00:19:56,359 Speaker 1: to report it to the federal government, which publishes a list. 313 00:19:57,200 --> 00:20:00,920 Speaker 1: Since two tho nine, when the reporting requirement into effect, 314 00:20:01,280 --> 00:20:05,080 Speaker 1: HIPPO covered entities have reported more than two thousand, five 315 00:20:05,200 --> 00:20:10,040 Speaker 1: hundred breaches that affected almost two hundred million individuals health records. 316 00:20:10,720 --> 00:20:14,280 Speaker 1: Health data breaches happen so frequently now that they rarely 317 00:20:14,359 --> 00:20:18,160 Speaker 1: make the news their routine. On average, there's a breach 318 00:20:18,200 --> 00:20:22,439 Speaker 1: of HIPPO protected health data every thirty one hours, and 319 00:20:22,520 --> 00:20:25,600 Speaker 1: that's only the data breaches that companies have detected and 320 00:20:25,640 --> 00:20:28,200 Speaker 1: that we know about. We know about them because the 321 00:20:28,280 --> 00:20:31,879 Speaker 1: law requires entities covered by HIPPA to tell us, but 322 00:20:32,000 --> 00:20:35,840 Speaker 1: under federal law, entities not covered by HIPPA generally don't 323 00:20:35,840 --> 00:20:38,359 Speaker 1: have to tell us when a data breach happens. The 324 00:20:38,520 --> 00:20:42,399 Speaker 1: state laws may require them to report breaches. They also 325 00:20:42,440 --> 00:20:45,720 Speaker 1: aren't bound by any of the other requirements of HIPPA. 326 00:20:45,800 --> 00:20:48,240 Speaker 1: They're mostly bound by the promises they make to you 327 00:20:48,359 --> 00:20:52,080 Speaker 1: in their terms of service, those long passages of legal 328 00:20:52,160 --> 00:20:54,639 Speaker 1: ease that you click through after you download an app 329 00:20:54,880 --> 00:20:57,640 Speaker 1: or sign up for a new service, and that's where 330 00:20:57,680 --> 00:21:00,560 Speaker 1: a lot of the privacy concerns about health data are growing. 331 00:21:01,040 --> 00:21:03,080 Speaker 1: There's not only the risk that your data might get 332 00:21:03,080 --> 00:21:06,199 Speaker 1: breached in an illegal hacking operation or stolen by a 333 00:21:06,200 --> 00:21:09,240 Speaker 1: crooked employee. There's also the risk that it might get 334 00:21:09,280 --> 00:21:12,920 Speaker 1: shared or sold in a way that's not necessarily illegal 335 00:21:13,440 --> 00:21:18,000 Speaker 1: but isn't completely transparent. Either. Facebook and Amazon can do 336 00:21:18,000 --> 00:21:21,240 Speaker 1: anything they want with your data or any other any 337 00:21:21,280 --> 00:21:24,720 Speaker 1: company that's not a covered can do anything they want 338 00:21:25,880 --> 00:21:29,159 Speaker 1: unless they have assured you in that fine print that 339 00:21:29,240 --> 00:21:32,800 Speaker 1: they won't. But since none of us need that fine print, 340 00:21:33,040 --> 00:21:36,399 Speaker 1: will never get around suing. So under HIPPA, we have 341 00:21:36,480 --> 00:21:39,080 Speaker 1: certain rights, the right to get a copy of our data, 342 00:21:39,400 --> 00:21:41,880 Speaker 1: the right to know how it's being shared and when 343 00:21:41,920 --> 00:21:45,800 Speaker 1: it is shared improperly, and it requires healthcare providers to 344 00:21:45,920 --> 00:21:49,800 Speaker 1: keep our identifying data close to not disclose it without 345 00:21:49,800 --> 00:21:53,040 Speaker 1: our permission. We don't have those rights over the data 346 00:21:53,080 --> 00:21:55,280 Speaker 1: we give to some app we download, or a new 347 00:21:55,320 --> 00:21:58,719 Speaker 1: fitness device or a social media service. We don't have 348 00:21:58,800 --> 00:22:01,800 Speaker 1: those rights over what with our credit card purchasing data 349 00:22:02,320 --> 00:22:06,640 Speaker 1: or our online searches. Partly because we don't have those rights, 350 00:22:06,680 --> 00:22:10,000 Speaker 1: sometimes our names and contact details wind up for sale 351 00:22:10,000 --> 00:22:14,360 Speaker 1: on data brokers lists labeling us as diabetics or dementia 352 00:22:14,480 --> 00:22:19,800 Speaker 1: sufferers or victims of domestic violence. Right now, the law 353 00:22:19,840 --> 00:22:22,600 Speaker 1: doesn't do a very good job of making companies be 354 00:22:22,840 --> 00:22:26,280 Speaker 1: really clear about what they're doing with our data and 355 00:22:26,320 --> 00:22:30,040 Speaker 1: making sure customers are okay with it. So what should 356 00:22:30,040 --> 00:22:32,399 Speaker 1: we do? I think it's a really good question, and 357 00:22:32,400 --> 00:22:36,320 Speaker 1: it's a tough question. Here's Ileana Peters the attorney and 358 00:22:36,359 --> 00:22:40,440 Speaker 1: former HIPPO official. Trying to decide what's best for all 359 00:22:40,560 --> 00:22:43,400 Speaker 1: industries with regard to the privacy and security of data 360 00:22:43,480 --> 00:22:47,560 Speaker 1: is extremely difficult. I think, certainly there are some things 361 00:22:47,600 --> 00:22:49,520 Speaker 1: we can all agree on, and maybe that's where we 362 00:22:49,560 --> 00:22:52,440 Speaker 1: need to start. Certainly, I think individual rights is one 363 00:22:52,480 --> 00:22:55,040 Speaker 1: of those things, you know. I think everybody should have 364 00:22:55,160 --> 00:22:57,240 Speaker 1: rights to their own data and should be able to 365 00:22:57,280 --> 00:23:01,720 Speaker 1: be at least participatory and how they're data maybe um 366 00:23:01,840 --> 00:23:04,800 Speaker 1: used or disclosed, why it should be deleted, how that 367 00:23:04,840 --> 00:23:07,600 Speaker 1: should happen? Um, you know, when they can get copies 368 00:23:07,680 --> 00:23:12,800 Speaker 1: of it, how that should happen. One possible model for 369 00:23:12,840 --> 00:23:16,440 Speaker 1: people looking to improve privacy policy in the United States 370 00:23:16,720 --> 00:23:18,959 Speaker 1: is a new law that recently took effect in the 371 00:23:18,960 --> 00:23:23,000 Speaker 1: European Union. It's called the General Data Protection Regulation, and 372 00:23:23,000 --> 00:23:26,800 Speaker 1: its strengthens privacy protections for consumers. It covers all sorts 373 00:23:26,840 --> 00:23:30,560 Speaker 1: of personal data, not just healthcare. The law mix companies 374 00:23:30,560 --> 00:23:33,280 Speaker 1: get more explicit consent from people about the data they 375 00:23:33,320 --> 00:23:35,840 Speaker 1: want to collect. It also gives people a right to 376 00:23:35,840 --> 00:23:38,000 Speaker 1: get a copy of their data, and it's supposed to 377 00:23:38,000 --> 00:23:41,119 Speaker 1: give them more control over what happens to it. The 378 00:23:41,200 --> 00:23:44,520 Speaker 1: United States doesn't have anything like it yet, and there's 379 00:23:44,560 --> 00:23:47,840 Speaker 1: no clear path to passing a new umbrella privacy law 380 00:23:47,920 --> 00:23:52,280 Speaker 1: in the US anytime soon. That means that even companies 381 00:23:52,440 --> 00:23:55,400 Speaker 1: trying to do the right thing don't have good standards 382 00:23:55,400 --> 00:24:00,480 Speaker 1: to follow. Pam Dixon, the privacy advocate, said, we've start 383 00:24:00,480 --> 00:24:03,480 Speaker 1: by creating a set of standards that companies adhere to 384 00:24:03,640 --> 00:24:07,480 Speaker 1: voluntarily that would give consumers more trust and how their 385 00:24:07,520 --> 00:24:11,159 Speaker 1: data is being used. So ideally, what I'd like to 386 00:24:11,200 --> 00:24:14,800 Speaker 1: see at a minimum, is some kind of structure that 387 00:24:14,840 --> 00:24:19,360 Speaker 1: allows for um privacy standards to be built. Is there 388 00:24:19,400 --> 00:24:23,080 Speaker 1: a privacy standard we could write for health data outside 389 00:24:23,080 --> 00:24:25,320 Speaker 1: of HIPPA. I think there is, and I think we 390 00:24:25,359 --> 00:24:29,360 Speaker 1: could find a lot of agreement amongst the stakeholders. As 391 00:24:29,400 --> 00:24:31,439 Speaker 1: I said, I think there's a lot of people who 392 00:24:31,520 --> 00:24:33,240 Speaker 1: want to do the right thing. It's just there's not 393 00:24:33,320 --> 00:24:36,240 Speaker 1: a standard yet. In the meantime, what can we do 394 00:24:36,400 --> 00:24:40,679 Speaker 1: as individuals to have more control over our data? First, 395 00:24:40,920 --> 00:24:44,240 Speaker 1: you can exercise the rights you already have under HIPPA. 396 00:24:44,560 --> 00:24:47,680 Speaker 1: Pam recommends everyone get a copy of their medical records 397 00:24:47,760 --> 00:24:51,600 Speaker 1: from their providers. If someone tries to steal your identity 398 00:24:51,720 --> 00:24:54,840 Speaker 1: later on, it will be important to have your original files. 399 00:24:55,680 --> 00:24:58,080 Speaker 1: If you have kids, get copies for your kids too. 400 00:24:58,680 --> 00:25:01,240 Speaker 1: You can also pay attention to what you're agreeing to 401 00:25:01,600 --> 00:25:05,400 Speaker 1: when you start using a new app or service. Here's Alana. 402 00:25:05,800 --> 00:25:09,879 Speaker 1: I read everything before I click I accept, but I 403 00:25:10,000 --> 00:25:13,000 Speaker 1: realized that I may not be the typical user. PAM 404 00:25:13,040 --> 00:25:17,320 Speaker 1: also recommends simply asking companies what data they're collecting and 405 00:25:17,359 --> 00:25:19,960 Speaker 1: what they're doing with it. You know, sending an email 406 00:25:20,160 --> 00:25:23,720 Speaker 1: to um an app developer and asking what happens is 407 00:25:23,720 --> 00:25:26,160 Speaker 1: always a great idea. I do that all the time. 408 00:25:26,640 --> 00:25:28,919 Speaker 1: If they don't email me back, I delete the app. 409 00:25:29,320 --> 00:25:32,600 Speaker 1: I'm a reporter, so maybe I'm biased about this, but 410 00:25:32,720 --> 00:25:35,480 Speaker 1: I think asking questions is a good way to show 411 00:25:35,600 --> 00:25:38,680 Speaker 1: the people were trusting with our data, that we're paying attention, 412 00:25:39,160 --> 00:25:41,520 Speaker 1: that we care about what happens to it, and that 413 00:25:41,560 --> 00:25:44,760 Speaker 1: we want some control. I spent about twenty minutes on 414 00:25:44,800 --> 00:25:47,760 Speaker 1: the phone with my insurance company. Most of the time 415 00:25:47,920 --> 00:25:58,200 Speaker 1: I was on hold, Hello, Yes, thank you so much 416 00:25:58,200 --> 00:26:01,840 Speaker 1: for patiently waiting. I can clearly apoll eventually. I just 417 00:26:01,880 --> 00:26:05,720 Speaker 1: wanted to make sure that she was really friendly, and 418 00:26:05,800 --> 00:26:08,800 Speaker 1: eventually she gave me the address of the privacy office 419 00:26:09,000 --> 00:26:11,600 Speaker 1: where I could send an email to request an accounting 420 00:26:11,600 --> 00:26:15,640 Speaker 1: of disclosures, one of my rights under HIPPA. I wrote 421 00:26:15,640 --> 00:26:18,040 Speaker 1: to them in April. At the end of May, they 422 00:26:18,080 --> 00:26:20,639 Speaker 1: sent me a letter that described how my health information 423 00:26:20,680 --> 00:26:24,120 Speaker 1: was released. Anthem said they're required by law to send 424 00:26:24,119 --> 00:26:26,879 Speaker 1: my claims records to a database run by the state 425 00:26:26,920 --> 00:26:30,560 Speaker 1: Health Department. The letter also said that my name, date 426 00:26:30,600 --> 00:26:34,360 Speaker 1: of birth, and contact information were exposed in a cyber 427 00:26:34,400 --> 00:26:38,280 Speaker 1: attack in t Anthem was hacked in a breach that 428 00:26:38,359 --> 00:26:42,240 Speaker 1: compromised data on seventy nine million people. It was the 429 00:26:42,320 --> 00:26:46,560 Speaker 1: largest recorded health data theft in US history. Anthem paid 430 00:26:46,600 --> 00:26:50,440 Speaker 1: a sixteen million dollars settlement last year over potential HIPPO 431 00:26:50,520 --> 00:26:54,159 Speaker 1: violations related to the breach. The company did not admit 432 00:26:54,200 --> 00:26:58,160 Speaker 1: liability as part of the settlement, and justin May, two 433 00:26:58,240 --> 00:27:02,520 Speaker 1: Chinese nationals were indicted in the crime. The Justice Department 434 00:27:02,680 --> 00:27:06,639 Speaker 1: called them part of an extremely sophisticated hacking group operating 435 00:27:06,680 --> 00:27:10,480 Speaker 1: in China that targeted US businesses. We got in touch 436 00:27:10,560 --> 00:27:13,960 Speaker 1: with Anthem about this. A spokeswoman there said the company 437 00:27:14,040 --> 00:27:17,680 Speaker 1: is committed to safeguarding customer data and there's no evidence 438 00:27:17,760 --> 00:27:21,520 Speaker 1: that the information stolen in the cyber attack resulted in 439 00:27:21,600 --> 00:27:25,320 Speaker 1: fraud against customers. So I know my data is out there, 440 00:27:25,760 --> 00:27:29,200 Speaker 1: along with millions of other people's. I don't feel great 441 00:27:29,200 --> 00:27:31,880 Speaker 1: about it, but at least I know. I'm more worried 442 00:27:32,040 --> 00:27:44,879 Speaker 1: about what I don't know. And that's it for this 443 00:27:44,920 --> 00:27:48,720 Speaker 1: week's prognosis. Thanks for listening. Do you have a story 444 00:27:48,760 --> 00:27:51,679 Speaker 1: about healthcare in the US or around the world we 445 00:27:51,720 --> 00:27:54,320 Speaker 1: want to hear from you. Find me on Twitter at 446 00:27:54,400 --> 00:27:58,120 Speaker 1: the Cortes or email m Cortes at bloomberg dot net. 447 00:27:58,880 --> 00:28:01,080 Speaker 1: If you were a fan of this episode, please take 448 00:28:01,119 --> 00:28:03,760 Speaker 1: a moment to rate and review us and really helps 449 00:28:03,760 --> 00:28:07,000 Speaker 1: new listeners find the show and don't forget to subscribe. 450 00:28:07,600 --> 00:28:11,160 Speaker 1: This episode was produced by Lindsay Cratterwell. Our story editor 451 00:28:11,240 --> 00:28:15,760 Speaker 1: was Rick Shine. Special thanks to Drew Armstrong. Francesco Levie 452 00:28:15,840 --> 00:28:18,560 Speaker 1: is head of Bloomberg Podcasts. We'll be back on June 453 00:28:19,119 --> 00:28:21,040 Speaker 1: with our next episode. See you then,