1 00:00:00,920 --> 00:00:04,000 Speaker 1: Selecting and selling your data. Data is the new natural 2 00:00:04,040 --> 00:00:07,640 Speaker 1: resource of our time. That's the most sought after information 3 00:00:08,119 --> 00:00:10,760 Speaker 1: on the black market now was more and more patient 4 00:00:10,760 --> 00:00:13,920 Speaker 1: information is shared on the internet. The lists are being sold. 5 00:00:14,040 --> 00:00:16,479 Speaker 1: We don't know who is buying them. It's not just 6 00:00:16,600 --> 00:00:21,600 Speaker 1: Facebook's Google, it's Amazon. It's also insurance companies, retail companies. 7 00:00:27,200 --> 00:00:29,520 Speaker 1: Believe it or not, when you go to the doctor, 8 00:00:30,000 --> 00:00:32,839 Speaker 1: the data from that doctor's visit is often sold to 9 00:00:32,880 --> 00:00:36,000 Speaker 1: companies that have nothing to do with treating any illness. 10 00:00:37,400 --> 00:00:43,040 Speaker 1: Electronic records companies, labs, pharmacies, insurers. They're all selling intimate 11 00:00:43,120 --> 00:00:47,760 Speaker 1: data about you. It's not only big data, but big business. 12 00:00:48,960 --> 00:00:52,920 Speaker 1: Welcome to Prognosis, bloom Brigs podcast about the intersection of 13 00:00:52,960 --> 00:00:56,760 Speaker 1: health and technology and the unexpected places it's taking us. 14 00:00:57,400 --> 00:01:04,120 Speaker 1: I'm your host, Michelle fay Cortes. This week, we're looking 15 00:01:04,160 --> 00:01:07,640 Speaker 1: at a new kind of healthcare data broker. These companies 16 00:01:07,720 --> 00:01:10,800 Speaker 1: don't just want to sell your data. They're also willing 17 00:01:10,800 --> 00:01:13,440 Speaker 1: to pay you for it. But that's not really the point. 18 00:01:13,959 --> 00:01:16,360 Speaker 1: Stay with us and we'll tell you what these companies 19 00:01:16,400 --> 00:01:20,640 Speaker 1: are really after. Here's Bloomberg's health reporter Kristen B. Brown 20 00:01:20,880 --> 00:01:29,040 Speaker 1: with the story. Okay, So I'm going to log into 21 00:01:29,080 --> 00:01:33,880 Speaker 1: my Nebula account and see more opportunities there are for 22 00:01:34,080 --> 00:01:39,720 Speaker 1: me to make some money off my DNA. That's the 23 00:01:39,760 --> 00:01:42,520 Speaker 1: sound of me hunched over my laptop on a rainy 24 00:01:42,560 --> 00:01:46,880 Speaker 1: Oakland afternoon, signing up to give my personal healthcare data away, 25 00:01:47,480 --> 00:01:50,720 Speaker 1: or rather I'm signing up to sell it. Okay. So 26 00:01:50,800 --> 00:01:55,400 Speaker 1: I'm going to take some surveys. Okay, So they want 27 00:01:55,400 --> 00:02:03,440 Speaker 1: to know approximately how often do you drink alcohol? M daily? 28 00:02:03,480 --> 00:02:06,640 Speaker 1: Are almost daily? Once twice a week, probably three or 29 00:02:06,640 --> 00:02:10,840 Speaker 1: four times for a week. Uh, how many glasses of 30 00:02:11,080 --> 00:02:23,720 Speaker 1: red wine do you drink? True white wine? I'm answering 31 00:02:23,760 --> 00:02:26,440 Speaker 1: surveys about my health on the website of a company 32 00:02:26,520 --> 00:02:30,840 Speaker 1: named Nebula Genomics. Nebula is one among a new breed 33 00:02:30,880 --> 00:02:35,079 Speaker 1: of health data brokers. Like more traditional health data brokers, 34 00:02:35,240 --> 00:02:38,200 Speaker 1: the company wants to profit off your data, in this 35 00:02:38,280 --> 00:02:42,440 Speaker 1: case by selling it to researchers, but it wants you 36 00:02:42,639 --> 00:02:46,960 Speaker 1: to profit off your data too. Nebula launched last fall 37 00:02:47,080 --> 00:02:49,720 Speaker 1: with a lot of buzz. It was spun out of 38 00:02:49,720 --> 00:02:52,920 Speaker 1: the lab of George Church, the Harvard geneticist as famous 39 00:02:52,919 --> 00:02:55,600 Speaker 1: for his work as he is throwing wild ideas out there, 40 00:02:56,160 --> 00:03:00,480 Speaker 1: like trying to resurrect the Wooly Mammoth Church, and many 41 00:03:00,520 --> 00:03:03,640 Speaker 1: others in the scientific community believe that the more people 42 00:03:03,680 --> 00:03:06,920 Speaker 1: share their genetic data, the sooner we will have treatments 43 00:03:06,960 --> 00:03:11,839 Speaker 1: and cures for devastating diseases. This is oversimplifying it a bit, 44 00:03:12,040 --> 00:03:14,600 Speaker 1: but the basic idea is that researchers could mine the 45 00:03:14,639 --> 00:03:17,799 Speaker 1: genomes of people who share the same conditions and look 46 00:03:17,840 --> 00:03:21,720 Speaker 1: for clues to treating those conditions, identifying the common bits 47 00:03:21,760 --> 00:03:24,280 Speaker 1: of their genetic code that could be linked to disease. 48 00:03:25,240 --> 00:03:27,400 Speaker 1: But if you want people to share their data, the 49 00:03:27,440 --> 00:03:30,080 Speaker 1: company reasons, you have to give them something for it. 50 00:03:31,080 --> 00:03:34,320 Speaker 1: This is a strategy long deployed among tech companies like 51 00:03:34,400 --> 00:03:38,080 Speaker 1: Facebook and Google. You give Facebook all that data, and 52 00:03:38,160 --> 00:03:42,720 Speaker 1: Facebook gives you access to Facebook. Nebula instead gives people 53 00:03:42,800 --> 00:03:48,000 Speaker 1: access to freehold genome sequencing. Eventually, it also plans to 54 00:03:48,040 --> 00:03:51,360 Speaker 1: offer other kinds of perks, like gift cards and money 55 00:03:51,400 --> 00:03:55,840 Speaker 1: for sharing your data. I caught up with Kamalabad, the 56 00:03:55,880 --> 00:03:59,280 Speaker 1: twenty four year old CEO of Nebula in San Francisco. 57 00:04:00,200 --> 00:04:03,000 Speaker 1: Our goal is, you know, can we build this essentially 58 00:04:03,000 --> 00:04:06,920 Speaker 1: a community of stakeholders that are willingly and transparently sharing 59 00:04:06,960 --> 00:04:09,880 Speaker 1: their genomic data. You know, the main interest or main 60 00:04:09,880 --> 00:04:13,560 Speaker 1: goal of nebulas to build data sets that enable scientists 61 00:04:13,560 --> 00:04:16,640 Speaker 1: to do interesting things right, whether it's rational drug design, 62 00:04:16,720 --> 00:04:20,120 Speaker 1: whether it's developing new use cases for precision medicine or 63 00:04:20,120 --> 00:04:24,119 Speaker 1: pharmaca genomics. All this relies on large scale data sets 64 00:04:24,120 --> 00:04:26,400 Speaker 1: that we don't really have access to today. In the 65 00:04:26,440 --> 00:04:29,839 Speaker 1: genomics world, Companies like twenty three and Me have figured 66 00:04:29,839 --> 00:04:32,680 Speaker 1: out that people will actually pay them to give away 67 00:04:32,720 --> 00:04:36,000 Speaker 1: their health data. Consumers shell out as much as a 68 00:04:36,080 --> 00:04:39,320 Speaker 1: hundred ninety nine bucks for twenty three and ME DNA test, 69 00:04:39,800 --> 00:04:41,919 Speaker 1: and twenty three and Me can turn around and sell 70 00:04:41,960 --> 00:04:47,440 Speaker 1: access to that information to pharmaceutical companies. Last year, and 71 00:04:47,480 --> 00:04:50,640 Speaker 1: Me into three hundred million dollar deal with Glaxo Smith 72 00:04:50,680 --> 00:04:54,200 Speaker 1: Klein to do just that. But Kamal says that's the 73 00:04:54,200 --> 00:04:57,599 Speaker 1: wrong approach. We think a pretty big flaw on the 74 00:04:57,640 --> 00:05:00,599 Speaker 1: existing model is that it's it's very transactional. Right. You 75 00:05:00,640 --> 00:05:02,840 Speaker 1: kind of swipe your credit card, you get one of 76 00:05:02,880 --> 00:05:04,880 Speaker 1: those spit kits, you spit in it, you send it back, 77 00:05:04,880 --> 00:05:06,520 Speaker 1: you get your report, and you say, you know, great, 78 00:05:06,880 --> 00:05:09,320 Speaker 1: that's it. Um what we want to incentivize people to 79 00:05:09,360 --> 00:05:12,960 Speaker 1: do is come back over time, learn more about themselves, 80 00:05:13,080 --> 00:05:15,039 Speaker 1: and share more information. That way, we can build a 81 00:05:15,080 --> 00:05:20,400 Speaker 1: longitudinal view of somebody's health. Nebula wants to track your 82 00:05:20,400 --> 00:05:23,440 Speaker 1: health over time to keep users coming back so that 83 00:05:23,520 --> 00:05:26,039 Speaker 1: researchers can get a more complete picture of it a 84 00:05:26,080 --> 00:05:31,520 Speaker 1: better data set. My own longitudinal health journey started with surveys, 85 00:05:32,320 --> 00:05:37,440 Speaker 1: lots of surveys. I took a survey about cancer. Have 86 00:05:37,520 --> 00:05:41,600 Speaker 1: I ever been diagnosed with cancer? No? And what about 87 00:05:41,640 --> 00:05:44,599 Speaker 1: my diet? How many tables means have cooked vegetables do 88 00:05:44,640 --> 00:05:49,320 Speaker 1: I eat first today? Table spoons? And my exercise habits? 89 00:05:49,960 --> 00:05:52,640 Speaker 1: How many days you walk for at least ten minutes 90 00:05:53,400 --> 00:05:56,240 Speaker 1: in a simple whole week. I like every day I 91 00:05:56,320 --> 00:06:00,240 Speaker 1: walked from the bar ten minutes. I gave no feel 92 00:06:00,240 --> 00:06:04,040 Speaker 1: information about my drinking habits and my medical history. I 93 00:06:04,120 --> 00:06:07,400 Speaker 1: also uploaded my twenty train MEE data, which gives the 94 00:06:07,400 --> 00:06:11,720 Speaker 1: company access to really intimate information about me, like whether 95 00:06:11,760 --> 00:06:16,040 Speaker 1: I'm at risk for Alzheimer's or diabetes. At the end 96 00:06:16,080 --> 00:06:18,320 Speaker 1: of all this, I had earned six hundred and fifty 97 00:06:18,400 --> 00:06:21,320 Speaker 1: credits of the one thousand credits you need to get 98 00:06:21,360 --> 00:06:25,719 Speaker 1: a free low grade whole genome sequencing instead of going 99 00:06:25,760 --> 00:06:27,600 Speaker 1: through all this. By the way, you can also just 100 00:06:27,680 --> 00:06:32,040 Speaker 1: buy the sequencing from Nebula for a hundred bucks. Recently, 101 00:06:32,279 --> 00:06:35,400 Speaker 1: the company also launched a subscription service, which gives you 102 00:06:35,440 --> 00:06:38,160 Speaker 1: access to things like new research about your genome and 103 00:06:38,240 --> 00:06:42,480 Speaker 1: priority to participate in research studies. Another pitch the company 104 00:06:42,560 --> 00:06:45,480 Speaker 1: makes is that if a researcher finds your information interesting, 105 00:06:46,520 --> 00:06:49,679 Speaker 1: they might pay for a clinical grade sequencing and share 106 00:06:49,720 --> 00:06:53,280 Speaker 1: that data with you. Comal told me that eventually there 107 00:06:53,279 --> 00:06:56,160 Speaker 1: will also be opportunities to improve your own health by 108 00:06:56,200 --> 00:07:00,040 Speaker 1: participating in research studies that, for example, give part have 109 00:07:00,120 --> 00:07:03,640 Speaker 1: spens wearables to track things like heart rate. But as 110 00:07:03,640 --> 00:07:05,800 Speaker 1: I was filling out all those surveys, I got the 111 00:07:05,839 --> 00:07:09,720 Speaker 1: distinct feeling that I probably wasn't all that interesting to researchers. 112 00:07:10,880 --> 00:07:13,520 Speaker 1: I exercise, I eat pretty well most of the time, 113 00:07:14,080 --> 00:07:17,480 Speaker 1: and my family has no history of inherited disease other 114 00:07:17,520 --> 00:07:22,120 Speaker 1: than really bad eyesight and pretty average health wise. Usually 115 00:07:22,360 --> 00:07:26,040 Speaker 1: people's data is only valuable and aggregate when combined with 116 00:07:26,120 --> 00:07:31,240 Speaker 1: data from millions of other people. Health data brokers are 117 00:07:31,280 --> 00:07:34,840 Speaker 1: nothing new. They've actually been around since the nineteen fifties, 118 00:07:35,000 --> 00:07:38,880 Speaker 1: but computers and then the Internet turned health data brokerage 119 00:07:38,880 --> 00:07:43,400 Speaker 1: into a multibillion dollar business. Here's how it works. Companies 120 00:07:43,440 --> 00:07:46,920 Speaker 1: collect de identified data from millions and millions of people, 121 00:07:47,360 --> 00:07:51,920 Speaker 1: often paying pennies per record your blood test results, hospital records, 122 00:07:51,960 --> 00:07:55,080 Speaker 1: prescription information. It all gets stripped of your name and 123 00:07:55,160 --> 00:07:59,280 Speaker 1: sold by data broker middleman like a c A pharmaceutical 124 00:07:59,320 --> 00:08:02,080 Speaker 1: company can buy access to your records to better sell 125 00:08:02,120 --> 00:08:05,160 Speaker 1: you drugs, even as it might be difficult for you 126 00:08:05,280 --> 00:08:07,640 Speaker 1: to get a copy of your own health care records 127 00:08:07,680 --> 00:08:10,520 Speaker 1: from your doctor. And the practice, by the way, is 128 00:08:10,600 --> 00:08:13,600 Speaker 1: totally legal under hip hop, the law that's been on 129 00:08:13,640 --> 00:08:18,600 Speaker 1: the book since nine to protect patient privacy. I talked 130 00:08:18,600 --> 00:08:21,480 Speaker 1: to Adam Tanner, who wrote a book about this. He 131 00:08:21,520 --> 00:08:23,600 Speaker 1: was on a cruise ship off the coats of Vietnam 132 00:08:23,640 --> 00:08:26,280 Speaker 1: when we talked. You go to the doctor's office, You 133 00:08:26,320 --> 00:08:29,640 Speaker 1: close the door. You expect only the doctor will know 134 00:08:29,680 --> 00:08:32,680 Speaker 1: what I'm telling about my health condition. But the doctor 135 00:08:32,720 --> 00:08:36,440 Speaker 1: is often recording onto a computer the details of the 136 00:08:37,160 --> 00:08:40,360 Speaker 1: of the patient's condition, and that's good to keep records 137 00:08:40,400 --> 00:08:43,360 Speaker 1: on what patients are about. Many of those systems, however, 138 00:08:43,360 --> 00:08:47,000 Speaker 1: those computer systems that connect doctors with hospitals and pharmacies 139 00:08:47,040 --> 00:08:50,800 Speaker 1: and so on. Many of them sell anonymized data about patients, 140 00:08:51,000 --> 00:08:52,920 Speaker 1: so it doesn't have your name in it, but it says, 141 00:08:53,240 --> 00:08:56,160 Speaker 1: here is a woman this age, living in this part 142 00:08:56,200 --> 00:08:59,520 Speaker 1: of town, and it joins records about you then with 143 00:08:59,600 --> 00:09:02,600 Speaker 1: other previous records about you. Adam told me that this 144 00:09:02,640 --> 00:09:07,040 Speaker 1: business really took off when records became digital. And digitization 145 00:09:07,080 --> 00:09:09,440 Speaker 1: of medicine is a good thing in general because it 146 00:09:09,559 --> 00:09:12,760 Speaker 1: keeps detailed records about you, but it has allowed this 147 00:09:13,000 --> 00:09:17,240 Speaker 1: side business to establish itself in the shadows that most 148 00:09:17,280 --> 00:09:20,760 Speaker 1: patients do not see, do not have a say in uh, 149 00:09:20,760 --> 00:09:23,760 Speaker 1: and indeed many health professionals don't know about this. Data 150 00:09:23,880 --> 00:09:26,440 Speaker 1: mostly gets used for marketing, and some of it can 151 00:09:26,480 --> 00:09:30,040 Speaker 1: be pretty intimate or embarrassing, even though it's stripped of 152 00:09:30,040 --> 00:09:34,400 Speaker 1: your name. These medical data dossiers usually include gender, age, 153 00:09:34,480 --> 00:09:37,800 Speaker 1: and partial zip codes. Studies have shown that it's not 154 00:09:37,920 --> 00:09:41,160 Speaker 1: always that hard to identify people based on it. One 155 00:09:41,200 --> 00:09:44,640 Speaker 1: should be concerned because health information is often our most 156 00:09:44,720 --> 00:09:48,080 Speaker 1: intimate information. You could be discriminated against the work, You 157 00:09:48,080 --> 00:09:51,839 Speaker 1: could be discriminated against socially. Once this kind of information 158 00:09:51,920 --> 00:09:55,160 Speaker 1: is out there in the ether, you can't put it back. Forever, 159 00:09:55,360 --> 00:09:59,280 Speaker 1: and just knowing this basic information could be damaging to you, 160 00:10:00,160 --> 00:10:03,320 Speaker 1: and it could be something relatively trivial. Adam makes a 161 00:10:03,480 --> 00:10:06,600 Speaker 1: good point here in his book, he talks about how 162 00:10:06,679 --> 00:10:09,880 Speaker 1: the actor Charlie Sheen wound up paying millions in bribes 163 00:10:09,920 --> 00:10:13,640 Speaker 1: to keep his own HIV diagnosis private. For those of 164 00:10:13,720 --> 00:10:17,640 Speaker 1: us that aren't famous, there is risk to life insurers, 165 00:10:17,679 --> 00:10:21,280 Speaker 1: for example, if they access this data, would legally be 166 00:10:21,360 --> 00:10:25,080 Speaker 1: allowed to discriminate against you based on it. But Adam 167 00:10:25,120 --> 00:10:28,040 Speaker 1: told me he could also imagine some less obvious ways 168 00:10:28,120 --> 00:10:31,400 Speaker 1: this data could be incriminating. I remember attending just a 169 00:10:31,400 --> 00:10:35,280 Speaker 1: few years ago a lecture at the university and the 170 00:10:35,320 --> 00:10:40,240 Speaker 1: woman was showing some videos on the internet, uh to 171 00:10:40,360 --> 00:10:42,640 Speaker 1: demonstrate a point in her lecture, and off to the 172 00:10:42,720 --> 00:10:45,959 Speaker 1: side of a YouTube video she was showing there was advertisement, 173 00:10:46,120 --> 00:10:49,240 Speaker 1: are you depressed? We have the answer to your mental 174 00:10:49,280 --> 00:10:52,640 Speaker 1: health issues. Um. Now, it could be a coincidence that 175 00:10:52,679 --> 00:10:55,760 Speaker 1: those ads were served, but I can't erase from my 176 00:10:55,800 --> 00:10:57,959 Speaker 1: mind the image that this is a person that may 177 00:10:58,000 --> 00:11:00,600 Speaker 1: have had that issue. At a time when Facebook and 178 00:11:00,679 --> 00:11:04,040 Speaker 1: Amazon and Google have woken people up to the value 179 00:11:04,040 --> 00:11:07,000 Speaker 1: of their data, this new crop of companies that want 180 00:11:07,040 --> 00:11:10,120 Speaker 1: to pay you for your data, are exploiting frustration with 181 00:11:10,240 --> 00:11:14,840 Speaker 1: this model. Here's comal again the way the process exists today, 182 00:11:15,600 --> 00:11:18,600 Speaker 1: no one is really winning except for these these intermediary 183 00:11:18,720 --> 00:11:22,800 Speaker 1: data brokers. So I think this idea of let's let's 184 00:11:23,120 --> 00:11:27,040 Speaker 1: empower patients to aggregate, curate, and share their their health 185 00:11:27,120 --> 00:11:33,720 Speaker 1: data is something that's becoming more common and more mainstream. 186 00:11:33,800 --> 00:11:36,280 Speaker 1: This is language you hear a lot in this world. 187 00:11:36,960 --> 00:11:40,520 Speaker 1: The websites for these companies are filled with trust inspiring 188 00:11:40,520 --> 00:11:46,199 Speaker 1: words like transparency and privacy. They promised control and ownership 189 00:11:46,320 --> 00:11:50,079 Speaker 1: over your data. They also suggest that your data will 190 00:11:50,120 --> 00:11:52,960 Speaker 1: be put to better use. The data bought and sold 191 00:11:53,000 --> 00:11:56,920 Speaker 1: by traditional healthcare data brokers is often riddled with errors 192 00:11:57,120 --> 00:12:00,240 Speaker 1: and without some serious looting, The people using I have 193 00:12:00,440 --> 00:12:02,680 Speaker 1: no way of following up with the people the data 194 00:12:02,760 --> 00:12:05,199 Speaker 1: come from if they want to ask follow up questions. 195 00:12:06,040 --> 00:12:09,520 Speaker 1: It also hasn't really delivered the scientific or medical benefits 196 00:12:09,559 --> 00:12:13,000 Speaker 1: that it could. Like I mentioned earlier, it mainly gets 197 00:12:13,080 --> 00:12:16,800 Speaker 1: used for commercial purposes. But I wondered if giving your 198 00:12:16,880 --> 00:12:22,080 Speaker 1: data away can ever really be an empowering move. Another 199 00:12:22,160 --> 00:12:25,839 Speaker 1: company I gave my data to Luna DNA actually got 200 00:12:25,880 --> 00:12:29,240 Speaker 1: permission from the Securities and Exchange Commission to give users 201 00:12:29,400 --> 00:12:33,920 Speaker 1: shares of the company in exchange for data. Luna's pitch 202 00:12:34,040 --> 00:12:37,160 Speaker 1: made me think of Facebook again and how surprised Facebook 203 00:12:37,240 --> 00:12:39,920 Speaker 1: users were when they realized exactly how their data was 204 00:12:39,960 --> 00:12:43,160 Speaker 1: being used, how they were paying for using the site. 205 00:12:44,040 --> 00:12:46,880 Speaker 1: Luna CEO Bob Kane told me that giving people an 206 00:12:46,880 --> 00:12:50,960 Speaker 1: ownership stake helps them trust that their data isn't being misused. 207 00:12:51,559 --> 00:12:54,200 Speaker 1: Our data is very much ours. It's as unique as 208 00:12:54,240 --> 00:12:57,199 Speaker 1: anything can get to defining us, and so it's one 209 00:12:57,200 --> 00:12:59,400 Speaker 1: of those rights that nobody can take away from us. 210 00:13:00,040 --> 00:13:03,040 Speaker 1: Una has set up a complicated corporate structure in order 211 00:13:03,080 --> 00:13:06,240 Speaker 1: to make this happen. The database itself is actually a 212 00:13:06,280 --> 00:13:09,840 Speaker 1: subsidiary of Luna, and that is what people get shares of. 213 00:13:10,760 --> 00:13:13,600 Speaker 1: Since soft launching earlier this year, the company has been 214 00:13:13,640 --> 00:13:17,760 Speaker 1: busy building partnerships with groups like Rare Disease Foundations, hoping 215 00:13:17,760 --> 00:13:21,760 Speaker 1: patients with those diseases will contribute data and eventually lead 216 00:13:21,760 --> 00:13:25,920 Speaker 1: to treatments. And when the database turns a profit by say, 217 00:13:26,040 --> 00:13:30,080 Speaker 1: selling information to researchers, everyone gets a cut. Yes, so 218 00:13:30,120 --> 00:13:33,440 Speaker 1: those shares are yours. They're non transferable because we don't 219 00:13:33,440 --> 00:13:36,160 Speaker 1: think you can transfer the right to control your data, 220 00:13:36,240 --> 00:13:39,920 Speaker 1: and they really represent your data in this system, and 221 00:13:39,960 --> 00:13:42,280 Speaker 1: you're consenting for your data to be used at a 222 00:13:42,400 --> 00:13:47,000 Speaker 1: population level to help researchers answer sort of higher level 223 00:13:47,080 --> 00:13:50,080 Speaker 1: questions about links between your genome and your health or 224 00:13:50,080 --> 00:13:53,800 Speaker 1: social determinants of health. How they're used is when we 225 00:13:53,920 --> 00:13:57,479 Speaker 1: sign up with a commercial company, for instance of pharma, 226 00:13:57,640 --> 00:14:01,440 Speaker 1: and they pay us to access to database. The proceeds 227 00:14:01,480 --> 00:14:04,839 Speaker 1: will be shared with the community based on your ownership. 228 00:14:05,480 --> 00:14:08,480 Speaker 1: I signed up for a Luna DNA account. I answered 229 00:14:08,480 --> 00:14:10,920 Speaker 1: a few surveys about my health and shared my twenty 230 00:14:10,960 --> 00:14:14,959 Speaker 1: three and me data. For that, I got fifty four shares. 231 00:14:15,640 --> 00:14:18,960 Speaker 1: According to Luna's filing with SEC. Each of those shares 232 00:14:19,000 --> 00:14:23,200 Speaker 1: are currently worth about seven cents. Based on those numbers, 233 00:14:23,320 --> 00:14:29,640 Speaker 1: your whole genome would be worth one dollars. There are 234 00:14:29,680 --> 00:14:33,880 Speaker 1: constantly new companies like Luna and Nebula popping up, and 235 00:14:33,920 --> 00:14:37,040 Speaker 1: I shared my data with a bunch of them. I 236 00:14:37,120 --> 00:14:39,640 Speaker 1: really wanted to get a sense of the entire landscape, 237 00:14:40,120 --> 00:14:42,080 Speaker 1: to see what you can get when you give your 238 00:14:42,080 --> 00:14:47,240 Speaker 1: own data away. Doc do AI gives people Amazon gift 239 00:14:47,240 --> 00:14:51,400 Speaker 1: cards for sharing data and participating in trials. I didn't 240 00:14:51,400 --> 00:14:54,200 Speaker 1: get anything for uploading my genome, and one of the 241 00:14:54,240 --> 00:14:57,160 Speaker 1: trials I enrolled in a waitlist for would only earn 242 00:14:57,240 --> 00:15:00,200 Speaker 1: me thirty of the one thousand points I would need 243 00:15:00,240 --> 00:15:04,160 Speaker 1: to get a ten dollar Amazon gift card. Another company, 244 00:15:04,160 --> 00:15:07,640 Speaker 1: in Blima, charges people to securely store their records on 245 00:15:07,680 --> 00:15:11,240 Speaker 1: the blockchain, but you can also earn cryptocurrency for sharing 246 00:15:11,240 --> 00:15:16,040 Speaker 1: your data. Yet another, Humanity dot co, pitches itself as 247 00:15:16,080 --> 00:15:19,440 Speaker 1: a go between brokering data for its users and helping 248 00:15:19,480 --> 00:15:23,600 Speaker 1: them get it cut. On the company's app and YouTube channel, 249 00:15:23,800 --> 00:15:27,280 Speaker 1: there were all these testimonials from people proclaiming that ownership 250 00:15:27,320 --> 00:15:30,560 Speaker 1: of data is a human right. I want to own. 251 00:15:30,640 --> 00:15:34,600 Speaker 1: I want to own my data. I want to earn 252 00:15:34,640 --> 00:15:38,560 Speaker 1: my data as my property because right now it is 253 00:15:38,640 --> 00:15:42,640 Speaker 1: quite unclear. But Humanity, like the other companies, is also 254 00:15:42,720 --> 00:15:45,840 Speaker 1: pretty early stage. There wasn't much I could do besides 255 00:15:45,880 --> 00:15:49,840 Speaker 1: pledge my enthusiasm for the idea. My little data brokering 256 00:15:49,880 --> 00:15:52,440 Speaker 1: experiment was starting to feel less like a mission of 257 00:15:52,440 --> 00:15:57,240 Speaker 1: self empowerment and more like a waste of time. I 258 00:15:57,280 --> 00:16:00,680 Speaker 1: talked with George Contreris, a law professor at the University 259 00:16:00,680 --> 00:16:03,640 Speaker 1: of Utah who thinks a lot about these things. He 260 00:16:03,800 --> 00:16:07,560 Speaker 1: was also skeptical. I mean the cynical view is that 261 00:16:07,640 --> 00:16:12,840 Speaker 1: these companies are cropping up so that they can monetize 262 00:16:12,880 --> 00:16:15,440 Speaker 1: the data, right, I mean, not not so that patients 263 00:16:15,560 --> 00:16:19,640 Speaker 1: can profit from the use of their health data, so 264 00:16:19,720 --> 00:16:23,040 Speaker 1: that these companies can become intermediaries and take a slice 265 00:16:23,760 --> 00:16:27,480 Speaker 1: of every data transaction that comes along. All of the 266 00:16:27,520 --> 00:16:30,280 Speaker 1: companies I talked to told me that people really shouldn't 267 00:16:30,320 --> 00:16:33,520 Speaker 1: be in it for the rewards anyway. It's about helping 268 00:16:33,520 --> 00:16:36,960 Speaker 1: the progress of science and medicine, and they want to 269 00:16:36,960 --> 00:16:40,200 Speaker 1: do it more transparently and in doing so also collect 270 00:16:40,280 --> 00:16:44,240 Speaker 1: higher quality data. But it's not the collecting health data 271 00:16:44,440 --> 00:16:49,200 Speaker 1: doesn't have legitimate justifications. George said, there's a doctor's office 272 00:16:49,200 --> 00:16:54,120 Speaker 1: and twenty people show up with the same strange flu symptoms. 273 00:16:54,280 --> 00:16:57,280 Speaker 1: We we want that doctor's office to report that to 274 00:16:57,320 --> 00:17:00,560 Speaker 1: the c DC, and we want them to figure out 275 00:17:00,560 --> 00:17:03,600 Speaker 1: what's going on and then to develop a vaccine. There 276 00:17:03,640 --> 00:17:08,400 Speaker 1: are a million contexts where it's important for health care 277 00:17:08,400 --> 00:17:12,199 Speaker 1: providers to be able to provide data. His issue was 278 00:17:12,240 --> 00:17:15,119 Speaker 1: more that this new breed of data brokers could actually 279 00:17:15,119 --> 00:17:18,680 Speaker 1: wind up making things more complicated by starting to treat 280 00:17:18,760 --> 00:17:22,600 Speaker 1: data like legal property. If your data is legal property. 281 00:17:22,880 --> 00:17:24,880 Speaker 1: All of a sudden, there are all these new issues, 282 00:17:25,000 --> 00:17:28,560 Speaker 1: like asking permission every time someone wants to use it 283 00:17:29,160 --> 00:17:32,560 Speaker 1: and figuring out things like who controls your data when 284 00:17:32,560 --> 00:17:35,800 Speaker 1: you die. If you are in a situation where the 285 00:17:35,880 --> 00:17:40,640 Speaker 1: centers for disease control, or hospitals or pharmaceutical companies vaccine 286 00:17:40,680 --> 00:17:44,960 Speaker 1: companies needed to figure out how to pay somebody every 287 00:17:45,000 --> 00:17:48,040 Speaker 1: time they wanted to use some data, you know the system. 288 00:17:48,240 --> 00:17:52,040 Speaker 1: The system would a become much less efficient and fee 289 00:17:52,040 --> 00:17:54,920 Speaker 1: would be come much more expensive. Both of those are 290 00:17:54,920 --> 00:17:59,160 Speaker 1: not good for public health, George said. Ironically, in this 291 00:17:59,200 --> 00:18:02,200 Speaker 1: new model, people might wind up giving away even more 292 00:18:02,320 --> 00:18:07,040 Speaker 1: information and having fewer protections for it. They're better protected 293 00:18:07,160 --> 00:18:10,520 Speaker 1: under the current system with HIPPA than they are with 294 00:18:10,720 --> 00:18:15,520 Speaker 1: these data brokers, who you know, are pretty uncontrolled, unregulated, 295 00:18:15,720 --> 00:18:20,040 Speaker 1: and you know you you basically just have to trust them, 296 00:18:20,080 --> 00:18:24,280 Speaker 1: although they're you know, they're really just profit oriented startups 297 00:18:24,320 --> 00:18:28,359 Speaker 1: at this point. Um so I would honestly rather trust 298 00:18:28,359 --> 00:18:31,760 Speaker 1: my hospital than than one of these data brokers to 299 00:18:32,520 --> 00:18:35,640 Speaker 1: use my data properly. I talked with a few other 300 00:18:35,720 --> 00:18:39,679 Speaker 1: experts in the space though that disagreed with George. Adam, 301 00:18:39,920 --> 00:18:42,280 Speaker 1: the guy who wrote the Health Data Book, and Eric Topol, 302 00:18:42,440 --> 00:18:45,439 Speaker 1: a geneticist who has written a lot about patient access 303 00:18:45,480 --> 00:18:49,520 Speaker 1: to data, both told me they were actually optimistic, so 304 00:18:49,560 --> 00:18:53,000 Speaker 1: long as these companies are transparent and give people choices 305 00:18:53,119 --> 00:18:56,600 Speaker 1: and how their data is used, the point of collecting 306 00:18:56,640 --> 00:19:00,000 Speaker 1: all this data, after all, to help advance science and medicine, 307 00:19:00,000 --> 00:19:04,120 Speaker 1: and to find cures for devastating diseases and understand more 308 00:19:04,200 --> 00:19:07,520 Speaker 1: about how the human body works. It's hard to argue 309 00:19:07,560 --> 00:19:12,040 Speaker 1: with that. But in reporting this story, I couldn't help 310 00:19:12,080 --> 00:19:15,920 Speaker 1: think about another story, the now famous story of Henrietta Lacks. 311 00:19:16,880 --> 00:19:19,479 Speaker 1: Henriette A. Lacks was a black woman who died of 312 00:19:19,520 --> 00:19:22,439 Speaker 1: cancer in the South in the nineteen fifties after getting 313 00:19:22,480 --> 00:19:26,720 Speaker 1: pretty lacking medical treatment, But doctors harvested her cells, which 314 00:19:26,760 --> 00:19:30,280 Speaker 1: turned out to have some special characteristics. They didn't die. 315 00:19:31,440 --> 00:19:34,919 Speaker 1: The HeLa cells proved invaluable to medical research, and with 316 00:19:34,960 --> 00:19:38,640 Speaker 1: their help, many companies got rich. Well. Henrietta's family, at 317 00:19:38,640 --> 00:19:42,000 Speaker 1: times barely got by. What if the genome you were 318 00:19:42,000 --> 00:19:45,120 Speaker 1: paid twenty one for winds up leading to a billion 319 00:19:45,160 --> 00:19:49,199 Speaker 1: dollar cure. Most of us have pretty average datum, but 320 00:19:49,280 --> 00:19:52,000 Speaker 1: some of us don't. It was one thirty two year 321 00:19:52,040 --> 00:19:55,880 Speaker 1: old aerobics instructor in the Dallas suburbs that led researchers 322 00:19:55,920 --> 00:19:58,680 Speaker 1: to a mutation in the gene pc s K nine 323 00:19:59,119 --> 00:20:02,359 Speaker 1: that seems to low or levels of bad cholesterol. It 324 00:20:02,480 --> 00:20:05,480 Speaker 1: was a finding that led multiple companies to pursue therapies 325 00:20:05,720 --> 00:20:09,240 Speaker 1: that could one day rake in billions. If our medical 326 00:20:09,320 --> 00:20:12,000 Speaker 1: data does lead to a cure, should we get a cut. 327 00:20:19,520 --> 00:20:21,199 Speaker 1: At the end of all this, I had given my 328 00:20:21,240 --> 00:20:24,920 Speaker 1: health data away too many different companies, and in return, 329 00:20:25,080 --> 00:20:28,800 Speaker 1: I'd gotten halfway to a free DNA sequencing and fifty 330 00:20:28,840 --> 00:20:33,200 Speaker 1: four shares worth seven cents of hoop. These companies all 331 00:20:33,200 --> 00:20:36,119 Speaker 1: make the argument that this was empowering, that I was 332 00:20:36,160 --> 00:20:40,360 Speaker 1: taking control of my own information. I want to help 333 00:20:40,400 --> 00:20:44,240 Speaker 1: advance medical research, but there was something disingenuous about the 334 00:20:44,280 --> 00:20:49,480 Speaker 1: suggestion that sharing my information would be beneficial to me. Instead, 335 00:20:49,840 --> 00:20:52,719 Speaker 1: it felt like I had just been complicit in harvesting 336 00:20:52,760 --> 00:20:56,840 Speaker 1: my own information for other people to profit off. I 337 00:20:56,960 --> 00:21:16,679 Speaker 1: definitely did not feel empowered. And that's it for this 338 00:21:16,680 --> 00:21:20,560 Speaker 1: week's prognosis. Thanks for listening. Do you have a story 339 00:21:20,560 --> 00:21:23,679 Speaker 1: about healthcare in the US or around the world We 340 00:21:23,720 --> 00:21:26,480 Speaker 1: want to hear from you. Find me on Twitter at 341 00:21:26,520 --> 00:21:29,480 Speaker 1: the Cortes. If you were a fan of this episode, 342 00:21:29,680 --> 00:21:32,119 Speaker 1: please take a moment to rate and review us. It 343 00:21:32,200 --> 00:21:36,560 Speaker 1: helps new listeners find the show. This episode was produced 344 00:21:36,560 --> 00:21:39,680 Speaker 1: by Liz Smith. Our story editors were Drew Armstrong and 345 00:21:39,800 --> 00:21:44,120 Speaker 1: Rick Shine. Frances Glivie is head of Bloomberg Podcasts. We'll 346 00:21:44,160 --> 00:21:47,440 Speaker 1: be back on June six with our next episode. See 347 00:21:47,440 --> 00:21:47,720 Speaker 1: you then,