1 00:00:01,400 --> 00:00:06,440 Speaker 1: Nature versus nurture. There have always been questions about which 2 00:00:06,440 --> 00:00:09,240 Speaker 1: one contributes more to who we all become as adults. 3 00:00:10,480 --> 00:00:13,240 Speaker 1: But imagine a future where the way you're raised is 4 00:00:13,280 --> 00:00:16,600 Speaker 1: based on your essential nature, where your parents believe your 5 00:00:16,640 --> 00:00:20,120 Speaker 1: genetic makeup is already pointing you to be a musician, 6 00:00:20,400 --> 00:00:23,720 Speaker 1: or a doctor or a math professor, and they're just 7 00:00:23,760 --> 00:00:26,319 Speaker 1: paving the way. All of this is based on the 8 00:00:26,320 --> 00:00:28,720 Speaker 1: results of a DNA test that you've got just hours 9 00:00:28,800 --> 00:00:32,680 Speaker 1: after your birth. In China, that future is happening now. 10 00:00:34,800 --> 00:00:38,480 Speaker 1: Welcome to Prognosis, Bloomberg's podcast about the intersection of health 11 00:00:38,479 --> 00:00:42,640 Speaker 1: and technology and the unexpected places it's taking us. I'm 12 00:00:42,640 --> 00:00:49,680 Speaker 1: your host, Michelle fay Cortes. Throughout this season, you've heard 13 00:00:49,720 --> 00:00:53,440 Speaker 1: about all the ways the proliferation of healthcare data, especially 14 00:00:53,440 --> 00:00:58,840 Speaker 1: from genetic testing, is creating opportunities and challenges. That includes 15 00:00:58,920 --> 00:01:02,720 Speaker 1: privacy concerns about who gets the information and how it's used. 16 00:01:03,520 --> 00:01:06,440 Speaker 1: But that all out embrace of DNA testing is perhaps 17 00:01:06,480 --> 00:01:11,199 Speaker 1: even more striking in China. This year alone, nearly four 18 00:01:11,280 --> 00:01:14,280 Speaker 1: million Chinese are expected to take part in genetic testing 19 00:01:14,680 --> 00:01:18,080 Speaker 1: to learn about their families and their health risks. But 20 00:01:18,160 --> 00:01:20,960 Speaker 1: they're also pushing the boundaries of what these test results 21 00:01:21,000 --> 00:01:23,399 Speaker 1: mean and what they can tell us about ourselves and 22 00:01:23,400 --> 00:01:27,000 Speaker 1: our futures. From the very beginning of our lives. There 23 00:01:27,000 --> 00:01:30,920 Speaker 1: are questions about what is scientifically possible or provable. This 24 00:01:31,080 --> 00:01:33,959 Speaker 1: leading edge of DNA determinism is happening in a country 25 00:01:33,959 --> 00:01:37,440 Speaker 1: where the very concept of data privacy and what consumers 26 00:01:37,440 --> 00:01:41,920 Speaker 1: want and expect is vastly different from Western countries. For 27 00:01:42,000 --> 00:01:45,480 Speaker 1: this episode, Bloomberg's Jan hob reports with April Ma and 28 00:01:45,560 --> 00:01:49,080 Speaker 1: Danielle Away across three cities in mainland China and Hong Kong. 29 00:01:49,880 --> 00:01:59,400 Speaker 1: Here's one with the story. This jogging track in a 30 00:01:59,480 --> 00:02:02,760 Speaker 1: sprawl in Beijing Park has become a regular hangout for 31 00:02:02,880 --> 00:02:06,200 Speaker 1: Lou Fey Long. The park overlooks the lake and it's 32 00:02:06,280 --> 00:02:14,000 Speaker 1: nestled among blocks of residential high rises. Well, I usually 33 00:02:14,040 --> 00:02:16,360 Speaker 1: tried to get out here to this park a few 34 00:02:16,400 --> 00:02:19,360 Speaker 1: times a week to go running. The thirty six year 35 00:02:19,400 --> 00:02:22,520 Speaker 1: old investor in Chinese web apps has made his workout 36 00:02:22,560 --> 00:02:26,760 Speaker 1: here a priority after getting a genetics test online earlier 37 00:02:26,800 --> 00:02:34,200 Speaker 1: this year. You know what, I have family history of diabetes. 38 00:02:34,720 --> 00:02:37,960 Speaker 1: My father and my aunt both heaven and need insulent 39 00:02:38,080 --> 00:02:41,120 Speaker 1: to survive. I do have high blush sugar, so I 40 00:02:41,160 --> 00:02:43,960 Speaker 1: wanted to see if I was also at risk through 41 00:02:43,960 --> 00:02:49,520 Speaker 1: a genetics test to the test indicated he's four times 42 00:02:49,600 --> 00:02:53,640 Speaker 1: more prone to diabetes than the average person, so he 43 00:02:53,760 --> 00:02:57,400 Speaker 1: changed his diet and began following the exercise regimen that 44 00:02:57,520 --> 00:03:00,760 Speaker 1: came with the test results and by bidding into that 45 00:03:00,840 --> 00:03:06,120 Speaker 1: test tube, blue Joint, China's DNA testing boom. China's consumer 46 00:03:06,160 --> 00:03:09,520 Speaker 1: genetics market reached sales of about one point four billion 47 00:03:09,560 --> 00:03:13,880 Speaker 1: dollars in and it's expected to grow nearly twenty percent 48 00:03:13,960 --> 00:03:19,280 Speaker 1: a year through that's according to Research and Markets dot com. 49 00:03:19,400 --> 00:03:22,840 Speaker 1: China is one of the fastest growing markets globally for 50 00:03:22,919 --> 00:03:26,640 Speaker 1: DNA tests, and there's dozens of companies to choose from. 51 00:03:26,680 --> 00:03:29,760 Speaker 1: There's even a twenty three and me similar sounding company, 52 00:03:30,120 --> 00:03:33,960 Speaker 1: twenty three mo funk. They range from big outfits like 53 00:03:34,040 --> 00:03:38,560 Speaker 1: b GI Genomics that's reportedly the world's largest genetics research center, 54 00:03:38,880 --> 00:03:42,400 Speaker 1: to lots of small startups like one Gene and gene Box. 55 00:03:42,920 --> 00:03:46,720 Speaker 1: That company offers a very basic genome test for less 56 00:03:46,720 --> 00:03:50,560 Speaker 1: than a dollar fifty cents. Yep, that's right, one dollar 57 00:03:50,840 --> 00:04:00,480 Speaker 1: and fifty cents shipping included in justin o those three 58 00:04:00,600 --> 00:04:03,960 Speaker 1: years nearly sixty million Chinese will have given a sample 59 00:04:04,000 --> 00:04:07,760 Speaker 1: of their DNA by taking a consumer genetic test. That's 60 00:04:07,760 --> 00:04:11,680 Speaker 1: according to a Yeo, a Beijing research firm. The country 61 00:04:11,840 --> 00:04:14,839 Speaker 1: is only beginning to grapple with a number of big 62 00:04:14,960 --> 00:04:18,559 Speaker 1: questions like who has access to all that DNA data, 63 00:04:18,960 --> 00:04:22,560 Speaker 1: how's it being used? And what happens when an authoritarian 64 00:04:22,600 --> 00:04:27,200 Speaker 1: government potentially has access to the genetic blueprint for millions 65 00:04:27,240 --> 00:04:31,120 Speaker 1: of its citizens. And the thorny issues over health data 66 00:04:31,120 --> 00:04:35,159 Speaker 1: protection are surfacing as China becomes something of a wild 67 00:04:35,240 --> 00:04:38,960 Speaker 1: West for genetics. That's thanks to a rapid and mostly 68 00:04:39,040 --> 00:04:42,880 Speaker 1: unchecked growth. Ethical debates were raised just this year after 69 00:04:43,000 --> 00:04:46,839 Speaker 1: Chinese scientists put human brain genes into monkeys and induced 70 00:04:46,880 --> 00:04:50,440 Speaker 1: mental illness in other gene edited primates. But the Chinese 71 00:04:50,440 --> 00:04:54,719 Speaker 1: genetics experiment that caused the loudest international outcry was scientist 72 00:04:54,760 --> 00:04:59,080 Speaker 1: Hagen quies gene editing of the embryos of twin girls 73 00:04:59,160 --> 00:05:02,200 Speaker 1: last year. He said he edited their DNA to give 74 00:05:02,240 --> 00:05:06,880 Speaker 1: them immunity to HIV infection. The scientific community condemned the 75 00:05:06,920 --> 00:05:11,760 Speaker 1: experiment and called it unethical, including Chinese scientists. Now, this 76 00:05:11,839 --> 00:05:14,839 Speaker 1: kind of gene surgery is banned in so many countries. 77 00:05:15,080 --> 00:05:17,840 Speaker 1: Medical and scientific community are very upset about this. They 78 00:05:17,839 --> 00:05:22,880 Speaker 1: are outraged China set his work violated government rules. Still, 79 00:05:23,000 --> 00:05:26,520 Speaker 1: the controversy exposed the lack of oversight in the country's 80 00:05:26,520 --> 00:05:29,760 Speaker 1: genetics field. There's a sense that the government doesn't want 81 00:05:29,760 --> 00:05:32,760 Speaker 1: to impose too many regulations because it wants the industry 82 00:05:32,760 --> 00:05:36,000 Speaker 1: to leap frog advances in the US and Europe. There's 83 00:05:36,040 --> 00:05:39,960 Speaker 1: almost no regulation of the consumer market for DNA testing either. 84 00:05:40,680 --> 00:05:43,599 Speaker 1: That makes it possible for Chinese companies to give out 85 00:05:43,600 --> 00:05:47,599 Speaker 1: test results that go beyond what American companies would be allowed. 86 00:05:50,120 --> 00:05:53,520 Speaker 1: This genetics boom is also happening against the backdrop of 87 00:05:53,600 --> 00:05:58,200 Speaker 1: China stepping up mass surveillance of its citizens. Video camists 88 00:05:58,279 --> 00:06:01,360 Speaker 1: are posted on most city blocks, millions of them, as 89 00:06:01,440 --> 00:06:04,839 Speaker 1: China masses big databases of information on its own people. 90 00:06:05,240 --> 00:06:09,000 Speaker 1: There are questions how the growing consumer genetics industry might 91 00:06:09,160 --> 00:06:13,000 Speaker 1: feed into that. And if you ask Chinese consumers about 92 00:06:13,040 --> 00:06:18,600 Speaker 1: protecting their genetic code, what makes you you, most will shrug. 93 00:06:19,200 --> 00:06:22,400 Speaker 1: This seems especially true among the many new parents who 94 00:06:22,440 --> 00:06:26,159 Speaker 1: get their offspring sequenced in hopes of discovering innate skills 95 00:06:26,560 --> 00:06:42,360 Speaker 1: and future potential. Supposedly written in their DNA. Tama Joe 96 00:06:42,440 --> 00:06:44,800 Speaker 1: Sao Ying is playing with her two year old son 97 00:06:44,839 --> 00:06:48,560 Speaker 1: in their Shanghai home. Days after little bai Ye was born, 98 00:06:48,960 --> 00:06:52,360 Speaker 1: his mom decided to have his DNA analyzed. Testing of 99 00:06:52,440 --> 00:06:54,880 Speaker 1: babies in the womb and as newborns is one of 100 00:06:54,960 --> 00:06:58,360 Speaker 1: the fastest growing sectors of the genomics business. Here what 101 00:07:00,000 --> 00:07:05,000 Speaker 1: ina I did it for two reasons. One is to 102 00:07:05,080 --> 00:07:07,919 Speaker 1: know about his talents in the future so that I 103 00:07:07,920 --> 00:07:10,600 Speaker 1: can sell it direction for him. And then I wanted 104 00:07:10,640 --> 00:07:13,680 Speaker 1: to know about his host risks, whether he has any 105 00:07:13,720 --> 00:07:18,200 Speaker 1: genetic diseases, so that I can take preventive measures tell 106 00:07:18,240 --> 00:07:21,480 Speaker 1: you where I am. And with one point four billion 107 00:07:21,560 --> 00:07:26,000 Speaker 1: people and competition intents for schools and jobs, parents are 108 00:07:26,040 --> 00:07:29,880 Speaker 1: trying to give their little prints or princess every advantage possible. 109 00:07:30,480 --> 00:07:33,320 Speaker 1: The company salesman pitched that the test can show whether 110 00:07:33,360 --> 00:07:37,040 Speaker 1: genetically her day's old son is gifted in arts, music, 111 00:07:37,200 --> 00:07:41,200 Speaker 1: or math. So what did the genetic crystal ball tell her? 112 00:07:41,240 --> 00:07:46,200 Speaker 1: Travel ball pen The test results show that my son 113 00:07:46,360 --> 00:07:50,360 Speaker 1: has talent in the arts, especially in music. It says 114 00:07:50,400 --> 00:07:54,040 Speaker 1: he's strong in creativity and weak in sports. I think 115 00:07:54,040 --> 00:07:57,680 Speaker 1: it's very accurate. He's two years old now, and I 116 00:07:57,760 --> 00:08:00,920 Speaker 1: have noticed that he can recognize us song after hearing 117 00:08:00,960 --> 00:08:03,680 Speaker 1: it for the first time. If I asked him to 118 00:08:03,760 --> 00:08:07,000 Speaker 1: harm the song, he can also do it into I'm 119 00:08:07,040 --> 00:08:12,120 Speaker 1: surprised before the sales rep swabbed Little by his mouth 120 00:08:12,200 --> 00:08:16,040 Speaker 1: with a cotton tip. His mother remembers signing paperwork. She 121 00:08:16,200 --> 00:08:19,760 Speaker 1: thinks the paper might have been a disclosure for a woman. 122 00:08:20,640 --> 00:08:23,280 Speaker 1: If it's a piece of paper with writing on both sides, 123 00:08:24,000 --> 00:08:27,280 Speaker 1: it contained information about what your DNA is used for. 124 00:08:27,920 --> 00:08:30,720 Speaker 1: I didn't look at it carefully, so I don't know 125 00:08:30,880 --> 00:08:33,640 Speaker 1: if there is any mention in the document that it 126 00:08:33,720 --> 00:08:37,160 Speaker 1: can only be used for research and not for other purposes. 127 00:08:37,760 --> 00:08:41,760 Speaker 1: I didn't really pay attention. Joe, a former bank employee, 128 00:08:42,000 --> 00:08:46,160 Speaker 1: sounds casual about protecting her son's genetic blueprint, and her 129 00:08:46,200 --> 00:08:49,559 Speaker 1: nonchalance is in a way a reflection that Joe doesn't 130 00:08:49,640 --> 00:08:52,120 Speaker 1: think that she has any real say over how the 131 00:08:52,160 --> 00:08:55,240 Speaker 1: data may be passed around or used, or whether the 132 00:08:55,280 --> 00:08:59,280 Speaker 1: government might one day access it. Jake don't see as 133 00:08:59,320 --> 00:09:03,080 Speaker 1: an individual citizen, it's beyond our ability to control it. 134 00:09:03,800 --> 00:09:07,679 Speaker 1: We have little power. Joe and many other Chinese don't 135 00:09:07,720 --> 00:09:10,760 Speaker 1: think about privacy and the safeguarding a private data in 136 00:09:10,800 --> 00:09:14,000 Speaker 1: the same way as Westerners, and they're likely to view 137 00:09:14,000 --> 00:09:18,920 Speaker 1: the government's reach into their lives as benign. Example, why 138 00:09:19,000 --> 00:09:21,839 Speaker 1: because we are in China. If the government obtains the 139 00:09:21,960 --> 00:09:25,480 Speaker 1: data and uses it for something, we don't seem to 140 00:09:25,520 --> 00:09:29,120 Speaker 1: have the right to oppose it. But I believe if 141 00:09:29,160 --> 00:09:32,360 Speaker 1: our government is doing it, it must be doing it 142 00:09:32,400 --> 00:09:36,439 Speaker 1: for a good cost and will protect our personal data. 143 00:09:36,600 --> 00:09:39,959 Speaker 1: Even some Chinese law experts share the view that government 144 00:09:40,000 --> 00:09:43,840 Speaker 1: access to individual data is not a big deal. Since 145 00:09:44,320 --> 00:09:47,280 Speaker 1: the Indians, I think that when the state collects its 146 00:09:47,320 --> 00:09:52,000 Speaker 1: citizens genetic information has a different purpose. Let's leaves shall 147 00:09:52,120 --> 00:09:55,800 Speaker 1: know she's an associate professor who teaches classes in genetics 148 00:09:55,880 --> 00:10:00,040 Speaker 1: law at Peaking University, one of China's top schools palamont In. 149 00:10:01,240 --> 00:10:05,120 Speaker 1: Commercial companies that are providing genetic testing services are going 150 00:10:05,200 --> 00:10:08,120 Speaker 1: after the bottom line to make a profit, but the 151 00:10:08,160 --> 00:10:11,680 Speaker 1: state is collecting the data for scientific research and for 152 00:10:11,760 --> 00:10:15,520 Speaker 1: the cure of diseases. So personally I trust the state 153 00:10:15,559 --> 00:10:24,120 Speaker 1: more because they don't have any commercial intentions. Such an open, 154 00:10:24,320 --> 00:10:28,800 Speaker 1: unskeptical embrace of an authoritarian government's agenda main leave Westerners 155 00:10:28,840 --> 00:10:32,240 Speaker 1: scratching their heads. But let's step back for a second. 156 00:10:33,000 --> 00:10:36,280 Speaker 1: The different cultural approaches to privacy comes down to the 157 00:10:36,280 --> 00:10:41,000 Speaker 1: word itself. The Chinese word for privacy zine, and the 158 00:10:41,120 --> 00:10:44,440 Speaker 1: characters that make up the word carry the connotations of 159 00:10:44,559 --> 00:10:48,080 Speaker 1: hidden secrets. Tiffany Lee, who is a Resident Fellow at 160 00:10:48,160 --> 00:10:52,200 Speaker 1: Yale Law Schools Information Society Project, points out that cultural 161 00:10:52,280 --> 00:10:55,600 Speaker 1: norms around privacy on the mainland are often more about 162 00:10:55,640 --> 00:10:59,080 Speaker 1: protecting a person from shame rather than the protection of 163 00:10:59,200 --> 00:11:05,640 Speaker 1: individual liberty. The Chinese people are more open or less 164 00:11:05,640 --> 00:11:11,680 Speaker 1: sensitive about the privacy UH issue. F they are able 165 00:11:11,720 --> 00:11:18,200 Speaker 1: to treade privacy, say for convenience for uh sifety for efficiency, 166 00:11:18,240 --> 00:11:20,880 Speaker 1: in a lot of cases, they're willing to do that. 167 00:11:20,880 --> 00:11:23,800 Speaker 1: That's Robin Lee, the founder of search engine bai Do. 168 00:11:24,480 --> 00:11:27,000 Speaker 1: His comments at a forum in Beijing last year were 169 00:11:27,040 --> 00:11:31,880 Speaker 1: widely criticized online by his countrymen, a sign that attitudes 170 00:11:32,160 --> 00:11:36,400 Speaker 1: may be shifting still. How Chinese approach data protection is 171 00:11:36,440 --> 00:11:39,480 Speaker 1: often shaped by having grown up under a communist regime 172 00:11:39,960 --> 00:11:43,400 Speaker 1: and within a Confucian culture that teaches the government will 173 00:11:43,520 --> 00:11:47,760 Speaker 1: always enact policies for the common good. Contrast, that would 174 00:11:47,800 --> 00:11:50,280 Speaker 1: the US new data protection law that came into effect 175 00:11:50,320 --> 00:11:54,960 Speaker 1: last year that forced multinationals to scramble to comply. Its 176 00:11:55,000 --> 00:11:58,480 Speaker 1: specifically outlines that the protection of personal data is a 177 00:11:58,600 --> 00:12:09,040 Speaker 1: human right. China is also taking measures to better govern 178 00:12:09,160 --> 00:12:12,360 Speaker 1: how genetic data is handled. New rules that went into 179 00:12:12,360 --> 00:12:16,440 Speaker 1: effect this month layout protections for consumers. They call for 180 00:12:16,520 --> 00:12:19,600 Speaker 1: companies to inform customers about how their DNA data will 181 00:12:19,640 --> 00:12:22,679 Speaker 1: be protected and give them the right to opt out 182 00:12:22,720 --> 00:12:25,839 Speaker 1: at any time. The good thing is that we see 183 00:12:25,920 --> 00:12:29,880 Speaker 1: that China try to implement this international standard, which is 184 00:12:29,960 --> 00:12:34,800 Speaker 1: the so called prior informed consent for the use or 185 00:12:35,520 --> 00:12:39,600 Speaker 1: preservation collection of the genetic data. That's g On Lee. 186 00:12:39,880 --> 00:12:42,800 Speaker 1: He's an associate law professor at the Chinese University of 187 00:12:42,840 --> 00:12:46,400 Speaker 1: Hong Kong and he studies cyber law and privacy issues 188 00:12:46,480 --> 00:12:49,120 Speaker 1: on the mainland. The government is trying to send a 189 00:12:49,400 --> 00:12:52,400 Speaker 1: message to the market that is going to play tough 190 00:12:53,040 --> 00:12:58,320 Speaker 1: for the violation of all this genetic information misuse. I 191 00:12:58,400 --> 00:13:02,319 Speaker 1: do expect there will be some benchmark cases, probably in 192 00:13:02,360 --> 00:13:06,280 Speaker 1: the first two to three years. Last year, the government 193 00:13:06,320 --> 00:13:10,280 Speaker 1: handed down finds the multinational farmer company Astra Zeneca and 194 00:13:10,360 --> 00:13:14,120 Speaker 1: five domestic firms for sharing DNA samples or genetic data 195 00:13:14,400 --> 00:13:18,120 Speaker 1: with other organizations in China and outside the country. The 196 00:13:18,160 --> 00:13:21,280 Speaker 1: new rules prohibit the sale of genetic data except for 197 00:13:21,280 --> 00:13:26,079 Speaker 1: scientific research and puts restrictions on foreign companies. But there's 198 00:13:26,200 --> 00:13:30,560 Speaker 1: one provision of the regulation that should raise concerns. And 199 00:13:30,600 --> 00:13:34,199 Speaker 1: the other very prominently provision that I see from this 200 00:13:34,360 --> 00:13:36,840 Speaker 1: slow is that they think that the government can have 201 00:13:36,960 --> 00:13:40,640 Speaker 1: access to this genetic data. So for the purpose of 202 00:13:40,720 --> 00:13:45,520 Speaker 1: public health, national security, and um public interests, the government 203 00:13:45,640 --> 00:13:51,160 Speaker 1: can access these preserved genetic data. And it's actually not 204 00:13:51,400 --> 00:13:54,959 Speaker 1: very clear. So what is the purpose of public health, 205 00:13:55,280 --> 00:13:59,560 Speaker 1: national security and also public interests? National security can actually 206 00:13:59,640 --> 00:14:03,520 Speaker 1: define very browth that might include everything. So what kind 207 00:14:03,520 --> 00:14:06,800 Speaker 1: of genetic information could be valuable to the government. The 208 00:14:06,920 --> 00:14:12,240 Speaker 1: coffin might have much more information about your family relations 209 00:14:12,960 --> 00:14:16,920 Speaker 1: or your entstry, or what you did before and even 210 00:14:17,040 --> 00:14:21,520 Speaker 1: for for example, if anyone have any illegmate child that's 211 00:14:21,520 --> 00:14:24,800 Speaker 1: easy to track. So if anyone or if any entity 212 00:14:24,920 --> 00:14:29,120 Speaker 1: has access to that kind of data base or unlimited 213 00:14:29,880 --> 00:14:32,840 Speaker 1: unbalanced access to that kind of data, that would be 214 00:14:32,920 --> 00:14:36,960 Speaker 1: ver very dangerous. Lee believes the government has huge databases 215 00:14:37,080 --> 00:14:39,560 Speaker 1: or can get access to all sorts of information on 216 00:14:39,560 --> 00:14:43,240 Speaker 1: its citizens financial records, did you pay your bills on time? 217 00:14:43,640 --> 00:14:47,240 Speaker 1: Digital payments? How many luxury watches, did you buy speech 218 00:14:47,240 --> 00:14:50,360 Speaker 1: and online activity on the internet. What if all that 219 00:14:50,440 --> 00:14:55,920 Speaker 1: information could be combined on any one individual. Well, there 220 00:14:56,040 --> 00:14:58,920 Speaker 1: is a plan to merge some of that. Some provinces 221 00:14:58,960 --> 00:15:02,000 Speaker 1: have begun to implement what will become a nationwide social 222 00:15:02,040 --> 00:15:07,160 Speaker 1: credit system that melds personal financial data with behavior. Under 223 00:15:07,200 --> 00:15:10,600 Speaker 1: this new system, every citizen is ranked if you owe 224 00:15:10,600 --> 00:15:13,320 Speaker 1: money or don't pay your taxes, you run red lights, 225 00:15:13,600 --> 00:15:16,720 Speaker 1: or don't pick up after your dog, then points are deducted. 226 00:15:17,320 --> 00:15:20,440 Speaker 1: Good behavior and deeds like donating blood and money improve 227 00:15:20,520 --> 00:15:24,360 Speaker 1: your score. Those with low credit scores may lose access 228 00:15:24,400 --> 00:15:27,880 Speaker 1: to benefits or services, while those with good credit scores 229 00:15:27,920 --> 00:15:32,360 Speaker 1: are given priority and access. Already, Chinese have been prevented 230 00:15:32,360 --> 00:15:35,360 Speaker 1: from buying plane and rail tickets because of their social 231 00:15:35,360 --> 00:15:38,800 Speaker 1: credit scores. The system is still in its early days, 232 00:15:39,040 --> 00:15:41,600 Speaker 1: and there's still debate whether the project is aimed at 233 00:15:41,720 --> 00:15:46,080 Speaker 1: increasing surveillance or if it's China's unique way to incentivize 234 00:15:46,120 --> 00:15:50,520 Speaker 1: citizens to uphold laws. Many Chinese are actually in favor 235 00:15:50,520 --> 00:15:53,680 Speaker 1: of the system. They say it promotes good behavior and 236 00:15:53,760 --> 00:15:57,720 Speaker 1: engenders trust. With that unfolding on the ground, Lee is 237 00:15:57,800 --> 00:16:00,960 Speaker 1: concerned what will happen as d NA is thrown into 238 00:16:00,960 --> 00:16:05,200 Speaker 1: the mix. There is a possibility that all these different 239 00:16:05,400 --> 00:16:09,160 Speaker 1: kinds of a personal data, including the biological one and 240 00:16:09,240 --> 00:16:12,440 Speaker 1: digital one, will be connected together. Um. But we have 241 00:16:12,560 --> 00:16:16,720 Speaker 1: no idea how that combination will be used against you 242 00:16:16,800 --> 00:16:19,800 Speaker 1: because the technology is still developing. For most of the people, 243 00:16:19,920 --> 00:16:23,800 Speaker 1: I don't think they will feel very comfortable, um, if 244 00:16:23,840 --> 00:16:27,520 Speaker 1: they have that kind of information accessed by any other 245 00:16:27,600 --> 00:16:31,800 Speaker 1: parties other than yourselves. Um. Not to mention that a 246 00:16:31,840 --> 00:16:35,480 Speaker 1: lot of information actually is not known to you yourselves. 247 00:16:35,800 --> 00:16:38,680 Speaker 1: We were hard pressed to find experts in China raising 248 00:16:38,720 --> 00:16:42,520 Speaker 1: concerns on the government's potential access to genetic data. That 249 00:16:42,640 --> 00:16:45,840 Speaker 1: may be partly because the consumer genetics industry is still 250 00:16:45,880 --> 00:16:50,480 Speaker 1: relatively small and developing in the country. Well, I've never 251 00:16:50,520 --> 00:16:53,320 Speaker 1: thought about how genetic testing can be used to maintain 252 00:16:53,400 --> 00:16:57,680 Speaker 1: a stable society. That's lea shall known of Peaking University. Again, 253 00:16:58,360 --> 00:17:00,640 Speaker 1: she's trying to figure out the critics, is ms and 254 00:17:00,680 --> 00:17:04,720 Speaker 1: reservations that Westerners have about the Chinese government or police 255 00:17:05,119 --> 00:17:10,800 Speaker 1: getting their hands on individual DNA information. How would that 256 00:17:10,880 --> 00:17:14,320 Speaker 1: be possible? Would they find out through gene tests which 257 00:17:14,359 --> 00:17:17,840 Speaker 1: individuals have a genetic mutation that shows they're more likely 258 00:17:17,920 --> 00:17:20,480 Speaker 1: to raise a rebellion. Is that what people in the 259 00:17:20,520 --> 00:17:23,879 Speaker 1: West think that the Chinese government is going to predict 260 00:17:23,920 --> 00:17:27,360 Speaker 1: which people are going to cause chaos through genetic testing 261 00:17:27,680 --> 00:17:31,160 Speaker 1: and then put them under higher surveillance. Actually, some critics 262 00:17:31,160 --> 00:17:34,760 Speaker 1: say some version of that scenario is already playing out. 263 00:17:35,400 --> 00:17:38,080 Speaker 1: The United Nations and the United States say more than 264 00:17:38,160 --> 00:17:42,280 Speaker 1: one million weekers are being held enforced re education camps. 265 00:17:42,840 --> 00:17:46,280 Speaker 1: There are Muslim ethnic group that's under surveillance. China says 266 00:17:46,320 --> 00:17:50,720 Speaker 1: it's fighting separatism there in Cindio, this region of western China. 267 00:17:51,440 --> 00:17:55,240 Speaker 1: UH the authorities are requiring people from the age of 268 00:17:55,280 --> 00:17:59,200 Speaker 1: twelve to six to all be submitting the d n 269 00:17:59,280 --> 00:18:03,560 Speaker 1: A samples um to be put in a searchable DNA database. 270 00:18:03,760 --> 00:18:06,920 Speaker 1: That's Maya Wong, a senior researcher for Human Rights Watch 271 00:18:07,080 --> 00:18:11,080 Speaker 1: focused on China. The organization says it has documented that 272 00:18:11,240 --> 00:18:15,600 Speaker 1: Chinese police have collected forty million DNA entries from ordinary 273 00:18:15,680 --> 00:18:19,520 Speaker 1: Chinese not connected to crimes. The nonprofit says the nation's 274 00:18:19,560 --> 00:18:23,719 Speaker 1: Ministry of Public Security started building a searchable national DNA 275 00:18:23,840 --> 00:18:26,639 Speaker 1: database in the early two thousands as part of a 276 00:18:26,680 --> 00:18:29,639 Speaker 1: police project known as the Golden Shield. A lot of 277 00:18:29,680 --> 00:18:33,800 Speaker 1: these AI systems then feed into the next layer of surveillance, 278 00:18:34,040 --> 00:18:37,359 Speaker 1: which is the use of big data programs by the 279 00:18:37,400 --> 00:18:42,040 Speaker 1: police to track monitor people's relationships, where they go and ins, 280 00:18:42,480 --> 00:18:46,280 Speaker 1: for example, where this surveillance is most intrusive and visible. 281 00:18:46,520 --> 00:18:51,040 Speaker 1: That information is also used to um control people's movement 282 00:18:51,160 --> 00:18:54,800 Speaker 1: and also to be analyzed who put certain people who 283 00:18:54,800 --> 00:19:01,959 Speaker 1: are politically untrustworthy into political inductritional Asian camps um So 284 00:19:02,040 --> 00:19:06,720 Speaker 1: in that context, the collection of DNA is problematic because 285 00:19:06,800 --> 00:19:09,960 Speaker 1: it is part of a bigger program of mass surveillance 286 00:19:10,000 --> 00:19:14,120 Speaker 1: and gathering big data of people with the explicit purpose 287 00:19:14,280 --> 00:19:18,359 Speaker 1: of social control. With d NA, police can identify who's 288 00:19:18,440 --> 00:19:22,520 Speaker 1: related to whom once an individual is flagged as suspicious 289 00:19:22,640 --> 00:19:26,399 Speaker 1: or a threat. She says, Entire families could also be marked. 290 00:19:26,960 --> 00:19:30,320 Speaker 1: She says the biological information being collected here could be 291 00:19:30,400 --> 00:19:34,280 Speaker 1: used to isolate a group of people and discriminate against them. 292 00:19:34,440 --> 00:19:37,879 Speaker 1: China's Foreign Affairs Ministry didn't comment on our questions about 293 00:19:37,960 --> 00:19:41,719 Speaker 1: DNA being gathered in Shinjang. At a recent press briefing, 294 00:19:41,800 --> 00:19:45,280 Speaker 1: the Foreign Ministry spokesman said the camps are vocation training 295 00:19:45,280 --> 00:19:49,800 Speaker 1: centers that counter terrorism, and with so many consumer genetic 296 00:19:49,840 --> 00:19:54,440 Speaker 1: companies collecting DNA data. There's concern that Chinese government could 297 00:19:54,520 --> 00:19:58,800 Speaker 1: compel companies to turn data over with impunity. The rules 298 00:19:58,880 --> 00:20:04,120 Speaker 1: allow for the Chinese government to basically access data without 299 00:20:04,880 --> 00:20:09,800 Speaker 1: any procedures or legal oversight or any kind of you know, 300 00:20:10,240 --> 00:20:12,920 Speaker 1: restrictions on what they can access and what they can 301 00:20:12,960 --> 00:20:18,600 Speaker 1: take um and that has implications well for one billion people, 302 00:20:18,640 --> 00:20:27,600 Speaker 1: but also some of these companies also work abroad. Here 303 00:20:27,680 --> 00:20:30,960 Speaker 1: at a busy crosswalk in the manufacturing hub of Shenzen, 304 00:20:31,400 --> 00:20:36,040 Speaker 1: the surveillance of ordinary citizens is on public display. Across 305 00:20:36,119 --> 00:20:39,240 Speaker 1: the street is a gigantic TV monitor that displays the 306 00:20:39,280 --> 00:20:43,000 Speaker 1: photos of people who have gotten caught jaywalking. If the 307 00:20:43,080 --> 00:20:46,119 Speaker 1: software can identify a jaywalker, they're assessed to find that 308 00:20:46,240 --> 00:20:52,760 Speaker 1: sent us a text message to their phone. The city 309 00:20:52,880 --> 00:20:56,679 Speaker 1: is also home to the headquarters of genetics Firmwigian, the 310 00:20:56,760 --> 00:21:01,320 Speaker 1: popular consumer genetics testing company started into fourteen and has 311 00:21:01,359 --> 00:21:04,640 Speaker 1: tested the DNA samples of about three hundred thousand users. 312 00:21:05,240 --> 00:21:08,679 Speaker 1: Company CEO Chung Gung welcomes the country's new rules on 313 00:21:08,760 --> 00:21:14,760 Speaker 1: managing genetic data and says they won't disrupt business. The 314 00:21:14,760 --> 00:21:17,920 Speaker 1: genetic data Regien collects from customers is only being used 315 00:21:17,920 --> 00:21:20,960 Speaker 1: inside regien. We have not and will not share such 316 00:21:21,080 --> 00:21:24,800 Speaker 1: data with other organizations. Chun says the government hasn't come 317 00:21:24,880 --> 00:21:28,200 Speaker 1: knocking on its doors about the DNA data it's been collecting. 318 00:21:28,680 --> 00:21:32,520 Speaker 1: And what if the government ever asked for the data, 319 00:21:31,800 --> 00:21:37,879 Speaker 1: whether it's a tech company or genetic testing company, we 320 00:21:37,920 --> 00:21:40,720 Speaker 1: may have to provide data and situations where it's required, 321 00:21:41,320 --> 00:21:43,359 Speaker 1: but so far there isn't an order on this matter. 322 00:21:44,080 --> 00:21:47,000 Speaker 1: If there's a regulation requiring companies to do it, there's 323 00:21:47,040 --> 00:21:56,000 Speaker 1: no way a company can refuse. One very significant project 324 00:21:56,080 --> 00:21:59,639 Speaker 1: between the government and a genetics consumer company is unfolding 325 00:21:59,680 --> 00:22:03,080 Speaker 1: here in Shenzen. This hub of more than thirteen million 326 00:22:03,119 --> 00:22:06,280 Speaker 1: people is also home to be Gi Genomics, one of 327 00:22:06,320 --> 00:22:10,160 Speaker 1: the world's largest genome sequencing companies. The firm is also 328 00:22:10,200 --> 00:22:14,879 Speaker 1: the biggest provider of prenatal genetic testing in China. The 329 00:22:14,920 --> 00:22:18,480 Speaker 1: publicly listed company is also charged with running China's National 330 00:22:18,560 --> 00:22:22,320 Speaker 1: Gene Bank, a huge database and repository of DNA and 331 00:22:22,320 --> 00:22:26,320 Speaker 1: biological samples from millions of Chinese across the country, and 332 00:22:26,400 --> 00:22:28,919 Speaker 1: it aspires to be the biggest such center in the world. 333 00:22:29,320 --> 00:22:32,240 Speaker 1: China spent nearly one billion dollars to fund the first 334 00:22:32,280 --> 00:22:35,760 Speaker 1: phase and started the Gene Bank for health and disease research. 335 00:22:37,200 --> 00:22:40,360 Speaker 1: Under the new rules, there's a high chance that no 336 00:22:40,400 --> 00:22:43,760 Speaker 1: one will ever know when and how the government or 337 00:22:43,840 --> 00:22:47,800 Speaker 1: law enforcement might invoke public health, national security, or public 338 00:22:47,880 --> 00:22:51,920 Speaker 1: social interest reasons to access the genetic data stored there. 339 00:22:52,600 --> 00:22:55,520 Speaker 1: The rules don't outline a process for checks on the 340 00:22:55,560 --> 00:22:59,840 Speaker 1: government's access. B g I said customers data belonged to 341 00:22:59,880 --> 00:23:03,560 Speaker 1: the those clients and it respects national laws. In other 342 00:23:03,680 --> 00:23:08,280 Speaker 1: Chinese cities, some private genetics companies are working with law enforcement. 343 00:23:08,800 --> 00:23:12,280 Speaker 1: A handful of companies advertised as a selling point that 344 00:23:12,359 --> 00:23:16,399 Speaker 1: they share children's DNA data with police. Chinese police launched 345 00:23:16,400 --> 00:23:19,800 Speaker 1: a database in two thousand nine to collect DNA information 346 00:23:19,920 --> 00:23:23,879 Speaker 1: of children. The program is aimed at preventing child kidnapping 347 00:23:24,000 --> 00:23:34,359 Speaker 1: and trafficking. Joe is a mom of two kids and 348 00:23:34,440 --> 00:23:38,160 Speaker 1: says she wouldn't hesitate to add her children's genetic blueprint 349 00:23:38,280 --> 00:23:43,919 Speaker 1: to police database. Just if your kid does go missing 350 00:23:44,200 --> 00:23:47,600 Speaker 1: and there are still traffickers out there, that's pretty scary. 351 00:23:48,119 --> 00:23:51,600 Speaker 1: We are unlimited as parents to protect our kids, and 352 00:23:51,680 --> 00:23:55,120 Speaker 1: if there are other institutions and the government that are 353 00:23:55,200 --> 00:23:59,040 Speaker 1: doing this together for our children, I'm okay with it. 354 00:23:59,480 --> 00:24:07,520 Speaker 1: Going to launch of heights do this okay, though the 355 00:24:07,560 --> 00:24:11,840 Speaker 1: government's latest regulation on the Management of Genetic material addresses 356 00:24:11,960 --> 00:24:16,840 Speaker 1: other Chinese fears. The provision band's foreign companies from collecting 357 00:24:16,920 --> 00:24:20,520 Speaker 1: and storing the DNA of Chinese in the country or 358 00:24:20,600 --> 00:24:25,080 Speaker 1: sending Chinese data overseas. After China's f d A find 359 00:24:25,119 --> 00:24:29,600 Speaker 1: the company's last year from mishandling genetic data, articles circulated 360 00:24:29,640 --> 00:24:33,000 Speaker 1: online that the DNA that were illegally exported could be 361 00:24:33,160 --> 00:24:41,320 Speaker 1: used by foreign entities to harm Chinese. Here again is 362 00:24:41,400 --> 00:24:44,280 Speaker 1: Lou fe Long. He's the web app investor we first 363 00:24:44,359 --> 00:24:46,840 Speaker 1: met on the jogging track at the beginning of our story. 364 00:24:47,480 --> 00:24:51,680 Speaker 1: He's college educated and highly informed. He sees the pitfall 365 00:24:51,840 --> 00:24:58,400 Speaker 1: of Chinese DNA in the wrong hands. Larger problem is 366 00:24:59,160 --> 00:25:01,960 Speaker 1: if an organization and gets hold up these genes of 367 00:25:02,040 --> 00:25:05,760 Speaker 1: our ethnicity or of Chinese people, they might be able 368 00:25:05,800 --> 00:25:08,920 Speaker 1: to extract from it where are people are different from 369 00:25:08,920 --> 00:25:23,280 Speaker 1: others and then target those vulnerabilities. In Beijing, seven year 370 00:25:23,280 --> 00:25:26,640 Speaker 1: old r curator Jung Tang Wawin is working in the studio. 371 00:25:27,160 --> 00:25:29,879 Speaker 1: He and his family members have all given samples for 372 00:25:29,920 --> 00:25:33,080 Speaker 1: genetic tests because they wanted to dig into their health risks. 373 00:25:33,480 --> 00:25:36,800 Speaker 1: He's among the minority of consumers who's actually given some 374 00:25:36,840 --> 00:25:40,879 Speaker 1: thought to genetic data protection. He imagines the future in 375 00:25:40,920 --> 00:25:44,520 Speaker 1: which the government could use DNA to dictate professions and 376 00:25:44,680 --> 00:25:50,840 Speaker 1: life choices in I'm worried that if the government has 377 00:25:50,880 --> 00:25:55,919 Speaker 1: control over genetic information, this data will be incorporated into 378 00:25:56,000 --> 00:26:00,200 Speaker 1: each person's idea. Different groups will be separated by their 379 00:26:00,280 --> 00:26:05,040 Speaker 1: genetic information. His fears may not be as paranoid as 380 00:26:05,080 --> 00:26:09,160 Speaker 1: they sound. Chinese genetic testing companies now routinely give out 381 00:26:09,200 --> 00:26:12,280 Speaker 1: results that tell customers if they're at risk for depression 382 00:26:12,359 --> 00:26:15,800 Speaker 1: or mental disorders. Parents test their newborns to see if 383 00:26:15,800 --> 00:26:19,360 Speaker 1: they're likely to suffer from poor retention and memory. Yet, 384 00:26:19,359 --> 00:26:23,359 Speaker 1: despite those fears, John is determined to call valuable information 385 00:26:23,480 --> 00:26:26,960 Speaker 1: from his own DNA now before private companies or the 386 00:26:27,040 --> 00:26:30,520 Speaker 1: government does. He says everyone should have health data on 387 00:26:30,560 --> 00:26:33,760 Speaker 1: themselves and be able to enjoy the lighter side of 388 00:26:33,800 --> 00:26:36,720 Speaker 1: it too, like knowing if your lineage can be traced 389 00:26:36,720 --> 00:26:41,840 Speaker 1: back to an ancient emperor. So Nah, I know it 390 00:26:41,960 --> 00:26:45,080 Speaker 1: all comes with a price. There's no such thing as 391 00:26:45,160 --> 00:26:49,320 Speaker 1: absolute privacy. It just depends on the value of the information. 392 00:26:49,840 --> 00:26:52,760 Speaker 1: So I choose to overlook it because I'm after the 393 00:26:52,880 --> 00:26:57,119 Speaker 1: health information and the entertainment from the testing. I have 394 00:26:57,280 --> 00:27:00,800 Speaker 1: no control over the privacy of the information, and it's 395 00:27:00,800 --> 00:27:04,800 Speaker 1: small proactive to get the results now wait for the 396 00:27:04,920 --> 00:27:13,479 Speaker 1: day it will be accessed anyway. What this millennial's approach 397 00:27:13,520 --> 00:27:17,280 Speaker 1: to data privacy is both fatalistic and proactive at the 398 00:27:17,320 --> 00:27:20,040 Speaker 1: same time. If he were in the US, he might 399 00:27:20,040 --> 00:27:22,639 Speaker 1: be among those selling their DNA data in exchange for 400 00:27:22,680 --> 00:27:26,560 Speaker 1: gift cards, cryptocurrency, or shares to any number of companies 401 00:27:26,600 --> 00:27:32,000 Speaker 1: looking to buy it. Whatever China's consumer genetics market evolves into, 402 00:27:32,480 --> 00:27:35,400 Speaker 1: it will be shaped by the fears, needs, and cultural 403 00:27:35,480 --> 00:27:39,040 Speaker 1: values of more than a billion Chinese consumers, and there's 404 00:27:39,040 --> 00:27:41,480 Speaker 1: a good chance it will look nothing like what we 405 00:27:41,520 --> 00:27:53,679 Speaker 1: could ever imagine. Stay tuned, and that's it for the 406 00:27:53,720 --> 00:27:57,080 Speaker 1: second season of Prognosis. We hope you enjoyed hearing about 407 00:27:57,080 --> 00:28:00,200 Speaker 1: all the promise and privacy issues that are swirling round 408 00:28:00,240 --> 00:28:04,080 Speaker 1: healthcare data. We still want to hear from you, especially 409 00:28:04,080 --> 00:28:06,000 Speaker 1: if you have a story about healthcare in the US 410 00:28:06,280 --> 00:28:09,040 Speaker 1: or around the world. Find me on Twitter at aa 411 00:28:09,200 --> 00:28:12,880 Speaker 1: Cortes or email me m Cortes at Bloomberg dot net. 412 00:28:13,800 --> 00:28:15,880 Speaker 1: If you were a fan of this episode, please take 413 00:28:15,880 --> 00:28:18,560 Speaker 1: a moment to rate and review us. It really helps 414 00:28:18,560 --> 00:28:22,000 Speaker 1: new listeners find the show and don't forget to subscribe. 415 00:28:23,440 --> 00:28:27,040 Speaker 1: This episode is produced by Lindsay Cratterwell. Our story editor 416 00:28:27,080 --> 00:28:30,639 Speaker 1: was Rick Shine. Special thanks to mung Ling Huang and 417 00:28:30,720 --> 00:28:35,479 Speaker 1: Drew Armstrong. Francesca Leavia as head of Bloomberg Podcasts. Thanks 418 00:28:35,640 --> 00:28:36,640 Speaker 1: and we'll see you next time.