1 00:00:00,000 --> 00:00:02,480 Speaker 1: Now DNA testing company twenty three and meter is being 2 00:00:02,520 --> 00:00:05,640 Speaker 1: taken to court by a huge number of US states 3 00:00:06,080 --> 00:00:07,840 Speaker 1: because what's going on is that they want to sell 4 00:00:07,880 --> 00:00:11,400 Speaker 1: off the customer's personal genetic information. But this is without 5 00:00:11,400 --> 00:00:14,400 Speaker 1: customer's knowledge or consent. The company holds DNA info for 6 00:00:14,440 --> 00:00:17,119 Speaker 1: more than fifteen million people, many of them will be 7 00:00:17,160 --> 00:00:21,279 Speaker 1: in New Zealand as well. Rick Shearer is a privacy lawyer. Hey, Rick, Hi, 8 00:00:21,640 --> 00:00:23,480 Speaker 1: how are you well? Thank you? If you've sent your 9 00:00:23,520 --> 00:00:25,360 Speaker 1: DNA into the loot, should you be worried? 10 00:00:26,960 --> 00:00:30,160 Speaker 2: I've said that right from the start. It's highly tempting. 11 00:00:30,160 --> 00:00:31,840 Speaker 2: Obviously we all want to know where we came from 12 00:00:31,840 --> 00:00:35,320 Speaker 2: and who were related to, but one never knows where 13 00:00:35,320 --> 00:00:36,479 Speaker 2: these things are going to end up. 14 00:00:36,880 --> 00:00:38,720 Speaker 1: Where are they going to end up? So apparently, if 15 00:00:38,720 --> 00:00:41,680 Speaker 1: you have pharmaceutical company said last month that would paid 16 00:00:41,680 --> 00:00:44,239 Speaker 1: two hundred and fifty six million US dollars for it, 17 00:00:44,360 --> 00:00:45,519 Speaker 1: what do they do with it? 18 00:00:46,360 --> 00:00:50,000 Speaker 2: Yeah, well they'll use it to develop pharmaceuticals, no doubt, 19 00:00:50,040 --> 00:00:53,479 Speaker 2: and for any other purposes that they might want to 20 00:00:54,120 --> 00:00:56,520 Speaker 2: use it for. They say, of course that they will 21 00:00:56,560 --> 00:00:58,800 Speaker 2: abide by the privacy policies and so on, that the 22 00:00:58,800 --> 00:01:01,960 Speaker 2: company already had. But though, like many privacy policies are 23 00:01:01,960 --> 00:01:03,800 Speaker 2: a little bit of opaic, you don't quite know where 24 00:01:03,800 --> 00:01:05,160 Speaker 2: it's going to end up or what they're going to 25 00:01:05,160 --> 00:01:05,560 Speaker 2: do with it. 26 00:01:06,280 --> 00:01:08,400 Speaker 1: So what's the worst that they can do with it? Rick? 27 00:01:08,840 --> 00:01:12,360 Speaker 1: I mean, because increasingly, I'm sure you're experiencing this. Increasingly 28 00:01:12,400 --> 00:01:14,319 Speaker 1: we are getting we are living in a world where 29 00:01:14,360 --> 00:01:19,080 Speaker 1: our information is just widely available to companies, Google, et cetera. Right, 30 00:01:19,160 --> 00:01:21,440 Speaker 1: So why should we What are these guys going to 31 00:01:21,440 --> 00:01:22,800 Speaker 1: do with it? What should be worried about? 32 00:01:23,440 --> 00:01:25,160 Speaker 2: Well, we don't know what they're going to do with it, 33 00:01:25,200 --> 00:01:28,440 Speaker 2: that's the problem. And terms of terms of service and 34 00:01:28,480 --> 00:01:32,080 Speaker 2: privacy policies can be changed almost welly Nelly, these days 35 00:01:32,080 --> 00:01:34,000 Speaker 2: where we're used to getting the email in our box 36 00:01:34,040 --> 00:01:36,119 Speaker 2: saying oh, by the way, our privacy policy has changed, 37 00:01:36,880 --> 00:01:39,160 Speaker 2: if you continue to use our service, then you've accepted it. 38 00:01:39,480 --> 00:01:42,640 Speaker 2: The difficulty here, I think is that that's all very well, 39 00:01:42,680 --> 00:01:45,280 Speaker 2: where you know you're giving up some sort of personal 40 00:01:45,319 --> 00:01:47,560 Speaker 2: information of some data, even a credit card. You can 41 00:01:47,600 --> 00:01:50,440 Speaker 2: change your credit card when you like, you can't change 42 00:01:50,440 --> 00:01:53,600 Speaker 2: your DNA. So once it's out there, and once it's gone, 43 00:01:53,720 --> 00:01:56,600 Speaker 2: it's gone for good. And of course, with the increases 44 00:01:56,640 --> 00:02:00,920 Speaker 2: in technology, the use of DNA to all sorts of things, 45 00:02:00,920 --> 00:02:04,000 Speaker 2: including identifying US just for run of the mill types 46 00:02:04,000 --> 00:02:06,400 Speaker 2: of things is going to become more prevalent, so the 47 00:02:06,440 --> 00:02:09,760 Speaker 2: ability for people to impersonate US using DNA is likely 48 00:02:09,800 --> 00:02:10,639 Speaker 2: to increase as well. 49 00:02:10,800 --> 00:02:13,120 Speaker 1: Hey, very good point, Rick, Thanks very much, appreciate it. 50 00:02:13,200 --> 00:02:13,320 Speaker 2: Rick. 51 00:02:13,360 --> 00:02:17,200 Speaker 1: Share at Privacy Lawyers For more from Hither Duplessy Alan Drive. 52 00:02:17,360 --> 00:02:20,720 Speaker 1: Listen live to news talks it'd be from four pm weekdays, 53 00:02:20,880 --> 00:02:23,080 Speaker 1: or follow the podcast on iHeartRadio.