1 00:00:00,600 --> 00:00:05,360 Speaker 1: This is Bloomberg Law with June Grasso from Bloomberg Radio. 2 00:00:05,680 --> 00:00:09,600 Speaker 1: South Korea is fighting the pandemic through extensive surveillance of 3 00:00:09,600 --> 00:00:14,320 Speaker 1: its citizens credit card transactions, cell phone data, and surveillance 4 00:00:14,360 --> 00:00:18,400 Speaker 1: camera footage. Should the US allow this massive intrusion into 5 00:00:18,400 --> 00:00:22,439 Speaker 1: our privacy to stem the pandemic? Joining me is Harold Crant, 6 00:00:22,440 --> 00:00:26,280 Speaker 1: professor at the Chicago Kent College of Law. So, Harold, 7 00:00:26,320 --> 00:00:32,000 Speaker 1: you have this great public health concern versus privacy concerns. 8 00:00:32,479 --> 00:00:36,080 Speaker 1: How do you balance that in this country? It's impossible 9 00:00:36,080 --> 00:00:38,120 Speaker 1: to balance them well. And we saw that in the 10 00:00:38,159 --> 00:00:40,879 Speaker 1: wake of nine eleven. And I view this as a 11 00:00:40,960 --> 00:00:44,120 Speaker 1: kind of spectrum that there's are a pendulum that we 12 00:00:44,200 --> 00:00:47,120 Speaker 1: have more privacy in times of peace, we have less 13 00:00:47,159 --> 00:00:49,600 Speaker 1: privacy in times of war, and certainly this we have 14 00:00:49,640 --> 00:00:53,159 Speaker 1: a war against the pandemic. Right now remains to be 15 00:00:53,240 --> 00:00:55,280 Speaker 1: seen what will happen in this country, but there has 16 00:00:55,320 --> 00:00:58,000 Speaker 1: been some success with I think I could say two 17 00:00:58,080 --> 00:01:02,000 Speaker 1: different types of surveillance mechanisms in other countries such as Israel, 18 00:01:02,280 --> 00:01:05,440 Speaker 1: South Korea, and Singapore. And of course China is a 19 00:01:05,560 --> 00:01:08,039 Speaker 1: leader of all this as well. And just to set 20 00:01:08,080 --> 00:01:13,160 Speaker 1: the stage briefly, the technologies can be used first to 21 00:01:13,200 --> 00:01:16,280 Speaker 1: detect people who are at risk of catching the virus. 22 00:01:16,560 --> 00:01:19,680 Speaker 1: If you take someone who is caught the virus and 23 00:01:19,760 --> 00:01:21,600 Speaker 1: you look at their credit card data, if you look 24 00:01:21,600 --> 00:01:24,720 Speaker 1: at their cell phone location data, one can easily create 25 00:01:24,720 --> 00:01:28,280 Speaker 1: a web of contacts and alert people that may they 26 00:01:28,319 --> 00:01:30,400 Speaker 1: may have been exposed. And indeed, you could take a 27 00:01:30,440 --> 00:01:33,880 Speaker 1: step further, as some countries have and impose a mandatory 28 00:01:33,959 --> 00:01:37,399 Speaker 1: quarantine on those people who may have received the virus. 29 00:01:37,520 --> 00:01:40,160 Speaker 1: So that's the first sort of set of tools that 30 00:01:40,200 --> 00:01:42,280 Speaker 1: have been used. The second set of tools that have 31 00:01:42,360 --> 00:01:45,880 Speaker 1: been used are to enforce a quarantine. Right now, we 32 00:01:45,959 --> 00:01:48,880 Speaker 1: have sheltering place in in large parts of this country, 33 00:01:49,000 --> 00:01:52,320 Speaker 1: but no one really knows whether those rules are being 34 00:01:52,960 --> 00:01:56,160 Speaker 1: abided by. You can create a mandatory app on a 35 00:01:56,200 --> 00:01:59,480 Speaker 1: cell phone that detects when people leave their quarantine. You 36 00:01:59,480 --> 00:02:02,360 Speaker 1: can use location data to to ensure that they're there, 37 00:02:02,640 --> 00:02:05,280 Speaker 1: and you can back that up with some kind of 38 00:02:05,720 --> 00:02:10,240 Speaker 1: surveillance cameras, facial recognition through AI and so forth. So 39 00:02:10,320 --> 00:02:13,600 Speaker 1: technology does have a lot of promise both to detect 40 00:02:13,600 --> 00:02:16,120 Speaker 1: people who may be exposed to the virus um but 41 00:02:16,200 --> 00:02:19,360 Speaker 1: also to enforce quarantines. If we have to. But the 42 00:02:19,440 --> 00:02:21,760 Speaker 1: question is is you posed in the beginning should we? 43 00:02:22,440 --> 00:02:25,040 Speaker 1: Is this sort of foretel a kind of defeat of 44 00:02:25,120 --> 00:02:29,760 Speaker 1: privacy in the long run those measures that you mentioned, 45 00:02:29,800 --> 00:02:34,920 Speaker 1: for example, making sure that people are staying in if 46 00:02:34,960 --> 00:02:38,320 Speaker 1: they're quarantined. And in China, I know that they're acquiring 47 00:02:38,360 --> 00:02:42,320 Speaker 1: people to have software on their phones that classifies each 48 00:02:42,360 --> 00:02:46,840 Speaker 1: person with a color depending on their exposure to the virus. 49 00:02:46,840 --> 00:02:51,040 Speaker 1: But this not only sounds like big Brother, it starts 50 00:02:51,080 --> 00:02:55,119 Speaker 1: to sound should I say on American to start tracking 51 00:02:55,200 --> 00:02:58,280 Speaker 1: movements to that extent, and you've mentioned about China's is 52 00:02:58,280 --> 00:03:02,280 Speaker 1: sort of the the the height of AI and predictive analytics. 53 00:03:02,720 --> 00:03:06,240 Speaker 1: They look at a basic sort of set of of 54 00:03:06,919 --> 00:03:10,040 Speaker 1: factors such as who you've been contacted with, what is 55 00:03:10,080 --> 00:03:14,200 Speaker 1: your health, etcetera, etcetera, to forecast what kind of danger 56 00:03:14,280 --> 00:03:17,640 Speaker 1: you are of contracting the virus in the future. And 57 00:03:17,680 --> 00:03:20,400 Speaker 1: based on those predictive analytics, you are as signed a 58 00:03:20,480 --> 00:03:23,880 Speaker 1: color and the color allows you either some access to 59 00:03:23,919 --> 00:03:27,160 Speaker 1: some government buildings or not, access to restaurants or not 60 00:03:27,760 --> 00:03:31,280 Speaker 1: um And that really is very frightening because our liberties 61 00:03:31,280 --> 00:03:34,240 Speaker 1: then are being conditioned based upon an algorithm, and an 62 00:03:34,240 --> 00:03:37,400 Speaker 1: algorithm that may actually be faulty more times than not. 63 00:03:38,360 --> 00:03:42,560 Speaker 1: Can the public do anything to stop high tech intrusions 64 00:03:42,640 --> 00:03:46,360 Speaker 1: into their civil liberties? Well, I think that the first 65 00:03:46,400 --> 00:03:49,360 Speaker 1: question is we do have our constitution protections. We do 66 00:03:49,440 --> 00:03:53,400 Speaker 1: have statutory protections such as hip or health information in 67 00:03:53,400 --> 00:03:57,000 Speaker 1: this country, and any kind of measure that, for instance, 68 00:03:57,040 --> 00:04:01,160 Speaker 1: would intrude upon or pass on our secret health information, 69 00:04:01,240 --> 00:04:03,200 Speaker 1: such as the fact that we had the virus, would 70 00:04:03,200 --> 00:04:06,160 Speaker 1: be now subject to HIPPA. So Congress would have to 71 00:04:06,200 --> 00:04:09,360 Speaker 1: debate this and decide to suspend HIPPA for the for 72 00:04:09,400 --> 00:04:13,760 Speaker 1: the period of this pandemic um And so we'll have 73 00:04:13,800 --> 00:04:16,720 Speaker 1: at least some ventilation, so to use that word, all 74 00:04:16,800 --> 00:04:20,560 Speaker 1: the issues now, and the Fourth Amendment does protect, according 75 00:04:20,560 --> 00:04:22,719 Speaker 1: to the Supreme Court, at least the location data of 76 00:04:22,760 --> 00:04:26,680 Speaker 1: our phones, So right now we have some protection from that. 77 00:04:26,760 --> 00:04:29,400 Speaker 1: But of course, if Congress passed the statute saying in 78 00:04:29,440 --> 00:04:33,760 Speaker 1: a limited emergency, we should have the ability to track 79 00:04:34,200 --> 00:04:37,039 Speaker 1: self one data so we can get a better sense 80 00:04:37,080 --> 00:04:39,880 Speaker 1: of the spread of the virus and then stop it, 81 00:04:40,560 --> 00:04:44,120 Speaker 1: that might be a sufficient governmental interest to override the 82 00:04:44,240 --> 00:04:48,080 Speaker 1: privacy interests at stake if we do that. The one 83 00:04:48,120 --> 00:04:50,720 Speaker 1: thing that I would urge if I were a legislator, 84 00:04:51,040 --> 00:04:55,400 Speaker 1: is to put in the bill protections for the long term. 85 00:04:55,440 --> 00:04:57,680 Speaker 1: And I have in mind I think three though of 86 00:04:57,680 --> 00:05:01,000 Speaker 1: other people may have others that they would suggest, like 87 00:05:01,200 --> 00:05:04,320 Speaker 1: we had with the Patriot Act about twenty years ago. 88 00:05:05,200 --> 00:05:08,040 Speaker 1: Any kind of bill that it collects that information should 89 00:05:08,080 --> 00:05:11,240 Speaker 1: be only valid for a short period of time and 90 00:05:11,240 --> 00:05:14,200 Speaker 1: then would have to reenact it within a matter of 91 00:05:14,240 --> 00:05:17,039 Speaker 1: a year or two. That's one protection. I think the 92 00:05:17,120 --> 00:05:19,520 Speaker 1: data of the week we would collect, because it is 93 00:05:19,560 --> 00:05:23,159 Speaker 1: such sense of data should be destroyed after a certain 94 00:05:23,160 --> 00:05:26,320 Speaker 1: amount of time. UM. So that the fear is, once 95 00:05:26,360 --> 00:05:29,200 Speaker 1: you have this data, you can massage it for so 96 00:05:29,240 --> 00:05:32,360 Speaker 1: many purposes. And you know, you can think about undocumented 97 00:05:32,720 --> 00:05:36,839 Speaker 1: citizens and sorry, documented individuals in this country UM, who 98 00:05:36,960 --> 00:05:39,599 Speaker 1: would not want to be tracked UM, and many other 99 00:05:39,680 --> 00:05:43,400 Speaker 1: groups that might be politically adverse to the current government. 100 00:05:43,480 --> 00:05:46,880 Speaker 1: So I think that the data might be used carefully 101 00:05:46,920 --> 00:05:50,360 Speaker 1: for to help our fight against the pandemic. But if 102 00:05:50,400 --> 00:05:53,520 Speaker 1: Congress does this debate, UM, I I hope that they 103 00:05:53,720 --> 00:05:57,599 Speaker 1: bake into WA the prevision provisions that they an act 104 00:05:58,120 --> 00:06:01,919 Speaker 1: real protections to make sure this doesn't carry forward into 105 00:06:02,000 --> 00:06:05,320 Speaker 1: a long term erosion of our civil liberties. When you 106 00:06:05,360 --> 00:06:08,119 Speaker 1: look at one instance, New York has passed a law 107 00:06:08,600 --> 00:06:12,599 Speaker 1: giving the governor unlimited authority to rule by executive order 108 00:06:12,680 --> 00:06:17,440 Speaker 1: during a crisis. Now, any litigation about these laws are 109 00:06:17,480 --> 00:06:21,719 Speaker 1: passed by states or Congress would take longer to get 110 00:06:21,720 --> 00:06:27,000 Speaker 1: through the courts, possibly than the pandemic itself. So then 111 00:06:27,120 --> 00:06:31,040 Speaker 1: that may be true. And and we don't know, I 112 00:06:31,040 --> 00:06:35,160 Speaker 1: mean still obviously the governor's ability to rule by executive 113 00:06:35,200 --> 00:06:38,920 Speaker 1: order doesn't eliminate our civil liberties, and we do have constructs, 114 00:06:39,000 --> 00:06:42,080 Speaker 1: but as a practical matter, it's difficult to get this 115 00:06:42,200 --> 00:06:46,360 Speaker 1: litigation UM resolved within a quick amount of time. It's 116 00:06:46,440 --> 00:06:50,000 Speaker 1: particularly given the courts are somewhat paralyzed. UM. So we 117 00:06:50,120 --> 00:06:53,640 Speaker 1: do have a risky situation here, and you can see 118 00:06:53,680 --> 00:06:57,039 Speaker 1: that by looking what's happened in Israel and Singapore, UM 119 00:06:57,120 --> 00:07:00,200 Speaker 1: and Hong Kong, where their vigorous civil liberties debates going 120 00:07:00,240 --> 00:07:04,080 Speaker 1: on right there now. Because of those surveillance activities taken 121 00:07:04,120 --> 00:07:06,600 Speaker 1: by the government, even if it's some good faith, they 122 00:07:06,600 --> 00:07:09,800 Speaker 1: can make mistakes. They can lead to sort of sort 123 00:07:09,800 --> 00:07:13,640 Speaker 1: of case by case errors UM. As well as this 124 00:07:13,880 --> 00:07:16,680 Speaker 1: massive trolls of information that could be used in the 125 00:07:16,680 --> 00:07:21,760 Speaker 1: future for nefarious purposes. And then if you start with this, 126 00:07:22,520 --> 00:07:26,600 Speaker 1: do you then risk opening the door to even more 127 00:07:26,680 --> 00:07:31,080 Speaker 1: kinds of surveillance in the future post pandemic sort of 128 00:07:31,120 --> 00:07:36,680 Speaker 1: like what happened, We become used to government surveillance. Uh, 129 00:07:36,720 --> 00:07:39,320 Speaker 1: then we become less worried about it. And I think 130 00:07:39,320 --> 00:07:41,960 Speaker 1: that's been true too, because the private sector has so 131 00:07:42,040 --> 00:07:45,840 Speaker 1: much information about us. Right if you think about what Google, Amazon, 132 00:07:45,920 --> 00:07:49,640 Speaker 1: and Facebook knows about us, we become sort of a 133 00:07:49,680 --> 00:07:52,239 Speaker 1: nerd to the loss of privacy. And so some people 134 00:07:52,280 --> 00:07:55,120 Speaker 1: may say, look, there is no privacy anymore, they might 135 00:07:55,120 --> 00:07:57,080 Speaker 1: as well let the government use it to maybe they'll 136 00:07:57,080 --> 00:08:01,040 Speaker 1: do something better with it than Amazon, Rule and Facebook 137 00:08:01,040 --> 00:08:05,240 Speaker 1: are doing. Right now, are there any indications that the 138 00:08:05,280 --> 00:08:12,160 Speaker 1: Trump administration is considering any of these surveillance matters. My 139 00:08:12,240 --> 00:08:16,040 Speaker 1: understanding is that the CDC is now embarking upon a 140 00:08:16,080 --> 00:08:19,680 Speaker 1: study of what kind of surveillance activities they can use 141 00:08:19,720 --> 00:08:22,520 Speaker 1: in the States. Obviously, it's a little late here, so 142 00:08:22,640 --> 00:08:25,160 Speaker 1: much of its spread already so that it's it can't 143 00:08:25,160 --> 00:08:28,880 Speaker 1: be used as effectively as it has been in other countries. 144 00:08:28,920 --> 00:08:30,800 Speaker 1: But there are still some parts of this country where 145 00:08:31,320 --> 00:08:35,400 Speaker 1: that kind of surveillance activities can be used quite effectively, 146 00:08:35,679 --> 00:08:37,560 Speaker 1: and I know that they are looking at what other 147 00:08:37,600 --> 00:08:40,520 Speaker 1: countries have done, but there have no are no concrete 148 00:08:40,520 --> 00:08:44,960 Speaker 1: proposals today. I'd expect one within probably a couple of days. 149 00:08:45,000 --> 00:08:52,119 Speaker 1: There's also the problem withouting of coronavirus patients and instances 150 00:08:52,120 --> 00:08:56,199 Speaker 1: of outing so that people know you're your health they 151 00:08:56,240 --> 00:08:59,600 Speaker 1: know your health data. In one instance, at the very 152 00:08:59,640 --> 00:09:02,320 Speaker 1: beginning of the outbreak in New York, New York City 153 00:09:02,320 --> 00:09:06,560 Speaker 1: Mayor Build de Blasio posted details about the second person 154 00:09:06,600 --> 00:09:09,400 Speaker 1: in New York to test positive for the virus, and 155 00:09:09,440 --> 00:09:15,600 Speaker 1: the man was identified quickly as patient zero. And apparently 156 00:09:15,600 --> 00:09:17,960 Speaker 1: there's a lot of that kind of outing going on 157 00:09:18,080 --> 00:09:22,440 Speaker 1: in different countries. And Singapore they're posting information online about 158 00:09:22,480 --> 00:09:25,760 Speaker 1: each coronavirus patient, and I think in South Korea as well, 159 00:09:26,640 --> 00:09:31,400 Speaker 1: So that's another problem. Yeah, the whole idea of confidentiality 160 00:09:31,520 --> 00:09:35,960 Speaker 1: and health of information is a long important part of 161 00:09:36,000 --> 00:09:39,040 Speaker 1: our culture. And we saw this with AIDS people who 162 00:09:39,080 --> 00:09:43,000 Speaker 1: have contracted age about twenty years ago, and how awful 163 00:09:43,000 --> 00:09:46,560 Speaker 1: it was if those names were posted in the public. Indeed, 164 00:09:46,960 --> 00:09:49,320 Speaker 1: some people followed the government did that that would result 165 00:09:49,360 --> 00:09:51,840 Speaker 1: in a due process violation. We're seeing that now We 166 00:09:51,880 --> 00:09:54,120 Speaker 1: saw that with the Blasio in New York, We've seen 167 00:09:54,160 --> 00:09:57,240 Speaker 1: in other countries, and I do hope that at the minimum, 168 00:09:57,280 --> 00:09:59,760 Speaker 1: even if this surveillance takes place, it takes place with 169 00:09:59,800 --> 00:10:04,079 Speaker 1: a recognition that we can ensure some kind of privacy 170 00:10:04,120 --> 00:10:06,840 Speaker 1: for those who are afflicted in their families as well. 171 00:10:07,440 --> 00:10:10,440 Speaker 1: Thanks so much, Harold. That's Harold crant the professor at 172 00:10:10,440 --> 00:10:14,200 Speaker 1: the Chicago Kent College of Law. Thanks for listening to 173 00:10:14,200 --> 00:10:17,520 Speaker 1: the Bloomberg Law Podcast. You can subscribe and listen to 174 00:10:17,559 --> 00:10:21,280 Speaker 1: the show on Apple Podcasts, SoundCloud, and on Bloomberg dot 175 00:10:21,320 --> 00:10:25,800 Speaker 1: com slash podcast. I'm June Brosso. This is Bloomberg