1 00:00:03,200 --> 00:00:07,960 Speaker 1: This is Bloomberg Law with June Brussel from Bloomberg Radio, 2 00:00:09,200 --> 00:00:12,200 Speaker 1: Rachel Holliman, do you risk your life for your son? 3 00:00:12,520 --> 00:00:14,560 Speaker 1: Who are you as a car park at the corner 4 00:00:14,560 --> 00:00:17,960 Speaker 1: of his block? That you start walking? Allow directions? You'll 5 00:00:18,000 --> 00:00:19,880 Speaker 1: left through this? How are you seeing us right now? 6 00:00:20,320 --> 00:00:24,800 Speaker 1: Somebody's had In the movie Eagle Eye, two people's lives 7 00:00:24,800 --> 00:00:28,400 Speaker 1: are thrown into turmoil by calls from an unknown woman 8 00:00:28,440 --> 00:00:32,880 Speaker 1: who knows their every move using information and communications technology. 9 00:00:33,159 --> 00:00:35,800 Speaker 1: Big brother watching is a theme in many sci fi 10 00:00:35,880 --> 00:00:38,720 Speaker 1: and action movies, but a recent hack of a massive 11 00:00:38,800 --> 00:00:42,520 Speaker 1: trove of security cameras may show how close to reality 12 00:00:42,600 --> 00:00:46,640 Speaker 1: that's becoming. A group of hackers who breed security cameras 13 00:00:46,720 --> 00:00:49,600 Speaker 1: start up for Kanna were able to view the live 14 00:00:49,720 --> 00:00:55,120 Speaker 1: feeds and archive videos of one fifty thousand cameras inside hospitals, 15 00:00:55,200 --> 00:01:00,760 Speaker 1: police departments, prisons, schools, jails, factories, gyms, and corporate offices, 16 00:01:01,160 --> 00:01:03,400 Speaker 1: and one of the hackers said they've found a user 17 00:01:03,520 --> 00:01:07,480 Speaker 1: name and password for an administrator account publicly exposed on 18 00:01:07,560 --> 00:01:10,639 Speaker 1: the internet. Joining me is Rena baje Vala, a partner 19 00:01:10,640 --> 00:01:14,440 Speaker 1: and Ice miller. What kind of privacy do you have 20 00:01:14,680 --> 00:01:18,280 Speaker 1: when you enter a public facility. Let's say you go 21 00:01:18,360 --> 00:01:21,720 Speaker 1: to a hospital or a police station. Do you have 22 00:01:21,800 --> 00:01:25,160 Speaker 1: any privacy rights when you're at in public The answer 23 00:01:25,240 --> 00:01:29,360 Speaker 1: is it depends really on the place you are entering. 24 00:01:29,760 --> 00:01:32,480 Speaker 1: For example, on one end of a spectrum, if you 25 00:01:32,560 --> 00:01:36,560 Speaker 1: are in a correctional facility, your right to privacy is 26 00:01:36,720 --> 00:01:40,880 Speaker 1: very limited there. The way that the privacy laws work 27 00:01:41,040 --> 00:01:46,080 Speaker 1: in the United States is sector specific, so certain types 28 00:01:46,160 --> 00:01:51,240 Speaker 1: of businesses have laws relating to the collection and disclosure 29 00:01:51,280 --> 00:01:55,080 Speaker 1: of private information, but as a whole, the country doesn't 30 00:01:55,120 --> 00:01:58,240 Speaker 1: have a national privacy law. So if you walk into 31 00:01:58,280 --> 00:02:02,000 Speaker 1: a health care facility and you are seeking treatment, you 32 00:02:02,080 --> 00:02:06,400 Speaker 1: are covered by HIPPA, But walking into a retail facility, 33 00:02:06,600 --> 00:02:10,160 Speaker 1: you don't have that type of protection. You know, and 34 00:02:10,320 --> 00:02:14,840 Speaker 1: businesses in this country do have the right to surveile 35 00:02:15,320 --> 00:02:19,920 Speaker 1: for security purposes and other legitimate business purposes. I was 36 00:02:20,280 --> 00:02:23,679 Speaker 1: shocked at some of the places where the video cam 37 00:02:23,840 --> 00:02:26,320 Speaker 1: was were and some of the things that were exposed, 38 00:02:26,360 --> 00:02:30,720 Speaker 1: For example, hospital staff ers pinning down a patient to 39 00:02:30,880 --> 00:02:34,639 Speaker 1: a bed. Don't you have any privacy rights inside a hospital. 40 00:02:35,080 --> 00:02:38,360 Speaker 1: HIPPA is one of the most protective privacy statutes that 41 00:02:38,440 --> 00:02:41,200 Speaker 1: we have in the United States. And I've seen that 42 00:02:41,360 --> 00:02:45,520 Speaker 1: the company that was tacked, Kato, had a case study 43 00:02:45,560 --> 00:02:48,760 Speaker 1: on one of the hospital that they work with that 44 00:02:49,040 --> 00:02:52,840 Speaker 1: we're saying that it's a Hippo compliant security system. So 45 00:02:53,360 --> 00:02:56,760 Speaker 1: it's not that there are not privacy right. You know, 46 00:02:56,880 --> 00:03:00,760 Speaker 1: the question will certainly be attracted through litigation in and 47 00:03:01,000 --> 00:03:04,600 Speaker 1: I imagine regulators like the Department of Healthy Human Services 48 00:03:04,720 --> 00:03:08,040 Speaker 1: Office for Civil Rights will be taking a close look 49 00:03:08,080 --> 00:03:11,919 Speaker 1: at this situation and looking at Bocada and its relationships. 50 00:03:11,919 --> 00:03:16,520 Speaker 1: But you know, hospital systems, other health facilities use vendors 51 00:03:16,520 --> 00:03:20,080 Speaker 1: for a wide variety of purposes, and perhaps it's a 52 00:03:20,560 --> 00:03:27,520 Speaker 1: internal risk recording to protect the health facility against legal liability. However, 53 00:03:27,960 --> 00:03:31,040 Speaker 1: the relationship with the video surveillance company was that these 54 00:03:31,080 --> 00:03:34,000 Speaker 1: are supposed to be secure videos. So in a situation 55 00:03:34,120 --> 00:03:38,240 Speaker 1: like that, you know that information being exposed has created 56 00:03:38,280 --> 00:03:43,080 Speaker 1: the liability. Let's say that the prison sues or the 57 00:03:43,160 --> 00:03:46,080 Speaker 1: school sues. What would this suit be based on? I mean, 58 00:03:46,120 --> 00:03:48,520 Speaker 1: does it depend on where they are and what the 59 00:03:48,640 --> 00:03:51,640 Speaker 1: laws are, or does depend on the contract. It's going 60 00:03:51,680 --> 00:03:54,520 Speaker 1: to depend on the contract between the parties. So they 61 00:03:54,560 --> 00:03:57,960 Speaker 1: are going to be provisions and the contract between that 62 00:03:58,080 --> 00:04:01,839 Speaker 1: correctional facility or that health care facility and vocata that 63 00:04:02,040 --> 00:04:05,840 Speaker 1: indicate the scope of liability. There's there are likely detailed 64 00:04:06,000 --> 00:04:10,119 Speaker 1: provisions relating to those privacy and security concerns that we're 65 00:04:10,160 --> 00:04:12,320 Speaker 1: talking about, So it would be a breach of contract. 66 00:04:12,440 --> 00:04:17,360 Speaker 1: Lawsuits likely well. One of the issues be how secure 67 00:04:17,839 --> 00:04:22,080 Speaker 1: the company kept the cameras and the video footage because 68 00:04:22,320 --> 00:04:28,600 Speaker 1: apparently employees had super Redmond privileges and had unrestricted access 69 00:04:28,640 --> 00:04:34,279 Speaker 1: to customers surveillance footage. Absolutely a critical question here is 70 00:04:34,320 --> 00:04:38,200 Speaker 1: going to be access controlled and limiting those controls to 71 00:04:38,360 --> 00:04:41,680 Speaker 1: the need to know basis of individuals. And you know 72 00:04:41,720 --> 00:04:45,480 Speaker 1: that there's a principle called principle of least privilege where 73 00:04:45,880 --> 00:04:50,159 Speaker 1: you manage your access controlled with the default assumption that 74 00:04:50,240 --> 00:04:53,640 Speaker 1: people will not have privileges unless they are required to 75 00:04:53,760 --> 00:04:56,240 Speaker 1: form their jobs. So that is going to get critical 76 00:04:56,320 --> 00:05:00,120 Speaker 1: question going forward. Now, what about the people who have 77 00:05:00,160 --> 00:05:04,080 Speaker 1: been filmed and exposed? What kind of a lawsuit can neighboring? 78 00:05:04,720 --> 00:05:07,880 Speaker 1: So this is a unique situation because in addition to 79 00:05:08,240 --> 00:05:12,200 Speaker 1: the video, there is from what I understand, some sort 80 00:05:12,200 --> 00:05:15,919 Speaker 1: of people analytics service provided. So I don't know what 81 00:05:16,040 --> 00:05:18,560 Speaker 1: the role of that is because that would also connect 82 00:05:18,760 --> 00:05:23,440 Speaker 1: the video to a person and their history, and their 83 00:05:23,520 --> 00:05:27,680 Speaker 1: vehicle history and all types of other publicly available information. 84 00:05:28,080 --> 00:05:32,680 Speaker 1: But in terms of individuals, their main focus likely for 85 00:05:32,839 --> 00:05:36,240 Speaker 1: potential lawsuit would be under tort law. So there are 86 00:05:36,320 --> 00:05:41,560 Speaker 1: privacy tort that include things like unlawfully intruding into someone's 87 00:05:41,600 --> 00:05:46,080 Speaker 1: private affairs or disclosing someone's private information. So you can 88 00:05:46,120 --> 00:05:50,719 Speaker 1: see a scenario where if somebody was in, for example, 89 00:05:50,760 --> 00:05:54,960 Speaker 1: a mental health facility and that video was exposed that 90 00:05:55,240 --> 00:05:59,640 Speaker 1: had an individual in a sensitive situation that was had, 91 00:06:00,000 --> 00:06:03,320 Speaker 1: that individual might be able to make a privacy tort 92 00:06:03,440 --> 00:06:06,880 Speaker 1: claim in a lawsuit. And you know, in terms of 93 00:06:06,920 --> 00:06:09,799 Speaker 1: who they might see, they might do all parties involved. 94 00:06:10,400 --> 00:06:14,760 Speaker 1: I assume the company makes representations about the security of 95 00:06:14,760 --> 00:06:18,680 Speaker 1: its cameras to potential customers. How much does it matter 96 00:06:18,800 --> 00:06:22,680 Speaker 1: if the company fell short in that regard? For example, 97 00:06:22,800 --> 00:06:26,320 Speaker 1: suppose it was actually quite easy for the hackers to 98 00:06:26,360 --> 00:06:30,440 Speaker 1: get these admin privileges. If they fell short of their 99 00:06:31,560 --> 00:06:38,039 Speaker 1: assurances of security, that would create liability under the contract 100 00:06:38,360 --> 00:06:41,960 Speaker 1: for that company. Um, you know, one place to look 101 00:06:42,200 --> 00:06:48,400 Speaker 1: is at what Bricada's advertising statements are. And you know, 102 00:06:48,440 --> 00:06:50,279 Speaker 1: I took a look at their website. They have a 103 00:06:50,360 --> 00:06:53,479 Speaker 1: section on security on their website that goes through in 104 00:06:53,520 --> 00:06:57,039 Speaker 1: detail what their security measures are. Um. You know, companies 105 00:06:57,080 --> 00:07:00,800 Speaker 1: have privacy policies where they indicate in terms of us, 106 00:07:00,839 --> 00:07:05,040 Speaker 1: where they indicate how they will protect your information. All 107 00:07:05,080 --> 00:07:09,880 Speaker 1: of these statements, certainly for your regulators like the FTC, 108 00:07:10,640 --> 00:07:14,440 Speaker 1: will be of critical importance to determining well, if they 109 00:07:14,440 --> 00:07:18,120 Speaker 1: made statements and they did not live up to those statements, 110 00:07:18,200 --> 00:07:22,040 Speaker 1: that creates um exposure and liability both from a regulatory 111 00:07:22,120 --> 00:07:26,720 Speaker 1: perspective and in lawsuits. What's the role of the FTC here? 112 00:07:27,440 --> 00:07:31,880 Speaker 1: So the FTC in particular look at privacy policies and 113 00:07:32,000 --> 00:07:35,880 Speaker 1: where companies have not satisfied their promises that are in 114 00:07:35,920 --> 00:07:40,560 Speaker 1: a privacy policy. The FTC has jurisdiction to investigate and 115 00:07:40,640 --> 00:07:46,040 Speaker 1: bring enforcement actions. What are the regulatory agencies might get involved? 116 00:07:46,840 --> 00:07:49,880 Speaker 1: So the Department of Health and Human Services Office for 117 00:07:50,000 --> 00:07:54,480 Speaker 1: Civil Rights has jurisdiction over healthcare entities, so they would 118 00:07:54,480 --> 00:07:59,040 Speaker 1: be enforcing HIPPA and high tech the related law in 119 00:07:59,080 --> 00:08:01,880 Speaker 1: connection with the health care realm. And then you have 120 00:08:02,040 --> 00:08:05,960 Speaker 1: attorneys general, So the California Attorney General, you know this 121 00:08:06,080 --> 00:08:10,680 Speaker 1: is a California company. We can certainly expect some investigation 122 00:08:11,120 --> 00:08:16,640 Speaker 1: action from the California Attorney General. This conversation and sort 123 00:08:16,680 --> 00:08:20,600 Speaker 1: of the piecemeal regulatory approach to this, heck, bring up 124 00:08:20,640 --> 00:08:24,920 Speaker 1: the question do we need a federal privacy law. I 125 00:08:24,960 --> 00:08:28,400 Speaker 1: think everyone agrees that we need a federal privacy law, 126 00:08:28,600 --> 00:08:32,160 Speaker 1: and that's people on all ends of the spectrum of 127 00:08:32,640 --> 00:08:39,400 Speaker 1: privacy advocacy or protectionism. There is a way to comprehensively 128 00:08:39,760 --> 00:08:44,320 Speaker 1: manage data that does not focus on sector specific or 129 00:08:44,840 --> 00:08:49,079 Speaker 1: state by state laws, and instance like this can help 130 00:08:49,320 --> 00:08:55,320 Speaker 1: motivate legislators to move forward on a federal data privacy bill. 131 00:08:56,440 --> 00:08:59,080 Speaker 1: There are a lot of people have these wireless home cameras. 132 00:08:59,520 --> 00:09:02,120 Speaker 1: What do you have to do to make sure that 133 00:09:02,200 --> 00:09:06,480 Speaker 1: this doesn't happen? Do you have any projection from someone 134 00:09:06,600 --> 00:09:10,960 Speaker 1: accessing what's in the cloud? I think we're relying on 135 00:09:11,040 --> 00:09:13,720 Speaker 1: the companies to have the security, so you know, I 136 00:09:13,720 --> 00:09:17,520 Speaker 1: would I would take a look and vet um the companies, 137 00:09:17,559 --> 00:09:23,480 Speaker 1: but unfortunately there's no guarantee that, uh, you know, anything 138 00:09:23,520 --> 00:09:27,200 Speaker 1: that is connected can be hacked. And so that's really 139 00:09:27,240 --> 00:09:30,280 Speaker 1: where we're going in terms of technology is everything is 140 00:09:30,280 --> 00:09:33,640 Speaker 1: getting connected and interconnected, and we have Internet of things, 141 00:09:34,520 --> 00:09:39,440 Speaker 1: so this is a real concern to push back against that. UM, 142 00:09:39,559 --> 00:09:46,200 Speaker 1: and some states are legislating two ensure that reasonable security 143 00:09:46,240 --> 00:09:50,680 Speaker 1: features are in the Internet of Things devices, like California 144 00:09:50,800 --> 00:09:55,280 Speaker 1: has SP three, which is IoT law. UM. But I 145 00:09:55,320 --> 00:09:58,040 Speaker 1: think that the things you can do is you can 146 00:09:58,080 --> 00:10:01,400 Speaker 1: do as a consumer are make sure you understand what 147 00:10:01,559 --> 00:10:05,280 Speaker 1: the access rights are to your account. So if they 148 00:10:05,320 --> 00:10:09,600 Speaker 1: give you a default password, change that password. That is 149 00:10:09,679 --> 00:10:12,840 Speaker 1: something that is maybe even the simplest. You know, you 150 00:10:12,880 --> 00:10:17,200 Speaker 1: can inquire about what the security measures are, but you know, 151 00:10:17,280 --> 00:10:20,520 Speaker 1: you can't expect every consumer to be an expert in security. 152 00:10:21,080 --> 00:10:25,600 Speaker 1: So you know, the safest root is to not go connected, um, 153 00:10:25,640 --> 00:10:27,440 Speaker 1: you know, and and to not have things stored in 154 00:10:27,440 --> 00:10:31,480 Speaker 1: the cloud. If you can have a simpler camera that 155 00:10:31,600 --> 00:10:34,040 Speaker 1: you know, records a certain amount of footage and then 156 00:10:34,400 --> 00:10:38,199 Speaker 1: you know, rewrites over it. UH, that's a safer place 157 00:10:38,280 --> 00:10:41,880 Speaker 1: to be from a security perspective. UM. But you know, 158 00:10:42,000 --> 00:10:48,800 Speaker 1: convenience UH and technological innovation has has trended towards winning 159 00:10:48,800 --> 00:10:52,240 Speaker 1: out on those on those battles. Thanks Rina. That's Rina 160 00:10:52,240 --> 00:10:57,400 Speaker 1: baje Vala of Ice Miller. The Buying Justice Department is 161 00:10:57,480 --> 00:11:00,280 Speaker 1: using a controversial statute that was a that did in 162 00:11:00,360 --> 00:11:04,000 Speaker 1: nineteen sixty eight, at a time of civil rights demonstrations 163 00:11:04,040 --> 00:11:07,600 Speaker 1: and anti war protests, to charge more than sixty people 164 00:11:07,600 --> 00:11:10,720 Speaker 1: in the capital riots. The law had rarely been used 165 00:11:10,760 --> 00:11:14,720 Speaker 1: since the Nixon administration until the Trump administration began using 166 00:11:14,720 --> 00:11:17,720 Speaker 1: it to charge people in the Black Lives Matters protests. 167 00:11:18,320 --> 00:11:22,120 Speaker 1: Joining me, as former federal prosecutor Michael Zelden tell us, 168 00:11:22,120 --> 00:11:26,000 Speaker 1: the history of the Federal Anti Riot Act. The history 169 00:11:26,040 --> 00:11:30,119 Speaker 1: of the statute, which is sort of dubious in many respects, 170 00:11:30,400 --> 00:11:35,240 Speaker 1: was an effort, I think, to use the powers of 171 00:11:35,600 --> 00:11:40,600 Speaker 1: law enforcement to arrest disrupt those who were engaged in 172 00:11:41,200 --> 00:11:44,440 Speaker 1: acts of civil disobedience. I think that the history of 173 00:11:44,520 --> 00:11:47,640 Speaker 1: this thing was to make sure that the police were 174 00:11:48,080 --> 00:11:52,640 Speaker 1: able to ensure that those who blocked the bridges in 175 00:11:52,760 --> 00:11:58,360 Speaker 1: Selma or involved themselves in protests that led to shit 176 00:11:58,480 --> 00:12:02,880 Speaker 1: ins and closures of commercial establishments could be charged with 177 00:12:03,040 --> 00:12:08,599 Speaker 1: an offense beyond the normal still disobedient type of statutes 178 00:12:08,720 --> 00:12:10,760 Speaker 1: that already were in exist. So I think they have 179 00:12:10,800 --> 00:12:15,959 Speaker 1: a very dubious history. Does that history matter in how 180 00:12:16,000 --> 00:12:20,959 Speaker 1: the statute is used by law enforcement, Well, it doesn't 181 00:12:21,080 --> 00:12:25,280 Speaker 1: matter in a technical legal sense. That is, if the 182 00:12:25,320 --> 00:12:28,600 Speaker 1: statute was passed if its history is sort of for 183 00:12:28,760 --> 00:12:33,000 Speaker 1: bad purpose, That is, they meant this as a statute 184 00:12:33,040 --> 00:12:38,240 Speaker 1: to impose obedience to the state from those who are 185 00:12:38,360 --> 00:12:44,640 Speaker 1: engaged in civil disobedience. That doesn't make the statute infirm legally, 186 00:12:44,800 --> 00:12:48,480 Speaker 1: meaning that if someone chose to use the statute in 187 00:12:48,520 --> 00:12:52,280 Speaker 1: the future for some other purpose other than what maybe 188 00:12:52,280 --> 00:12:57,240 Speaker 1: the legislators had as their original meaning, it doesn't mean 189 00:12:57,400 --> 00:13:02,320 Speaker 1: the statue can't be used. So put other words, it 190 00:13:02,400 --> 00:13:06,760 Speaker 1: may be bad in its intention, but it may be 191 00:13:07,200 --> 00:13:12,840 Speaker 1: legal in its effect. Now, this statute hadn't been used 192 00:13:13,000 --> 00:13:16,559 Speaker 1: since the Nixon era until it was used in the 193 00:13:16,600 --> 00:13:20,560 Speaker 1: Trump administration. Yeah, more or less. I mean, then you 194 00:13:20,559 --> 00:13:23,480 Speaker 1: may find an odd case here or there. But as 195 00:13:23,480 --> 00:13:28,920 Speaker 1: I said, these cases involved cases of civil disorder. The 196 00:13:28,920 --> 00:13:32,720 Speaker 1: statute has a definition that says we shall use this 197 00:13:32,920 --> 00:13:37,800 Speaker 1: when interstate commerce is interfered with in the course of 198 00:13:37,840 --> 00:13:43,680 Speaker 1: a civil disorder, which means any public disturbance involving acts 199 00:13:43,720 --> 00:13:47,160 Speaker 1: of violence or by assemblages of three or more people 200 00:13:47,679 --> 00:13:51,960 Speaker 1: which causes immediate danger of or results in damage to 201 00:13:52,120 --> 00:13:56,360 Speaker 1: injury of property your person. So these are intended to 202 00:13:57,120 --> 00:14:01,840 Speaker 1: disrupt disturbances. And yes, of course, the Nixon administration used 203 00:14:01,840 --> 00:14:06,440 Speaker 1: them against anti war protesters, and the Trump administration used 204 00:14:06,440 --> 00:14:10,959 Speaker 1: them against what they called the Antifa civil disobedient gatherers 205 00:14:11,000 --> 00:14:14,800 Speaker 1: in Portland and Seattle and Minneapolis and other such cities. 206 00:14:16,120 --> 00:14:20,960 Speaker 1: Bill Barr apparently he told his U S attorneys two 207 00:14:21,840 --> 00:14:27,840 Speaker 1: pursue aggressive federal prosecution against protesters in the Black Lives 208 00:14:27,880 --> 00:14:32,480 Speaker 1: Matters movement, and they used this particular statute to do that. 209 00:14:33,640 --> 00:14:38,000 Speaker 1: When Jeff Sessions became Attorney General of the United States, 210 00:14:38,040 --> 00:14:43,000 Speaker 1: he reversed the policy of the Obama administration, which was 211 00:14:43,160 --> 00:14:49,120 Speaker 1: to use judgment in charging criminal offensive. Sessions wanted to 212 00:14:49,360 --> 00:14:54,200 Speaker 1: charge the most serious charge available bar When he became 213 00:14:54,240 --> 00:14:57,920 Speaker 1: Attorney general, also said that he believed that the most 214 00:14:58,120 --> 00:15:03,360 Speaker 1: serious charge availables being used, and particularly he singled out 215 00:15:03,840 --> 00:15:10,280 Speaker 1: the protesters in Oregon and Minneapolis and in Seattle as 216 00:15:10,640 --> 00:15:14,920 Speaker 1: essentially inciting an insurrection. I think was some of the 217 00:15:15,000 --> 00:15:19,160 Speaker 1: languages that he used during his um press statements. And 218 00:15:19,240 --> 00:15:24,160 Speaker 1: so here we find prosecutors bringing this charge eight United 219 00:15:24,160 --> 00:15:27,360 Speaker 1: States Code such as two thirty one, which is this 220 00:15:28,200 --> 00:15:33,680 Speaker 1: charging of people of interfering with the police or firefighters 221 00:15:33,680 --> 00:15:38,040 Speaker 1: who are trying to deal with a civil disturbance, and 222 00:15:38,160 --> 00:15:41,960 Speaker 1: they cite that it had a negative impact on interstate 223 00:15:42,600 --> 00:15:45,560 Speaker 1: commerce as they used weapons in order to do this, 224 00:15:45,680 --> 00:15:51,840 Speaker 1: and that therefore this ANSI assemblage statute, the statute really 225 00:15:51,960 --> 00:15:55,240 Speaker 1: had some sense of its origins, as I said, in 226 00:15:56,000 --> 00:16:01,800 Speaker 1: targeting civil disobedient protesters. They've decided to use this statute 227 00:16:02,080 --> 00:16:06,520 Speaker 1: that what they think most promperate charge in cases like this. 228 00:16:07,200 --> 00:16:12,560 Speaker 1: I think that it's problematic from a a charging philosophy standpoint. 229 00:16:12,640 --> 00:16:16,240 Speaker 1: I don't know why this would be the appropriate charge 230 00:16:16,400 --> 00:16:20,680 Speaker 1: to bring when there are all sorts of more straightforward 231 00:16:20,760 --> 00:16:26,320 Speaker 1: statutes like assault on a police officer, destruction of property, 232 00:16:26,720 --> 00:16:29,280 Speaker 1: destruction of the federal property. There are a lot of 233 00:16:30,080 --> 00:16:35,360 Speaker 1: great forward state and federal statutes that governed the behavior 234 00:16:35,480 --> 00:16:40,000 Speaker 1: that they things should be criminalized, and why they're using 235 00:16:40,040 --> 00:16:44,240 Speaker 1: this is not easily from understandable to my politics. So 236 00:16:44,520 --> 00:16:50,200 Speaker 1: more than sixty of the rioters charged in the Capital 237 00:16:50,560 --> 00:16:55,800 Speaker 1: riot have been charged with this civil disobedience charge. Does 238 00:16:55,840 --> 00:17:01,520 Speaker 1: it seem appropriate there? Well? My view of the law 239 00:17:01,840 --> 00:17:06,399 Speaker 1: is that if it's appropriate for one category, it should 240 00:17:06,440 --> 00:17:10,680 Speaker 1: be appropriate for another category. If it's inappropriate for one 241 00:17:10,720 --> 00:17:15,160 Speaker 1: category to be inappropriate for another category. So my politics 242 00:17:15,400 --> 00:17:18,359 Speaker 1: is that if I were a federal prosecutor. I would 243 00:17:18,400 --> 00:17:21,120 Speaker 1: not bring a charge that has this, you know, sort 244 00:17:21,160 --> 00:17:24,879 Speaker 1: of terrible history behind it, and I wouldn't bring it 245 00:17:25,000 --> 00:17:28,000 Speaker 1: in the Portland cases, and I wouldn't bring it in 246 00:17:28,080 --> 00:17:31,880 Speaker 1: the Capital riot cases. I think that there are other 247 00:17:32,000 --> 00:17:36,280 Speaker 1: statutes that get at what you want to charge people 248 00:17:36,359 --> 00:17:40,000 Speaker 1: with without invoking a statue that as such a horrible 249 00:17:40,080 --> 00:17:43,520 Speaker 1: history to it. So what other statutes would they could 250 00:17:43,520 --> 00:17:48,960 Speaker 1: they use in the Capital riots? Illegal entry, staying beyond 251 00:17:49,400 --> 00:17:54,240 Speaker 1: your entitlement to stay, destruction of property, assault on a 252 00:17:54,280 --> 00:17:57,679 Speaker 1: police officer. There may be a murder charge that comes 253 00:17:57,680 --> 00:18:01,680 Speaker 1: out of this as well. So though destruction of property 254 00:18:01,840 --> 00:18:07,320 Speaker 1: and and and assault styled statutes are the most easily 255 00:18:07,359 --> 00:18:13,439 Speaker 1: available for prosecutors to use. You know, I understand that 256 00:18:13,760 --> 00:18:18,080 Speaker 1: what took place at the Capitol was unique, and maybe 257 00:18:18,160 --> 00:18:21,560 Speaker 1: it requires a unique charge to be brought. And it's 258 00:18:21,640 --> 00:18:26,400 Speaker 1: not that this is equivalent to what took place in Portland. 259 00:18:26,960 --> 00:18:29,359 Speaker 1: What took place in Portland, I think is very different 260 00:18:29,359 --> 00:18:32,359 Speaker 1: than what took place at the Capitol. I just don't 261 00:18:32,400 --> 00:18:36,320 Speaker 1: like this statute. And we talked previously about beware of 262 00:18:37,080 --> 00:18:41,840 Speaker 1: impending war on domestic terror and what would that mean 263 00:18:42,400 --> 00:18:45,000 Speaker 1: in terms of the passage of new statutes with the 264 00:18:45,000 --> 00:18:48,880 Speaker 1: application of old statutes to new conduct. This is what 265 00:18:48,920 --> 00:18:51,520 Speaker 1: we talked about in an earlier conversation, and this is 266 00:18:51,560 --> 00:18:55,120 Speaker 1: exactly what we're seeing. This is conduct that I think, 267 00:18:56,040 --> 00:18:59,680 Speaker 1: in many respects has a free speech overlaid to it. 268 00:18:59,680 --> 00:19:02,760 Speaker 1: It may not be exactly on point, but it is 269 00:19:02,800 --> 00:19:07,720 Speaker 1: surely there. And I just don't like statutes that criminalize 270 00:19:07,960 --> 00:19:11,480 Speaker 1: anything that has to do with a free speech, right 271 00:19:11,560 --> 00:19:15,600 Speaker 1: to assemble, right to protest component to it. Is one 272 00:19:15,640 --> 00:19:19,159 Speaker 1: of the reasons that prosecutors are using this is that 273 00:19:19,200 --> 00:19:23,600 Speaker 1: it's a felony with up to five years in prison possible. Well, 274 00:19:23,640 --> 00:19:27,480 Speaker 1: that's a great question, because you know, assault and and 275 00:19:28,119 --> 00:19:33,600 Speaker 1: those types of statutes carry you hefty prison sentences as well. 276 00:19:33,720 --> 00:19:37,640 Speaker 1: I think that their thought is, you know, psychoanalyzing people. 277 00:19:37,680 --> 00:19:40,120 Speaker 1: I don't know, but I think their thought is that 278 00:19:40,760 --> 00:19:47,480 Speaker 1: this is what these protesters did. They interfered with the 279 00:19:47,520 --> 00:19:52,000 Speaker 1: application of police and fire authority in the context of 280 00:19:52,000 --> 00:19:54,720 Speaker 1: a civil disobedience. So the statute is right on point, 281 00:19:55,320 --> 00:19:58,440 Speaker 1: and so they're going to charge it. So I don't 282 00:19:58,520 --> 00:20:01,400 Speaker 1: assert bad motives as to these people because I don't 283 00:20:01,440 --> 00:20:04,240 Speaker 1: I don't know them. I think that they do think 284 00:20:04,320 --> 00:20:09,119 Speaker 1: that this statute is what is most applicable to the behavior. 285 00:20:09,720 --> 00:20:12,840 Speaker 1: I just wouldn't do it myself because I don't like yes, 286 00:20:12,920 --> 00:20:14,960 Speaker 1: I said twice. Now, I don't like its history. I 287 00:20:15,000 --> 00:20:18,680 Speaker 1: don't like what it's how it was intended, and I 288 00:20:18,720 --> 00:20:23,040 Speaker 1: think that it may well be vague um from a 289 00:20:23,200 --> 00:20:27,439 Speaker 1: First Amendment law enforcement standpoint, because you have to be 290 00:20:27,480 --> 00:20:30,960 Speaker 1: part of a civil disorder um when this is taking place, 291 00:20:31,000 --> 00:20:33,440 Speaker 1: and it's really not all that clear to me how 292 00:20:33,480 --> 00:20:36,800 Speaker 1: one wants to define that or how one could have 293 00:20:37,000 --> 00:20:40,520 Speaker 1: defined that as it relates to one's First Amendment right 294 00:20:40,600 --> 00:20:46,639 Speaker 1: to address grievances, protests symbol speak. You know, there are 295 00:20:46,680 --> 00:20:50,480 Speaker 1: several cases in Oregon and where there are similar allegations 296 00:20:50,520 --> 00:20:54,000 Speaker 1: to some of the cases in the Capital, and the 297 00:20:54,080 --> 00:21:00,200 Speaker 1: defense there is challenging the use of this statute is 298 00:21:00,240 --> 00:21:04,040 Speaker 1: their legal challenge. But as I understand it, as I 299 00:21:04,160 --> 00:21:10,240 Speaker 1: best understand it, it is again that this statute, the 300 00:21:10,359 --> 00:21:15,399 Speaker 1: application of this statute to this conduct is inappropriate. That 301 00:21:15,600 --> 00:21:20,000 Speaker 1: what this person did is not a violation of of 302 00:21:20,240 --> 00:21:24,120 Speaker 1: of two one, that he may be chargeable with some 303 00:21:24,200 --> 00:21:29,800 Speaker 1: type of assaultive conduct with respect to spraying the bare spray, 304 00:21:29,960 --> 00:21:35,000 Speaker 1: But it wasn't an effort to disrupt the police in 305 00:21:35,040 --> 00:21:40,680 Speaker 1: the context of a civil disorder implicating interstate and foreign commerce. 306 00:21:40,760 --> 00:21:43,639 Speaker 1: I think that they're taking a technical defense. This statute 307 00:21:43,880 --> 00:21:47,240 Speaker 1: doesn't apply, and in addition, if it were to apply, 308 00:21:47,840 --> 00:21:55,119 Speaker 1: it is unconstitutionally vague. And what's the prosecution's response. The 309 00:21:55,160 --> 00:21:59,480 Speaker 1: prosecution is saying that we are using this statute in 310 00:21:59,520 --> 00:22:06,160 Speaker 1: a very narrow context, that it is the application of 311 00:22:06,200 --> 00:22:11,920 Speaker 1: the statute in respective conduct as relates to the federal 312 00:22:12,119 --> 00:22:15,520 Speaker 1: government business being conducted. There was an assault on the 313 00:22:15,560 --> 00:22:20,080 Speaker 1: federal courthouse, and so they're saying that in this case 314 00:22:20,680 --> 00:22:27,080 Speaker 1: has narrowly applied in the circumstances. The statute is legitimate, 315 00:22:27,200 --> 00:22:33,800 Speaker 1: it's not overly broad, and it's appropriately targeted to the 316 00:22:33,840 --> 00:22:38,320 Speaker 1: type of violent conduct that the individual charge with it 317 00:22:38,760 --> 00:22:41,240 Speaker 1: engaged in. And they have photos of him with pepper 318 00:22:41,400 --> 00:22:45,639 Speaker 1: spray and with a knife and things of that sort. 319 00:22:46,119 --> 00:22:50,040 Speaker 1: But the defendant is saying, this is an overly broad statute, 320 00:22:50,040 --> 00:22:52,919 Speaker 1: it's unconstitutional, doesn't shouldn't apply to me. It should be 321 00:22:52,960 --> 00:22:57,080 Speaker 1: struck down as an illegal statute, as a government saying no, no, no, 322 00:22:57,600 --> 00:23:00,080 Speaker 1: as it was applied to you in this case with 323 00:23:00,119 --> 00:23:02,600 Speaker 1: respect to the conduct that you engaged in at that 324 00:23:02,720 --> 00:23:08,160 Speaker 1: point in time. It's perfectly appropriate, lawfully applied, and framed 325 00:23:08,200 --> 00:23:12,840 Speaker 1: in a constitutionally adequate way. So our federal prosecutors in 326 00:23:13,000 --> 00:23:18,359 Speaker 1: d C concerned about what happens, what the ruling is 327 00:23:19,119 --> 00:23:23,800 Speaker 1: in Oregon as far as the statute is concerned. Yeah. 328 00:23:23,840 --> 00:23:28,280 Speaker 1: Of course, if this case were to go forward and 329 00:23:29,080 --> 00:23:32,000 Speaker 1: the court were to rule, for example, that the statute 330 00:23:32,240 --> 00:23:37,560 Speaker 1: was unconstitutionally vague, it was overbroad, or it was inappropriately 331 00:23:37,640 --> 00:23:40,760 Speaker 1: applied in some way, that would have a ripple effect 332 00:23:40,840 --> 00:23:46,800 Speaker 1: on the charges that were brought against the January six insurrectionists. So, 333 00:23:48,280 --> 00:23:52,639 Speaker 1: because they're used in both places, any case that goes 334 00:23:52,680 --> 00:23:58,240 Speaker 1: first had become president for how it's interpreted down the line. 335 00:23:58,800 --> 00:24:05,400 Speaker 1: So Yes, and UM not sure that this case will 336 00:24:05,440 --> 00:24:08,240 Speaker 1: continue to go forward, that the U. S. Attorney will 337 00:24:08,480 --> 00:24:12,240 Speaker 1: continue to prosecute under this statute because of that the 338 00:24:12,359 --> 00:24:15,240 Speaker 1: ripple effect that they may decide and as they're doing 339 00:24:15,280 --> 00:24:18,400 Speaker 1: with many other cases in Oregon, which is dismissing them 340 00:24:19,600 --> 00:24:24,320 Speaker 1: and allowing the state to bring the more appropriate charges 341 00:24:24,359 --> 00:24:26,920 Speaker 1: that we've been talking about, the type of assault charges, 342 00:24:27,000 --> 00:24:32,159 Speaker 1: destruction of property, UH charges, failure to obey a police 343 00:24:32,160 --> 00:24:37,840 Speaker 1: officer type of charge. Those federal cases are being dismissed 344 00:24:37,840 --> 00:24:42,360 Speaker 1: in favor of the more direct state cases. And we've 345 00:24:42,359 --> 00:24:44,800 Speaker 1: seen this before. We've seen this. For example, in the 346 00:24:44,920 --> 00:24:48,600 Speaker 1: Rodney King case, they were assault charges brought under the 347 00:24:48,640 --> 00:24:53,240 Speaker 1: state laws of California. When those cases didn't proceed well, 348 00:24:53,800 --> 00:24:57,560 Speaker 1: they then decided that they would bring a federal Civil 349 00:24:57,640 --> 00:25:04,000 Speaker 1: Rights Act violation charge against the officers. So normally you 350 00:25:04,240 --> 00:25:08,920 Speaker 1: start with the statutes that apply most directly, in my 351 00:25:09,160 --> 00:25:11,680 Speaker 1: opinion as a prosecutor, and then though in this case, 352 00:25:11,720 --> 00:25:15,400 Speaker 1: those would be the state styled charges, and then you'd 353 00:25:15,400 --> 00:25:18,920 Speaker 1: see how it evolved if, for example, you were able 354 00:25:18,960 --> 00:25:22,159 Speaker 1: to convict the individuals that you want to convict or 355 00:25:22,760 --> 00:25:28,359 Speaker 1: assault on a police officer, destruction and government five year plus, 356 00:25:29,320 --> 00:25:32,719 Speaker 1: Why we do then need to bring a federal charge 357 00:25:32,840 --> 00:25:39,800 Speaker 1: using this statue. It's terrible past and possibly aspect to 358 00:25:39,920 --> 00:25:43,199 Speaker 1: it when you don't have to. So I like to 359 00:25:43,240 --> 00:25:47,600 Speaker 1: proceed from that which is most easily proved, most directly chargeable, 360 00:25:47,920 --> 00:25:51,639 Speaker 1: and leave these more obscure statutes to another day. Despite 361 00:25:51,680 --> 00:25:54,800 Speaker 1: that your preference, and you're right, the U. S. Attorney 362 00:25:54,800 --> 00:25:57,840 Speaker 1: important has dropped more than thirty of the nine cases. 363 00:25:58,160 --> 00:26:01,280 Speaker 1: If they decide to go forward. Is this an uphill 364 00:26:01,320 --> 00:26:03,960 Speaker 1: battle for the defense to get the charges dismissed on 365 00:26:04,000 --> 00:26:07,080 Speaker 1: this basis? I think so yes, Because remember what we 366 00:26:07,119 --> 00:26:11,120 Speaker 1: said at the very outset of our conversation was, while 367 00:26:11,200 --> 00:26:14,800 Speaker 1: the statute has a sort of bad history, and while 368 00:26:15,000 --> 00:26:18,720 Speaker 1: the proponents of it I think had bad political motives 369 00:26:18,720 --> 00:26:23,600 Speaker 1: and enacting a statute like this, those motives don't in 370 00:26:23,680 --> 00:26:28,760 Speaker 1: and of themselves make the statute legally infirm. And so 371 00:26:29,040 --> 00:26:34,359 Speaker 1: if the statute is legally correct, the defense citing the 372 00:26:34,640 --> 00:26:38,320 Speaker 1: history of it doesn't win. They maybe win in a 373 00:26:38,359 --> 00:26:41,720 Speaker 1: court of public opinion, but they don't win in a 374 00:26:41,760 --> 00:26:47,919 Speaker 1: courtroom that is determining whether the statute is constitutional or unconstitutional. Usage. 375 00:26:48,520 --> 00:26:50,920 Speaker 1: Thanks for being on the Bloomberg Lawn Show. Michael. That's 376 00:26:50,960 --> 00:26:54,560 Speaker 1: former federal prosecutor Michael Zelden, and that's it for the 377 00:26:54,560 --> 00:26:57,480 Speaker 1: sedition of the Bloomberg Law Show. Remember you can always 378 00:26:57,480 --> 00:27:00,119 Speaker 1: at the latest legal news on our Bloomberg Lawn Podcast US. 379 00:27:00,320 --> 00:27:03,320 Speaker 1: You can find them on Apple Podcasts, Spotify, and at 380 00:27:03,480 --> 00:27:08,800 Speaker 1: www dot Bloomberg dot com, Slash podcast Slash Law. I'm 381 00:27:08,880 --> 00:27:11,880 Speaker 1: June Grosso. Thanks so much for listening. Please tune into 382 00:27:11,880 --> 00:27:14,159 Speaker 1: the Bloomberg and Wall Show every week night attend in 383 00:27:14,240 --> 00:27:16,440 Speaker 1: Eastern right here on Bloomberg Radio,