1 00:00:00,040 --> 00:00:02,639 Speaker 1: When the Security Is and Exchange Commission created the system 2 00:00:02,680 --> 00:00:06,160 Speaker 1: known as EDGAR in the nineteen nineties to make corporate 3 00:00:06,240 --> 00:00:09,680 Speaker 1: key corporate filings publicly available. It was hailed as a 4 00:00:09,760 --> 00:00:12,639 Speaker 1: victory for transparency that would help level the playing field 5 00:00:12,680 --> 00:00:16,759 Speaker 1: for investors, and the system is many years later now 6 00:00:17,520 --> 00:00:20,080 Speaker 1: very heavily used. According to the SEC, it gets more 7 00:00:20,079 --> 00:00:22,960 Speaker 1: than one point seven million filings per year and more 8 00:00:23,000 --> 00:00:25,919 Speaker 1: than fifty million pages of documents are accessed on Edgar 9 00:00:26,120 --> 00:00:30,240 Speaker 1: every day. But the SEC recently revealed that it learned 10 00:00:30,320 --> 00:00:33,320 Speaker 1: last year about a hack into the system that may 11 00:00:33,360 --> 00:00:37,320 Speaker 1: have allowed hackers to obtain and profit from corporate confidential 12 00:00:37,360 --> 00:00:42,360 Speaker 1: corporate information before that information became public. Here to talk 13 00:00:42,400 --> 00:00:45,159 Speaker 1: with us about this hack into the EDGAR system are 14 00:00:45,240 --> 00:00:48,520 Speaker 1: Peter Henning, a professor at Wayne State University Law School, 15 00:00:48,600 --> 00:00:53,280 Speaker 1: and Robert Hockett, a professor at Cornell University Law School. Peter, 16 00:00:53,800 --> 00:00:56,880 Speaker 1: the you know, most of the EDGAR system is publicly 17 00:00:56,920 --> 00:00:59,600 Speaker 1: available information. That's kind of the point, but there is 18 00:00:59,720 --> 00:01:03,640 Speaker 1: part of it that has some confidential information that apparently 19 00:01:03,720 --> 00:01:06,520 Speaker 1: is the subject of this hack. Explain exactly what it 20 00:01:06,600 --> 00:01:10,320 Speaker 1: is that got hacked into here. Well, the security breach 21 00:01:10,480 --> 00:01:14,000 Speaker 1: came through a portal that the SEC has so that 22 00:01:14,680 --> 00:01:21,080 Speaker 1: newer companies companies that recently went public could essentially take 23 00:01:21,120 --> 00:01:25,920 Speaker 1: it for a test drive and past materials on EDGAR. 24 00:01:26,480 --> 00:01:30,039 Speaker 1: The requirement is that whenever a company makes its disclosure UM, 25 00:01:30,400 --> 00:01:32,960 Speaker 1: say quarterly or annual earnings, it has to do that 26 00:01:33,560 --> 00:01:37,280 Speaker 1: UM immediately and make it available to all investors at 27 00:01:37,319 --> 00:01:38,959 Speaker 1: the same time. So it was a way for them 28 00:01:38,959 --> 00:01:41,800 Speaker 1: to test it UM. But there are companies that will 29 00:01:41,840 --> 00:01:44,720 Speaker 1: make filings UH. For example, I p O s Now 30 00:01:44,800 --> 00:01:48,640 Speaker 1: you can make what's essentially a dark filing. You can 31 00:01:48,680 --> 00:01:51,840 Speaker 1: put information in there that isn't available to the public 32 00:01:52,360 --> 00:01:55,520 Speaker 1: that might have been available to the hackers and would 33 00:01:55,520 --> 00:01:59,040 Speaker 1: give them maybe some insight information about what was going 34 00:01:59,080 --> 00:02:01,880 Speaker 1: to happen at those companies and perhaps others if they 35 00:02:01,960 --> 00:02:04,120 Speaker 1: rummaged around through the system. You just don't know what 36 00:02:04,160 --> 00:02:07,800 Speaker 1: you're going to find, Bob, the attack occurred last year. 37 00:02:08,040 --> 00:02:12,160 Speaker 1: The SEC just disclosed it on Wednesday. Is that against 38 00:02:12,200 --> 00:02:17,520 Speaker 1: its own advice to companies to announce cyber attacks promptly? Well, 39 00:02:17,560 --> 00:02:19,600 Speaker 1: it's it's it's hard to tell to tell you the trition. 40 00:02:19,639 --> 00:02:21,799 Speaker 1: I mean, the problem is, UM, you know that the 41 00:02:21,880 --> 00:02:24,760 Speaker 1: SEC is sort of forced start faced with a dilemma 42 00:02:25,360 --> 00:02:28,600 Speaker 1: on the one hand, if it reveals information that turns 43 00:02:28,639 --> 00:02:32,519 Speaker 1: out not to be really that important in the longer term, 44 00:02:32,560 --> 00:02:35,200 Speaker 1: but sort of stokes a panic or stokes sort of 45 00:02:35,240 --> 00:02:39,160 Speaker 1: excess concern in the short term, uh than it might well, 46 00:02:39,360 --> 00:02:41,920 Speaker 1: you know, sort of think better about having disclosed something 47 00:02:41,960 --> 00:02:44,960 Speaker 1: too quickly. So it's not sure whether to tell anybody 48 00:02:45,120 --> 00:02:47,360 Speaker 1: right away because it doesn't want to cause more panic 49 00:02:47,360 --> 00:02:50,040 Speaker 1: than might be warranted. On the other hand, that being said, 50 00:02:50,080 --> 00:02:52,840 Speaker 1: if it does indeed turn out to be a significant problem, 51 00:02:53,000 --> 00:02:54,720 Speaker 1: then of course the SEC looks to have egg on 52 00:02:54,760 --> 00:02:56,359 Speaker 1: its face when it turns out that it knew the 53 00:02:56,400 --> 00:02:59,440 Speaker 1: information even sooner. In this particular case, I think what's 54 00:02:59,440 --> 00:03:02,800 Speaker 1: particularly important or maybe worth noting, is that it didn't 55 00:03:02,800 --> 00:03:06,680 Speaker 1: reveal the information until it determines that somebody might actually 56 00:03:06,680 --> 00:03:10,360 Speaker 1: have used some still gotten information in order to engage 57 00:03:10,360 --> 00:03:12,840 Speaker 1: in some form of insider trading. And that's something the 58 00:03:12,919 --> 00:03:17,400 Speaker 1: SEC apparently only just learned. Well, Peter, if the idea 59 00:03:17,520 --> 00:03:22,120 Speaker 1: of this part of the system, you know, the confidential information, 60 00:03:22,360 --> 00:03:26,440 Speaker 1: is to encourage new companies to get to put things up, 61 00:03:26,480 --> 00:03:28,639 Speaker 1: for folks to get things in early and test out 62 00:03:28,639 --> 00:03:33,359 Speaker 1: the system, what's if this ends up deterring that from happening. 63 00:03:33,440 --> 00:03:36,120 Speaker 1: If this kind of hack deterurns that from happening, what 64 00:03:36,200 --> 00:03:39,400 Speaker 1: are the likely consequences in terms of companies ability to 65 00:03:39,840 --> 00:03:43,080 Speaker 1: get their information out the right way? Well, I don't 66 00:03:43,080 --> 00:03:44,720 Speaker 1: I'm not sure if it will be a detern. It 67 00:03:44,920 --> 00:03:48,080 Speaker 1: certainly will make companies hesitant, and indeed, even the SEC 68 00:03:49,000 --> 00:03:51,680 Speaker 1: said for those using this portal, you know, be careful 69 00:03:51,720 --> 00:03:54,760 Speaker 1: about the information you put here. UM. But you know, 70 00:03:54,880 --> 00:03:57,560 Speaker 1: just like any warning label, I'm not sure how many 71 00:03:57,600 --> 00:04:01,040 Speaker 1: people might have actually read it. Um. Really the message 72 00:04:01,080 --> 00:04:04,000 Speaker 1: here is the broader one, and of course it's coming 73 00:04:04,040 --> 00:04:07,200 Speaker 1: just a couple of weeks after the disclosure of the 74 00:04:07,240 --> 00:04:12,720 Speaker 1: Equifax league is that really no computer system is completely secure. 75 00:04:12,920 --> 00:04:16,880 Speaker 1: That we are living, um in an era and this 76 00:04:17,040 --> 00:04:19,640 Speaker 1: may go on. Um as far as the eye can see, 77 00:04:19,720 --> 00:04:21,839 Speaker 1: we're living in an era in which there is going 78 00:04:21,920 --> 00:04:26,960 Speaker 1: to be cyber attacks and confidential information can get exposed. 79 00:04:27,480 --> 00:04:31,320 Speaker 1: So you know, it's um, maybe physician heled iself. The 80 00:04:31,480 --> 00:04:35,359 Speaker 1: SEC has to take stronger measures here to protect what 81 00:04:35,480 --> 00:04:39,160 Speaker 1: may be crucial information about companies. Otherwise they're going to 82 00:04:39,240 --> 00:04:41,880 Speaker 1: be more careful about what they file and may try 83 00:04:41,880 --> 00:04:44,800 Speaker 1: to puzz things a little bit to try to ratchet 84 00:04:44,800 --> 00:04:49,000 Speaker 1: down how much they end up disclosing in their public filings. 85 00:04:49,080 --> 00:04:53,720 Speaker 1: Bob Edgar is tracked carefully by traders who use super 86 00:04:53,760 --> 00:04:58,960 Speaker 1: fast computers. How much information does Edgar have that can 87 00:04:59,080 --> 00:05:02,400 Speaker 1: actually move the more kit, Well, it's had a great 88 00:05:02,480 --> 00:05:05,080 Speaker 1: deal of such information. And then in a way, that's 89 00:05:05,080 --> 00:05:07,040 Speaker 1: sort of part of the point, right, I mean, the 90 00:05:07,040 --> 00:05:10,640 Speaker 1: original impetus behind Edgar is essentially just sort of a 91 00:05:10,760 --> 00:05:13,599 Speaker 1: race or to sort of diminish, nearly to the vanishing point, 92 00:05:14,120 --> 00:05:17,560 Speaker 1: any kind of time advantage that one trader might have 93 00:05:17,640 --> 00:05:21,400 Speaker 1: relative to another when it comes to trading uninformation that 94 00:05:21,520 --> 00:05:24,920 Speaker 1: is disclosed once it is disclosed, and has some sort 95 00:05:24,920 --> 00:05:29,440 Speaker 1: of significance, a price relevant significance to the shares of 96 00:05:29,480 --> 00:05:32,120 Speaker 1: the firm that are traded. Right. So the irony here, 97 00:05:32,160 --> 00:05:34,640 Speaker 1: of course is that you know Edgar is is established 98 00:05:34,680 --> 00:05:37,840 Speaker 1: in order to level that that playing field. But if 99 00:05:37,880 --> 00:05:40,280 Speaker 1: some people are able to hack it and others are not, 100 00:05:40,839 --> 00:05:42,840 Speaker 1: you might end up with the sort of paradoxyl the 101 00:05:43,080 --> 00:05:47,560 Speaker 1: paradoxical situation wherein Edgar ends up facilitating certain kinds of 102 00:05:47,600 --> 00:05:52,080 Speaker 1: insider trading by essentially gipping off right, some people much more, 103 00:05:52,600 --> 00:05:55,480 Speaker 1: much sooner than it tips off others simply by by 104 00:05:55,560 --> 00:05:58,720 Speaker 1: into those first people's capacity to hack it. So that 105 00:05:58,720 --> 00:06:00,919 Speaker 1: makes Peter's point of all the more important that you know, 106 00:06:01,000 --> 00:06:04,080 Speaker 1: in order for even to sort of fulfill its function, 107 00:06:04,640 --> 00:06:07,000 Speaker 1: it really has to be more or less hack proof, 108 00:06:07,160 --> 00:06:10,160 Speaker 1: or at least it has to be proofed against hacking 109 00:06:10,240 --> 00:06:13,200 Speaker 1: of the kind that can facilitate insider trading. We've been 110 00:06:13,240 --> 00:06:16,240 Speaker 1: talking about the hack of the sec and its implications 111 00:06:16,279 --> 00:06:19,320 Speaker 1: with Peter Henning, professor at Wayne State University Law School, 112 00:06:19,360 --> 00:06:23,520 Speaker 1: and Robert hocket, a professor at Cornell University Law School. Peter, 113 00:06:23,680 --> 00:06:28,400 Speaker 1: this isn't the first time that the SECS Edgar system 114 00:06:28,440 --> 00:06:31,560 Speaker 1: has been compromised. Now they are going to put in 115 00:06:31,760 --> 00:06:35,640 Speaker 1: the consult what's been called the Consolidated Audit Trail, So 116 00:06:35,839 --> 00:06:38,560 Speaker 1: would you explain that and whether they're going to be 117 00:06:38,640 --> 00:06:44,400 Speaker 1: concerns about that in light of this new hack. Consolidated 118 00:06:44,440 --> 00:06:47,080 Speaker 1: Audit Trail has actually been a dream of the SECS 119 00:06:47,200 --> 00:06:50,599 Speaker 1: for about the last thirty to forty years, where it 120 00:06:50,600 --> 00:06:54,680 Speaker 1: would give them a real time look at who is 121 00:06:54,760 --> 00:06:59,360 Speaker 1: trading um across all of the markets, so that they 122 00:06:59,400 --> 00:07:02,240 Speaker 1: could see if there's any kind of market disruption or 123 00:07:02,760 --> 00:07:08,360 Speaker 1: if the order flow is somehow affected by an event 124 00:07:08,600 --> 00:07:12,600 Speaker 1: or perhaps even a technological glitch. So this is what 125 00:07:12,640 --> 00:07:18,000 Speaker 1: they've wanted. What that is, though, is incredibly valuable information. Uh. 126 00:07:18,280 --> 00:07:22,800 Speaker 1: If I know that, say Fidelity or Vanguard is selling 127 00:07:22,800 --> 00:07:27,640 Speaker 1: out a position or accumulating a position, um, I can 128 00:07:27,680 --> 00:07:30,720 Speaker 1: trade ahead of that or trade along with it before 129 00:07:30,800 --> 00:07:32,800 Speaker 1: the stock price is affected. I can make a great 130 00:07:32,840 --> 00:07:36,080 Speaker 1: deal of money. So what the heck is saying is 131 00:07:36,160 --> 00:07:41,440 Speaker 1: that as the SEC accumulates more and more valuable information, uh, 132 00:07:41,520 --> 00:07:45,680 Speaker 1: it's going to become a target even more. And so 133 00:07:45,800 --> 00:07:49,000 Speaker 1: it's really going to have to protect that information. And 134 00:07:49,000 --> 00:07:51,840 Speaker 1: of course the firms are worried that their information could 135 00:07:51,880 --> 00:07:55,400 Speaker 1: be stolen and used either against them or by someone 136 00:07:55,440 --> 00:07:59,560 Speaker 1: to profit. And that's going to cost other investors money. 137 00:07:59,440 --> 00:08:02,560 Speaker 1: I expected, given this hack, and you know, we don't 138 00:08:02,560 --> 00:08:05,560 Speaker 1: know that much about it yet, but given this hack, uh, 139 00:08:05,960 --> 00:08:09,720 Speaker 1: a lot of banks and other investors would be very 140 00:08:09,760 --> 00:08:12,640 Speaker 1: concerned about what might happen when the consolidated art show 141 00:08:12,680 --> 00:08:15,320 Speaker 1: finally gets up. Can we expect that this is just 142 00:08:15,360 --> 00:08:20,000 Speaker 1: going to delay that project, you know, by measures we 143 00:08:20,040 --> 00:08:23,360 Speaker 1: can't even figure out yet. Yeah, I don't know, I 144 00:08:23,680 --> 00:08:25,840 Speaker 1: really don't know whether we should expect this to sort 145 00:08:25,840 --> 00:08:28,440 Speaker 1: of delay that project or not. I mean, it might 146 00:08:28,520 --> 00:08:32,480 Speaker 1: do that. It might in fact instead hasten the project 147 00:08:32,559 --> 00:08:37,240 Speaker 1: of of beefing up internet security or cybersecurity or the like. 148 00:08:37,440 --> 00:08:39,760 Speaker 1: Where it might that might do both. I mean a 149 00:08:39,760 --> 00:08:41,720 Speaker 1: couple of other things worth noting in this connection. It 150 00:08:41,760 --> 00:08:43,360 Speaker 1: seems to me as first of all, there is the 151 00:08:43,400 --> 00:08:47,199 Speaker 1: Equifax matter that Peter had mentioned before. There's also another 152 00:08:47,240 --> 00:08:49,480 Speaker 1: matter that we've sort of forgotten about but was pretty 153 00:08:49,480 --> 00:08:51,960 Speaker 1: big news about a year ago, and that was when 154 00:08:51,960 --> 00:08:55,480 Speaker 1: the New York bet was fooled by hackers into making 155 00:08:55,760 --> 00:08:59,520 Speaker 1: a very large money transfer on behalf or supposedly on 156 00:08:59,640 --> 00:09:03,440 Speaker 1: behalf of the Bangladesh Central Bank UH and that was 157 00:09:03,840 --> 00:09:06,360 Speaker 1: done through hacking as well. And indeed the New York 158 00:09:06,440 --> 00:09:09,920 Speaker 1: Fed sort of discovered the problem um only sort of 159 00:09:09,960 --> 00:09:13,720 Speaker 1: by accident, only through a sort of a fortuity owing 160 00:09:13,720 --> 00:09:16,320 Speaker 1: to a strange name that was used by one of 161 00:09:16,360 --> 00:09:19,360 Speaker 1: the parties who was hacking it. And so people have 162 00:09:19,440 --> 00:09:21,079 Speaker 1: since then, of course, has been a little bit concerned 163 00:09:21,080 --> 00:09:24,599 Speaker 1: about the security of the swift money transfer system that 164 00:09:24,679 --> 00:09:26,960 Speaker 1: the central banks and other banks used as well. So 165 00:09:27,440 --> 00:09:30,280 Speaker 1: in a way, the problem is is quite pervasive throughout 166 00:09:30,320 --> 00:09:33,560 Speaker 1: the the financial system, and I'm hoping therefore that the 167 00:09:33,600 --> 00:09:35,560 Speaker 1: takeaway from this will be that we really have to 168 00:09:35,600 --> 00:09:38,880 Speaker 1: get quite serious about cybersecurity across the entirety of the 169 00:09:38,920 --> 00:09:42,640 Speaker 1: financial system and not let it delay um uh sort 170 00:09:42,679 --> 00:09:45,400 Speaker 1: of beneficial actions that various regulars who are planning to 171 00:09:45,400 --> 00:09:48,320 Speaker 1: take unless absolutely necessary, but instead it just sort of 172 00:09:48,320 --> 00:09:51,200 Speaker 1: speed us up when it comes to really addressing all 173 00:09:51,240 --> 00:09:53,800 Speaker 1: of the cyber vulnerabilities that appear to be pervasive out there. 174 00:09:54,360 --> 00:09:58,880 Speaker 1: Peter SEC Chairman Jake Clayton is scheduled to testify before 175 00:09:58,880 --> 00:10:02,000 Speaker 1: the Senate Banking com it Eating next week. What kind 176 00:10:02,000 --> 00:10:05,320 Speaker 1: of questions do you expect him to be getting and 177 00:10:05,360 --> 00:10:08,679 Speaker 1: will there be a grilling of sorts? There'll be a 178 00:10:08,720 --> 00:10:10,760 Speaker 1: little bit of a grilling, although in a sense he 179 00:10:10,800 --> 00:10:14,360 Speaker 1: gets a bit of a free pass because the hack 180 00:10:14,480 --> 00:10:18,760 Speaker 1: took place under his predecessor, Mary Joe White, and you know, 181 00:10:18,840 --> 00:10:22,000 Speaker 1: perhaps the delay and disclosing it um might be an 182 00:10:22,040 --> 00:10:25,040 Speaker 1: issue brought up, But really I think he wants to 183 00:10:25,200 --> 00:10:28,760 Speaker 1: use this as a way to highlight the need to 184 00:10:28,960 --> 00:10:34,120 Speaker 1: enhance cybersecurity. And as Bob said, Bob's absolutely right that, um, 185 00:10:34,679 --> 00:10:37,719 Speaker 1: this is not we can't just beat these as isolated incidents. 186 00:10:38,040 --> 00:10:40,480 Speaker 1: That this is something that is going to be pervasive 187 00:10:40,600 --> 00:10:44,400 Speaker 1: through the financial system, and so if you view one 188 00:10:44,480 --> 00:10:49,760 Speaker 1: security patch as somehow a cure, it's at best of placebo. 189 00:10:49,920 --> 00:10:52,760 Speaker 1: So I think Clayton is going to go on the 190 00:10:52,800 --> 00:10:56,200 Speaker 1: offensive here and perhaps even use this as a way 191 00:10:56,240 --> 00:10:59,439 Speaker 1: to ask Congress for more money for the SEC. Uh, 192 00:10:59,520 --> 00:11:03,960 Speaker 1: don't free act. This is a political agency, Bob. You know, 193 00:11:04,000 --> 00:11:06,720 Speaker 1: we we talk about the importance of cybersecurity, and it 194 00:11:06,840 --> 00:11:10,720 Speaker 1: seems are there ways to actually stop this from happening 195 00:11:10,720 --> 00:11:14,439 Speaker 1: because it seems like every everyone and every agency can 196 00:11:14,480 --> 00:11:18,600 Speaker 1: be act. Yeah. So I mean if I were a 197 00:11:18,640 --> 00:11:22,600 Speaker 1: computer security expert, um, I could answer you more different deplity, 198 00:11:22,600 --> 00:11:24,360 Speaker 1: but I would probably also be a millionaire or a 199 00:11:24,360 --> 00:11:27,520 Speaker 1: billionaire by now. It's I mean, in theory, we can 200 00:11:27,679 --> 00:11:30,640 Speaker 1: do this, right, but but there's so many prerequisites that 201 00:11:30,880 --> 00:11:33,480 Speaker 1: have to be met. One of them that maybe it's 202 00:11:33,480 --> 00:11:36,959 Speaker 1: worth highlighting at the moment is that because so much 203 00:11:37,160 --> 00:11:39,640 Speaker 1: of the transacting that goes on in the financial system 204 00:11:39,640 --> 00:11:44,200 Speaker 1: now takes place across borders through multiple electronic systems, you 205 00:11:44,280 --> 00:11:47,160 Speaker 1: need some kind of harmonization on the part of multiple 206 00:11:47,240 --> 00:11:51,440 Speaker 1: jurisdictions when it comes to what forms of electronic communications 207 00:11:51,440 --> 00:11:53,720 Speaker 1: are going to be used, what protocols or what security 208 00:11:53,720 --> 00:11:57,760 Speaker 1: protocols are to be used, what specific technologies technologies are 209 00:11:57,760 --> 00:11:59,520 Speaker 1: going to be used, and so forth, And it's thus 210 00:11:59,520 --> 00:12:01,640 Speaker 1: far proved it would be difficult to get consensus even 211 00:12:01,679 --> 00:12:03,079 Speaker 1: on that. You might have read even a couple of 212 00:12:03,160 --> 00:12:05,440 Speaker 1: days ago that some of our partners in Europe and 213 00:12:05,480 --> 00:12:07,920 Speaker 1: Asia are sort of suspicious of the protocols that were 214 00:12:07,920 --> 00:12:10,880 Speaker 1: currently favoring because they think we might be favoring Member 215 00:12:10,880 --> 00:12:13,280 Speaker 1: precisely because we're able to hack them. I'm going to 216 00:12:13,360 --> 00:12:15,760 Speaker 1: have to stop you there. I thought all professors were 217 00:12:15,840 --> 00:12:19,280 Speaker 1: millionaires thanks to two of them, Peter Henning, professor at 218 00:12:19,280 --> 00:12:22,720 Speaker 1: Waynestad University Law School and Robert Hockett, professor at Cornell 219 00:12:22,840 --> 00:12:23,880 Speaker 1: University Law School,