1 00:00:04,120 --> 00:00:07,160 Speaker 1: Get in touch with technology with tech Stuff from how 2 00:00:07,200 --> 00:00:13,800 Speaker 1: stuff works dot com. Hey there, and welcome to Tech Stuff. 3 00:00:13,840 --> 00:00:16,560 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer at 4 00:00:16,600 --> 00:00:20,960 Speaker 1: how Stuff Works in a love all things tech usually, 5 00:00:21,760 --> 00:00:25,520 Speaker 1: but today we're gonna talk about a type of technology 6 00:00:25,560 --> 00:00:30,280 Speaker 1: that can have either positive or negative effect depending upon 7 00:00:30,320 --> 00:00:35,400 Speaker 1: how it's implemented. And joining me today is someone that, uh, 8 00:00:35,760 --> 00:00:38,400 Speaker 1: I'm really excited could join us on the show. I'm 9 00:00:38,400 --> 00:00:43,280 Speaker 1: talking about Bob Sullivan, a founding member of MSNBC, the 10 00:00:43,320 --> 00:00:47,680 Speaker 1: guy behind Bob Sullivan dot net ak the Red Tape Chronicles. 11 00:00:47,760 --> 00:00:51,000 Speaker 1: He's also a co host on a show called The Breach. 12 00:00:51,600 --> 00:00:55,160 Speaker 1: It's All about Data Breaches. Season one, which is phenomenal, 13 00:00:55,400 --> 00:00:58,760 Speaker 1: is all about the massive data breach that happened with 14 00:00:58,840 --> 00:01:04,360 Speaker 1: Yahoo with three billion accounts exposed as a result of 15 00:01:04,360 --> 00:01:09,320 Speaker 1: that data breach. And now they're gearing up. Season two 16 00:01:09,360 --> 00:01:12,720 Speaker 1: is starting, and you're here to talk to us about 17 00:01:13,480 --> 00:01:17,800 Speaker 1: voting machines, elect and and elections and how technology intersects 18 00:01:17,840 --> 00:01:21,080 Speaker 1: with that and how that how that can be somewhat 19 00:01:21,640 --> 00:01:26,280 Speaker 1: terrifying in some ways. Welcome to the show, Bob, Sure, 20 00:01:26,360 --> 00:01:28,920 Speaker 1: thanks for having me. Really important topic, right now, so 21 00:01:28,959 --> 00:01:32,000 Speaker 1: I'm glad we're talking about it. Yeah, me too. And 22 00:01:32,640 --> 00:01:35,399 Speaker 1: for interest of full disclosure, I live in the state 23 00:01:35,440 --> 00:01:38,920 Speaker 1: of Georgia here in the United States, and Georgia is 24 00:01:39,120 --> 00:01:44,319 Speaker 1: a state that adopted touchscreen electronic voting machines very early, 25 00:01:44,360 --> 00:01:47,040 Speaker 1: back in two thousand two, I think is when we started, 26 00:01:47,360 --> 00:01:51,600 Speaker 1: and it's statewide here at Georgia and UH we we 27 00:01:51,680 --> 00:01:55,640 Speaker 1: have a lot of things that I hesitate to use 28 00:01:55,680 --> 00:01:59,760 Speaker 1: the word distinguish, but really kind of highlight some of 29 00:01:59,760 --> 00:02:05,600 Speaker 1: the shoes I have with UH direct recording devices like this, 30 00:02:06,080 --> 00:02:10,640 Speaker 1: the biggest one being that in Georgia, there's no associated 31 00:02:10,680 --> 00:02:14,600 Speaker 1: paper trail with any of the voting record, So when 32 00:02:14,639 --> 00:02:17,680 Speaker 1: a voter goes in and casts his or her ballot, 33 00:02:18,520 --> 00:02:22,240 Speaker 1: there's no paper receipt where they can check that the 34 00:02:22,280 --> 00:02:26,120 Speaker 1: results showing up on screen are reflected on paper, and 35 00:02:26,160 --> 00:02:30,639 Speaker 1: therefore there's no real way to do a full audit afterward. 36 00:02:31,639 --> 00:02:36,280 Speaker 1: And I think best case scenario, and Bob you can, 37 00:02:36,800 --> 00:02:38,519 Speaker 1: I want to hear your input as well, but best 38 00:02:38,560 --> 00:02:42,800 Speaker 1: case scenario for me is that even assuming there's no 39 00:02:43,200 --> 00:02:48,720 Speaker 1: hanky panky going on, there's no interference in the election process. 40 00:02:49,639 --> 00:02:55,040 Speaker 1: It it hurts the the appearance of the democratic process 41 00:02:55,160 --> 00:02:58,920 Speaker 1: because it inserts the concept of doubt without being able 42 00:02:58,960 --> 00:03:02,120 Speaker 1: to audit those results and say yes, in fact, the 43 00:03:02,200 --> 00:03:04,760 Speaker 1: votes that were cast are the votes that were recorded. 44 00:03:05,800 --> 00:03:11,080 Speaker 1: What what's your point of view on that particular topic. Well, 45 00:03:11,280 --> 00:03:13,600 Speaker 1: that's the most important point at all. You've started at 46 00:03:13,600 --> 00:03:16,840 Speaker 1: the punch line, which is fine. Um. Vote hacking is 47 00:03:16,840 --> 00:03:20,400 Speaker 1: a complicated subject. In Georgia, you're very intimate with it 48 00:03:20,639 --> 00:03:23,280 Speaker 1: at the moment, but all around the country there are Um, 49 00:03:23,320 --> 00:03:25,480 Speaker 1: it's not just the machines, there's a whole life cycle 50 00:03:25,560 --> 00:03:29,160 Speaker 1: of how your vote might be hacked or um suede 51 00:03:29,400 --> 00:03:32,560 Speaker 1: or made to disappear. There's lots of different ways that 52 00:03:32,639 --> 00:03:35,720 Speaker 1: votes can be manipulated. But something that's just the most 53 00:03:35,760 --> 00:03:39,240 Speaker 1: fundamental thing about voting and about democracy is that when 54 00:03:39,240 --> 00:03:41,880 Speaker 1: the election is over and the results are posted, that 55 00:03:41,960 --> 00:03:46,480 Speaker 1: the losers believe in the legitimacy of the election. And 56 00:03:46,520 --> 00:03:50,600 Speaker 1: when they don't, that creates a tremendous governance problem. So 57 00:03:51,000 --> 00:03:54,640 Speaker 1: faith in the institution of voting itself is fundamental to America. 58 00:03:54,960 --> 00:03:57,480 Speaker 1: And faith isn't an on off switch. And we're seeing 59 00:03:57,480 --> 00:04:00,200 Speaker 1: this right now in lots of different ways, right, Um, 60 00:04:00,240 --> 00:04:02,280 Speaker 1: But whether or not people believe in the integrity of 61 00:04:02,320 --> 00:04:05,839 Speaker 1: the voting process itself. We'll say a lot about how 62 00:04:05,880 --> 00:04:09,800 Speaker 1: we're governed. And in Georgia you mentioned paper. Um, the 63 00:04:09,920 --> 00:04:11,920 Speaker 1: irony here we are on a tech podcast and we're 64 00:04:11,920 --> 00:04:15,520 Speaker 1: talking about the latest and greatest technology, and we're here 65 00:04:15,560 --> 00:04:19,200 Speaker 1: in the country with the latest and greatest entrepreneurs and 66 00:04:19,320 --> 00:04:23,599 Speaker 1: the latest and greatest inventions, and anyone you talked to 67 00:04:23,800 --> 00:04:26,919 Speaker 1: in the the voting hacking space will tell you the 68 00:04:26,960 --> 00:04:30,680 Speaker 1: most important thing is paper. So we're going backwards here 69 00:04:30,680 --> 00:04:32,640 Speaker 1: in some ways, but it all comes down to paper 70 00:04:32,720 --> 00:04:35,880 Speaker 1: because there has to be some way to audit a 71 00:04:35,960 --> 00:04:40,000 Speaker 1: vote afterwards. And if there's not, then you have a 72 00:04:40,040 --> 00:04:41,880 Speaker 1: black box and people are walking up to it and 73 00:04:41,920 --> 00:04:45,120 Speaker 1: touching a screen and you just trust that is going 74 00:04:45,120 --> 00:04:47,679 Speaker 1: to work. And anybody who works in the technology world, 75 00:04:47,720 --> 00:04:51,000 Speaker 1: we we're just having this conversation before we started this podcast. 76 00:04:51,640 --> 00:04:56,920 Speaker 1: Tech often breaks everybody. Who's I mean, in the simplest example, 77 00:04:56,960 --> 00:04:58,400 Speaker 1: how many times have you walked up to an a 78 00:04:58,480 --> 00:05:00,520 Speaker 1: t M and tried to get forty dollars out of 79 00:05:00,560 --> 00:05:04,159 Speaker 1: the machine and you accidentally hit the sixty button. Happens 80 00:05:04,200 --> 00:05:06,520 Speaker 1: all the time. That's that can be a calibration thing. 81 00:05:06,520 --> 00:05:08,760 Speaker 1: It can even be an angle thing. So that's just 82 00:05:08,880 --> 00:05:11,159 Speaker 1: one of you know, fifty ways that things can go 83 00:05:11,200 --> 00:05:13,640 Speaker 1: wrong just on the one kind of voting machine that 84 00:05:13,680 --> 00:05:16,240 Speaker 1: we started talking about. So this is a big, rich topic. 85 00:05:16,279 --> 00:05:19,279 Speaker 1: But legitimacy is the most important thing. And it's it's 86 00:05:19,320 --> 00:05:21,920 Speaker 1: interesting that you bring up the A t M example 87 00:05:22,080 --> 00:05:26,279 Speaker 1: because obviously there was the the it became somewhat infamous 88 00:05:26,279 --> 00:05:29,440 Speaker 1: in text circles. But die Bald, the company that makes 89 00:05:29,520 --> 00:05:31,360 Speaker 1: a lot of A t M s for a long time, 90 00:05:31,440 --> 00:05:35,159 Speaker 1: was in the business of making these voting machines for 91 00:05:35,279 --> 00:05:39,159 Speaker 1: various places, and uh in two thousand and four it 92 00:05:39,200 --> 00:05:44,000 Speaker 1: was a huge story. The CEO of Diebald, Walden w Adell, 93 00:05:44,120 --> 00:05:49,359 Speaker 1: had written a letter to various wealthy acquaintances he had 94 00:05:49,720 --> 00:05:53,599 Speaker 1: in and a fundraising effort for the Republican Party and 95 00:05:53,640 --> 00:05:56,360 Speaker 1: said that he was looking forward to delivering the electoral 96 00:05:56,440 --> 00:06:00,279 Speaker 1: votes to get the president back into office. And a 97 00:06:00,320 --> 00:06:02,840 Speaker 1: lot of people said, well, whether he intended it to 98 00:06:02,920 --> 00:06:06,400 Speaker 1: be we're going to make sure this happens versus I 99 00:06:06,480 --> 00:06:09,760 Speaker 1: want to, you know, support this candidate I believe in 100 00:06:10,560 --> 00:06:13,000 Speaker 1: is kind of beside the point, because again, it brought 101 00:06:13,120 --> 00:06:16,760 Speaker 1: that doubt into the democratic process. If if people point 102 00:06:16,800 --> 00:06:18,560 Speaker 1: to him and say, but you make the machines that 103 00:06:18,640 --> 00:06:21,880 Speaker 1: count the votes, and you say you plan on delivering 104 00:06:21,920 --> 00:06:25,279 Speaker 1: these votes to a specific candidate, that kind of has 105 00:06:25,760 --> 00:06:29,960 Speaker 1: a very dangerous implication in it. And as you say 106 00:06:30,160 --> 00:06:34,960 Speaker 1: it's this, it's you already have this level of this 107 00:06:35,080 --> 00:06:37,400 Speaker 1: hurdle that you have to overcome when you're designing any 108 00:06:37,480 --> 00:06:40,479 Speaker 1: kind of system for voting. To make sure that the 109 00:06:40,520 --> 00:06:44,720 Speaker 1: process is as straightforward and as difficult to mess up 110 00:06:44,800 --> 00:06:49,120 Speaker 1: on the users side as possible. The whole keep it simple, 111 00:06:49,200 --> 00:06:52,520 Speaker 1: stupid kind of approach really would be a huge benefit 112 00:06:52,560 --> 00:06:57,080 Speaker 1: when you're designing any sort of voting system. Two give 113 00:06:57,240 --> 00:07:00,359 Speaker 1: confidence to the voter that whatever choices they want to 114 00:07:00,360 --> 00:07:04,040 Speaker 1: make are actually reflected in the votes that they're casting. 115 00:07:04,920 --> 00:07:07,800 Speaker 1: And uh, as you say, with a T M S, 116 00:07:07,880 --> 00:07:10,160 Speaker 1: it's easy enough to make a mistake. Those same companies 117 00:07:10,160 --> 00:07:14,720 Speaker 1: are designing some of these electronic voting systems. And anyone 118 00:07:14,760 --> 00:07:17,840 Speaker 1: who's worked with engineers for a long time understands that 119 00:07:18,800 --> 00:07:21,920 Speaker 1: engineers are very good at building systems that make sense 120 00:07:21,920 --> 00:07:25,920 Speaker 1: to engineers. Sometimes they build systems that are not so 121 00:07:26,000 --> 00:07:29,920 Speaker 1: transparent to the end user. Sure, and the other thing 122 00:07:29,920 --> 00:07:33,400 Speaker 1: that engineers, um some engineers do is they like to 123 00:07:33,480 --> 00:07:36,080 Speaker 1: keep their code to themselves. So another problem with Dibuld 124 00:07:36,120 --> 00:07:39,240 Speaker 1: and another voting machine companies has been, okay, so you 125 00:07:39,320 --> 00:07:41,560 Speaker 1: have this who knows what he meant, I'm going to 126 00:07:41,600 --> 00:07:43,600 Speaker 1: deliver the votes that. You can hear that in an 127 00:07:43,640 --> 00:07:46,160 Speaker 1: innocent way. You can hear that in a suspicious way. 128 00:07:46,200 --> 00:07:49,480 Speaker 1: But then when in this computer security world, what often 129 00:07:49,600 --> 00:07:51,160 Speaker 1: happens is if you want to prove that you're a 130 00:07:51,240 --> 00:07:54,640 Speaker 1: very secure piece of software, you offer it up to 131 00:07:54,640 --> 00:07:57,120 Speaker 1: the open source community and let it let it be reviewed, 132 00:07:57,120 --> 00:08:00,760 Speaker 1: and you submit yourself to vulnerability testing. You even vulnerability 133 00:08:00,800 --> 00:08:04,400 Speaker 1: bounties and the voting machine industry in general has been 134 00:08:04,480 --> 00:08:07,840 Speaker 1: very late to this entire process of of inviting hackers 135 00:08:07,880 --> 00:08:11,840 Speaker 1: in and helping them, letting the hackers help them secure 136 00:08:11,840 --> 00:08:14,480 Speaker 1: their machines. For the most part, the entire sort of 137 00:08:14,560 --> 00:08:18,760 Speaker 1: voting complex has been very defensive about this process. And 138 00:08:19,000 --> 00:08:22,320 Speaker 1: that's another reason for another layer of suspicion. So there's 139 00:08:22,440 --> 00:08:25,480 Speaker 1: there's that black box behind what's going on in these 140 00:08:25,560 --> 00:08:29,160 Speaker 1: voting machines. Until fairly recently, it was for the most 141 00:08:29,200 --> 00:08:32,640 Speaker 1: part illegal to tinker with voting machine software because of 142 00:08:33,040 --> 00:08:37,280 Speaker 1: Digital Rights management rules. UM, the Library of Congress recently 143 00:08:37,400 --> 00:08:39,960 Speaker 1: changed rules, and that's why we're seeing and been hearing 144 00:08:39,960 --> 00:08:43,680 Speaker 1: so much more about machine hacking because they've temporarily temporarily 145 00:08:43,720 --> 00:08:48,000 Speaker 1: created this research window whereby hackers can actually buy machines 146 00:08:48,040 --> 00:08:49,720 Speaker 1: if they can buy them on eBay or whatnot, and 147 00:08:49,720 --> 00:08:52,079 Speaker 1: then try to hack them. And that's why you might 148 00:08:52,160 --> 00:08:54,720 Speaker 1: recall that in the summer at the big Las Vegas 149 00:08:54,760 --> 00:08:57,280 Speaker 1: Hacker Convention def Con, they had something called the Voting 150 00:08:57,360 --> 00:09:00,280 Speaker 1: Village and last year was the first time. This member 151 00:09:00,280 --> 00:09:01,800 Speaker 1: was the second time they did it, all because of 152 00:09:01,840 --> 00:09:04,440 Speaker 1: this Library of Congress ruling. They bought a bunch of machines, 153 00:09:04,480 --> 00:09:06,280 Speaker 1: They invited a bunch of packers into the room, and 154 00:09:06,280 --> 00:09:08,800 Speaker 1: they said go to town. And of course the results 155 00:09:08,800 --> 00:09:11,920 Speaker 1: were probably what you are are guessing. Lots of these 156 00:09:11,960 --> 00:09:14,959 Speaker 1: machines are hackable. UM. I want to take one step back, though. 157 00:09:15,000 --> 00:09:18,680 Speaker 1: You mentioned that your machines are from two thousand two, 158 00:09:19,400 --> 00:09:23,000 Speaker 1: and that's not an accident, that's on purpose. UM. All 159 00:09:23,040 --> 00:09:25,240 Speaker 1: of the country, all over the country are these fifteen 160 00:09:25,320 --> 00:09:28,440 Speaker 1: year old voting machines that are essentially the byproduct of 161 00:09:28,480 --> 00:09:31,360 Speaker 1: something called the Help America Vote Act and that was 162 00:09:31,400 --> 00:09:34,640 Speaker 1: the result of the Bush v. Gore uh controversy in 163 00:09:34,640 --> 00:09:38,000 Speaker 1: the year two thousand. Remember the hanging chads and people 164 00:09:38,080 --> 00:09:41,200 Speaker 1: staring at screens all the time, and America at that 165 00:09:41,240 --> 00:09:43,440 Speaker 1: point said, oh, we have to update our voting technology. 166 00:09:43,480 --> 00:09:46,719 Speaker 1: Why are we using this nineteen seventies era machines with 167 00:09:46,800 --> 00:09:48,880 Speaker 1: these big levers in these punch cards that you know, 168 00:09:48,960 --> 00:09:51,760 Speaker 1: are like from an IBM basement computer. In the fifties, 169 00:09:52,120 --> 00:09:54,360 Speaker 1: so the federal government made a bunch of money available 170 00:09:54,960 --> 00:09:58,880 Speaker 1: for localities, for states and counties to buy new voting machines, 171 00:09:58,960 --> 00:10:01,400 Speaker 1: and so there was a gold rush. Everybody went to 172 00:10:01,440 --> 00:10:04,000 Speaker 1: buy what was then the latest technology, which looked something 173 00:10:04,040 --> 00:10:06,840 Speaker 1: like an a t M machine. Um. But the problem is, 174 00:10:06,880 --> 00:10:08,480 Speaker 1: and I think this is the point I'd really like 175 00:10:08,520 --> 00:10:11,720 Speaker 1: to stress more than any the federal government can't tell 176 00:10:11,920 --> 00:10:15,400 Speaker 1: jurisdictions how to collect votes. And there are something like 177 00:10:15,679 --> 00:10:20,360 Speaker 1: ten thousand different entities all across America and overseas that 178 00:10:20,440 --> 00:10:23,640 Speaker 1: can collect votes in America, ten thousand, and all of 179 00:10:23,679 --> 00:10:26,120 Speaker 1: them decide on their own what kind of machine, what 180 00:10:26,200 --> 00:10:29,280 Speaker 1: kind of procedures they're going to use. So anybody who's 181 00:10:29,280 --> 00:10:33,600 Speaker 1: ever moved and gone to vote realizes, wait, this process 182 00:10:33,679 --> 00:10:36,600 Speaker 1: is totally different from the last process that I used. Now, 183 00:10:37,000 --> 00:10:39,400 Speaker 1: think about how how hard it is for a billion 184 00:10:39,440 --> 00:10:42,040 Speaker 1: dollar company to secure its systems. And now I want 185 00:10:42,080 --> 00:10:44,960 Speaker 1: you to think about ten thousand voting jurisdictions, all of 186 00:10:45,000 --> 00:10:50,320 Speaker 1: whom need to have some cybersecurity expert connected to their process. 187 00:10:50,840 --> 00:10:53,360 Speaker 1: There aren't ten thousand qualified people in the world period 188 00:10:53,360 --> 00:10:55,640 Speaker 1: to do that, let alone that these jurisdictions have the 189 00:10:55,640 --> 00:10:57,920 Speaker 1: money to pay for that. So it's a very very 190 00:10:57,920 --> 00:11:01,040 Speaker 1: tricky problem, and we're only to beginning to come scratch 191 00:11:01,080 --> 00:11:03,600 Speaker 1: the surface of how we fix it. Yes, and in 192 00:11:03,600 --> 00:11:08,800 Speaker 1: the United States it is incredibly complicated. I mean in Georgia, 193 00:11:08,840 --> 00:11:14,600 Speaker 1: for the longest time, the the responsibility of getting the 194 00:11:14,640 --> 00:11:19,520 Speaker 1: infrastructure in place to even cast votes was left up 195 00:11:19,520 --> 00:11:23,120 Speaker 1: to the county level. It was only later that it 196 00:11:23,160 --> 00:11:26,680 Speaker 1: became a state level thing, which was largely to install 197 00:11:26,800 --> 00:11:33,520 Speaker 1: these these electronic voting machines. We've obviously, again not to 198 00:11:33,520 --> 00:11:38,520 Speaker 1: harangue my home state, but we've also had a recent 199 00:11:38,600 --> 00:11:45,880 Speaker 1: controversy about the the the the central server for a 200 00:11:46,000 --> 00:11:49,080 Speaker 1: counting up the votes and whether or not it was 201 00:11:49,200 --> 00:11:54,280 Speaker 1: the target of Russian hackers a few years ago, and 202 00:11:55,040 --> 00:11:59,120 Speaker 1: the story unfolded that the main server that was being 203 00:11:59,160 --> 00:12:04,840 Speaker 1: tested was over in Kinnesaw State University, was part of 204 00:12:04,920 --> 00:12:10,040 Speaker 1: the overall system that was in charge of tabulating all 205 00:12:10,040 --> 00:12:13,360 Speaker 1: these votes up run by Center for Election Systems here 206 00:12:13,360 --> 00:12:16,800 Speaker 1: in Georgia. And there was a lawsuit that was brought 207 00:12:16,880 --> 00:12:21,559 Speaker 1: together to get more access to this technology to understand 208 00:12:21,679 --> 00:12:24,520 Speaker 1: what potential vulnerabilities there might be, whether or not any 209 00:12:24,559 --> 00:12:29,120 Speaker 1: hackers had actually managed to penetrate that, And then the 210 00:12:29,440 --> 00:12:32,640 Speaker 1: word got out that the server had been wiped, and 211 00:12:32,679 --> 00:12:37,480 Speaker 1: then a month later the backup servers had been wiped, which, again, 212 00:12:37,880 --> 00:12:43,360 Speaker 1: even if nothing bad happened, casts doubt on legitimacy. And 213 00:12:43,440 --> 00:12:48,359 Speaker 1: it is you some could argue a method of voter suppression, 214 00:12:48,440 --> 00:12:53,280 Speaker 1: it ends up demoralizing the voting base. Uh. And it 215 00:12:53,360 --> 00:12:56,920 Speaker 1: is incredibly complicated, and not just from it. This is 216 00:12:56,960 --> 00:13:00,440 Speaker 1: a psychology thing, not just a technology thing. And it 217 00:13:00,480 --> 00:13:03,120 Speaker 1: does not help that we know that Russian hackers were 218 00:13:03,120 --> 00:13:06,720 Speaker 1: targeting various election systems, or at least probing election systems 219 00:13:06,960 --> 00:13:10,440 Speaker 1: throughout the United States and two thousand you know, fourteen 220 00:13:10,480 --> 00:13:14,520 Speaker 1: fifteen through two thousand sixteen, right right, I think this 221 00:13:14,600 --> 00:13:18,240 Speaker 1: is a really important point, and we're racing out a 222 00:13:19,240 --> 00:13:22,320 Speaker 1: mini version of the Breach podcast to deal with this, 223 00:13:22,400 --> 00:13:25,640 Speaker 1: and and I actually would just like to stress one thing. 224 00:13:25,760 --> 00:13:27,960 Speaker 1: Of course, when you think about voting hacking, you think 225 00:13:28,000 --> 00:13:30,400 Speaker 1: about somebody switching a vote in a machine. I've voted 226 00:13:30,440 --> 00:13:33,000 Speaker 1: for Bush, the machine thinks score and and so on. 227 00:13:33,480 --> 00:13:37,880 Speaker 1: But but voting hacking, air quotes hacking actually has a 228 00:13:38,240 --> 00:13:42,040 Speaker 1: much much wider life cycle than that, And it begins 229 00:13:42,120 --> 00:13:45,280 Speaker 1: all the way with you already suggested it with voter suppression, 230 00:13:45,600 --> 00:13:49,360 Speaker 1: with you deciding you're going to vote or not and 231 00:13:49,400 --> 00:13:52,160 Speaker 1: who you're going to vote for, so that process can 232 00:13:52,200 --> 00:13:54,200 Speaker 1: be hacked. And again we have evidence of this from 233 00:13:55,160 --> 00:13:59,720 Speaker 1: that outsidentities can can create false news stories that say 234 00:13:59,760 --> 00:14:02,559 Speaker 1: the the Pope has endorsed Donald Trump, for example, or 235 00:14:03,080 --> 00:14:06,000 Speaker 1: one of a thousand other fake news stories that that 236 00:14:06,720 --> 00:14:10,160 Speaker 1: you utilize this propaganda too, is to do one of 237 00:14:10,200 --> 00:14:13,880 Speaker 1: two things. Either confuse the electorate or depress the electric 238 00:14:13,920 --> 00:14:16,079 Speaker 1: so people don't bother to vote or don't bother to register. 239 00:14:16,840 --> 00:14:19,400 Speaker 1: That's only one step. The next step is you actually register, 240 00:14:19,440 --> 00:14:21,720 Speaker 1: and how does that happen, How does the registration process 241 00:14:22,240 --> 00:14:24,360 Speaker 1: take place, and whether or not the information is lost 242 00:14:24,440 --> 00:14:28,880 Speaker 1: or somehow manipulated. When you register on online or or 243 00:14:28,920 --> 00:14:32,200 Speaker 1: over paper, and then that registration stays in a computer somewhere, 244 00:14:32,240 --> 00:14:34,080 Speaker 1: and as long as it's in a computer, something bad 245 00:14:34,080 --> 00:14:37,240 Speaker 1: can happen to it, right and magically, when you the 246 00:14:37,280 --> 00:14:39,240 Speaker 1: day you show up to vote, there has to be 247 00:14:39,240 --> 00:14:41,280 Speaker 1: a way that the person sitting across the table from 248 00:14:41,320 --> 00:14:43,440 Speaker 1: you can verify that you're entitled to vote where you are. 249 00:14:43,920 --> 00:14:45,640 Speaker 1: Um that used to be done with paper. A lot 250 00:14:45,640 --> 00:14:48,280 Speaker 1: of these states now use something called e poll books. 251 00:14:48,880 --> 00:14:52,240 Speaker 1: They're usually iPads, and on the fly, those poll books 252 00:14:52,240 --> 00:14:55,160 Speaker 1: are downloading voter registration data to make sure that Bob 253 00:14:55,160 --> 00:14:58,600 Speaker 1: Sullivan really lives in this voting precinct and belongs there. 254 00:14:59,400 --> 00:15:04,200 Speaker 1: Um that the information gets there, sometimes wirelessly, sometimes over 255 00:15:04,320 --> 00:15:07,560 Speaker 1: over an old network, But that's another opportunity where the 256 00:15:07,600 --> 00:15:10,920 Speaker 1: information could be altered and and from the file of 257 00:15:11,000 --> 00:15:15,200 Speaker 1: voter suppression. Imagine if they either erase it or just 258 00:15:15,280 --> 00:15:17,240 Speaker 1: move it, move you to another district. So you show 259 00:15:17,320 --> 00:15:19,520 Speaker 1: up to vote, and this happens all the time without hacking. 260 00:15:19,720 --> 00:15:21,600 Speaker 1: You shop to vote and the person says, no, no, no no, 261 00:15:21,600 --> 00:15:23,480 Speaker 1: you're not registered in this district. You have to vote 262 00:15:23,920 --> 00:15:26,280 Speaker 1: over there, and maybe over there is an hour away, 263 00:15:26,280 --> 00:15:27,840 Speaker 1: and maybe your lunch hour is gone, and now you're 264 00:15:27,880 --> 00:15:31,720 Speaker 1: not voting. So a lot of times we think about 265 00:15:31,760 --> 00:15:34,760 Speaker 1: vote hacking is somebody switching a million votes for some candidate, 266 00:15:34,800 --> 00:15:37,520 Speaker 1: But it can happen in these smaller ways too. So 267 00:15:37,680 --> 00:15:40,720 Speaker 1: and then after you vote on the machine, the machine 268 00:15:40,760 --> 00:15:43,520 Speaker 1: has to tabulate, so maybe it registers the vote correctly, 269 00:15:43,520 --> 00:15:46,800 Speaker 1: but the tabulation is done wrong. After that local machine 270 00:15:46,840 --> 00:15:49,880 Speaker 1: tabulates and all the precinct votes are added up using 271 00:15:49,880 --> 00:15:53,000 Speaker 1: some process. And then I think this is the trickiest part. 272 00:15:53,320 --> 00:15:55,720 Speaker 1: All those results have to be transmitted eventually to the 273 00:15:55,760 --> 00:15:58,840 Speaker 1: Secretary of State's office. Sometimes it's a sneaker net. Somebody 274 00:15:58,840 --> 00:16:01,120 Speaker 1: puts a memory card in our pocket and drives it 275 00:16:01,480 --> 00:16:04,000 Speaker 1: across the state. More often than not, now it's it's 276 00:16:04,120 --> 00:16:07,400 Speaker 1: wirelessly transmitted over a mobile network or whatnot, And those 277 00:16:07,440 --> 00:16:10,240 Speaker 1: are opportunities for hacking. And then just to put one 278 00:16:10,280 --> 00:16:12,480 Speaker 1: more thought in your head, because this is something that 279 00:16:12,520 --> 00:16:15,360 Speaker 1: has happened. The Russian government has done this in other nations. 280 00:16:15,880 --> 00:16:18,320 Speaker 1: Let's say everything goes smoothly, everything is on the up 281 00:16:18,360 --> 00:16:21,400 Speaker 1: and up, no hacking anywhere, but the moment the results 282 00:16:21,400 --> 00:16:24,680 Speaker 1: come out, the secretary of State's website is hacked. So 283 00:16:24,800 --> 00:16:27,800 Speaker 1: for just an instant, the results are the wrong results, 284 00:16:28,320 --> 00:16:31,000 Speaker 1: and then they're switched back. The Secretary of State figures 285 00:16:31,000 --> 00:16:33,480 Speaker 1: out all that's that's a mistake. Now all the people 286 00:16:33,480 --> 00:16:36,680 Speaker 1: who lost are going to have screenshots of this hacked 287 00:16:36,720 --> 00:16:39,880 Speaker 1: page that's say they want, And once again you can 288 00:16:39,920 --> 00:16:44,440 Speaker 1: create domat in flament doubt and uncertainty and illegitimacy an election, 289 00:16:44,880 --> 00:16:46,960 Speaker 1: not by doing any of this other sci fi stuff 290 00:16:47,000 --> 00:16:50,680 Speaker 1: we're talking about, but simply one website hack. So protecting 291 00:16:50,720 --> 00:16:52,920 Speaker 1: the vote requires you to protect every single one of 292 00:16:53,000 --> 00:16:57,040 Speaker 1: these stages, and it's a mammoth task. Guys, Mr Sullivan 293 00:16:57,080 --> 00:17:00,920 Speaker 1: and I continued our conversation about data secure and elections, 294 00:17:01,000 --> 00:17:03,400 Speaker 1: but before we get into that, let's take a quick 295 00:17:03,440 --> 00:17:13,439 Speaker 1: break to thank our sponsor. I'm so glad that I 296 00:17:13,480 --> 00:17:17,159 Speaker 1: have you on here too, because it also it reinforces 297 00:17:18,280 --> 00:17:21,000 Speaker 1: this idea that I have said in previous episodes when 298 00:17:21,040 --> 00:17:24,480 Speaker 1: it comes to talking about hacking, because we all have 299 00:17:25,200 --> 00:17:28,800 Speaker 1: the image in our heads of the Hollywood version of hacking, 300 00:17:28,920 --> 00:17:34,239 Speaker 1: where it's the the either the super too cool for 301 00:17:34,280 --> 00:17:38,040 Speaker 1: you person or the incredibly nerdy person sitting down at 302 00:17:38,119 --> 00:17:41,720 Speaker 1: a computer terminal, typing in and on the third attempt 303 00:17:41,800 --> 00:17:43,680 Speaker 1: gets through the password and then you see a bunch 304 00:17:43,720 --> 00:17:46,359 Speaker 1: of meaningless characters on the screen and they say whoa 305 00:17:46,760 --> 00:17:50,359 Speaker 1: or something along those lines, which is not at all, 306 00:17:50,520 --> 00:17:53,760 Speaker 1: not at all close to reality. One of the biggest 307 00:17:53,760 --> 00:17:57,400 Speaker 1: tools in any hackers toolbox, and it's one that you've 308 00:17:57,600 --> 00:18:02,159 Speaker 1: you've mentioned, is social engineering. This idea of manipulating people 309 00:18:02,240 --> 00:18:05,159 Speaker 1: to get what you want, which doesn't even involve you 310 00:18:05,240 --> 00:18:09,840 Speaker 1: necessarily having contact with a device. It is an age 311 00:18:09,880 --> 00:18:13,960 Speaker 1: old trick of hackers because it's it goes all the 312 00:18:13,960 --> 00:18:16,680 Speaker 1: way back to confidence. Man. I mean, this is something 313 00:18:16,720 --> 00:18:18,440 Speaker 1: that has been around for ages. If you know how 314 00:18:18,480 --> 00:18:21,760 Speaker 1: people think, and you know what people react to, and 315 00:18:21,800 --> 00:18:24,439 Speaker 1: you leverage that, you can get what you want. And 316 00:18:24,520 --> 00:18:28,639 Speaker 1: it may only involve very light technology and the sense 317 00:18:28,760 --> 00:18:33,000 Speaker 1: of hacking that one website that's enough to to really 318 00:18:33,359 --> 00:18:38,119 Speaker 1: put things into a turmoil. Uh, it doesn't require having 319 00:18:38,560 --> 00:18:42,520 Speaker 1: physical access to these voting machines. I see that often 320 00:18:43,040 --> 00:18:45,399 Speaker 1: people pointing that out, that a lot of these machines 321 00:18:45,440 --> 00:18:49,000 Speaker 1: would require someone to get physical access and that the 322 00:18:49,080 --> 00:18:52,439 Speaker 1: likelihood of that is very low. And again all of 323 00:18:52,440 --> 00:18:55,439 Speaker 1: that may very well be true, but ultimately, if the 324 00:18:55,480 --> 00:18:59,880 Speaker 1: perception that it's possible these machines have been uh tam 325 00:19:00,040 --> 00:19:03,320 Speaker 1: bread with is enough to cause problems. It's kind of 326 00:19:03,320 --> 00:19:08,000 Speaker 1: a moot point, so uh. I've also seen one of 327 00:19:08,000 --> 00:19:10,040 Speaker 1: the first articles I ever worked on when I was 328 00:19:10,119 --> 00:19:13,240 Speaker 1: hired by how Stuff Works, was about electronic voting machines, 329 00:19:13,680 --> 00:19:15,359 Speaker 1: and there was a section that they wanted me to 330 00:19:15,400 --> 00:19:19,800 Speaker 1: do about the concept of casting votes over the internet, 331 00:19:20,119 --> 00:19:25,160 Speaker 1: because the idea of giving accessibility, widespread accessibility, improving that 332 00:19:25,240 --> 00:19:29,800 Speaker 1: maybe maybe improving the number of people who actually participate 333 00:19:30,480 --> 00:19:34,000 Speaker 1: in the voting process. Things like that are very important 334 00:19:34,160 --> 00:19:37,560 Speaker 1: and it's you can't just dismiss them. However, I'm curious 335 00:19:37,600 --> 00:19:41,480 Speaker 1: what you think. Is there a is there a way 336 00:19:41,600 --> 00:19:44,399 Speaker 1: do you think of implementing such a system that would 337 00:19:44,480 --> 00:19:48,080 Speaker 1: be secure, or at least not just secure, but appear 338 00:19:48,160 --> 00:19:50,320 Speaker 1: secure enough that people would have faith in it, or 339 00:19:50,359 --> 00:19:53,000 Speaker 1: do you think that that's a non starter. I'm gonna 340 00:19:53,040 --> 00:19:57,080 Speaker 1: give you two answers that are a bit contradictory. Um 341 00:19:57,280 --> 00:20:01,879 Speaker 1: I I'm one who believes that the integrity of the 342 00:20:02,240 --> 00:20:06,680 Speaker 1: voting process visa v hacking is incredibly important. That's in 343 00:20:06,720 --> 00:20:09,439 Speaker 1: my cybrus curty friends are going to take out their 344 00:20:09,440 --> 00:20:12,000 Speaker 1: pose and arrows on me right now, but that's probably 345 00:20:12,119 --> 00:20:13,959 Speaker 1: third or fourth on the list of things that are 346 00:20:14,000 --> 00:20:17,600 Speaker 1: wrong with American elections. You know, things like jerrymandering and 347 00:20:17,640 --> 00:20:20,240 Speaker 1: just voter disinterest are even bigger. In fact, one of 348 00:20:20,280 --> 00:20:22,200 Speaker 1: the conclusions not to give away the punch line to 349 00:20:22,280 --> 00:20:25,200 Speaker 1: my podcast, but my conclusion at the end of it 350 00:20:25,240 --> 00:20:29,680 Speaker 1: is the thing that makes America's voting system the most vulnerable, 351 00:20:29,760 --> 00:20:32,720 Speaker 1: Like our biggest vulnerability in our voting process is the 352 00:20:32,800 --> 00:20:35,640 Speaker 1: disinterest of our voters, who you know, half the people 353 00:20:35,640 --> 00:20:38,119 Speaker 1: don't vote even for presidential campaigns, and that creates this 354 00:20:38,240 --> 00:20:41,640 Speaker 1: massive opening that makes it easy for a hacker. Again, 355 00:20:41,680 --> 00:20:44,080 Speaker 1: who wouldn't have to change a million votes, that could 356 00:20:44,119 --> 00:20:47,640 Speaker 1: just change thirty votes in Wisconsin and Pennsylvania and Ohio 357 00:20:48,119 --> 00:20:50,480 Speaker 1: and tip an election. Right. So so it's on us 358 00:20:50,520 --> 00:20:53,239 Speaker 1: to participate. And as a result of that, you know, 359 00:20:54,040 --> 00:20:57,560 Speaker 1: I am one, I am interested in internet voting somewhere 360 00:20:57,600 --> 00:21:01,640 Speaker 1: along the line because it would increase increased participate. Patian Um, 361 00:21:01,680 --> 00:21:06,320 Speaker 1: all of the cybersecurity people who respected me thirty seconds 362 00:21:06,359 --> 00:21:09,239 Speaker 1: ago don't respect me anymore. All of them, in one 363 00:21:09,320 --> 00:21:12,080 Speaker 1: voice would say to you know, what we need is 364 00:21:12,080 --> 00:21:14,760 Speaker 1: paper and we never need internet voting. That that they're 365 00:21:14,800 --> 00:21:17,040 Speaker 1: all terrified of internet voting and just really don't think 366 00:21:17,040 --> 00:21:19,240 Speaker 1: that there's a way to make it secure. And they 367 00:21:19,280 --> 00:21:22,359 Speaker 1: they do know, at least from the security standpoint, even 368 00:21:22,359 --> 00:21:25,120 Speaker 1: better than I do. Like all the reporting says, we're 369 00:21:25,160 --> 00:21:29,600 Speaker 1: nowhere near that there are. There are very limited examples 370 00:21:29,640 --> 00:21:32,720 Speaker 1: even in the US of people voting over the internet. 371 00:21:33,040 --> 00:21:36,480 Speaker 1: Soldiers at sea sometimes can do it overseas um, but 372 00:21:36,560 --> 00:21:40,680 Speaker 1: there's a critical difference there in that they surrender their 373 00:21:40,800 --> 00:21:45,280 Speaker 1: right to anonymity um and just to uh throw another 374 00:21:45,320 --> 00:21:47,879 Speaker 1: concept to you and your listeners. One of the reasons 375 00:21:47,880 --> 00:21:50,679 Speaker 1: this is also difficult is because we have to secure 376 00:21:50,720 --> 00:21:53,280 Speaker 1: these votes but at the same time preserve the anonymity 377 00:21:53,320 --> 00:21:55,440 Speaker 1: of the voter. The secret ballot box is an important 378 00:21:55,440 --> 00:21:58,200 Speaker 1: part of our process and an important part of democracy, 379 00:21:58,520 --> 00:22:00,440 Speaker 1: and as a result of the fact that vote is 380 00:22:00,440 --> 00:22:03,639 Speaker 1: a secret, it's very hard to authenticate the vote and 381 00:22:03,720 --> 00:22:08,400 Speaker 1: not identify the voter, so they do. The exception that's 382 00:22:08,440 --> 00:22:11,400 Speaker 1: made for overseas voting over the internet is the soldier 383 00:22:11,440 --> 00:22:15,280 Speaker 1: will actually allow for a verification of the vote that's 384 00:22:15,320 --> 00:22:18,439 Speaker 1: outside of the someone someone will call or otherwise contact 385 00:22:18,440 --> 00:22:21,080 Speaker 1: the voter and verify the vote itself. So there's a 386 00:22:21,119 --> 00:22:25,640 Speaker 1: process for that, but in general, um, the fact that 387 00:22:26,680 --> 00:22:30,159 Speaker 1: there is no real positive way to identify a person 388 00:22:30,400 --> 00:22:32,800 Speaker 1: on the internet. We see this every day with spam 389 00:22:32,800 --> 00:22:35,920 Speaker 1: mail and everything else. The mix of that lack of 390 00:22:36,200 --> 00:22:40,160 Speaker 1: real authentication with with voting is just a toxic combination 391 00:22:40,200 --> 00:22:42,760 Speaker 1: that seems like a really hard problem to solve. Well 392 00:22:42,800 --> 00:22:46,760 Speaker 1: and and again that has become the a central point 393 00:22:46,880 --> 00:22:50,679 Speaker 1: in in things like voter suppression, where that ends up 394 00:22:50,720 --> 00:22:55,920 Speaker 1: being the the defense of those who would use more 395 00:22:55,960 --> 00:23:00,800 Speaker 1: and more restrictive means of verifying of a ter's identity 396 00:23:00,840 --> 00:23:04,480 Speaker 1: before allowing them to vote because of the perceived fear 397 00:23:04,840 --> 00:23:10,119 Speaker 1: of people misrepresenting themselves and casting what would amount to 398 00:23:10,160 --> 00:23:15,120 Speaker 1: a false vote. Uh. That that's a very popular narrative 399 00:23:15,240 --> 00:23:20,520 Speaker 1: I've seen that doesn't seem to have that much evidence 400 00:23:20,600 --> 00:23:23,760 Speaker 1: to support the fear, right. It's it's one of those 401 00:23:23,760 --> 00:23:25,399 Speaker 1: things where it says, well, you know, we have to 402 00:23:25,440 --> 00:23:27,840 Speaker 1: have these systems in place, otherwise we're going to have 403 00:23:27,880 --> 00:23:33,280 Speaker 1: fraudulent votes, and apart from some small stories that you'll 404 00:23:33,320 --> 00:23:37,800 Speaker 1: hear that end up being exaggerated a great deal, you know, 405 00:23:37,840 --> 00:23:40,359 Speaker 1: like dead people in Chicago casting votes for the mayor, 406 00:23:40,560 --> 00:23:43,600 Speaker 1: kind of thing. You don't really see it, but it 407 00:23:43,680 --> 00:23:47,680 Speaker 1: ends up becoming the central argument for putting systems into 408 00:23:47,680 --> 00:23:54,040 Speaker 1: place that effectively tell people, uh that their vote isn't welcome. 409 00:23:54,240 --> 00:23:57,720 Speaker 1: And I certainly have a lot of empathy for people 410 00:23:57,760 --> 00:24:02,200 Speaker 1: who maybe it's not that they're disinterested in voting necessarily, 411 00:24:02,320 --> 00:24:05,480 Speaker 1: but it's more that they've been actively discouraged from participating 412 00:24:05,520 --> 00:24:09,040 Speaker 1: in the process. And you couple that with anyone who 413 00:24:09,080 --> 00:24:12,400 Speaker 1: has this sensation that perhaps their vote wouldn't even count 414 00:24:12,440 --> 00:24:14,840 Speaker 1: in the first place, you can definitely see why there 415 00:24:14,840 --> 00:24:18,159 Speaker 1: are some real problems. Um. I've seen a lot of 416 00:24:18,160 --> 00:24:22,600 Speaker 1: people argue for things like a mandated voter registration, maybe 417 00:24:22,600 --> 00:24:27,000 Speaker 1: go in Australia's direction and have mandated voting as well. 418 00:24:27,560 --> 00:24:31,840 Speaker 1: I've also seen people who who don't necessarily have a 419 00:24:31,840 --> 00:24:34,040 Speaker 1: solution in mind, but they say we need to make 420 00:24:34,119 --> 00:24:37,600 Speaker 1: voting as easy for elections as it is to vote 421 00:24:37,640 --> 00:24:40,840 Speaker 1: for American Idol. That that's like a common meme on 422 00:24:40,880 --> 00:24:43,560 Speaker 1: the internet as well. And I don't I don't necessarily 423 00:24:43,600 --> 00:24:47,520 Speaker 1: disagree with any of that. I think encouraging participation is 424 00:24:47,560 --> 00:24:50,760 Speaker 1: really is really key. As you point out, the more 425 00:24:50,760 --> 00:24:53,160 Speaker 1: people who are participates who participate, the harder it would 426 00:24:53,200 --> 00:24:57,320 Speaker 1: be to really skew the outcome for one thing. Uh, 427 00:24:57,600 --> 00:25:00,440 Speaker 1: it would be much more noticeable if any one word, 428 00:25:00,480 --> 00:25:02,560 Speaker 1: to try and skew an outcome where you've got a 429 00:25:02,840 --> 00:25:07,159 Speaker 1: very large participation unless things are just you know, super 430 00:25:07,200 --> 00:25:09,399 Speaker 1: close all the way down the line among the entire 431 00:25:09,720 --> 00:25:15,080 Speaker 1: population of whatever region it is that's holding the election. Uh. 432 00:25:15,119 --> 00:25:21,560 Speaker 1: What do you think is a solution or perhaps the 433 00:25:21,600 --> 00:25:27,520 Speaker 1: best way to move forward given all these different problems, 434 00:25:27,560 --> 00:25:31,320 Speaker 1: not just the technological ones but the psychological ones. Sure well, 435 00:25:31,520 --> 00:25:36,040 Speaker 1: the solution a lot of folks agree on is actually 436 00:25:36,080 --> 00:25:40,360 Speaker 1: not in more technology. Um. You love technology, Evan writing 437 00:25:40,359 --> 00:25:42,520 Speaker 1: about technology for twenty years, But I think all of 438 00:25:42,600 --> 00:25:44,960 Speaker 1: us have a really bad habit of thinking whenever there's 439 00:25:44,960 --> 00:25:47,439 Speaker 1: a problem that more technology is going to take care 440 00:25:47,480 --> 00:25:49,880 Speaker 1: of it, it's a silver bullet somehow, Like a better 441 00:25:49,960 --> 00:25:52,440 Speaker 1: voting machine is not going to take care of this. Um, 442 00:25:52,440 --> 00:25:55,159 Speaker 1: there's a bunch of solutions, but the most important solution 443 00:25:55,320 --> 00:25:58,880 Speaker 1: is the auditing of elections themselves. So, um, there has 444 00:25:58,920 --> 00:26:01,840 Speaker 1: to be some sort of process s that's automatic after 445 00:26:01,880 --> 00:26:05,080 Speaker 1: every election where we figure out whether or not the 446 00:26:05,160 --> 00:26:07,359 Speaker 1: votes that were cast are actually the votes that were counted, 447 00:26:07,400 --> 00:26:09,720 Speaker 1: and we're presented to us later. And now I'm going 448 00:26:09,760 --> 00:26:12,360 Speaker 1: to tell you why that hasn't happened yet, even though 449 00:26:12,400 --> 00:26:15,200 Speaker 1: pretty much everybody agrees on the election. And I'm going 450 00:26:15,240 --> 00:26:17,639 Speaker 1: to draw on my first five years as a I 451 00:26:17,760 --> 00:26:20,639 Speaker 1: was a reporter in small town newspapers in New Jersey, 452 00:26:20,720 --> 00:26:23,479 Speaker 1: covering school boards and planning boards in the city council 453 00:26:23,520 --> 00:26:26,760 Speaker 1: and mayor and all that, and elections of course, local elections. 454 00:26:27,240 --> 00:26:30,040 Speaker 1: And when you're running from mayor of Pompton Planes, you know, 455 00:26:30,080 --> 00:26:33,159 Speaker 1: the vote is five hundred to four every time, and 456 00:26:33,200 --> 00:26:36,359 Speaker 1: the loser always demands a recount. What does the winner 457 00:26:36,400 --> 00:26:40,080 Speaker 1: say when the loser demands a recount? This has happened 458 00:26:40,119 --> 00:26:42,600 Speaker 1: over over in American history. Winners never want to recount. 459 00:26:42,680 --> 00:26:44,879 Speaker 1: Only bad things can happen if you're a winner and 460 00:26:44,880 --> 00:26:49,040 Speaker 1: there's a recount. So the people have spoken, that's it. Yeah, 461 00:26:49,160 --> 00:26:51,600 Speaker 1: And because you know what, there's a there's a chance 462 00:26:51,640 --> 00:26:53,920 Speaker 1: that you might lose the recount, so stop the recount. 463 00:26:54,520 --> 00:26:57,600 Speaker 1: Um and Audit sounds a lot like recounts to people. 464 00:26:57,960 --> 00:27:02,880 Speaker 1: So in general, here's what happens. Some election, however legitimate 465 00:27:03,000 --> 00:27:05,000 Speaker 1: or not legitimate, it is again, it's it's not like 466 00:27:05,000 --> 00:27:06,560 Speaker 1: a yes or no it's more of a scale of 467 00:27:06,560 --> 00:27:09,399 Speaker 1: one to ten. Thing. Some election occurs, some group of 468 00:27:09,440 --> 00:27:12,159 Speaker 1: people win, and that group of people has no interest 469 00:27:12,200 --> 00:27:15,320 Speaker 1: whatsoever in changing whatever it was before that had them win. 470 00:27:16,000 --> 00:27:18,239 Speaker 1: So if you were to implement a new process like 471 00:27:18,600 --> 00:27:20,760 Speaker 1: they do, they do these sample audits. They're called risk 472 00:27:20,880 --> 00:27:23,280 Speaker 1: limiting audits, and this is generally agreed to be the 473 00:27:23,320 --> 00:27:25,440 Speaker 1: best way to approach this. You take a sample of 474 00:27:25,480 --> 00:27:28,400 Speaker 1: the ballots that's very carefully picked, and you make sure 475 00:27:28,440 --> 00:27:31,439 Speaker 1: that the results match the published results, and that's just 476 00:27:31,880 --> 00:27:34,600 Speaker 1: away with like a confidence level that you see if 477 00:27:34,600 --> 00:27:37,400 Speaker 1: the election was manipulated or not. The winners never want 478 00:27:37,400 --> 00:27:39,760 Speaker 1: to do this because again, if you're in office now, 479 00:27:39,760 --> 00:27:41,400 Speaker 1: why would you want to do something that might raise 480 00:27:41,400 --> 00:27:44,200 Speaker 1: the possibility of somebody saying, oh, no, you didn't win. 481 00:27:44,840 --> 00:27:49,000 Speaker 1: So it's very very hard to get get elected officials 482 00:27:49,040 --> 00:27:51,600 Speaker 1: to spend the money and the focus on this problem. 483 00:27:51,640 --> 00:27:53,520 Speaker 1: And then the other problem is attention span. We care 484 00:27:53,560 --> 00:27:57,520 Speaker 1: about this because it's two weeks before an election in December. 485 00:27:57,520 --> 00:28:00,160 Speaker 1: We'll stop talking about election security until two years from 486 00:28:00,160 --> 00:28:01,960 Speaker 1: now when we talk about it again. This is a 487 00:28:02,000 --> 00:28:04,320 Speaker 1: cycle that we've been through over and over again. I'm 488 00:28:04,400 --> 00:28:07,159 Speaker 1: hoping because of what happened in and all the Russian 489 00:28:07,160 --> 00:28:09,000 Speaker 1: news that's in the year. There might be a little 490 00:28:09,040 --> 00:28:11,160 Speaker 1: more staying power this time around, but it's a very 491 00:28:11,200 --> 00:28:15,840 Speaker 1: hard problem to attack. It's certainly again in my home state, 492 00:28:16,040 --> 00:28:19,679 Speaker 1: it's it's front and center, in no small part because 493 00:28:19,720 --> 00:28:24,320 Speaker 1: we have a governor's race this election in which one 494 00:28:24,480 --> 00:28:28,000 Speaker 1: of the two candidates is the current Secretary of State, 495 00:28:28,680 --> 00:28:34,000 Speaker 1: which has raised all sorts of interesting questions about processes 496 00:28:34,000 --> 00:28:40,360 Speaker 1: and technologies and motivations for not changing systems even after 497 00:28:41,440 --> 00:28:45,400 Speaker 1: these various controversies have happened. Uh, it almost always comes 498 00:28:45,400 --> 00:28:49,600 Speaker 1: down to cost, which is I mean, that's a legitimate factor. 499 00:28:49,760 --> 00:28:52,640 Speaker 1: It's you can't just wave your hand away at that, 500 00:28:52,760 --> 00:28:56,560 Speaker 1: but it also becomes a very convenient one. Uh. And 501 00:28:56,640 --> 00:29:00,600 Speaker 1: so again it ends up having that effect of potentially 502 00:29:00,600 --> 00:29:04,640 Speaker 1: demoralizing people who otherwise would come out and perhaps support 503 00:29:04,640 --> 00:29:07,600 Speaker 1: an opponent. But it's very much in front of us 504 00:29:07,640 --> 00:29:11,240 Speaker 1: here in Georgia. But obviously there are other areas across 505 00:29:11,280 --> 00:29:15,720 Speaker 1: the United States that have maybe not a similar situation, 506 00:29:15,760 --> 00:29:21,280 Speaker 1: but just as passionate a discussion about these processes as 507 00:29:21,320 --> 00:29:25,520 Speaker 1: in my home state. UM, I really appreciate you coming 508 00:29:25,560 --> 00:29:28,320 Speaker 1: on my show and talking about this. I'm very much 509 00:29:28,360 --> 00:29:32,600 Speaker 1: looking forward to your episode of The Breach. Season one 510 00:29:33,040 --> 00:29:38,200 Speaker 1: was phenomenal, going over in great detail the Yahoo hack. 511 00:29:38,640 --> 00:29:42,080 Speaker 1: If you guys out there haven't listened to the Breach, 512 00:29:42,480 --> 00:29:45,160 Speaker 1: please go and do that because you're going to learn 513 00:29:45,680 --> 00:29:50,880 Speaker 1: a lot. Uh. The episodes are informative and they go 514 00:29:50,920 --> 00:29:56,520 Speaker 1: into great detail, and it really stresses how incredibly complicated 515 00:29:56,600 --> 00:30:01,480 Speaker 1: and important and potentially scary information secure already really is. 516 00:30:02,080 --> 00:30:05,160 Speaker 1: And uh, it's nice to have a journalist on to 517 00:30:05,240 --> 00:30:07,960 Speaker 1: talk about this, someone who investigates this sort of stuff. 518 00:30:08,200 --> 00:30:11,560 Speaker 1: I've had hackers on before. They have a very interesting 519 00:30:12,200 --> 00:30:19,280 Speaker 1: philosophy and attitude that I sometimes cannot resolve. So fair enough, Yeah, 520 00:30:19,480 --> 00:30:20,720 Speaker 1: I feel like I would like to if I could 521 00:30:20,720 --> 00:30:23,600 Speaker 1: just throw one quick thing out to you. Though, all 522 00:30:23,840 --> 00:30:27,240 Speaker 1: the election security people stressed this to me, however pessimistic 523 00:30:27,240 --> 00:30:29,440 Speaker 1: they are about the state of affairs today, they all 524 00:30:29,440 --> 00:30:31,560 Speaker 1: are very convinced it can be fixed and that there 525 00:30:31,680 --> 00:30:35,080 Speaker 1: is a solution that that with enough energy and will, 526 00:30:35,600 --> 00:30:37,959 Speaker 1: we can overcome and we can straighten on our voting process. 527 00:30:38,040 --> 00:30:41,560 Speaker 1: Voting is a very very limited specific task as opposed 528 00:30:41,600 --> 00:30:44,680 Speaker 1: to saying something like banking, and it can be secured 529 00:30:44,720 --> 00:30:47,200 Speaker 1: and so one of the messages I don't want to 530 00:30:47,240 --> 00:30:49,440 Speaker 1: leave people with is the vote can be hacked. I 531 00:30:49,440 --> 00:30:52,880 Speaker 1: don't bother voting quite the opposite, right right, It's it's 532 00:30:52,920 --> 00:30:55,720 Speaker 1: this is something we can fix, and it's it's it's 533 00:30:55,880 --> 00:31:01,680 Speaker 1: very important that we do it, and that all it really, ultimately, 534 00:31:01,720 --> 00:31:04,680 Speaker 1: all it really takes is the willpower and the support 535 00:31:04,760 --> 00:31:07,560 Speaker 1: for it to go forward. And and I certainly think 536 00:31:07,600 --> 00:31:10,320 Speaker 1: that we're getting into a a climate where that's going 537 00:31:10,360 --> 00:31:14,240 Speaker 1: to become more and more of an issue, where there's 538 00:31:14,240 --> 00:31:16,440 Speaker 1: just gonna be a demand for that. I mean, it's 539 00:31:16,600 --> 00:31:21,200 Speaker 1: it cannot stand to continue, Um, as long as there 540 00:31:21,320 --> 00:31:24,840 Speaker 1: is this doubt in the process. It's it's so fundamental 541 00:31:25,320 --> 00:31:27,560 Speaker 1: to who we are as a nation. In order for 542 00:31:27,640 --> 00:31:32,000 Speaker 1: that too, uh, it has to be addressed on a 543 00:31:32,120 --> 00:31:35,920 Speaker 1: very real level and not just given lip service. I 544 00:31:36,200 --> 00:31:40,720 Speaker 1: actually am fairly optimistic about this, and um, I think 545 00:31:40,760 --> 00:31:43,400 Speaker 1: that maybe in the short term it may not be 546 00:31:44,440 --> 00:31:47,400 Speaker 1: the smoothest ride, but I think we are going to 547 00:31:47,440 --> 00:31:51,720 Speaker 1: see a lot more advocacy and movement on this front. 548 00:31:52,080 --> 00:31:55,560 Speaker 1: So I have high hopes and I do look forward 549 00:31:55,720 --> 00:31:58,200 Speaker 1: to hearing that episode of the Breach once it publishes. 550 00:31:58,240 --> 00:32:01,680 Speaker 1: I believe it's coming out November five. Yeah, maybe a 551 00:32:01,680 --> 00:32:03,760 Speaker 1: couple of days sooner than that, but right before the election. 552 00:32:03,800 --> 00:32:06,760 Speaker 1: Just excellent to get your stirred up. All right, well, 553 00:32:06,800 --> 00:32:09,200 Speaker 1: I look forward to hearing that. Bob, thank you so 554 00:32:09,320 --> 00:32:13,320 Speaker 1: much for joining our show. Thanks for having me. I 555 00:32:13,360 --> 00:32:17,320 Speaker 1: have a bit more to say about the technology of voting, 556 00:32:17,400 --> 00:32:19,880 Speaker 1: so stick around, but first let's take a quick break 557 00:32:20,000 --> 00:32:30,640 Speaker 1: to thank our sponsor again. I want to thank Bob 558 00:32:30,680 --> 00:32:35,120 Speaker 1: Sullivan for joining us on the show and his insight 559 00:32:35,360 --> 00:32:42,080 Speaker 1: into the dangers of hacking and voting. We realize, of 560 00:32:42,120 --> 00:32:45,800 Speaker 1: course that you know, just the appearance of that danger 561 00:32:45,920 --> 00:32:49,080 Speaker 1: is enough to influence people because we we are human beings, 562 00:32:49,120 --> 00:32:53,520 Speaker 1: were affected by our emotions, and if you introduce the 563 00:32:53,600 --> 00:32:59,440 Speaker 1: thought that perhaps the system is broken, that has real consequences. 564 00:32:59,800 --> 00:33:02,520 Speaker 1: But before I go, I wanted to give a quick rundown, 565 00:33:02,600 --> 00:33:07,239 Speaker 1: sort of a uh an overview of the advance of 566 00:33:07,320 --> 00:33:10,760 Speaker 1: technology when it comes to voting and how that has 567 00:33:10,880 --> 00:33:15,360 Speaker 1: changed over the years. The mechanical lever machines that have 568 00:33:15,440 --> 00:33:20,240 Speaker 1: been used in voting first debuted back in eighteen eighty nine. 569 00:33:20,280 --> 00:33:23,320 Speaker 1: They were patented by a guy named Jacob H. Myers, 570 00:33:23,760 --> 00:33:27,040 Speaker 1: and the technology actually became known as the Myers Automatic Booth. 571 00:33:27,560 --> 00:33:29,520 Speaker 1: The first time it would actually be used in an 572 00:33:29,520 --> 00:33:33,360 Speaker 1: election was eighteen ninety two in Lockport, New York, and 573 00:33:33,400 --> 00:33:37,760 Speaker 1: then you started to see other lever based voting machines 574 00:33:38,120 --> 00:33:43,760 Speaker 1: arise in the decades following. Optical scans wouldn't be a 575 00:33:43,800 --> 00:33:48,040 Speaker 1: thing until nineteen sixty two. The first optical scan voting 576 00:33:48,040 --> 00:33:52,040 Speaker 1: ballot was used in Karn's City, California. This would be 577 00:33:52,120 --> 00:33:54,280 Speaker 1: the sort of scans that you would see in in 578 00:33:54,760 --> 00:33:56,720 Speaker 1: something like an s a t in a way where 579 00:33:57,080 --> 00:33:59,560 Speaker 1: you've got or a standardized test where you have to 580 00:33:59,600 --> 00:34:02,520 Speaker 1: fill in bubbles. The idea being that you're using a 581 00:34:02,600 --> 00:34:07,400 Speaker 1: ballot and you're using a system that relies upon light 582 00:34:07,720 --> 00:34:11,600 Speaker 1: to read that ballot and to detect which votes have 583 00:34:11,719 --> 00:34:15,880 Speaker 1: been cast and then record them. Uh. Then the next 584 00:34:16,680 --> 00:34:20,720 Speaker 1: moment in tech comes from my hometown in Atlanta, Georgia. 585 00:34:20,880 --> 00:34:24,320 Speaker 1: We have lots of counties that make up the metro 586 00:34:24,400 --> 00:34:28,520 Speaker 1: Atlanta area, but the two main ones are Fulton into 587 00:34:28,560 --> 00:34:31,279 Speaker 1: cab for the heart of the city. Those were the 588 00:34:31,320 --> 00:34:34,560 Speaker 1: first two counties in the United States to use punch 589 00:34:34,640 --> 00:34:38,240 Speaker 1: cards with computer tally machines. That was in nineteen sixty four. 590 00:34:38,960 --> 00:34:43,320 Speaker 1: The following year, Joseph P. Harris and William ruverall patented 591 00:34:43,360 --> 00:34:47,600 Speaker 1: the votomatic punch card system. That was a system where 592 00:34:47,640 --> 00:34:51,600 Speaker 1: you would actually have two things to to reference. You 593 00:34:51,600 --> 00:34:53,600 Speaker 1: had your ballot, but you also had to have a 594 00:34:53,600 --> 00:34:57,680 Speaker 1: little booklet that told you which numbers on the ballot 595 00:34:57,719 --> 00:35:02,279 Speaker 1: would correspond with which candidates in the election. So you 596 00:35:02,360 --> 00:35:06,720 Speaker 1: might say Choice one represents one candidate, Choice to represents 597 00:35:06,760 --> 00:35:09,120 Speaker 1: another candidate. You look at the ballot. The ballot would 598 00:35:09,160 --> 00:35:11,759 Speaker 1: not have the names of the candidates on there, just 599 00:35:11,840 --> 00:35:14,759 Speaker 1: the numbers, and you would punch the corresponding numbers out. 600 00:35:15,200 --> 00:35:17,759 Speaker 1: This was considered to actually be a superior system in 601 00:35:17,800 --> 00:35:20,879 Speaker 1: the that the computer tally machines could handle these sort 602 00:35:20,880 --> 00:35:25,440 Speaker 1: of ballots much more easily than earlier ones. However, you 603 00:35:26,120 --> 00:35:30,640 Speaker 1: could imagine that that also introduced a bit of a 604 00:35:30,680 --> 00:35:33,520 Speaker 1: barrier for people who are trying to use the system 605 00:35:33,640 --> 00:35:36,200 Speaker 1: to make sure that the number that they are punching 606 00:35:36,239 --> 00:35:38,680 Speaker 1: out actually corresponds with the choice they wanted to make. 607 00:35:39,160 --> 00:35:44,600 Speaker 1: In ninety four, a group of inventors got a patent 608 00:35:44,719 --> 00:35:47,480 Speaker 1: for a system they called video Voter, which was a 609 00:35:47,520 --> 00:35:51,200 Speaker 1: direct recording electronic voting machine or d R e UH 610 00:35:51,320 --> 00:35:53,680 Speaker 1: This was likely the first design of a d r 611 00:35:53,719 --> 00:35:55,960 Speaker 1: E that was used in a real election. It was 612 00:35:56,000 --> 00:35:57,840 Speaker 1: first used in nineteen seventy five and a couple of 613 00:35:57,880 --> 00:36:02,160 Speaker 1: Illinois locations, all in the United States, would conduct a 614 00:36:02,160 --> 00:36:04,760 Speaker 1: study and would issue a report that had the title 615 00:36:05,040 --> 00:36:09,160 Speaker 1: Effective Use of Computing Technology and Vote Tallying. And in 616 00:36:09,200 --> 00:36:15,239 Speaker 1: that report they investigated concepts like security, UH system functionality, 617 00:36:15,280 --> 00:36:19,520 Speaker 1: system design, and also the ability to audit systems, which 618 00:36:19,719 --> 00:36:22,400 Speaker 1: if you were listening to Mr Sullivan you realize is 619 00:36:22,440 --> 00:36:25,439 Speaker 1: a very very important part of the process in order 620 00:36:25,440 --> 00:36:29,160 Speaker 1: for people to have confidence that the system works. The 621 00:36:29,320 --> 00:36:32,759 Speaker 1: r F Shoop or Shop Corporation I actually don't know 622 00:36:32,760 --> 00:36:35,439 Speaker 1: how to pronounce it, s h o u P would 623 00:36:35,480 --> 00:36:41,400 Speaker 1: produce the Shoop or Shop Tronic elect ironic voting machine. 624 00:36:41,560 --> 00:36:45,719 Speaker 1: I say elect that way because the elect and electronic 625 00:36:45,880 --> 00:36:48,520 Speaker 1: is all in capital letters. It was a push button 626 00:36:48,600 --> 00:36:50,879 Speaker 1: d r E voting machine, so not a touch screen 627 00:36:50,920 --> 00:36:54,000 Speaker 1: but a push button machine, but it was electronic. It 628 00:36:54,040 --> 00:36:58,160 Speaker 1: was the first to achieve real commercial success. In another 629 00:36:58,200 --> 00:37:00,960 Speaker 1: report comes out. This one's called at Your See Integrity 630 00:37:00,960 --> 00:37:04,920 Speaker 1: and Security and Computerized Vote Tallying was written by Roy Saltman, 631 00:37:05,280 --> 00:37:08,400 Speaker 1: and it warned that punch cards with pre scored sections. 632 00:37:08,400 --> 00:37:09,880 Speaker 1: You know, a punch card that sort of has a 633 00:37:09,920 --> 00:37:13,120 Speaker 1: little perforated areas for you to punch out. Uh, could 634 00:37:13,200 --> 00:37:16,600 Speaker 1: be unreliable and that would come into sharp focus in 635 00:37:16,640 --> 00:37:19,279 Speaker 1: the wake of the two thousand US presidential election. You 636 00:37:19,360 --> 00:37:23,160 Speaker 1: heard Mr Sullivan reference that with the hanging Chad's that's 637 00:37:23,200 --> 00:37:27,960 Speaker 1: what that refers to. In the Federal Election Commission released 638 00:37:28,040 --> 00:37:31,080 Speaker 1: the first set of standards for computer based voting. This 639 00:37:31,160 --> 00:37:33,960 Speaker 1: was called Performance and Test Standards for Punch Card mark 640 00:37:34,040 --> 00:37:37,840 Speaker 1: Sense and Direct Recording Electronic Voting Systems. Marks Sense, by 641 00:37:37,840 --> 00:37:41,560 Speaker 1: the way, referred to U an optical scanning technology, and 642 00:37:41,600 --> 00:37:44,520 Speaker 1: those standards would receive updates in two thousand two with 643 00:37:44,640 --> 00:37:49,400 Speaker 1: the Help America Vote Act. In nineteen six the Reformed 644 00:37:49,440 --> 00:37:54,719 Speaker 1: Party was able to vote in their presidential primary over 645 00:37:54,760 --> 00:37:58,360 Speaker 1: the internet. This was, you know, a small third party, 646 00:37:58,760 --> 00:38:02,680 Speaker 1: small in comparison to other parties, and but it did 647 00:38:02,680 --> 00:38:06,040 Speaker 1: show someone trying to make use of the Internet as 648 00:38:06,280 --> 00:38:11,080 Speaker 1: a tool for elections. But again just a just a primary, 649 00:38:11,160 --> 00:38:15,080 Speaker 1: not a an actual election election. In two thousand three, 650 00:38:15,360 --> 00:38:17,479 Speaker 1: there was a group of computer scientists who would release 651 00:38:17,480 --> 00:38:19,880 Speaker 1: a report after analyzing a specific model of a d 652 00:38:20,000 --> 00:38:23,360 Speaker 1: R e voting machine and finding vulnerabilities that could potentially 653 00:38:23,360 --> 00:38:26,920 Speaker 1: be exploited. So at least as early as two thousand three, 654 00:38:27,320 --> 00:38:33,799 Speaker 1: you had concerned data security experts raising this possibility. Uh 655 00:38:33,840 --> 00:38:36,520 Speaker 1: In in two thousand four, we had that moment where 656 00:38:36,760 --> 00:38:40,640 Speaker 1: the CEO of die Bald wrote that letter that again 657 00:38:40,760 --> 00:38:44,640 Speaker 1: raised more doubt about the system. UH. Die Bald would 658 00:38:44,640 --> 00:38:46,759 Speaker 1: get out of the voting machine business in two thousand nine. 659 00:38:46,800 --> 00:38:49,800 Speaker 1: They sold off their division to a company called Election 660 00:38:49,920 --> 00:38:53,279 Speaker 1: Systems and Software, Incorporated, which today controls more than so 661 00:38:53,360 --> 00:38:55,800 Speaker 1: many five percent of the voting machine market in the 662 00:38:55,880 --> 00:39:00,799 Speaker 1: United States. Kevin Shelley, the Secretary of State of California, 663 00:39:00,840 --> 00:39:04,440 Speaker 1: would ban touchscreen electronic voting machines in four counties in 664 00:39:04,440 --> 00:39:09,080 Speaker 1: California and de certify all existing systems pending security improvements. 665 00:39:09,080 --> 00:39:11,759 Speaker 1: This also happened in two thousand four. That same year, 666 00:39:12,000 --> 00:39:14,960 Speaker 1: Nevada would pass a mandate that would require all electronic 667 00:39:15,000 --> 00:39:18,400 Speaker 1: voting machines used in federal elections to produce a paper 668 00:39:18,440 --> 00:39:23,160 Speaker 1: audit trail um, something again that Mr Sullivan argued. Two 669 00:39:23,160 --> 00:39:27,520 Speaker 1: thousand four, a company called Unileect got into some trouble. 670 00:39:27,600 --> 00:39:31,560 Speaker 1: They manufactured some voting machines that were used at Carteret 671 00:39:31,640 --> 00:39:35,439 Speaker 1: County in North Carolina, and according to the company, each 672 00:39:35,520 --> 00:39:38,600 Speaker 1: voting machine could store up to ten thousand five d votes, 673 00:39:39,000 --> 00:39:41,960 Speaker 1: but in fact they could only store three thousand five votes, 674 00:39:42,600 --> 00:39:47,200 Speaker 1: and worse than that, they appeared to accept votes beyond 675 00:39:47,480 --> 00:39:51,640 Speaker 1: three thousand and five, but those weren't actually recorded, so 676 00:39:51,760 --> 00:39:54,360 Speaker 1: the machine did not give any indication that its memory 677 00:39:54,480 --> 00:39:58,160 Speaker 1: was full, and ultimately four thousand, four hundred thirty eight 678 00:39:58,239 --> 00:40:01,120 Speaker 1: votes were lost, and when you think about it, that 679 00:40:01,160 --> 00:40:05,440 Speaker 1: means that more than half of all votes were lost. 680 00:40:06,160 --> 00:40:09,640 Speaker 1: Only only less than half in that voting precinct were recorded, 681 00:40:10,080 --> 00:40:13,080 Speaker 1: and that's pretty terrible. Two thousand five, a company called 682 00:40:13,320 --> 00:40:15,880 Speaker 1: black Box Voting would set up a mock election in 683 00:40:15,960 --> 00:40:19,640 Speaker 1: Florida and to security experts were invited to try and 684 00:40:19,719 --> 00:40:23,240 Speaker 1: hack the voting machine system that they were using, and 685 00:40:23,680 --> 00:40:26,359 Speaker 1: they were able to change the outcome of this mock 686 00:40:26,480 --> 00:40:30,359 Speaker 1: election and leave no trace of their presence behind, and 687 00:40:30,400 --> 00:40:33,439 Speaker 1: it demonstrated how some systems can be vulnerable. The same 688 00:40:33,440 --> 00:40:36,040 Speaker 1: company would demonstrate the following year how some systems have 689 00:40:36,120 --> 00:40:39,920 Speaker 1: backdoor vulnerabilities which can be exploited and software can be 690 00:40:39,960 --> 00:40:44,680 Speaker 1: injected to activate months or years later to alter election results. 691 00:40:45,280 --> 00:40:47,439 Speaker 1: So if you were able to get access to these 692 00:40:47,480 --> 00:40:50,920 Speaker 1: machines half a year or a year before an election, 693 00:40:51,400 --> 00:40:54,719 Speaker 1: then you could actually make the alterations, and in the 694 00:40:54,880 --> 00:40:58,040 Speaker 1: time span in between, when security starts to increase the 695 00:40:58,080 --> 00:41:01,160 Speaker 1: elections are coming up, no knows that the machine has 696 00:41:01,160 --> 00:41:05,080 Speaker 1: already been compromised. In two thousand six, Dr Edward Felton 697 00:41:05,200 --> 00:41:07,560 Speaker 1: showed that if you really know what you're doing, you 698 00:41:07,600 --> 00:41:11,120 Speaker 1: could install malware on a Diebold electronic voting machine in 699 00:41:11,200 --> 00:41:14,120 Speaker 1: less than a minute. Now, you had to have physical 700 00:41:14,120 --> 00:41:17,640 Speaker 1: access to the machine, so that does make it more challenging. 701 00:41:17,680 --> 00:41:19,360 Speaker 1: You do have to get that access. But if you 702 00:41:19,440 --> 00:41:22,120 Speaker 1: knew what you were doing and you had the physical access, 703 00:41:22,160 --> 00:41:24,960 Speaker 1: in less than sixty seconds, you could put malware that 704 00:41:25,000 --> 00:41:28,359 Speaker 1: would alter votes, could steal votes from one candidate, give 705 00:41:28,360 --> 00:41:30,080 Speaker 1: it to another, and it could also act as a 706 00:41:30,160 --> 00:41:33,480 Speaker 1: virus and spread to other machines if they were sharing 707 00:41:33,520 --> 00:41:37,200 Speaker 1: a network. So pretty nasty stuff. In two thousand seven, 708 00:41:37,239 --> 00:41:40,480 Speaker 1: the Ohio Secretary of State Jennifer L. Brunner would authorize 709 00:41:40,480 --> 00:41:44,960 Speaker 1: a study of these direct electronic voting systems in Ohio, 710 00:41:45,239 --> 00:41:48,560 Speaker 1: and the report would conclude that none of the computer 711 00:41:48,640 --> 00:41:51,839 Speaker 1: based systems in that state met security standards and all 712 00:41:51,880 --> 00:41:56,720 Speaker 1: were susceptible to breaches. In two thousand eleven, Argon National 713 00:41:56,840 --> 00:42:00,200 Speaker 1: Laboratory security experts would demonstrate that they could hack a 714 00:42:00,320 --> 00:42:03,600 Speaker 1: d r E machine via remote control, but that hack, 715 00:42:03,640 --> 00:42:05,960 Speaker 1: again would require getting physical access to the machine. You 716 00:42:05,960 --> 00:42:08,400 Speaker 1: had to install a component in the machine. Once the 717 00:42:08,400 --> 00:42:11,719 Speaker 1: component was in there and connected to the system, then 718 00:42:11,760 --> 00:42:14,560 Speaker 1: you could access the system from up to half a 719 00:42:14,600 --> 00:42:19,040 Speaker 1: mile away and change things around using a remote control. 720 00:42:19,480 --> 00:42:24,000 Speaker 1: In June two thousand seventeen, that's when the first hackathon 721 00:42:24,160 --> 00:42:28,720 Speaker 1: at Defcon really focused on on voter machines, and according 722 00:42:28,760 --> 00:42:33,000 Speaker 1: to the story I was reading, a Danish hacker was 723 00:42:33,040 --> 00:42:36,920 Speaker 1: able to compromise one of the machines that were there 724 00:42:36,960 --> 00:42:39,919 Speaker 1: for the hackers to work on while the presenters were 725 00:42:39,920 --> 00:42:44,080 Speaker 1: still introducing the event, so they hadn't even concluded their 726 00:42:44,080 --> 00:42:47,760 Speaker 1: introductory comments when one of the hackers had already managed 727 00:42:47,800 --> 00:42:49,799 Speaker 1: to compromise one of the machines. By the end of 728 00:42:49,800 --> 00:42:54,320 Speaker 1: the day, the various hackers had discovered and exploited eighteen 729 00:42:54,400 --> 00:42:59,280 Speaker 1: new vulnerabilities in various e voting and epole book systems. 730 00:42:59,320 --> 00:43:04,760 Speaker 1: So this is obviously a matter of major concern, and 731 00:43:05,080 --> 00:43:10,680 Speaker 1: the Russian hacking scandal has done nothing but make that 732 00:43:10,760 --> 00:43:15,200 Speaker 1: even more apparent. And UH, the Department of Homeland Security 733 00:43:15,200 --> 00:43:18,719 Speaker 1: initially stated that twenty one different states voting systems were 734 00:43:18,719 --> 00:43:22,560 Speaker 1: targeted by Russian hackers in two thousand sixteen. UH. Georgia 735 00:43:22,680 --> 00:43:25,000 Speaker 1: was not among them in that report, according to the 736 00:43:25,080 --> 00:43:29,680 Speaker 1: Secretary of State. But then the investigation that Robert Mueller 737 00:43:30,080 --> 00:43:34,240 Speaker 1: conducts is conducting in the in Russian hacking. It included 738 00:43:34,280 --> 00:43:38,280 Speaker 1: the indictment of twelve Russian military officers that were connected 739 00:43:38,360 --> 00:43:43,640 Speaker 1: to election tampering allegations, and that indictment revealed that the 740 00:43:43,719 --> 00:43:47,520 Speaker 1: charges included probing attacks on county websites in Georgia, thus 741 00:43:48,239 --> 00:43:51,320 Speaker 1: showing that perhaps Georgia did not get away scott free 742 00:43:51,520 --> 00:43:55,040 Speaker 1: during that election and that was possibly targeted. This is 743 00:43:55,080 --> 00:43:59,799 Speaker 1: what led to that investigation revolving around the server in 744 00:44:00,000 --> 00:44:04,160 Speaker 1: Georgia that may have been targeted and tampered with, but 745 00:44:04,239 --> 00:44:09,320 Speaker 1: then was subsequently wiped and then the backups were also wiped. UH. 746 00:44:09,320 --> 00:44:12,200 Speaker 1: And there's a whole scandal about that. I'm not going 747 00:44:12,239 --> 00:44:13,719 Speaker 1: to go into that because I know you guys have 748 00:44:13,800 --> 00:44:16,719 Speaker 1: heard way too much about Georgia and a show that's 749 00:44:16,719 --> 00:44:20,680 Speaker 1: just about technology in general. I want to leave off 750 00:44:20,760 --> 00:44:24,080 Speaker 1: with echoing something that Mr Sullivan said, which is that 751 00:44:24,200 --> 00:44:29,879 Speaker 1: this process is incredibly important. And while there are real 752 00:44:30,320 --> 00:44:33,520 Speaker 1: problems here, challenges that we need to overcome, they are 753 00:44:34,080 --> 00:44:40,680 Speaker 1: totally solvable problems. These are not insurmountable. There's the things 754 00:44:40,719 --> 00:44:43,440 Speaker 1: that are standing in our way are willpower and money. 755 00:44:43,520 --> 00:44:48,760 Speaker 1: So we have to have legislations that are are willing 756 00:44:48,800 --> 00:44:52,920 Speaker 1: to budget the money necessary to put the proper systems 757 00:44:52,920 --> 00:44:55,439 Speaker 1: in place. We need to have the willpower to make 758 00:44:55,480 --> 00:45:00,600 Speaker 1: sure that those are done responsibly and that no one 759 00:45:00,760 --> 00:45:03,640 Speaker 1: is being told not to vote, that we aren't suppressing 760 00:45:03,680 --> 00:45:08,759 Speaker 1: anyone's vote. And that goes for any political side. It 761 00:45:08,800 --> 00:45:13,080 Speaker 1: doesn't matter to me whether you share my personal political 762 00:45:13,080 --> 00:45:16,200 Speaker 1: philosophy or not. What matters to me is that you 763 00:45:16,239 --> 00:45:19,240 Speaker 1: are able to express that in the manner of voting, 764 00:45:19,320 --> 00:45:24,560 Speaker 1: and that you are not, you know, kept from that activity, 765 00:45:24,600 --> 00:45:27,920 Speaker 1: that you are able to participate fully in the democratic process. 766 00:45:28,120 --> 00:45:31,760 Speaker 1: That to me is the most important part. And uh, 767 00:45:31,800 --> 00:45:36,480 Speaker 1: and I'm well, if it comes out where the political 768 00:45:36,480 --> 00:45:39,719 Speaker 1: philosophy I believe in is on the losing side, if 769 00:45:39,800 --> 00:45:42,719 Speaker 1: the election appears to be completely legitimate, and that's the 770 00:45:42,719 --> 00:45:44,760 Speaker 1: will power that you know, it's the will of the people, 771 00:45:45,160 --> 00:45:46,960 Speaker 1: and I just happen to be on the losing side. 772 00:45:47,719 --> 00:45:51,080 Speaker 1: I can reconcile that a lot more easily than I 773 00:45:51,160 --> 00:45:55,640 Speaker 1: can thinking I'll never know if the system reflects what 774 00:45:55,680 --> 00:45:58,560 Speaker 1: the people really wanted, or if the system failed us 775 00:45:58,640 --> 00:46:03,680 Speaker 1: because of either inherent vulnerabilities in that system or the 776 00:46:03,760 --> 00:46:08,719 Speaker 1: perception of those vulnerabilities. And uh, it's a complicated thing. 777 00:46:08,920 --> 00:46:11,400 Speaker 1: But again, I want to thank Mr Sullivan for joining 778 00:46:11,719 --> 00:46:15,279 Speaker 1: the podcast and sharing his expertise and his thoughts on 779 00:46:15,320 --> 00:46:18,719 Speaker 1: the matter. I recommend you check out the Breach podcast. 780 00:46:18,719 --> 00:46:21,040 Speaker 1: I think you will really enjoy it. If you, guys 781 00:46:21,040 --> 00:46:23,279 Speaker 1: have suggestions for future episodes of tech Stuff, you should 782 00:46:23,280 --> 00:46:25,640 Speaker 1: send me an email. The addresses tech Stuff at how 783 00:46:25,719 --> 00:46:27,880 Speaker 1: stuff works dot com, or you can drop me a 784 00:46:27,920 --> 00:46:29,719 Speaker 1: line on Facebook or Twitter. The handle at both of 785 00:46:29,760 --> 00:46:33,280 Speaker 1: those is text Stuff hs W. You can go over 786 00:46:33,320 --> 00:46:36,600 Speaker 1: to our merchandsise store that's at t public dot com 787 00:46:36,640 --> 00:46:39,800 Speaker 1: slash tech Stuff. Check that out. Don't forget to follow 788 00:46:39,920 --> 00:46:43,960 Speaker 1: us on Instagram and help talk to you again really 789 00:46:44,040 --> 00:46:52,200 Speaker 1: soon for more on this and thousands of other topics. 790 00:46:52,280 --> 00:47:01,080 Speaker 1: Because it how stuff works dot com. Eight