1 00:00:04,400 --> 00:00:07,760 Speaker 1: Welcome to text Stuff, a production from my Heart Radio. 2 00:00:12,000 --> 00:00:14,280 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,400 --> 00:00:17,400 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,440 --> 00:00:19,639 Speaker 1: and a love of all things tech. And if you 5 00:00:19,720 --> 00:00:23,119 Speaker 1: listened to last Wednesday's episode, you heard a classic episode 6 00:00:23,160 --> 00:00:27,080 Speaker 1: of tech Stuff about electronic voting machines. The episode published 7 00:00:27,080 --> 00:00:31,440 Speaker 1: back in October twenty six and Mr Ben Bolan of 8 00:00:31,480 --> 00:00:33,880 Speaker 1: Stuff They Don't Want You to Know and Ridiculous History 9 00:00:33,960 --> 00:00:37,400 Speaker 1: Fame joined me for that discussion. We are going to 10 00:00:37,479 --> 00:00:42,080 Speaker 1: continue that discussion today. This is part two of that episode, 11 00:00:42,680 --> 00:00:47,360 Speaker 1: and we are going to continue to dive down the 12 00:00:47,440 --> 00:00:55,680 Speaker 1: tricky and potentially disastrous pathway to creating voting systems. The 13 00:00:55,760 --> 00:00:58,760 Speaker 1: idea that if you create a voting system that is 14 00:00:58,880 --> 00:01:04,280 Speaker 1: perceived to be unfair, then you're really undermining the entire 15 00:01:04,600 --> 00:01:10,160 Speaker 1: foundation of democracy. And I know that sounds like I'm exaggerating, 16 00:01:10,200 --> 00:01:12,920 Speaker 1: but I'm not because if we lose faith in the system, 17 00:01:12,920 --> 00:01:16,880 Speaker 1: if we do not believe the system works, then it 18 00:01:16,920 --> 00:01:19,600 Speaker 1: becomes a self fulfilling prophecy. Of course, we have to 19 00:01:19,640 --> 00:01:22,039 Speaker 1: make sure that the system does work. If the system 20 00:01:22,080 --> 00:01:24,200 Speaker 1: is not working, then we absolutely have to fix it. 21 00:01:24,200 --> 00:01:27,280 Speaker 1: So it is a very tricky subject, Ben and I 22 00:01:27,319 --> 00:01:30,880 Speaker 1: will go into much more detail that I hope you 23 00:01:30,880 --> 00:01:32,960 Speaker 1: guys are doing well. I know this is still a 24 00:01:33,040 --> 00:01:37,080 Speaker 1: very trying and challenging time for so many reasons. I 25 00:01:37,120 --> 00:01:39,600 Speaker 1: want you guys to know I believe in you. I 26 00:01:39,640 --> 00:01:43,000 Speaker 1: think that you guys are awesome. Uh you all have 27 00:01:43,200 --> 00:01:46,680 Speaker 1: proven that over and over. So let's sit back and 28 00:01:46,720 --> 00:01:49,960 Speaker 1: listen to this classic episode about the scary world of 29 00:01:50,080 --> 00:01:53,960 Speaker 1: e v M S Part two. At the close of 30 00:01:53,960 --> 00:01:57,640 Speaker 1: our conversation, we has our votes and we decided democratically 31 00:01:58,080 --> 00:02:00,640 Speaker 1: that this would be better as a two part series 32 00:02:00,640 --> 00:02:05,040 Speaker 1: than a single epic length episode that would have stretched 33 00:02:05,200 --> 00:02:11,520 Speaker 1: the patience of the most uh most forgiving of listeners. 34 00:02:12,040 --> 00:02:14,280 Speaker 1: Now we have broken it up into two parts, so 35 00:02:14,360 --> 00:02:17,600 Speaker 1: what you are about to hear is the conclusion of 36 00:02:17,639 --> 00:02:19,360 Speaker 1: that two parts. Also, I should add that when we 37 00:02:19,400 --> 00:02:22,799 Speaker 1: get to the end and we sign off, we did 38 00:02:22,840 --> 00:02:25,040 Speaker 1: not know at that point that it was going to 39 00:02:25,080 --> 00:02:28,000 Speaker 1: be a two parter, so we actually have already recorded 40 00:02:28,040 --> 00:02:30,720 Speaker 1: the end of the episode that I'm recording the beginning 41 00:02:30,720 --> 00:02:33,360 Speaker 1: of now. And yes, time travel does confuse me. So 42 00:02:33,440 --> 00:02:37,560 Speaker 1: let's just segue in to part two of how electronic 43 00:02:37,639 --> 00:02:41,200 Speaker 1: voting machines are going to ruin everything. Remember we mentioned 44 00:02:41,240 --> 00:02:43,200 Speaker 1: that in two thousand two, that's when they started to 45 00:02:43,240 --> 00:02:46,120 Speaker 1: appear on the scene. That's when Georgia. Actually they started 46 00:02:46,160 --> 00:02:48,320 Speaker 1: to appear before two thousand two, but two thousand two 47 00:02:48,320 --> 00:02:51,119 Speaker 1: is when Georgia became the first state to have just 48 00:02:51,400 --> 00:02:56,440 Speaker 1: the electronic voting machines out there in the polling places. Um, 49 00:02:56,480 --> 00:02:59,800 Speaker 1: many of those machines have not been updated since, and 50 00:03:00,200 --> 00:03:02,160 Speaker 1: almost all of them are running on a version of 51 00:03:02,200 --> 00:03:06,080 Speaker 1: Windows XP which hasn't been updated since April two thou fourteen. 52 00:03:07,280 --> 00:03:11,160 Speaker 1: So that means any security vulnerability that has been discovered 53 00:03:11,200 --> 00:03:15,040 Speaker 1: since two thousand and fourteen is still there. Chances are 54 00:03:15,080 --> 00:03:17,840 Speaker 1: some of the ones before two thousand fourteen are still 55 00:03:17,880 --> 00:03:19,680 Speaker 1: there because I bet a lot of those machines have 56 00:03:19,840 --> 00:03:23,560 Speaker 1: never received a patch, right, They're just these big there, 57 00:03:23,720 --> 00:03:27,560 Speaker 1: these machines that are being overseen by the government, and uh, 58 00:03:27,720 --> 00:03:30,680 Speaker 1: you know it, Chances are a lot of them are unpatched. 59 00:03:30,800 --> 00:03:33,640 Speaker 1: So there's a lot of potential for people to tamper. 60 00:03:33,760 --> 00:03:36,200 Speaker 1: And not only that, you know, I mentioned the idea 61 00:03:36,320 --> 00:03:41,280 Speaker 1: of either unconsciously or purposefully inserting a bias. That's a 62 00:03:41,320 --> 00:03:45,040 Speaker 1: real concern too, right, because private companies are the ones 63 00:03:45,120 --> 00:03:50,280 Speaker 1: designing these machines. They do so with using proprietary software 64 00:03:50,680 --> 00:03:55,280 Speaker 1: that is not visible to the general public, nor is 65 00:03:55,480 --> 00:03:59,320 Speaker 1: it uh what's what's word, nor is it in any 66 00:03:59,320 --> 00:04:03,160 Speaker 1: way compelled to answer to the general public, Like the 67 00:04:03,200 --> 00:04:08,040 Speaker 1: public doesn't have a the quote unquote Capital p public 68 00:04:08,400 --> 00:04:13,400 Speaker 1: doesn't have a vote regarding which companies run these machines. 69 00:04:14,320 --> 00:04:18,680 Speaker 1: You look at this and you see that everyone says, well, 70 00:04:18,720 --> 00:04:21,760 Speaker 1: we can't reveal our code because if we did, uh, 71 00:04:21,880 --> 00:04:24,479 Speaker 1: then our competitors would see how we do things, and 72 00:04:24,520 --> 00:04:27,839 Speaker 1: then we can't compete in the marketplace anymore. And so 73 00:04:28,400 --> 00:04:31,159 Speaker 1: meanwhile you've got security experts saying this should all be 74 00:04:31,240 --> 00:04:33,760 Speaker 1: open source. And the reason it should be open source 75 00:04:33,920 --> 00:04:36,719 Speaker 1: is so that the community at large would have the 76 00:04:36,720 --> 00:04:39,479 Speaker 1: opportunity to look at the code and see if there 77 00:04:39,520 --> 00:04:43,599 Speaker 1: are any vulnerabilities and if so, make sure that those 78 00:04:43,640 --> 00:04:45,760 Speaker 1: get patched before you get to a point where the 79 00:04:45,760 --> 00:04:49,440 Speaker 1: code is being put into play in an actual election, 80 00:04:50,000 --> 00:04:53,599 Speaker 1: so that those vulnerabilities can't be exploited. Because if something 81 00:04:53,680 --> 00:04:56,800 Speaker 1: is proprietary and someone figures out an exploit to that software, 82 00:04:57,440 --> 00:05:00,080 Speaker 1: then it's all behind closed doors. There's no way to 83 00:05:00,120 --> 00:05:03,560 Speaker 1: address it, and you end up with a huge problem. 84 00:05:03,720 --> 00:05:07,200 Speaker 1: But but beyond that, you have companies that could insert 85 00:05:07,240 --> 00:05:10,240 Speaker 1: a bias into the programming itself. And this kind of 86 00:05:10,279 --> 00:05:13,560 Speaker 1: stems from an interesting story that happened in the early 87 00:05:13,560 --> 00:05:19,400 Speaker 1: two thousand's. So two thousand three, Walden Wally O'Dell and 88 00:05:19,440 --> 00:05:22,039 Speaker 1: I'm not making up the nickname Wally, that is what 89 00:05:22,200 --> 00:05:27,000 Speaker 1: he would go by. Wally O'Dell was at that time 90 00:05:27,160 --> 00:05:30,599 Speaker 1: the CEO and chairman of die Bald. Now Diebald's best 91 00:05:30,640 --> 00:05:35,800 Speaker 1: known for making a t M S but dial, Yeah, well, 92 00:05:35,839 --> 00:05:39,560 Speaker 1: they also for a while made electronic voting machines. There 93 00:05:39,600 --> 00:05:43,320 Speaker 1: was a subsidiary called Diebold Election Systems, which the company 94 00:05:43,360 --> 00:05:47,640 Speaker 1: has since divested itself of that particular property. But Diebold 95 00:05:47,640 --> 00:05:50,599 Speaker 1: Election Systems made a lot of electronic voting machines. And 96 00:05:50,640 --> 00:05:54,400 Speaker 1: die Bald, by the way, is headquartered in Ohio, a 97 00:05:54,640 --> 00:06:00,000 Speaker 1: very important state in national elections here in the United States. 98 00:06:00,040 --> 00:06:02,520 Speaker 1: Swing states, that's what we call it, yep swing some 99 00:06:02,600 --> 00:06:05,640 Speaker 1: meeting swing states for you know, there's probably I've got 100 00:06:05,640 --> 00:06:07,279 Speaker 1: some listeners who are outside the US. You may have 101 00:06:07,360 --> 00:06:09,080 Speaker 1: heard the term swing state, and you don't know what 102 00:06:09,120 --> 00:06:12,480 Speaker 1: that means. So the way elections work in the US. 103 00:06:12,839 --> 00:06:16,640 Speaker 1: You know, you've got your major two party system, and 104 00:06:17,200 --> 00:06:19,400 Speaker 1: ninety five percent of the time that's the those are 105 00:06:19,400 --> 00:06:22,240 Speaker 1: the only parties that really make an impact. So you've 106 00:06:22,240 --> 00:06:24,920 Speaker 1: got your Republicans and your Democrats, and you've got a 107 00:06:24,960 --> 00:06:28,640 Speaker 1: lot of states that that lean toward one or the other, 108 00:06:28,760 --> 00:06:32,080 Speaker 1: and it is almost unheard of for them to support 109 00:06:32,640 --> 00:06:35,880 Speaker 1: a candidate of the opposite party, sure like uh for 110 00:06:36,000 --> 00:06:40,239 Speaker 1: one example of this would be UH state like Utah 111 00:06:40,400 --> 00:06:45,440 Speaker 1: would be legendarily not voting Democratic, and a state like 112 00:06:45,640 --> 00:06:49,240 Speaker 1: Vermont would be legendarily not voting Republicans. Right, So it 113 00:06:49,240 --> 00:06:52,200 Speaker 1: would be an incredibly strange set of circumstances to see 114 00:06:52,200 --> 00:06:55,760 Speaker 1: one of these states support a candidate of the opposite 115 00:06:55,760 --> 00:06:58,880 Speaker 1: party that it tends to support. So for the most part, 116 00:06:58,920 --> 00:07:03,200 Speaker 1: we just assume those electors in the electoral college will 117 00:07:03,240 --> 00:07:08,400 Speaker 1: support the traditional candidates that those states would support. But 118 00:07:08,440 --> 00:07:13,080 Speaker 1: there's some states that could go either way. They could 119 00:07:13,160 --> 00:07:16,240 Speaker 1: swing towards the Republican side one election and swing towards 120 00:07:16,240 --> 00:07:22,040 Speaker 1: the Democratic side and another Florida, Pennsylvaniaia. So the the 121 00:07:22,080 --> 00:07:25,520 Speaker 1: electors in these states, UH, it may be that they 122 00:07:25,640 --> 00:07:27,680 Speaker 1: are the ones that are chosen are the ones for 123 00:07:27,920 --> 00:07:32,239 Speaker 1: Republican uh in one one election our Democrat and another election. 124 00:07:33,000 --> 00:07:35,440 Speaker 1: So Ohio is one of those states, which means it's 125 00:07:35,440 --> 00:07:38,440 Speaker 1: incredibly valuable. Right. It means it's a battleground state. You 126 00:07:38,480 --> 00:07:42,880 Speaker 1: want to win that state because you know you've got 127 00:07:42,920 --> 00:07:47,600 Speaker 1: your your foundation, that's almost guaranteed. Nothing's ever a guarantee, 128 00:07:47,600 --> 00:07:51,280 Speaker 1: but it's almost a guarantee. And uh so you need 129 00:07:51,320 --> 00:07:53,520 Speaker 1: to really concentrate on the states where you have the 130 00:07:53,520 --> 00:07:56,080 Speaker 1: potential to flip it. So that's on your side and 131 00:07:56,120 --> 00:07:59,440 Speaker 1: not your opponents side. Well, Ohio is one of those states. 132 00:08:00,000 --> 00:08:04,880 Speaker 1: Eyebold has its headquarters in Ohio, and Wally, the CEO 133 00:08:05,640 --> 00:08:09,720 Speaker 1: of Diebold at that time, wrote a letter where he 134 00:08:09,880 --> 00:08:15,280 Speaker 1: was uh committing himself to quote helping Ohio deliver its 135 00:08:15,280 --> 00:08:18,920 Speaker 1: electoral votes to the president end quote, president being George W. 136 00:08:19,120 --> 00:08:25,760 Speaker 1: Bush at that time. So the CEO of a company 137 00:08:25,800 --> 00:08:29,440 Speaker 1: that makes voting machine saying he's committed to delivering Ohio's 138 00:08:29,440 --> 00:08:32,760 Speaker 1: electoral votes to the president. Now, the company was quick 139 00:08:32,800 --> 00:08:37,800 Speaker 1: to say, hey, he's not talking about using our products 140 00:08:37,880 --> 00:08:42,280 Speaker 1: to push the elections a personal fundraiser dotor, right, he's 141 00:08:42,280 --> 00:08:45,120 Speaker 1: talking about campaigning and but a lot of people said, hey, 142 00:08:45,360 --> 00:08:48,600 Speaker 1: isn't there a conflict of interests if you've got the 143 00:08:48,640 --> 00:08:52,160 Speaker 1: head of a company making the voting technology that people 144 00:08:52,200 --> 00:08:57,160 Speaker 1: depend upon also openly supporting a specific candidate in the race. 145 00:08:57,320 --> 00:09:02,080 Speaker 1: It's very It's very close to a member of the 146 00:09:02,120 --> 00:09:05,760 Speaker 1: Supreme Court saying like, I will do anything I can 147 00:09:06,120 --> 00:09:10,560 Speaker 1: to get you elected, George, and then and then saying, oh, 148 00:09:10,760 --> 00:09:13,679 Speaker 1: fiddle D D. We don't have time to recount the 149 00:09:13,760 --> 00:09:16,800 Speaker 1: votes from Florida, and we look, there's some things that 150 00:09:16,840 --> 00:09:19,880 Speaker 1: are more important than making absolutely certain that the people's 151 00:09:20,000 --> 00:09:26,240 Speaker 1: vote was reflected in the actual election. Like reasons. I 152 00:09:26,400 --> 00:09:28,640 Speaker 1: just like saying fiddle D D. But I don't think 153 00:09:28,640 --> 00:09:31,600 Speaker 1: the Supremes say that. But you're you're raised the supremest 154 00:09:31,640 --> 00:09:35,800 Speaker 1: they sing, I referred to them as the Supremes. I 155 00:09:35,840 --> 00:09:39,959 Speaker 1: like that. John robertson the Supremes. So many wonderful visions 156 00:09:40,000 --> 00:09:42,920 Speaker 1: going through my head right now. Oh man, you should 157 00:09:42,920 --> 00:09:45,320 Speaker 1: have seen the comedy show we did with that with 158 00:09:45,440 --> 00:09:49,959 Speaker 1: that sketch. Well, but anyway that at that point inside 159 00:09:50,000 --> 00:09:54,280 Speaker 1: like what you're saying is really important here because let's 160 00:09:54,280 --> 00:09:56,560 Speaker 1: assume if we were to take Wally at his word, 161 00:09:57,720 --> 00:10:02,920 Speaker 1: then even is that a PR firm would say, the 162 00:10:02,960 --> 00:10:07,240 Speaker 1: optics look bad. So when you when you make those 163 00:10:07,280 --> 00:10:11,000 Speaker 1: sorts of statements, you have to be very conscious, conscious 164 00:10:11,120 --> 00:10:13,800 Speaker 1: of what is going to happen. This is uh, this 165 00:10:13,880 --> 00:10:16,920 Speaker 1: is an environment wherein uh, let's see what year was this, 166 00:10:16,960 --> 00:10:19,160 Speaker 1: two thousand three, right, two thousand three was when you 167 00:10:19,200 --> 00:10:21,800 Speaker 1: wrote the letter. Thousand four was the election. So people 168 00:10:21,840 --> 00:10:26,400 Speaker 1: are already very sore off of the two thousand election, 169 00:10:26,840 --> 00:10:31,240 Speaker 1: and we have seen an accelerating distrust in the American 170 00:10:31,280 --> 00:10:37,520 Speaker 1: political system or sometime you know, and this this thing 171 00:10:37,640 --> 00:10:41,959 Speaker 1: is just making it, exacerbating it. And I think in 172 00:10:42,120 --> 00:10:43,840 Speaker 1: one of the Wired pieces I read it might have 173 00:10:43,920 --> 00:10:46,959 Speaker 1: even been the one you just referenced. Uh. The the 174 00:10:47,040 --> 00:10:49,840 Speaker 1: author of the piece made a really great point, which 175 00:10:49,880 --> 00:10:53,880 Speaker 1: was that you don't need any case of tampering or bias. 176 00:10:54,040 --> 00:10:58,199 Speaker 1: It doesn't have to exist, but the possibility of it 177 00:10:58,600 --> 00:11:02,240 Speaker 1: of it existing is an off to cause turmoil. Right, 178 00:11:02,440 --> 00:11:06,559 Speaker 1: you can cause real world turmoil just from the possibility 179 00:11:06,559 --> 00:11:09,640 Speaker 1: of this being a thing without any evidence of it 180 00:11:09,720 --> 00:11:12,640 Speaker 1: actually happening. So you don't need to have evidence of 181 00:11:12,760 --> 00:11:16,120 Speaker 1: someone purposefully trying to rig the system in order for 182 00:11:16,160 --> 00:11:19,600 Speaker 1: people to not trust the system. As long as it's 183 00:11:19,679 --> 00:11:23,079 Speaker 1: clear that it's a possibility. That's enough to have people 184 00:11:23,120 --> 00:11:25,200 Speaker 1: say like, well, how can I trust anything? How can 185 00:11:25,240 --> 00:11:28,640 Speaker 1: I trust the results? Because you know you're telling me 186 00:11:28,679 --> 00:11:31,120 Speaker 1: there's no evidence, But how do I know there's no evidence? 187 00:11:31,280 --> 00:11:36,079 Speaker 1: I know it's possible. Twice you can't get fooled aget. Yeah, thanks, 188 00:11:36,280 --> 00:11:39,600 Speaker 1: it's a it's a great quote that I'm not gonna 189 00:11:39,679 --> 00:11:41,840 Speaker 1: make any fun of. All Right, So then we've got 190 00:11:41,880 --> 00:11:44,360 Speaker 1: the fear of hackers. We've got the fear of bias 191 00:11:44,440 --> 00:11:48,559 Speaker 1: from the standpoint of the manufacturers of the voting machines themselves, 192 00:11:48,800 --> 00:11:50,760 Speaker 1: and then we get the fear of hackers third parties 193 00:11:50,800 --> 00:11:54,480 Speaker 1: that want to rig the results or prevent people from voting. 194 00:11:54,760 --> 00:12:01,240 Speaker 1: Completely possible, absolutely, completely possible, because again, these systems are 195 00:12:01,280 --> 00:12:04,480 Speaker 1: not terribly secure. Um. Now you've got a couple of 196 00:12:04,480 --> 00:12:07,240 Speaker 1: different arguments about this, Right, You've got people who support 197 00:12:07,360 --> 00:12:12,520 Speaker 1: d r S and they say, listen, hacking trying to 198 00:12:12,640 --> 00:12:15,959 Speaker 1: trying to affect politics on a national level here in 199 00:12:15,960 --> 00:12:20,319 Speaker 1: the United States is a fool's errand for multiple reasons, 200 00:12:20,320 --> 00:12:23,040 Speaker 1: and a big one is that we have so many 201 00:12:23,080 --> 00:12:27,280 Speaker 1: different types of electronic voting machines out there, and they 202 00:12:27,360 --> 00:12:30,439 Speaker 1: are proprietary. They don't all work on the same software, 203 00:12:30,840 --> 00:12:34,560 Speaker 1: so you can't develop a universal approach to affect all 204 00:12:34,679 --> 00:12:37,960 Speaker 1: the machines. Now, if the entire United States, if all 205 00:12:38,000 --> 00:12:41,440 Speaker 1: of us use the exact same kind of machine across 206 00:12:41,520 --> 00:12:45,200 Speaker 1: all the states and the territories, then that could be 207 00:12:45,600 --> 00:12:49,760 Speaker 1: a potential vulnerability. If in fact they were also connected 208 00:12:49,800 --> 00:12:53,000 Speaker 1: to like the Internet, right, you would have an amazing 209 00:12:53,040 --> 00:12:55,520 Speaker 1: target because you think, well, if I developed the right 210 00:12:55,559 --> 00:12:58,640 Speaker 1: kind of software, I can affect every single vote cast 211 00:12:58,720 --> 00:13:01,160 Speaker 1: in the United States. But that's not the reality of 212 00:13:01,160 --> 00:13:04,000 Speaker 1: our situation. We have all these different types of machines, 213 00:13:04,280 --> 00:13:06,160 Speaker 1: some of which are connected to the Internet, some of 214 00:13:06,200 --> 00:13:10,280 Speaker 1: which aren't. You have running on different types of software, 215 00:13:10,320 --> 00:13:13,800 Speaker 1: so you cannot create that one size fits all approach 216 00:13:13,880 --> 00:13:16,280 Speaker 1: to affecting all of them. That being said, you could 217 00:13:16,280 --> 00:13:20,440 Speaker 1: still affect specific ones, like you could target specific regions 218 00:13:20,480 --> 00:13:24,960 Speaker 1: that you think are particularly important and and try to 219 00:13:25,040 --> 00:13:31,120 Speaker 1: affect voting that way. In fact, critics say that because 220 00:13:31,240 --> 00:13:35,239 Speaker 1: the return on investment is so high, it's an incredibly 221 00:13:35,280 --> 00:13:38,440 Speaker 1: tempting target for hackers. Like you might say, yeah, it's 222 00:13:38,480 --> 00:13:40,960 Speaker 1: a lot of work, but look at the outcome. The 223 00:13:41,000 --> 00:13:44,560 Speaker 1: outcome is the effect of a national election. It's it's 224 00:13:44,600 --> 00:13:47,160 Speaker 1: harder to have a bigger set of stakes than that, 225 00:13:48,120 --> 00:13:52,240 Speaker 1: So you can say like, yeah, it's hard, but the 226 00:13:52,240 --> 00:13:56,120 Speaker 1: the goal is so huge that justifies the hard work 227 00:13:56,440 --> 00:13:59,040 Speaker 1: on the part of the hackers. Ben and I have 228 00:13:59,080 --> 00:14:01,920 Speaker 1: a little bit more to say about the scary world 229 00:14:02,120 --> 00:14:05,720 Speaker 1: of e v ms, but first let's take a quick break. 230 00:14:14,800 --> 00:14:18,560 Speaker 1: Here's the thing. Yeah, you don't have to be like 231 00:14:18,920 --> 00:14:22,640 Speaker 1: the Liam Neeson taken level of hacker. You don't have 232 00:14:22,720 --> 00:14:27,280 Speaker 1: to be anonymous or you know, some faction set operations. 233 00:14:27,320 --> 00:14:30,359 Speaker 1: You don't have to have a particular set of skills. Yeah, exactly, 234 00:14:30,440 --> 00:14:33,480 Speaker 1: And that was that was not very bad. I've had 235 00:14:33,520 --> 00:14:37,640 Speaker 1: a little thing stuck in my teeth, that's all. But 236 00:14:37,640 --> 00:14:41,880 Speaker 1: but yeah, you you don't have to be some amazing 237 00:14:42,360 --> 00:14:50,000 Speaker 1: savvant hacker with these crazy credentials and qualifications. You don't 238 00:14:50,040 --> 00:14:53,160 Speaker 1: even have to be states a state sponsored hacker like 239 00:14:53,200 --> 00:14:55,880 Speaker 1: a you know from the Chinese military, right, No, you 240 00:14:55,960 --> 00:15:00,200 Speaker 1: can if you're talking about systems running on when nose 241 00:15:00,320 --> 00:15:02,520 Speaker 1: XP that haven't been patched in two years at the 242 00:15:02,720 --> 00:15:07,080 Speaker 1: very at at best, they haven't been patched in two years. 243 00:15:07,360 --> 00:15:10,720 Speaker 1: At worst, it's been much longer than that. Uh, and 244 00:15:10,840 --> 00:15:13,440 Speaker 1: if there is a way to access the machines, either 245 00:15:13,560 --> 00:15:16,840 Speaker 1: over the Internet or through physical contact with the machines, 246 00:15:17,560 --> 00:15:21,200 Speaker 1: it's entirely possible for someone to use code developed by 247 00:15:21,360 --> 00:15:26,760 Speaker 1: somebody else and infect that machine. So yeah, so the 248 00:15:26,800 --> 00:15:31,040 Speaker 1: diminutive term for these people, uh, and my friend Shannon 249 00:15:31,080 --> 00:15:35,200 Speaker 1: hates this term would be script kitties. Script kitties meaning 250 00:15:35,200 --> 00:15:40,280 Speaker 1: a type of person who uh profits off of malicious code. 251 00:15:40,280 --> 00:15:42,920 Speaker 1: But they didn't develop the code themselves. They went to 252 00:15:43,040 --> 00:15:47,840 Speaker 1: some place that where that code is available. They either 253 00:15:47,920 --> 00:15:50,160 Speaker 1: purchase the code or they download the code for free, 254 00:15:50,200 --> 00:15:53,240 Speaker 1: depending upon the place that they're going to, and then 255 00:15:53,280 --> 00:15:57,000 Speaker 1: they deploy that code. And in some cases the code 256 00:15:57,160 --> 00:15:59,600 Speaker 1: is pretty much automated. It does everything for you. You 257 00:15:59,720 --> 00:16:01,280 Speaker 1: just have to be the one. You just have to 258 00:16:01,280 --> 00:16:05,000 Speaker 1: be able to deploy it somehow right, and you don't 259 00:16:05,040 --> 00:16:07,800 Speaker 1: have to have any knowledge of how it works. All 260 00:16:07,840 --> 00:16:09,800 Speaker 1: you need to do is just make sure the code 261 00:16:09,800 --> 00:16:11,520 Speaker 1: gets on the machine that you wanted to get on 262 00:16:12,200 --> 00:16:14,040 Speaker 1: in the right direction. Yeah. So, imagine that you've got 263 00:16:14,080 --> 00:16:17,160 Speaker 1: a thumb drive with this code and it's in a file, 264 00:16:17,200 --> 00:16:19,720 Speaker 1: so it will auto execute once it connects to a 265 00:16:19,880 --> 00:16:22,320 Speaker 1: USB drive on a machine. You come out to a 266 00:16:22,760 --> 00:16:25,040 Speaker 1: voting machine that happens at USB drive, which would be 267 00:16:25,080 --> 00:16:28,200 Speaker 1: a terrible idea. I'm not even suggesting that there are 268 00:16:28,320 --> 00:16:31,320 Speaker 1: electronic voting machines that have USB ports. I don't know 269 00:16:31,400 --> 00:16:33,680 Speaker 1: if there are. If there are, that's a terrible idea. 270 00:16:34,280 --> 00:16:38,000 Speaker 1: You plug in the thumb drive, automatically executes runs the 271 00:16:38,000 --> 00:16:40,680 Speaker 1: malicious code that has become part of the voting machines 272 00:16:41,080 --> 00:16:43,720 Speaker 1: programming from that point forward, and you can no longer 273 00:16:43,760 --> 00:16:49,160 Speaker 1: trust the results generates. Uh, that is entirely possible, assuming that, again, 274 00:16:49,200 --> 00:16:51,960 Speaker 1: you have some way of of injecting the code into 275 00:16:52,000 --> 00:16:56,000 Speaker 1: the machine. So if you do have that possibility, that 276 00:16:56,040 --> 00:17:01,479 Speaker 1: would be bad um. And also you don't even necessarily 277 00:17:01,760 --> 00:17:04,760 Speaker 1: have to set out to change votes. Right like that, 278 00:17:04,880 --> 00:17:06,760 Speaker 1: that's the way we mostly think of it. Like again, 279 00:17:06,840 --> 00:17:09,960 Speaker 1: going back to the Reagan Carter example, we would think, oh, well, 280 00:17:09,960 --> 00:17:12,320 Speaker 1: they've programmed it so that it took that Reagan vote 281 00:17:12,320 --> 00:17:15,320 Speaker 1: and flipped it to Carter. You could also just try 282 00:17:15,520 --> 00:17:19,520 Speaker 1: and overload the voting machine so that no votes could 283 00:17:19,560 --> 00:17:22,760 Speaker 1: be cast on it at all. You can take it down. Yeah, 284 00:17:22,880 --> 00:17:25,680 Speaker 1: you're just taking down the machine you just wanted to 285 00:17:25,680 --> 00:17:29,520 Speaker 1: to crash and be unrecoverable, and then you you really 286 00:17:29,560 --> 00:17:33,199 Speaker 1: impact that the ability for people to cast votes. If 287 00:17:33,240 --> 00:17:36,679 Speaker 1: you do that in enough places that have very particular 288 00:17:38,080 --> 00:17:41,920 Speaker 1: political leanings, you can affect the outcome of an election 289 00:17:42,240 --> 00:17:45,200 Speaker 1: simply because the people who wanted to vote were physically 290 00:17:45,280 --> 00:17:49,040 Speaker 1: unable to do so. And unless you make special allowances 291 00:17:49,080 --> 00:17:54,280 Speaker 1: for that, then those those votes will never happen. Uh. 292 00:17:54,320 --> 00:17:58,600 Speaker 1: So that's kind of ugly to um. And also finally, 293 00:17:58,760 --> 00:18:02,239 Speaker 1: uh turns out at the results are unencrypted, which is 294 00:18:02,440 --> 00:18:07,240 Speaker 1: crazy to me, which yeah, is bizarre. Yeah, it doesn't 295 00:18:07,280 --> 00:18:10,840 Speaker 1: make any sense. Votes recorded in plain text in a 296 00:18:10,920 --> 00:18:14,960 Speaker 1: file where you're not you haven't even encrypted the results, 297 00:18:15,720 --> 00:18:19,520 Speaker 1: so there's no protection there. Like if someone does get 298 00:18:19,560 --> 00:18:22,840 Speaker 1: access to that data somehow, no matter you know, whatever 299 00:18:23,000 --> 00:18:27,639 Speaker 1: methodology you might need, they can make changes. And you 300 00:18:27,680 --> 00:18:31,879 Speaker 1: know this is not just theoretical. And the state of 301 00:18:31,920 --> 00:18:35,040 Speaker 1: Virginia de certified hundreds of wind vote e v m 302 00:18:35,160 --> 00:18:38,840 Speaker 1: s because they were insecure, and according to the officials, 303 00:18:39,359 --> 00:18:44,000 Speaker 1: you could change the votes in these wind vote machines 304 00:18:44,600 --> 00:18:48,320 Speaker 1: undetected if you were anywhere within a half mile radius 305 00:18:48,320 --> 00:18:51,960 Speaker 1: of the machine, even if you weren't like a knowledgeable 306 00:18:52,000 --> 00:18:53,520 Speaker 1: hack like you don't want to have to get out 307 00:18:53,520 --> 00:19:01,200 Speaker 1: of the car. It's like Pokemon go, but for election. Yeah. Um, 308 00:19:01,240 --> 00:19:04,600 Speaker 1: that's crazy. Now. I should also add that according to 309 00:19:04,640 --> 00:19:07,160 Speaker 1: the Wired article where I've pulled that from, UH, they 310 00:19:07,200 --> 00:19:11,520 Speaker 1: never found any evidence that someone had tampered with the votes, 311 00:19:11,840 --> 00:19:14,200 Speaker 1: but they did discover that the vulnerability was there, which 312 00:19:14,280 --> 00:19:16,879 Speaker 1: meant that there was always the possibility someone could tamper 313 00:19:16,920 --> 00:19:20,080 Speaker 1: with the votes, which is why they were de certified. Um. Luckily, 314 00:19:20,600 --> 00:19:24,119 Speaker 1: at least according to the officials, it doesn't look like 315 00:19:24,160 --> 00:19:27,200 Speaker 1: anyone actually managed to do that or took the time 316 00:19:27,280 --> 00:19:31,480 Speaker 1: to do that, But that's that's a possibility that has 317 00:19:31,520 --> 00:19:34,800 Speaker 1: to be addressed, right um. And that kind of brings 318 00:19:34,840 --> 00:19:38,359 Speaker 1: us up to the hacking threat. So I don't know 319 00:19:38,400 --> 00:19:40,280 Speaker 1: if you guys have been paying attention here in the 320 00:19:40,359 --> 00:19:45,280 Speaker 1: United States. Earlier in hackers gained access to the Democratic 321 00:19:45,359 --> 00:19:48,439 Speaker 1: National Convention databases and they stole a whole bunch of 322 00:19:48,440 --> 00:19:52,639 Speaker 1: emails and files, more than twenty thousand of them, and UH, 323 00:19:52,680 --> 00:19:55,480 Speaker 1: a lot of them were released by wiki leaks. Around 324 00:19:55,520 --> 00:19:57,879 Speaker 1: twenty thousand were released by wiki leaks, but that doesn't 325 00:19:57,920 --> 00:20:00,639 Speaker 1: mean that that's all of them. And there were a 326 00:20:00,680 --> 00:20:03,240 Speaker 1: lot of questions like who the heck hacked into the 327 00:20:03,320 --> 00:20:06,800 Speaker 1: d n C. Yeah, and we were pointing out a 328 00:20:06,800 --> 00:20:12,480 Speaker 1: specific country and that country is Russia, thank you, but 329 00:20:12,480 --> 00:20:18,200 Speaker 1: but it's want okay, yeah, all right, Well, so here's 330 00:20:18,240 --> 00:20:21,639 Speaker 1: here's one of the things that makes this an interesting story, 331 00:20:21,760 --> 00:20:25,720 Speaker 1: and into some degree the reports of it remind me 332 00:20:25,880 --> 00:20:30,120 Speaker 1: a little bit of the famous Sony hack, right where 333 00:20:30,119 --> 00:20:33,119 Speaker 1: they talking about North Korea being the culprit and you 334 00:20:33,200 --> 00:20:37,359 Speaker 1: and I explored that together as well. So one of 335 00:20:37,400 --> 00:20:42,000 Speaker 1: the big questions is to what end, to what what 336 00:20:42,119 --> 00:20:50,000 Speaker 1: value would Russia's state sponsored hacking branch, which does exist, 337 00:20:50,400 --> 00:20:54,760 Speaker 1: what what value would it see in interfering on such 338 00:20:54,880 --> 00:20:59,800 Speaker 1: a minimal level at this point? Is this a buy 339 00:21:00,000 --> 00:21:03,560 Speaker 1: in for blackmail? Further down the line, is this to 340 00:21:04,320 --> 00:21:11,639 Speaker 1: discredit something? Because the d n C is a domestic 341 00:21:11,720 --> 00:21:16,720 Speaker 1: facing organization, so it's a lot, it's much more. Um, 342 00:21:16,760 --> 00:21:19,680 Speaker 1: it's an entirely different animal in comparison to something like 343 00:21:20,080 --> 00:21:24,440 Speaker 1: a State Department server which would have stuff about Russia. Yeah, yeah, 344 00:21:24,440 --> 00:21:28,680 Speaker 1: any of those things like c I A are you 345 00:21:28,760 --> 00:21:31,240 Speaker 1: know that would be another or n s A. Right, 346 00:21:31,280 --> 00:21:33,399 Speaker 1: like the targets that you would think of if you 347 00:21:33,400 --> 00:21:36,000 Speaker 1: wanted to gather a lot of intelligence yeah, and so 348 00:21:36,440 --> 00:21:39,520 Speaker 1: the issue here. Initially, I I don't know about you, 349 00:21:39,560 --> 00:21:42,640 Speaker 1: but initially I was skeptical because I thought, this, all 350 00:21:42,680 --> 00:21:45,879 Speaker 1: this bear poking that's going on now is not gonna 351 00:21:45,960 --> 00:21:50,760 Speaker 1: it's not. Yeah, yeah, it's not. It's not gonna. It's 352 00:21:50,800 --> 00:21:54,040 Speaker 1: not gonna pay off. Because there is a reason that 353 00:21:54,160 --> 00:21:57,440 Speaker 1: the Putin government has been in power for as long 354 00:21:57,560 --> 00:22:00,600 Speaker 1: as it has, and it's it's because, as you know, 355 00:22:01,280 --> 00:22:05,240 Speaker 1: this this government is seen. American presidents come and go, 356 00:22:05,960 --> 00:22:10,360 Speaker 1: and they have a maybe a longer horizon politically sure 357 00:22:10,440 --> 00:22:14,919 Speaker 1: when they contemplate these things. So I it's strange to 358 00:22:14,920 --> 00:22:18,480 Speaker 1: see what appears to be harmless rhetoric trotted out against them. However, 359 00:22:19,119 --> 00:22:23,400 Speaker 1: I had to rethink my skepticism on this when more 360 00:22:23,480 --> 00:22:28,120 Speaker 1: than one security firm came out saying evidence indicates it's 361 00:22:28,160 --> 00:22:32,600 Speaker 1: probably these guys. Yeah. As it turns out, a security 362 00:22:32,640 --> 00:22:37,359 Speaker 1: firm immediately, pretty much shortly after the leak was was 363 00:22:37,640 --> 00:22:41,600 Speaker 1: made public, said that the signs were pointing towards a 364 00:22:41,720 --> 00:22:47,400 Speaker 1: Russian actor, um that the methodologies and software used were 365 00:22:47,440 --> 00:22:50,680 Speaker 1: the same that had been employed by Russian state sponsored 366 00:22:50,720 --> 00:22:54,680 Speaker 1: hacking groups in the past. And uh, there are some 367 00:22:54,720 --> 00:22:57,359 Speaker 1: folks who said I'm you know, I don't think so. 368 00:22:57,400 --> 00:23:02,440 Speaker 1: In fact, one um jusipher to point, oh hacker came 369 00:23:02,480 --> 00:23:04,760 Speaker 1: out and said, no, no, no, it was me. I'm 370 00:23:04,800 --> 00:23:07,200 Speaker 1: a lone wolf. I did it. It wasn't. I was 371 00:23:07,240 --> 00:23:08,879 Speaker 1: the one who who stole all that and gave it 372 00:23:08,920 --> 00:23:12,600 Speaker 1: over to Wiki leaks and uh. Although they also claimed 373 00:23:12,640 --> 00:23:15,720 Speaker 1: that he was Romanian or he or she the hacker 374 00:23:15,760 --> 00:23:18,760 Speaker 1: claimed that he or she was Romanian, but then when 375 00:23:19,240 --> 00:23:23,760 Speaker 1: people were trying to communicate with him or her in Romanian, Uh, 376 00:23:23,840 --> 00:23:27,720 Speaker 1: the responses did not seem particularly coherent, which seems to 377 00:23:27,760 --> 00:23:31,040 Speaker 1: give the lie that they are in fact Romanian. But 378 00:23:31,720 --> 00:23:35,439 Speaker 1: other security firms ended up corroborating the findings of the 379 00:23:35,480 --> 00:23:38,880 Speaker 1: first one. They said, yeah, the exploits that were used, 380 00:23:38,880 --> 00:23:41,679 Speaker 1: the malware that was used is identical to the type 381 00:23:41,680 --> 00:23:44,399 Speaker 1: of malware that has been used by this Russian hacking group. 382 00:23:45,000 --> 00:23:48,199 Speaker 1: It is exactly the m O that we've seen in 383 00:23:48,320 --> 00:23:52,320 Speaker 1: other cases, including a case that involved the German parliament. 384 00:23:52,320 --> 00:23:59,600 Speaker 1: In so uh, either it's again another Russian hacking group, 385 00:23:59,800 --> 00:24:02,800 Speaker 1: or it's someone who learned everything they know from a 386 00:24:02,920 --> 00:24:07,480 Speaker 1: Russian hacking group because it was identical in nature. Yes, 387 00:24:08,040 --> 00:24:10,560 Speaker 1: So why would Russia want to infere interfere with the DNC. 388 00:24:10,760 --> 00:24:13,240 Speaker 1: And I think the most distinct way of putting that 389 00:24:13,320 --> 00:24:18,320 Speaker 1: is that Pewton doesn't like Clinton, huh. Kind of kind 390 00:24:18,320 --> 00:24:21,000 Speaker 1: of what it boils down to is that Pewton uh 391 00:24:21,040 --> 00:24:26,600 Speaker 1: and would not want Clinton and the White House well 392 00:24:26,680 --> 00:24:29,960 Speaker 1: as president. Yeah, also as an active Secretary of State 393 00:24:30,080 --> 00:24:35,000 Speaker 1: at times when NATO and NATO and Russia as well 394 00:24:35,040 --> 00:24:40,880 Speaker 1: as its allies were engaged in geopolitical tensions that sparked 395 00:24:40,880 --> 00:24:44,760 Speaker 1: into Right now, there's an ongoing proxy war in Syria 396 00:24:45,440 --> 00:24:48,159 Speaker 1: and it's between Russia and between the U S. No, 397 00:24:48,200 --> 00:24:51,520 Speaker 1: of course it's not being reported that way or marketed 398 00:24:51,560 --> 00:24:57,359 Speaker 1: that way, but unless you're reading any media outside of 399 00:24:57,400 --> 00:25:00,000 Speaker 1: the US or Russia, in which case everyone says RUSSI 400 00:25:00,000 --> 00:25:02,320 Speaker 1: and who backs the Syrian government and the US that 401 00:25:02,359 --> 00:25:06,639 Speaker 1: backs the rebels like, and that goes across political spectrum. 402 00:25:06,840 --> 00:25:08,760 Speaker 1: I'm glad you pointed that out. So you know, a 403 00:25:08,880 --> 00:25:16,080 Speaker 1: very very um historically conservative papers of note in England, 404 00:25:16,320 --> 00:25:19,400 Speaker 1: like like the magazine The Economist or the much more 405 00:25:19,480 --> 00:25:23,880 Speaker 1: liberal paper The Independent, will both pretty much say what 406 00:25:23,920 --> 00:25:26,560 Speaker 1: it is. It's just we don't talk about it here 407 00:25:26,600 --> 00:25:30,480 Speaker 1: on CNN. Uh. There is, there is an antagonistic thing. 408 00:25:30,640 --> 00:25:34,880 Speaker 1: So I would say there's probably resentment or concern about 409 00:25:35,240 --> 00:25:38,160 Speaker 1: having a commander of chief who knows all the State 410 00:25:38,160 --> 00:25:43,480 Speaker 1: Department's skeletons and including the Russian ones. And then additionally, 411 00:25:45,080 --> 00:25:46,920 Speaker 1: and there always has to be a question when there's 412 00:25:47,280 --> 00:25:51,280 Speaker 1: this level of nepotism in American politics, where the because 413 00:25:51,280 --> 00:25:53,680 Speaker 1: we had the son of a former president become president 414 00:25:53,840 --> 00:25:58,760 Speaker 1: somehow in a meritocracy, and then actually right right multiple times, 415 00:25:58,800 --> 00:26:02,560 Speaker 1: and then having the um the spouse of a former 416 00:26:02,600 --> 00:26:06,600 Speaker 1: president be in the running for president. What's really crazy 417 00:26:06,680 --> 00:26:08,600 Speaker 1: is if you ever see a family tree of all 418 00:26:08,640 --> 00:26:11,600 Speaker 1: the presidents and see how much except for that one, 419 00:26:11,680 --> 00:26:14,479 Speaker 1: there's like one guy. Yeah, but everyone is pretty much 420 00:26:14,480 --> 00:26:18,120 Speaker 1: everyone else's cousin at at most. And there's an interesting 421 00:26:18,200 --> 00:26:22,160 Speaker 1: mathematical aspect to that, because if you go for far 422 00:26:22,280 --> 00:26:25,240 Speaker 1: enough out or the further you go, the more people 423 00:26:25,240 --> 00:26:29,480 Speaker 1: you're related to us, like I've probably you're probably sixteen 424 00:26:29,640 --> 00:26:34,440 Speaker 1: cousins with someone that would completely it would completely baffle you. Yeah, 425 00:26:35,040 --> 00:26:38,280 Speaker 1: and so that I mean, that's true, but I could 426 00:26:38,320 --> 00:26:41,840 Speaker 1: see that it. I could see the Russian government having 427 00:26:43,160 --> 00:26:49,800 Speaker 1: that kind of concern, but also funding wise, uh, funding wise, 428 00:26:49,840 --> 00:26:52,600 Speaker 1: and infrastructural wise, they're pretty strapped for cash right now 429 00:26:53,280 --> 00:26:58,960 Speaker 1: and strapped for manpower as well, So it's a question 430 00:26:59,000 --> 00:27:01,360 Speaker 1: of how well quick they are as well as how 431 00:27:01,440 --> 00:27:04,600 Speaker 1: motivated they are. Right and and let's also point out 432 00:27:04,600 --> 00:27:07,200 Speaker 1: that hacking the d n C is different than hacking 433 00:27:07,240 --> 00:27:10,679 Speaker 1: an election, because, like I said earlier, with the electronic 434 00:27:10,720 --> 00:27:13,239 Speaker 1: voting machines, you have all these different targets, with all 435 00:27:13,240 --> 00:27:17,200 Speaker 1: these different softwares, Whereas hacking the d n C means 436 00:27:17,400 --> 00:27:20,760 Speaker 1: focusing on a single target, not looking at a bunch 437 00:27:20,800 --> 00:27:26,000 Speaker 1: of different targets across the United States. So that requires 438 00:27:26,080 --> 00:27:28,679 Speaker 1: less of an investment. I mean, it's still, you know, 439 00:27:28,720 --> 00:27:32,560 Speaker 1: obviously requires a great deal deal of work in order 440 00:27:32,600 --> 00:27:37,159 Speaker 1: to find to create a vulnerability or to exploit a vulnerability, 441 00:27:37,359 --> 00:27:39,680 Speaker 1: But it's different than having to target whole bunch of 442 00:27:39,680 --> 00:27:42,000 Speaker 1: different machines. But it has raised the question a lot 443 00:27:42,000 --> 00:27:44,960 Speaker 1: of people have asked, well, if they hacked the DNC, 444 00:27:45,560 --> 00:27:49,960 Speaker 1: could Russia also interfere with the actual election of sixteen 445 00:27:50,840 --> 00:27:54,199 Speaker 1: uh And Wired's Brian Barrett, who wrote one of the 446 00:27:54,240 --> 00:27:57,760 Speaker 1: articles I read for this episode, specifically says he does 447 00:27:57,800 --> 00:28:01,840 Speaker 1: not think so. Uh most because if in fact Russia 448 00:28:01,920 --> 00:28:04,959 Speaker 1: wanted to interfere with the election, it would not have 449 00:28:05,160 --> 00:28:11,000 Speaker 1: also targeted the DNC database and hacked into it, because 450 00:28:11,680 --> 00:28:15,520 Speaker 1: once everyone finds out about that, then you are on 451 00:28:15,600 --> 00:28:20,199 Speaker 1: high alert for the election. Uh And so if you 452 00:28:20,240 --> 00:28:22,600 Speaker 1: wanted to hack the election, you would not want people 453 00:28:22,640 --> 00:28:26,119 Speaker 1: to know that you were capable of doing something on 454 00:28:26,160 --> 00:28:29,600 Speaker 1: that scale before you did it. You don't want to 455 00:28:29,720 --> 00:28:32,080 Speaker 1: You don't don't monologue in front of James Bond before 456 00:28:32,080 --> 00:28:35,680 Speaker 1: you push the button exactly exactly. And it's not the 457 00:28:35,800 --> 00:28:38,920 Speaker 1: first time at the geopolitical rodeo. So they've done this 458 00:28:39,000 --> 00:28:42,240 Speaker 1: if Yeah, they're they're old hands at uh and as 459 00:28:42,440 --> 00:28:44,720 Speaker 1: as is the United States. I'm not giving the US 460 00:28:44,760 --> 00:28:49,680 Speaker 1: a pass. Yeah, we've got any major player in geopolitics. 461 00:28:49,720 --> 00:28:52,760 Speaker 1: There's some dark stuff going on, which in regimes the 462 00:28:52,800 --> 00:28:56,560 Speaker 1: way that people named Chad changed cargo pants right, So 463 00:28:58,600 --> 00:29:01,840 Speaker 1: probably Russia has not gone to be uh involved. And 464 00:29:01,840 --> 00:29:05,240 Speaker 1: plus again it would it would require a much broader 465 00:29:05,280 --> 00:29:09,239 Speaker 1: attack that would require multiple strategies and where for you 466 00:29:09,280 --> 00:29:12,560 Speaker 1: to play it out. So but the fact that the 467 00:29:12,680 --> 00:29:18,480 Speaker 1: possibility has been raised again creates uncertainty among the US public, 468 00:29:18,720 --> 00:29:24,200 Speaker 1: and that alone is enough to cause a lack of 469 00:29:24,240 --> 00:29:28,560 Speaker 1: confidence and results. Right, we've already seen I mean Trump 470 00:29:28,600 --> 00:29:31,120 Speaker 1: has even said, like I expect there's gonna be a 471 00:29:31,120 --> 00:29:34,360 Speaker 1: lot of tampering involved in this, uh in this election, 472 00:29:34,680 --> 00:29:37,880 Speaker 1: already bringing into question the results which haven't happened yet. 473 00:29:38,560 --> 00:29:41,360 Speaker 1: It's like he's it's like he's already said we can't 474 00:29:41,400 --> 00:29:44,760 Speaker 1: trust what the results will say. And we haven't even 475 00:29:44,800 --> 00:29:48,000 Speaker 1: gone into the voting booths. We're just about to wrap 476 00:29:48,080 --> 00:29:50,920 Speaker 1: up this discussion about electronic voting machines. But before we 477 00:29:50,960 --> 00:30:00,840 Speaker 1: get to that, let's take another quick break. What do 478 00:30:00,920 --> 00:30:04,480 Speaker 1: we think how do we wrap this wall up? Well, 479 00:30:05,280 --> 00:30:08,640 Speaker 1: first we should remember that while in Georgia it's hard, 480 00:30:08,680 --> 00:30:12,000 Speaker 1: it's easy for us to forget we're in the minority, 481 00:30:12,080 --> 00:30:15,440 Speaker 1: not everyone is using an electronic voting machine to cast 482 00:30:15,480 --> 00:30:19,040 Speaker 1: their vote. In fact, according to uh UM Pamela Smith 483 00:30:19,200 --> 00:30:23,320 Speaker 1: of Verified Voting, about the nation will vote on some 484 00:30:23,400 --> 00:30:25,960 Speaker 1: form of paper ballot in the two thousand sixteen election, 485 00:30:26,080 --> 00:30:30,640 Speaker 1: only us using other methods. So there's not some maniacal 486 00:30:30,960 --> 00:30:36,480 Speaker 1: corporate super villain who's out to rig the election across 487 00:30:36,560 --> 00:30:40,720 Speaker 1: the nation for the Clinton campaign, the Trump campaign, or 488 00:30:40,800 --> 00:30:44,600 Speaker 1: Gary Busey, right right, Gary Busey can try and infect 489 00:30:44,600 --> 00:30:47,080 Speaker 1: as many computers as he wants, but as it turns out, 490 00:30:47,240 --> 00:30:51,040 Speaker 1: unless he's figured out away to replace all the choices 491 00:30:51,040 --> 00:30:54,000 Speaker 1: on a paper ballot with Gary Busey, which would be amazing, 492 00:30:55,280 --> 00:30:57,960 Speaker 1: I kind of want to see an official ballot that 493 00:30:58,080 --> 00:31:01,400 Speaker 1: just as Gary Busey as all the different options, like 494 00:31:01,480 --> 00:31:07,200 Speaker 1: including including the resolutions that are exactly Unless he's able 495 00:31:07,240 --> 00:31:09,640 Speaker 1: to do that, then it's not going to have that 496 00:31:09,840 --> 00:31:12,640 Speaker 1: big of a scale of impact. So so this is 497 00:31:12,680 --> 00:31:15,840 Speaker 1: also some of that fear uncertainty in doubt, right, Like, 498 00:31:17,360 --> 00:31:20,280 Speaker 1: on the one hand, yes, these systems are not as 499 00:31:20,280 --> 00:31:23,440 Speaker 1: secure as they should be. They aren't encrypting data, which 500 00:31:23,480 --> 00:31:26,880 Speaker 1: they should be, or at least not all of them are. Um, 501 00:31:26,960 --> 00:31:30,560 Speaker 1: they not all of them have paper trails. Even the 502 00:31:30,560 --> 00:31:33,040 Speaker 1: ones that do have paper trails may not the state 503 00:31:33,320 --> 00:31:37,120 Speaker 1: may not require a post election audit, which they absolutely 504 00:31:37,360 --> 00:31:39,880 Speaker 1: lee should. Also, the paper trails may not be voter 505 00:31:39,960 --> 00:31:43,720 Speaker 1: or verifiable, which they absolutely should be, so that we 506 00:31:43,800 --> 00:31:46,600 Speaker 1: can be well, so that we can have confidence that 507 00:31:46,680 --> 00:31:50,960 Speaker 1: the results announced are in fact reflective of the actual 508 00:31:51,040 --> 00:31:54,880 Speaker 1: choices that people made. Whether it's in favor of your 509 00:31:54,960 --> 00:31:59,280 Speaker 1: candidate or against your candidate, you want to know that 510 00:31:59,360 --> 00:32:03,280 Speaker 1: the results it happened. We're real and not reflective of 511 00:32:03,360 --> 00:32:08,920 Speaker 1: someone else's machiavellian plan to put put a specific person 512 00:32:08,920 --> 00:32:11,600 Speaker 1: in power, or specific group in power, or specific set 513 00:32:11,600 --> 00:32:15,680 Speaker 1: of laws into play. You want to feel like the 514 00:32:15,720 --> 00:32:18,080 Speaker 1: process works, and in order to do that, you need 515 00:32:18,200 --> 00:32:20,720 Speaker 1: this verifiable voter trail. You need to have these post 516 00:32:20,760 --> 00:32:25,640 Speaker 1: election audits. And not everywhere has that ability or option. 517 00:32:26,120 --> 00:32:30,160 Speaker 1: So now that's one thing. Um. Another thing to remember 518 00:32:31,480 --> 00:32:36,840 Speaker 1: is that, uh, it's hard to get these systems up 519 00:32:36,920 --> 00:32:39,719 Speaker 1: to date and at that level where we could be 520 00:32:39,760 --> 00:32:43,800 Speaker 1: confident in their results, because again, it costs money, and 521 00:32:43,880 --> 00:32:50,360 Speaker 1: it's politically difficult to convince groups to spend that money, 522 00:32:50,400 --> 00:32:54,800 Speaker 1: specifically to upgrade election equipment. Keeping in mind that this 523 00:32:54,840 --> 00:32:56,600 Speaker 1: is stuff that's used only a couple of times a 524 00:32:56,680 --> 00:33:00,160 Speaker 1: year at most, right, Like you might have to if 525 00:33:00,160 --> 00:33:02,520 Speaker 1: you are very active in your local community, you may 526 00:33:02,560 --> 00:33:05,200 Speaker 1: be voting maybe a couple of times per year, depending 527 00:33:05,280 --> 00:33:10,320 Speaker 1: upon how things are run in your area. Uh, some 528 00:33:10,360 --> 00:33:13,240 Speaker 1: people only come out to vote for the presidential elections. 529 00:33:13,240 --> 00:33:15,320 Speaker 1: They don't vote for any of the other ones. And 530 00:33:15,360 --> 00:33:18,880 Speaker 1: so it's hard to justify like you're gonna spend X 531 00:33:18,960 --> 00:33:22,040 Speaker 1: million dollars this year for equipment that you're gonna be 532 00:33:22,080 --> 00:33:25,120 Speaker 1: able to use maybe in two elections before you need 533 00:33:25,160 --> 00:33:28,680 Speaker 1: to like to general elections before you need to upgrade 534 00:33:28,720 --> 00:33:31,560 Speaker 1: the equipment again. And there's a great argument about open 535 00:33:31,640 --> 00:33:38,480 Speaker 1: sourcing yware exactly. So I mean, I think I would 536 00:33:38,480 --> 00:33:43,680 Speaker 1: love to see more transparency both on that open source approach, 537 00:33:43,720 --> 00:33:46,239 Speaker 1: so people can make sure there's no inherent bias in 538 00:33:46,280 --> 00:33:51,160 Speaker 1: the code again, conscious or not, because there there are 539 00:33:51,200 --> 00:33:53,480 Speaker 1: times where people can insert of by us without even 540 00:33:53,480 --> 00:33:55,920 Speaker 1: thinking they are right like that that just happens because 541 00:33:55,920 --> 00:33:58,440 Speaker 1: we're human. You don't have to be making a conscious 542 00:33:58,440 --> 00:34:01,560 Speaker 1: decision to be a jerk to create. Yeah, And there's 543 00:34:01,600 --> 00:34:06,080 Speaker 1: also there's also a perception problem that comes out post event, 544 00:34:06,600 --> 00:34:11,120 Speaker 1: which is, you know, humans are intensely tribalistic. It's we're 545 00:34:11,160 --> 00:34:16,680 Speaker 1: not particularly fact based creatures, unfortunately, and because we were, 546 00:34:16,880 --> 00:34:21,759 Speaker 1: we evolved to live off of perceptions. So what what 547 00:34:22,000 --> 00:34:25,680 Speaker 1: inevitably happens is no matter what side of the political 548 00:34:25,719 --> 00:34:29,960 Speaker 1: aisle you find yourself kicking it on the people who 549 00:34:30,440 --> 00:34:34,000 Speaker 1: feel that their candidate has one or when a candidate wins, 550 00:34:34,040 --> 00:34:39,040 Speaker 1: the supporters will say the system works. These people who 551 00:34:39,160 --> 00:34:44,320 Speaker 1: object to my candidate are just total Yeah, just sour grapes, 552 00:34:44,480 --> 00:34:47,600 Speaker 1: these n income poops. They don't understand. And of course 553 00:34:47,719 --> 00:34:53,279 Speaker 1: I would be I would be a dignified, a dignified, 554 00:34:53,320 --> 00:34:57,359 Speaker 1: dissenting voice if I, you know, if I didn't pick 555 00:34:57,440 --> 00:35:00,040 Speaker 1: the obviously the best candidate, which if the other and 556 00:35:00,160 --> 00:35:02,680 Speaker 1: it had one, which they didn't, but if they had, 557 00:35:02,760 --> 00:35:05,719 Speaker 1: then I would have been a gracious loser. However, in 558 00:35:05,800 --> 00:35:09,400 Speaker 1: the alternative, in the alternative, university schrod Injur's cat this 559 00:35:09,520 --> 00:35:13,400 Speaker 1: for a second, what what would happen if the same 560 00:35:13,480 --> 00:35:18,840 Speaker 1: people lost? They would say, well, the systems rigged. It 561 00:35:18,920 --> 00:35:21,480 Speaker 1: was rigged from the beginning. It was rigged because the media. 562 00:35:21,560 --> 00:35:23,960 Speaker 1: The media reported things in such a way as to 563 00:35:24,960 --> 00:35:30,440 Speaker 1: minimize the the opponents flaws and maximize my candidates flaws 564 00:35:30,760 --> 00:35:33,279 Speaker 1: which didn't even really exist. They manufactured flaws and then 565 00:35:33,320 --> 00:35:37,000 Speaker 1: they maximize them and etcetera, etcetera. And this this is 566 00:35:37,040 --> 00:35:38,920 Speaker 1: like a natural sort of thing that happens. We're not 567 00:35:38,960 --> 00:35:42,200 Speaker 1: trying to vilify anyone in particular or any political party, 568 00:35:42,360 --> 00:35:43,960 Speaker 1: to be honest, we've already seen it here in the 569 00:35:44,080 --> 00:35:47,960 Speaker 1: US this year, just through the primaries. Primaries happened. You 570 00:35:48,000 --> 00:35:54,879 Speaker 1: saw people who were like progressives, who were supporting Bernie Sanders. Uh, 571 00:35:55,239 --> 00:36:00,680 Speaker 1: refusing to believe that Clinton had won the primary. Now 572 00:36:00,840 --> 00:36:05,560 Speaker 1: there's another Uh, there's another Wally Adell situation there. However, Yes, 573 00:36:06,360 --> 00:36:11,200 Speaker 1: where you had the the chairwoman of the Democratic Convention 574 00:36:11,719 --> 00:36:17,920 Speaker 1: who in some of those leaked messages that Wiki Leaks released, Uh, 575 00:36:18,000 --> 00:36:22,240 Speaker 1: it was clear that she was favoring She she personally 576 00:36:22,320 --> 00:36:26,600 Speaker 1: favored Clinton over Sanders. Whether that actually ever had any 577 00:36:27,000 --> 00:36:31,279 Speaker 1: effect beyond her personal beliefs is less clear. In fact, 578 00:36:31,680 --> 00:36:34,600 Speaker 1: the evidence doesn't really support that there was like a 579 00:36:34,760 --> 00:36:41,120 Speaker 1: a concentrated effort to diminish Sanders to a point where 580 00:36:41,120 --> 00:36:43,719 Speaker 1: he would be he would not be allowed to be 581 00:36:43,760 --> 00:36:46,400 Speaker 1: a candidate. Um. That doesn't mean that it didn't happen. 582 00:36:46,440 --> 00:36:48,799 Speaker 1: It just means that there wasn't like a smoking gun 583 00:36:48,840 --> 00:36:51,400 Speaker 1: from that Wiki Leaks release. Yeah. But again, just like 584 00:36:51,440 --> 00:36:55,520 Speaker 1: a PR company would see Debbie Wasserman Schultz, who was 585 00:36:55,560 --> 00:36:57,719 Speaker 1: the head of that who was the head of the 586 00:36:57,719 --> 00:37:02,160 Speaker 1: convention at the time, did write and vociferously supported, uh, 587 00:37:02,320 --> 00:37:09,200 Speaker 1: the Clinton campaign. So whether or not there was any shenaniganry, yeah, 588 00:37:09,200 --> 00:37:12,319 Speaker 1: I just barely stumbled through that word. Whether or not 589 00:37:12,440 --> 00:37:15,560 Speaker 1: there was the fact of the matter is that it 590 00:37:16,520 --> 00:37:21,359 Speaker 1: increases Uh, distrust and fear and uncertainty you said earlier, Right, 591 00:37:21,440 --> 00:37:25,280 Speaker 1: so you get this lack of confidence in the system. Uh. 592 00:37:25,320 --> 00:37:27,279 Speaker 1: And obviously that's not the direction we want to move in. 593 00:37:27,360 --> 00:37:30,239 Speaker 1: We want to move in a direction where everyone is 594 00:37:30,560 --> 00:37:34,040 Speaker 1: at least confident that their voices being heard. It may 595 00:37:34,080 --> 00:37:37,239 Speaker 1: turn out that their voice does isn't powerful enough to 596 00:37:37,320 --> 00:37:41,040 Speaker 1: make the change they want, which is frustrating obviously, but 597 00:37:41,200 --> 00:37:43,880 Speaker 1: less so than feeling like you are being ignored or 598 00:37:43,920 --> 00:37:48,359 Speaker 1: purposefully silenced. Right, you don't want that at all. But 599 00:37:48,480 --> 00:37:53,120 Speaker 1: so is the direct democracy really an answer, because if 600 00:37:53,160 --> 00:37:57,520 Speaker 1: we define direct democracy as well, let's see, for example, 601 00:37:58,440 --> 00:38:03,080 Speaker 1: one of the best arguments against direct democracy. Alright, ch okay, 602 00:38:03,440 --> 00:38:06,920 Speaker 1: anyone with anyone with an Internet connection can go on 603 00:38:06,960 --> 00:38:09,840 Speaker 1: there and say whatever they want. And now we're starting 604 00:38:09,840 --> 00:38:13,439 Speaker 1: to see the priorities of a massive people without some 605 00:38:13,560 --> 00:38:19,839 Speaker 1: kind of structure to their interaction. Well, uh I, I'm 606 00:38:19,880 --> 00:38:24,400 Speaker 1: really in favor of um approaching this from a plurality standpoint, 607 00:38:24,440 --> 00:38:27,920 Speaker 1: where at least in most areas of government, where you 608 00:38:28,000 --> 00:38:33,360 Speaker 1: make sure that the representatives of government reflect the levels 609 00:38:33,440 --> 00:38:37,080 Speaker 1: of support in the general population. So instead instead of 610 00:38:37,080 --> 00:38:43,360 Speaker 1: it being like every single position is determined by a majority. 611 00:38:43,600 --> 00:38:46,000 Speaker 1: If you have a plurality, then you say, all right, 612 00:38:46,040 --> 00:38:50,480 Speaker 1: well X percentage of Congress must be made up of 613 00:38:50,520 --> 00:38:55,720 Speaker 1: this party because that reflects the American the American voters. 614 00:38:56,280 --> 00:38:59,040 Speaker 1: And then this why percentage must be of this other 615 00:38:59,080 --> 00:39:02,600 Speaker 1: party because that represents and then these independents also won 616 00:39:02,719 --> 00:39:05,120 Speaker 1: this amount, so they should take up these seats. But 617 00:39:05,160 --> 00:39:07,719 Speaker 1: that would be a totally different approach to government than 618 00:39:07,760 --> 00:39:09,839 Speaker 1: the way the United States is structured, and it could 619 00:39:09,880 --> 00:39:14,000 Speaker 1: get complicated. Who from where, from what? But this is 620 00:39:14,040 --> 00:39:18,520 Speaker 1: already complicated. Could I could I plug some stuff real quickly? Alright, So, 621 00:39:19,920 --> 00:39:21,759 Speaker 1: first off, thank you so much for having me on, 622 00:39:21,840 --> 00:39:25,080 Speaker 1: and ladies and gentlemen, uh off the Jury, I almost 623 00:39:25,120 --> 00:39:28,640 Speaker 1: said of the podcast Jury, thank you so thank you 624 00:39:28,719 --> 00:39:32,320 Speaker 1: so much for checking out this episode. If you're interested 625 00:39:32,520 --> 00:39:35,840 Speaker 1: in this. In this process, um, Jonathan and I have 626 00:39:36,040 --> 00:39:41,120 Speaker 1: looked at some other some other tech related conspiracy theories, 627 00:39:41,160 --> 00:39:46,320 Speaker 1: including that Sony hack, and you've been on several episodes together. 628 00:39:46,600 --> 00:39:49,279 Speaker 1: So you can find all of those in either one 629 00:39:49,320 --> 00:39:55,520 Speaker 1: of our various internet uh presences. There are videos or 630 00:39:55,680 --> 00:39:59,680 Speaker 1: podcasts or you know sometimes the mad scribblings on a 631 00:39:59,719 --> 00:40:05,160 Speaker 1: blog somewhere. Yes, yes, uh, the whispers of the whispers 632 00:40:05,200 --> 00:40:08,560 Speaker 1: of a a person passing you buy on a moonless 633 00:40:08,640 --> 00:40:11,359 Speaker 1: night at the crossroads. Occasionally you'll just be in the 634 00:40:11,360 --> 00:40:14,440 Speaker 1: woods and you'll hear the sound of a baby laughing. 635 00:40:14,480 --> 00:40:17,760 Speaker 1: And if you listen very carefully, just before the blair 636 00:40:17,800 --> 00:40:24,239 Speaker 1: witch gets you, you'll hear at any rate. Right, yes, uh, 637 00:40:24,280 --> 00:40:27,799 Speaker 1: that's uh, that's how we met. So there's another There 638 00:40:27,800 --> 00:40:29,560 Speaker 1: are another couple of things that I think would be 639 00:40:29,600 --> 00:40:31,799 Speaker 1: worth your time to check out if you're interested in 640 00:40:31,840 --> 00:40:35,279 Speaker 1: this subject that's specifically applied to uh, the U S. 641 00:40:35,360 --> 00:40:38,719 Speaker 1: Voting system and stuff they want you to know. Has 642 00:40:38,760 --> 00:40:41,000 Speaker 1: a couple of videos about this. We have one about 643 00:40:41,000 --> 00:40:43,920 Speaker 1: the Federal Election Commission, one that's a grab bag of 644 00:40:43,960 --> 00:40:47,640 Speaker 1: political conspiracies for seen, and then one thing called five 645 00:40:47,680 --> 00:40:51,000 Speaker 1: things you should know about primary elections. So do check 646 00:40:51,000 --> 00:40:53,640 Speaker 1: this out. They are free on the internet. Well, I 647 00:40:53,640 --> 00:40:55,719 Speaker 1: mean they're free, you know, assuming that you have an 648 00:40:55,719 --> 00:40:59,320 Speaker 1: Internet connection, right, Yeah, the content is free. The manner 649 00:40:59,480 --> 00:41:04,080 Speaker 1: of trans mission may be uh, somewhat less so depending 650 00:41:04,120 --> 00:41:06,200 Speaker 1: on whether or not you're at the local library. And 651 00:41:06,239 --> 00:41:09,520 Speaker 1: no wonder this is maybe a forward thinking question that 652 00:41:09,880 --> 00:41:13,440 Speaker 1: you know. Jonathan has another show called forward Thinking the 653 00:41:13,480 --> 00:41:15,279 Speaker 1: Future of voting. So go ahead and hit me. Yeah, 654 00:41:15,280 --> 00:41:18,920 Speaker 1: it's a future it's a future facing show about the 655 00:41:18,960 --> 00:41:23,240 Speaker 1: evolution of tech. Did you Did you touch on anything 656 00:41:23,280 --> 00:41:28,640 Speaker 1: about machine consciousness or artificial intelligence and voting a couple 657 00:41:28,640 --> 00:41:31,239 Speaker 1: of times. Actually, we did an episode specifically about what 658 00:41:31,280 --> 00:41:33,480 Speaker 1: it would be like to have a robot as president? 659 00:41:33,480 --> 00:41:35,719 Speaker 1: Would you ever want a future where a robot could 660 00:41:35,719 --> 00:41:38,560 Speaker 1: be president? Uh? The idea being that if you had 661 00:41:38,719 --> 00:41:43,520 Speaker 1: a truly impartial, artificially intelligent creature that would be able 662 00:41:43,560 --> 00:41:48,040 Speaker 1: to make choices that are the greatest benefit and the 663 00:41:48,120 --> 00:41:52,560 Speaker 1: least detriment to the population. Would you want to do that? 664 00:41:52,760 --> 00:41:55,759 Speaker 1: Or do you think humanity is absolutely essential in order 665 00:41:55,800 --> 00:41:58,120 Speaker 1: for you to have a leader? We did an episode 666 00:41:58,120 --> 00:42:00,920 Speaker 1: about that. We also did an episode about the future 667 00:42:00,920 --> 00:42:04,319 Speaker 1: of voting and some of the pitfalls, including things like 668 00:42:04,360 --> 00:42:08,120 Speaker 1: machine intelligence, Uh in the future. That one was not 669 00:42:08,200 --> 00:42:11,279 Speaker 1: too long ago. That that published, probably was about a 670 00:42:11,360 --> 00:42:14,839 Speaker 1: month or two ago maybe, So, Yeah, we've got some 671 00:42:14,880 --> 00:42:19,719 Speaker 1: stuff that we've talked about. I personally think that there 672 00:42:19,719 --> 00:42:23,200 Speaker 1: are a lot of problems we need to solve in 673 00:42:23,239 --> 00:42:27,800 Speaker 1: the immediate future that uh, some of which would would 674 00:42:27,840 --> 00:42:32,279 Speaker 1: avoid the machine learning artificial intelligence question for quite some time, 675 00:42:32,360 --> 00:42:35,560 Speaker 1: because um, like there there have been people who have asked, well, 676 00:42:35,560 --> 00:42:39,239 Speaker 1: what about the possibility of voting via the internet, the 677 00:42:39,280 --> 00:42:42,200 Speaker 1: idea being that if you could vote via in an internet, 678 00:42:42,280 --> 00:42:46,759 Speaker 1: you could drive up voter participation dramatically. You've you, the 679 00:42:46,880 --> 00:42:52,520 Speaker 1: more you reduce the investment required to participate, the more 680 00:42:52,560 --> 00:42:56,160 Speaker 1: people will participate. That's the idea, right, So if you 681 00:42:56,200 --> 00:42:58,560 Speaker 1: make it less of a chore to vote, more people 682 00:42:58,560 --> 00:43:01,800 Speaker 1: will vote. Because that that's one of those ongoing narratives 683 00:43:01,880 --> 00:43:04,520 Speaker 1: in the United States, right, that's such a small, relatively 684 00:43:04,520 --> 00:43:07,520 Speaker 1: small percentage of our population actually participates in the process, 685 00:43:08,160 --> 00:43:13,680 Speaker 1: particularly if it's not a presidential election. Yeah, I think though, 686 00:43:13,719 --> 00:43:16,319 Speaker 1: I think that people should I know this is controversial, 687 00:43:16,320 --> 00:43:18,680 Speaker 1: but I think people should be required to vote. I 688 00:43:18,719 --> 00:43:22,359 Speaker 1: think it should be part of the stuff you have 689 00:43:22,480 --> 00:43:25,680 Speaker 1: to do, and there's not there's not very much stuff 690 00:43:25,719 --> 00:43:28,600 Speaker 1: you have to do. Uh, if you were born here 691 00:43:28,640 --> 00:43:32,680 Speaker 1: to be a citizen's services, there's actually way more stuff 692 00:43:32,719 --> 00:43:36,799 Speaker 1: that you aren't supposed to do. Yeah, there's a much 693 00:43:36,840 --> 00:43:40,359 Speaker 1: bigger list of do not. But I think it's While 694 00:43:40,400 --> 00:43:45,680 Speaker 1: it's controversial, it's a good system to consider because then 695 00:43:45,800 --> 00:43:49,200 Speaker 1: we will see all of the people who for one 696 00:43:49,239 --> 00:43:51,399 Speaker 1: reason or another. We're not able to get time off 697 00:43:51,440 --> 00:43:54,600 Speaker 1: to vote. We're not able to get which is which is, 698 00:43:54,640 --> 00:43:56,880 Speaker 1: by the way, illegal. You are supposed to be your 699 00:43:56,920 --> 00:43:59,040 Speaker 1: employer is supposed to allow you the time to vote. 700 00:43:59,120 --> 00:44:01,600 Speaker 1: That's it's illegal in theory. It's a legal in theory 701 00:44:01,640 --> 00:44:05,600 Speaker 1: and practice it is much different, right, Yeah, I mean again, 702 00:44:06,200 --> 00:44:08,600 Speaker 1: just like we were talking about with the electronic voting machines. 703 00:44:08,760 --> 00:44:11,719 Speaker 1: Ideally it works one way. In reality it may work 704 00:44:11,760 --> 00:44:15,880 Speaker 1: a totally different way. There are countries that have compulsory voting. 705 00:44:16,000 --> 00:44:19,160 Speaker 1: Because often when you see the United States the voting numbers, 706 00:44:19,239 --> 00:44:22,440 Speaker 1: the percentage of people participating, it will be compared against 707 00:44:22,480 --> 00:44:25,320 Speaker 1: other nations and say, look, how terrible the US turnout 708 00:44:25,400 --> 00:44:27,839 Speaker 1: is compared to these countries. And then you start looking like, yeah, 709 00:44:27,840 --> 00:44:31,640 Speaker 1: but four of those countries you listed require everyone to vote. 710 00:44:32,080 --> 00:44:34,239 Speaker 1: So therefore, first of all, that none of them are 711 00:44:34,280 --> 00:44:38,560 Speaker 1: at a ent, so someone slacken. And secondly, that's not 712 00:44:38,680 --> 00:44:41,440 Speaker 1: fair to to hold up a country that doesn't have 713 00:44:41,480 --> 00:44:45,759 Speaker 1: compulsory voting against ones that do. Um, I don't know 714 00:44:45,800 --> 00:44:47,920 Speaker 1: that I whoever go with compulsory I kind of I 715 00:44:48,000 --> 00:44:51,200 Speaker 1: kind of feel you, I mean, I kind of I 716 00:44:51,239 --> 00:44:53,759 Speaker 1: want I want to see more people involved in it, 717 00:44:53,960 --> 00:44:57,480 Speaker 1: whether it's but I also don't know the reasons behind 718 00:44:58,440 --> 00:45:04,160 Speaker 1: people not voting, right, If they're not voting because there 719 00:45:04,280 --> 00:45:07,960 Speaker 1: is an uh a hardship on them in order to 720 00:45:08,040 --> 00:45:11,640 Speaker 1: participate in the system, It's not like I can blame them, right, 721 00:45:11,680 --> 00:45:14,719 Speaker 1: if there's some form of hardship, whether it's economic or 722 00:45:15,000 --> 00:45:17,080 Speaker 1: it's just you know, practical, like how do I get 723 00:45:17,120 --> 00:45:20,920 Speaker 1: to the voting station, whatever it may be. I have 724 00:45:20,960 --> 00:45:22,879 Speaker 1: a lot of sympathy for those people. And I even 725 00:45:22,880 --> 00:45:26,640 Speaker 1: have sympathy for people who have lost confidence in the 726 00:45:26,719 --> 00:45:28,839 Speaker 1: system and the reason they don't vote is because they 727 00:45:28,880 --> 00:45:33,680 Speaker 1: feel like their vote doesn't really have um an impact. 728 00:45:33,800 --> 00:45:38,120 Speaker 1: And I can certainly see, especially based upon the rhetoric 729 00:45:38,480 --> 00:45:41,880 Speaker 1: that you tend to be subjected to in the election seasons, 730 00:45:42,880 --> 00:45:47,040 Speaker 1: how you can get disillusioned and sad and therefore you're like, 731 00:45:47,080 --> 00:45:52,920 Speaker 1: I kind of just want to disappear until this is over. UM. 732 00:45:52,960 --> 00:45:57,400 Speaker 1: Although this every time there's an election, it's really important, 733 00:45:58,000 --> 00:46:01,080 Speaker 1: very very important, including at the local level, maybe especially 734 00:46:01,120 --> 00:46:03,840 Speaker 1: at the local level. UM. Here in the US, we 735 00:46:03,920 --> 00:46:07,120 Speaker 1: give a lot of attention to the presidential elections. Truth 736 00:46:07,160 --> 00:46:10,080 Speaker 1: to be known, your local elections are going to have 737 00:46:10,160 --> 00:46:12,520 Speaker 1: a much larger impact on your day to day life 738 00:46:12,560 --> 00:46:17,920 Speaker 1: than your national ones. Um, but obviously that's not the 739 00:46:17,920 --> 00:46:20,960 Speaker 1: big story. You're looking at who Who's who's sitting on 740 00:46:21,000 --> 00:46:24,320 Speaker 1: the on the throne I'm sorry in the oval office. 741 00:46:25,320 --> 00:46:27,160 Speaker 1: I'm the Royalist, so I have a little bit of 742 00:46:27,200 --> 00:46:31,000 Speaker 1: a different view on things. Um, I'm not really a royalist. 743 00:46:32,000 --> 00:46:34,960 Speaker 1: So well that that was a great discussion about it. 744 00:46:34,960 --> 00:46:38,200 Speaker 1: Obviously we got a little philosophical and um, you know, 745 00:46:38,320 --> 00:46:43,000 Speaker 1: I don't want anyone to base their political opinion on 746 00:46:43,080 --> 00:46:45,200 Speaker 1: anything I say. I think it's very important for you 747 00:46:45,239 --> 00:46:47,920 Speaker 1: to form your own even if I fully disagree with it, 748 00:46:47,960 --> 00:46:51,400 Speaker 1: I respect the fact that you guys have a political opinion. 749 00:46:51,840 --> 00:46:54,799 Speaker 1: So I'm not trying to sway anyone to my side. Um, 750 00:46:55,400 --> 00:46:58,000 Speaker 1: which is good because your side is crazy. My side 751 00:46:58,200 --> 00:47:03,600 Speaker 1: votes Reagan uh in this election. Yeah, he's he's he's 752 00:47:03,640 --> 00:47:06,920 Speaker 1: passed on and yet I write it in. Also, by 753 00:47:06,920 --> 00:47:08,279 Speaker 1: the way, you may not be able to write in 754 00:47:09,239 --> 00:47:13,160 Speaker 1: a president presidential candidate. Not all states allow it. Yeah, yeah, 755 00:47:13,200 --> 00:47:15,800 Speaker 1: that's a bummer. I think it's like forty seven states 756 00:47:15,840 --> 00:47:18,320 Speaker 1: allow it, so there's like three that do not. Something 757 00:47:18,360 --> 00:47:20,960 Speaker 1: like that. You know what probably happens somebody was screwing 758 00:47:20,960 --> 00:47:23,359 Speaker 1: around too much with what they thought was a great bit, 759 00:47:23,760 --> 00:47:27,760 Speaker 1: and they were like, guys, we cannot make we cannot 760 00:47:27,800 --> 00:47:31,919 Speaker 1: make the pink panther the state senator or build a cat, 761 00:47:32,120 --> 00:47:35,440 Speaker 1: or build a cat. Bloom County has run the presidential 762 00:47:35,480 --> 00:47:39,520 Speaker 1: candidacy quite a few times, right, Yeah, so, uh, maybe 763 00:47:39,560 --> 00:47:41,719 Speaker 1: it's just the joke gone wrong. But if you're in 764 00:47:41,719 --> 00:47:44,239 Speaker 1: one of the vast majority of states and you want 765 00:47:44,239 --> 00:47:49,680 Speaker 1: to write a vote in, then civic Seneca right as 766 00:47:49,719 --> 00:47:52,480 Speaker 1: a citizen of the United States. Some would argue your 767 00:47:52,520 --> 00:47:57,840 Speaker 1: moral responsibility depends on your on your perspective on the candidates, 768 00:47:57,840 --> 00:48:00,680 Speaker 1: I suppose, but I didn't the rate there we go 769 00:48:00,760 --> 00:48:02,680 Speaker 1: see at any rate. That's how you know it's me 770 00:48:02,960 --> 00:48:06,120 Speaker 1: when you hear that voice in the woods. Uh. Thank 771 00:48:06,160 --> 00:48:08,919 Speaker 1: you again Ben for being on the show. And that 772 00:48:09,000 --> 00:48:12,799 Speaker 1: wraps up this pair of classic episodes. Thank you so 773 00:48:12,880 --> 00:48:15,439 Speaker 1: much for listening. I think these are very important things 774 00:48:15,440 --> 00:48:17,759 Speaker 1: for us to be aware of, things that we need 775 00:48:17,760 --> 00:48:20,959 Speaker 1: to pay very close attention to. We can't just take 776 00:48:21,080 --> 00:48:24,160 Speaker 1: for granted that the systems that we have in place 777 00:48:24,360 --> 00:48:28,600 Speaker 1: to allow us to to participate in the democratic process 778 00:48:29,160 --> 00:48:32,040 Speaker 1: just work or that they just don't work. We need 779 00:48:32,040 --> 00:48:36,400 Speaker 1: to really examine them, ask questions, make ourselves accountable, and 780 00:48:36,440 --> 00:48:40,439 Speaker 1: make certain that we still have a system that people 781 00:48:40,480 --> 00:48:43,960 Speaker 1: can believe in. It's hard enough out there right now 782 00:48:44,040 --> 00:48:48,600 Speaker 1: to find representatives sometimes that actually reflect our own values. 783 00:48:48,680 --> 00:48:51,440 Speaker 1: That can be a real challenge. We don't need to 784 00:48:51,440 --> 00:48:54,839 Speaker 1: make it even harder by making the process one that 785 00:48:54,920 --> 00:48:58,240 Speaker 1: we feel we cannot rely upon. We need that process 786 00:48:58,280 --> 00:49:01,799 Speaker 1: to be rock solid. I hope you guys enjoyed these 787 00:49:01,840 --> 00:49:04,880 Speaker 1: classic episodes. Will be back with a new episode in 788 00:49:05,320 --> 00:49:09,600 Speaker 1: Wednesday's show, so stick around for that. And uh yeah, 789 00:49:09,640 --> 00:49:13,279 Speaker 1: I apologize for running reruns in a way, but at 790 00:49:13,280 --> 00:49:15,120 Speaker 1: the same time, I just felt like this is something 791 00:49:16,000 --> 00:49:18,759 Speaker 1: that we need to think about, especially in the advance 792 00:49:19,040 --> 00:49:21,840 Speaker 1: of another election year here in the United States. We 793 00:49:21,960 --> 00:49:25,600 Speaker 1: really need to pay attention to this stuff and be 794 00:49:25,760 --> 00:49:30,320 Speaker 1: active participants if we want to make any real substantive change. 795 00:49:31,080 --> 00:49:33,400 Speaker 1: All right, guys, if you want to reach out to me, 796 00:49:33,600 --> 00:49:35,760 Speaker 1: you can do so on Facebook or Twitter. The handle 797 00:49:35,800 --> 00:49:39,000 Speaker 1: for both of those is text stuff hs W and 798 00:49:39,000 --> 00:49:47,840 Speaker 1: I'll talk to you again really soon. Text Stuff is 799 00:49:47,880 --> 00:49:51,000 Speaker 1: an I Heart Radio production. For more podcasts from My 800 00:49:51,120 --> 00:49:54,759 Speaker 1: Heart Radio, visit the I heart radio, app, Apple podcasts, 801 00:49:54,880 --> 00:49:56,840 Speaker 1: or wherever you listen to your favorite shows.