1 00:00:04,400 --> 00:00:07,800 Speaker 1: Welcome to tech Stuff, a production from my Heart Radio. 2 00:00:12,039 --> 00:00:14,520 Speaker 1: Hey there, and welcome to tech Stuff. I'm your host, 3 00:00:14,640 --> 00:00:17,200 Speaker 1: Jonathan Strickland. I'm an executive producer with I Heart Radio 4 00:00:17,239 --> 00:00:19,919 Speaker 1: and I love all things tech. It is time for 5 00:00:19,960 --> 00:00:23,640 Speaker 1: another classic episode of tech Stuff. This one originally published 6 00:00:23,640 --> 00:00:27,800 Speaker 1: on June two thousand thirteen, that would have been my birthday. 7 00:00:28,240 --> 00:00:32,000 Speaker 1: This is called the Wiki Leaks story, and obviously it's 8 00:00:32,000 --> 00:00:33,960 Speaker 1: another one of those topics where I could do a 9 00:00:34,000 --> 00:00:38,239 Speaker 1: full update, probably a full redo of this episode, but 10 00:00:38,280 --> 00:00:40,440 Speaker 1: it's always interesting to go back and listen to how 11 00:00:40,479 --> 00:00:43,320 Speaker 1: the show has evolved over the years. So let's listen 12 00:00:43,600 --> 00:00:47,400 Speaker 1: to the Wiki Leaks story. Also, real quick, before we 13 00:00:47,440 --> 00:00:49,480 Speaker 1: get into this, I do want to give a brief 14 00:00:49,520 --> 00:00:52,479 Speaker 1: trigger warning. It's it's a light trigger warning. Really. UM. 15 00:00:52,560 --> 00:00:55,640 Speaker 1: Julianna sound being UM, one of the so called founders 16 00:00:55,640 --> 00:00:57,800 Speaker 1: of Wiki Leaks, has been brought up on some rape 17 00:00:57,840 --> 00:01:01,800 Speaker 1: charges and so there will be a non graphic discussion 18 00:01:01,800 --> 00:01:04,360 Speaker 1: of that later on in the podcast. UM. If that 19 00:01:04,520 --> 00:01:06,880 Speaker 1: is the kind of thing that you would rather avoid hearing, 20 00:01:06,920 --> 00:01:09,880 Speaker 1: than perhaps you would like to uh skip the rest 21 00:01:09,880 --> 00:01:13,560 Speaker 1: of this episode now. Spoiler alert, both Lauren and myself 22 00:01:13,680 --> 00:01:17,600 Speaker 1: and and with our particular political views. Feel that no 23 00:01:17,680 --> 00:01:20,839 Speaker 1: one in this story really comes out being a complete hero. 24 00:01:21,000 --> 00:01:24,119 Speaker 1: It's it's it's just it's humans being humans, which means 25 00:01:24,160 --> 00:01:27,680 Speaker 1: it's complex. It's complex and gets kind of messy. Yeah, 26 00:01:27,760 --> 00:01:30,440 Speaker 1: so we're gonna get messy. But before we get messy, 27 00:01:30,520 --> 00:01:32,320 Speaker 1: let's just clear things up a little bit, let's talk 28 00:01:32,360 --> 00:01:35,080 Speaker 1: about what wiki leaks is. Now. First of all, the 29 00:01:35,200 --> 00:01:37,720 Speaker 1: name wiki gives you this sense that it has this 30 00:01:37,800 --> 00:01:42,160 Speaker 1: sort of collaborative, open structure that allows people to come 31 00:01:42,160 --> 00:01:45,559 Speaker 1: in and add things and tweak things. Not so much. 32 00:01:45,680 --> 00:01:49,800 Speaker 1: They trined that early on in the timeline of wiki leaks, 33 00:01:49,880 --> 00:01:52,960 Speaker 1: but eventually moved away from it. The name has stuck, however, right, 34 00:01:53,000 --> 00:01:54,400 Speaker 1: and I think that I think that really what they 35 00:01:54,480 --> 00:01:58,080 Speaker 1: wanted to do was um it was used, was used 36 00:01:58,120 --> 00:02:02,400 Speaker 1: that that easy reading format of wikis without you know, 37 00:02:02,480 --> 00:02:04,640 Speaker 1: and then they decided to really close down the part 38 00:02:04,680 --> 00:02:08,360 Speaker 1: where hey, anyone can edit it, because that's bad times 39 00:02:08,680 --> 00:02:11,079 Speaker 1: right right when you're talking about what it is WikiLeaks 40 00:02:11,120 --> 00:02:14,160 Speaker 1: is trying to do and uh and be yes, the 41 00:02:14,680 --> 00:02:20,600 Speaker 1: sourcing of information in order to remain um anonymous and pertinent. Yeah, yeah, 42 00:02:20,639 --> 00:02:22,400 Speaker 1: these are all things that make it. You know, if 43 00:02:22,400 --> 00:02:24,919 Speaker 1: you were to just open it up for what wiki 44 00:02:24,960 --> 00:02:26,919 Speaker 1: leaks does, it would just be a mess very quickly. 45 00:02:26,960 --> 00:02:29,840 Speaker 1: It also has created a lot of confusion among people 46 00:02:29,880 --> 00:02:32,280 Speaker 1: who are just casually following this or have just heard 47 00:02:32,320 --> 00:02:35,000 Speaker 1: about it, who think that wiki leaks has some sort 48 00:02:35,040 --> 00:02:37,960 Speaker 1: of connection to Wikipedia, which it does not know. The 49 00:02:38,080 --> 00:02:41,120 Speaker 1: Wikipedia and wiki leaks are not at all related. So 50 00:02:41,160 --> 00:02:44,840 Speaker 1: what is wiki leaks. Well, it's a not for profit organization, 51 00:02:45,520 --> 00:02:49,160 Speaker 1: and the purpose of wiki leaks is to publish information 52 00:02:49,200 --> 00:02:53,560 Speaker 1: that would normally not be available to the public. Right. 53 00:02:53,680 --> 00:02:58,639 Speaker 1: Normally normally be stuff that's either been classified or Yeah, 54 00:02:58,720 --> 00:03:00,679 Speaker 1: it could be trade secret stuff, it could be a 55 00:03:00,760 --> 00:03:05,320 Speaker 1: corporate uh, communications that normally would not be available to 56 00:03:05,400 --> 00:03:09,880 Speaker 1: anyone outside a certain group of people within that corporation. Really, 57 00:03:09,919 --> 00:03:14,360 Speaker 1: it's any information that would be of public interest but 58 00:03:14,600 --> 00:03:18,400 Speaker 1: is not publicly available. And in fact, WikiLeaks go so 59 00:03:18,440 --> 00:03:21,840 Speaker 1: far as to say that the organization does not want 60 00:03:21,880 --> 00:03:25,400 Speaker 1: anything that's been published elsewhere. That's not the purpose for WikiLeaks. 61 00:03:25,400 --> 00:03:28,800 Speaker 1: They also make pains to say that they do not 62 00:03:29,040 --> 00:03:32,280 Speaker 1: solicit any kind of information or files. They accept it, 63 00:03:32,400 --> 00:03:35,520 Speaker 1: but they don't solicit, which is an important distinction they 64 00:03:35,560 --> 00:03:38,920 Speaker 1: have to make so that they yeah, because otherwise they 65 00:03:38,960 --> 00:03:44,400 Speaker 1: could start getting charges about being a spy or or 66 00:03:44,480 --> 00:03:49,080 Speaker 1: trying to bribe officials or people and who have access 67 00:03:49,120 --> 00:03:54,880 Speaker 1: to to classified information to then share it. If they 68 00:03:54,920 --> 00:03:58,880 Speaker 1: merely accepted donations of information, that absolves them of of 69 00:03:59,000 --> 00:04:02,240 Speaker 1: some of that responsible kind of technically, yeah, and just 70 00:04:02,840 --> 00:04:06,240 Speaker 1: at least prints a better picture of them, right as 71 00:04:06,280 --> 00:04:09,080 Speaker 1: far as it makes it more difficult to make a 72 00:04:09,080 --> 00:04:12,440 Speaker 1: case against them, doesn't necessarily mean that a case would 73 00:04:12,480 --> 00:04:14,800 Speaker 1: not be made against them and it wouldn't be successful. 74 00:04:14,800 --> 00:04:17,920 Speaker 1: But anyway, we're kind of getting ahead of ourselves now. Um. 75 00:04:17,960 --> 00:04:23,279 Speaker 1: According to Columbia University's Journalism School, I read this, there 76 00:04:23,360 --> 00:04:27,040 Speaker 1: was a very long article about wiki leaks and its 77 00:04:27,880 --> 00:04:32,080 Speaker 1: association with the Guardian newspaper was a newspaper in the UK, 78 00:04:33,080 --> 00:04:37,599 Speaker 1: and uh. In fact, there was quite a strong relationship 79 00:04:37,640 --> 00:04:40,560 Speaker 1: between the two for a while until an event that 80 00:04:40,600 --> 00:04:43,839 Speaker 1: we will discuss later. Yeah, yeah, that will come important later. 81 00:04:43,880 --> 00:04:48,400 Speaker 1: But according to that particular article, it said that Julian Assange, 82 00:04:48,440 --> 00:04:51,760 Speaker 1: who is often referred to as a founder, founder or 83 00:04:51,960 --> 00:04:54,760 Speaker 1: or at the very least the spokesperson for wiki leaks, 84 00:04:54,760 --> 00:05:00,920 Speaker 1: he's certainly the most identifiable personality associated with figurehead. If 85 00:05:00,960 --> 00:05:04,560 Speaker 1: if nothing else. Yeah. Yeah, and that gets complicated too, 86 00:05:04,600 --> 00:05:07,480 Speaker 1: and we'll explain that in a bit. But according to 87 00:05:07,520 --> 00:05:12,280 Speaker 1: this article, he registered the domain name back in However, 88 00:05:12,320 --> 00:05:16,040 Speaker 1: every other source I could find, two thousand six was 89 00:05:16,240 --> 00:05:20,400 Speaker 1: the earliest that it was actually um registered. This is 90 00:05:20,440 --> 00:05:24,080 Speaker 1: another part that makes talking about wiki leaks challenging in 91 00:05:24,200 --> 00:05:28,360 Speaker 1: that there's a lot of misinformation about the site itself, 92 00:05:28,560 --> 00:05:32,080 Speaker 1: not just about all the shenanigans that went on both 93 00:05:32,160 --> 00:05:36,960 Speaker 1: within wiki Leaks and surrounding Wiki Leaks, but just when 94 00:05:37,120 --> 00:05:40,240 Speaker 1: it really got started. Right, And part of the problem 95 00:05:40,480 --> 00:05:42,360 Speaker 1: we should mention now is that a lot of the 96 00:05:42,560 --> 00:05:45,400 Speaker 1: information out there about wiki leaks comes from either Juliana 97 00:05:45,480 --> 00:05:49,360 Speaker 1: Sounds himself or other other personalities that have been involved 98 00:05:49,400 --> 00:05:52,880 Speaker 1: with the organization and have left it violently and bitterly. Yeah, 99 00:05:53,360 --> 00:05:56,960 Speaker 1: and so so all of all of this is uh 100 00:05:57,400 --> 00:06:00,960 Speaker 1: not necessarily it's difficult to track down the exact truth. Yeah, 101 00:06:01,000 --> 00:06:03,880 Speaker 1: there's a lot of subjectivity here. I mean, everyone involved 102 00:06:03,960 --> 00:06:07,480 Speaker 1: has their own agenda and uh, and that doesn't it 103 00:06:07,520 --> 00:06:12,160 Speaker 1: doesn't necessarily align with the organization stated mission, right right, 104 00:06:12,560 --> 00:06:14,880 Speaker 1: So this is this is what makes this complicated. Now, 105 00:06:14,960 --> 00:06:17,080 Speaker 1: what is that stated mission? Well, the whole idea here 106 00:06:17,760 --> 00:06:19,840 Speaker 1: is that what Wiki leaks is trying to do is 107 00:06:19,920 --> 00:06:23,760 Speaker 1: make available information that the public that that that you know, 108 00:06:23,839 --> 00:06:26,160 Speaker 1: the organization feels the public should have access to but 109 00:06:26,400 --> 00:06:31,240 Speaker 1: otherwise would not because of the secrecy of either governmental 110 00:06:31,400 --> 00:06:34,480 Speaker 1: or corporate organizations or even just other organizations in general, 111 00:06:35,120 --> 00:06:41,160 Speaker 1: and that it's an attempt to make these organizations more transparent. Right. 112 00:06:41,360 --> 00:06:44,200 Speaker 1: They cite the Universal Declaration of Human Rights UM, in 113 00:06:44,279 --> 00:06:48,160 Speaker 1: particular Article nineteen UM, which which just says that that 114 00:06:48,240 --> 00:06:50,680 Speaker 1: everyone has the right to freedom, freedom of opinion and 115 00:06:50,760 --> 00:06:54,360 Speaker 1: expression right and that they should be able to pursue 116 00:06:54,520 --> 00:06:59,240 Speaker 1: information regardless of frontiers, which is kind of like saying, 117 00:06:59,600 --> 00:07:04,320 Speaker 1: you know, a person in in say China, who otherwise 118 00:07:04,440 --> 00:07:07,200 Speaker 1: might not have access to certain information completely has the 119 00:07:07,360 --> 00:07:10,560 Speaker 1: universal right to that, even if the Chinese government says 120 00:07:10,640 --> 00:07:13,480 Speaker 1: they don't. So in other words, it's kind of this 121 00:07:13,600 --> 00:07:18,240 Speaker 1: idea that everyone has this right regardless of what your 122 00:07:18,320 --> 00:07:21,160 Speaker 1: government says. Well, this is the United Nations that has 123 00:07:21,240 --> 00:07:23,880 Speaker 1: the Universal Declaration of Human Rights, and it's not like 124 00:07:24,040 --> 00:07:27,000 Speaker 1: this is somehow legally binding for all right, Right, it's 125 00:07:27,040 --> 00:07:29,800 Speaker 1: it's a very nice idea. Yeah, it's it's an ideal, Right, 126 00:07:30,000 --> 00:07:34,160 Speaker 1: it's not so much legal wording. It's it's saying ideally, 127 00:07:34,320 --> 00:07:38,040 Speaker 1: we would say all people have the right to these expectations, 128 00:07:38,600 --> 00:07:40,520 Speaker 1: and one of them would be this right to freedom 129 00:07:40,520 --> 00:07:44,000 Speaker 1: of expression. Now, depending on where you are in the world, 130 00:07:44,600 --> 00:07:48,720 Speaker 1: your right to freedom of expression maybe fairly generous or 131 00:07:48,760 --> 00:07:51,400 Speaker 1: it might be really restrictive. In the United States, we 132 00:07:51,520 --> 00:07:54,600 Speaker 1: like to think that we have the freedom of speech, 133 00:07:54,640 --> 00:07:57,280 Speaker 1: but there are certain limitations. I mean, obviously, if you 134 00:07:57,320 --> 00:07:59,760 Speaker 1: are using that freedom of speech to inflict harm upon 135 00:08:00,080 --> 00:08:04,240 Speaker 1: someone and it is completely unjustified, and you know, you 136 00:08:04,320 --> 00:08:07,920 Speaker 1: could be taken to court for things things like library um. 137 00:08:08,200 --> 00:08:10,040 Speaker 1: You know, the whole there's the whole argument about you 138 00:08:10,120 --> 00:08:13,040 Speaker 1: can't yell fire in a crowded theater because it could 139 00:08:13,200 --> 00:08:16,600 Speaker 1: cause people You're you're right, or freedom of expression does 140 00:08:16,680 --> 00:08:20,720 Speaker 1: not override someone else's right to safety safety right, or 141 00:08:20,760 --> 00:08:24,400 Speaker 1: at least the expectation of safety. Um. And so there 142 00:08:24,440 --> 00:08:26,000 Speaker 1: are a lot of complications here. But even in the 143 00:08:26,120 --> 00:08:29,320 Speaker 1: United States, you know, it's fairly wide open. People will 144 00:08:29,440 --> 00:08:31,800 Speaker 1: argue as to how wide open it is because of 145 00:08:31,920 --> 00:08:35,000 Speaker 1: their own personal views. If you go to someplace like 146 00:08:35,040 --> 00:08:39,079 Speaker 1: the United Kingdom, there rights of free speech are are 147 00:08:40,280 --> 00:08:46,000 Speaker 1: more narrow. There are very specific laws that prevent newspapers, 148 00:08:46,040 --> 00:08:51,480 Speaker 1: for example, from violating secrecy agreements. So something like The 149 00:08:51,520 --> 00:08:54,720 Speaker 1: Guardian had to make some really tough decisions when they 150 00:08:54,800 --> 00:08:56,959 Speaker 1: got information from Wiki leaks onto whether or not to 151 00:08:57,040 --> 00:09:00,800 Speaker 1: publish it, because it could result in some junctions from 152 00:09:00,840 --> 00:09:05,320 Speaker 1: the government, So that gets really complex as well. Anyway, 153 00:09:05,920 --> 00:09:09,679 Speaker 1: Wiki Leaks essentially says, we don't subscribe to any of 154 00:09:09,760 --> 00:09:15,120 Speaker 1: these regional concepts. We subscribe to this universal concept and 155 00:09:15,240 --> 00:09:18,360 Speaker 1: that's what it's all about. And so they try to 156 00:09:18,640 --> 00:09:23,319 Speaker 1: provide the material that journalists can use to tell the 157 00:09:23,440 --> 00:09:27,000 Speaker 1: stories that otherwise would be untold. Right. Um Assassa said 158 00:09:27,040 --> 00:09:30,760 Speaker 1: in in one interview with Time that this organization practices 159 00:09:30,840 --> 00:09:33,600 Speaker 1: civil obedience. It tries to make the world more civil 160 00:09:33,840 --> 00:09:36,480 Speaker 1: and act against abusive organizations that are pushing it in 161 00:09:36,559 --> 00:09:39,360 Speaker 1: the opposite direction. So it's kind of funny to call 162 00:09:39,360 --> 00:09:43,679 Speaker 1: it civil obedience considering that, uh, you know that they 163 00:09:43,720 --> 00:09:48,000 Speaker 1: are directly disobeying. Yes, they're being disobedient by by very nature. 164 00:09:48,400 --> 00:09:50,839 Speaker 1: But it's it's all an argument about who has the 165 00:09:50,920 --> 00:09:54,040 Speaker 1: power and responsibility. If you have a lot of you know, 166 00:09:54,280 --> 00:09:58,560 Speaker 1: as as Uncle Ben would say, with great power comes 167 00:09:58,600 --> 00:10:01,960 Speaker 1: great responsibility. That's that's one of those little cliche sayings, 168 00:10:02,040 --> 00:10:05,160 Speaker 1: but there is truth to that. It's very true that 169 00:10:05,320 --> 00:10:09,400 Speaker 1: if you are a governmental agency or a corporation that 170 00:10:09,520 --> 00:10:12,840 Speaker 1: has a lot of power, then there is a certain 171 00:10:13,040 --> 00:10:15,560 Speaker 1: expectation that you will use that power in a responsible 172 00:10:15,600 --> 00:10:18,199 Speaker 1: way and not do it in such a way that 173 00:10:18,320 --> 00:10:22,520 Speaker 1: you are going to violate people's rights, right. You know. 174 00:10:22,880 --> 00:10:26,920 Speaker 1: At the same time, frequently information is withheld by by governments, 175 00:10:26,920 --> 00:10:30,960 Speaker 1: in particular because because of of of measures of national security, 176 00:10:31,000 --> 00:10:33,679 Speaker 1: because they're trying to protect you know, either their their 177 00:10:33,720 --> 00:10:37,160 Speaker 1: military interests or their scientific interests, or or you know, 178 00:10:37,280 --> 00:10:40,760 Speaker 1: the people and the research that go into making that 179 00:10:40,920 --> 00:10:43,520 Speaker 1: country a safe place to be, right. So there is 180 00:10:43,559 --> 00:10:46,120 Speaker 1: a very delicate balance here, right. I mean, you could 181 00:10:46,240 --> 00:10:49,280 Speaker 1: argue that there are there's certain information that's out there 182 00:10:49,840 --> 00:10:52,240 Speaker 1: that it's good that it's secret because that means that 183 00:10:52,360 --> 00:10:55,120 Speaker 1: the people who are doing valuable work, and it may 184 00:10:55,280 --> 00:10:58,680 Speaker 1: be valuable work that in no way is violent towards 185 00:10:58,679 --> 00:11:02,199 Speaker 1: any other person. It may be completely humanitarian work that 186 00:11:02,640 --> 00:11:05,240 Speaker 1: their lives could be in jeopardy if the information were 187 00:11:05,280 --> 00:11:07,840 Speaker 1: made public, and that is something that you know needs 188 00:11:07,880 --> 00:11:11,040 Speaker 1: to be taken into consideration. Now, the what wicked Leaks 189 00:11:11,080 --> 00:11:12,719 Speaker 1: would say is, on the other hand, you also have 190 00:11:13,000 --> 00:11:18,800 Speaker 1: these instances where UH agencies or the military or a 191 00:11:18,880 --> 00:11:24,320 Speaker 1: corporation are behaving in an unethical or corrupt or illegal manner, 192 00:11:24,720 --> 00:11:27,360 Speaker 1: and without this information becoming public, they can never be 193 00:11:27,480 --> 00:11:30,719 Speaker 1: held accountable for that. And so if they you know, 194 00:11:30,800 --> 00:11:34,640 Speaker 1: this transparency gives accountability to those agencies. And in fact, 195 00:11:34,679 --> 00:11:36,560 Speaker 1: they go so far as to say it's the media's 196 00:11:36,679 --> 00:11:40,040 Speaker 1: role to to reveal this kind of information to the public, 197 00:11:40,120 --> 00:11:43,520 Speaker 1: to to make the public informed enough to be good citizens, right, 198 00:11:43,600 --> 00:11:46,160 Speaker 1: because this is that that big relationship. You know, we 199 00:11:46,280 --> 00:11:48,240 Speaker 1: often think of the government as being this kind of 200 00:11:48,360 --> 00:11:52,880 Speaker 1: separate entity that is UH in some cases we think 201 00:11:52,920 --> 00:11:56,040 Speaker 1: of as sort of that big brother Orwellian idea of 202 00:11:56,600 --> 00:12:01,199 Speaker 1: this this other that dictates what we do. But in reality, 203 00:12:01,280 --> 00:12:04,720 Speaker 1: at least in in you can argue to watch it, right. 204 00:12:04,760 --> 00:12:07,600 Speaker 1: You know, they're definitely fringe theories about how much your 205 00:12:07,679 --> 00:12:10,800 Speaker 1: your average citizen does or doesn't have power over over 206 00:12:10,880 --> 00:12:13,079 Speaker 1: these these things. But but but saying that, you know, 207 00:12:13,240 --> 00:12:15,599 Speaker 1: if we take democracy in the United States for what 208 00:12:15,760 --> 00:12:17,599 Speaker 1: it's supposed to be, if we assume that that in 209 00:12:17,679 --> 00:12:20,520 Speaker 1: fact is the case, then what Wiki Leak's role is 210 00:12:21,040 --> 00:12:25,440 Speaker 1: to tell the public, Hey, this is what your officials 211 00:12:25,480 --> 00:12:27,920 Speaker 1: are up to, and if you don't approve of that, 212 00:12:28,160 --> 00:12:29,959 Speaker 1: you need to be aware of it so that you 213 00:12:30,080 --> 00:12:33,559 Speaker 1: can the next time voting comes around, behave in a 214 00:12:33,640 --> 00:12:35,880 Speaker 1: way that that allows you to get the right people 215 00:12:35,920 --> 00:12:38,800 Speaker 1: in charge, assuming of course, that the right people are 216 00:12:39,280 --> 00:12:41,599 Speaker 1: the ones who are running for office. That's that's a 217 00:12:41,640 --> 00:12:46,760 Speaker 1: different political discussion. That's the whole other, whole other canne worms. Honestly, 218 00:12:46,880 --> 00:12:50,520 Speaker 1: I'm I'm probably coming across is really cynical. I'm actually 219 00:12:50,640 --> 00:12:54,360 Speaker 1: not that cynical. I'm just aware of a lot of cynicism. 220 00:12:54,480 --> 00:12:56,240 Speaker 1: So it's one of those things where you kind of 221 00:12:56,320 --> 00:12:59,079 Speaker 1: dance around it. Anyway. Well, yeah, and you know, the 222 00:12:59,120 --> 00:13:02,000 Speaker 1: thing is that I read one thing about how the 223 00:13:02,200 --> 00:13:06,160 Speaker 1: US Information Security Oversight Office reported that um, the number 224 00:13:06,200 --> 00:13:09,720 Speaker 1: of new secrets designated as such by the government rose 225 00:13:11,080 --> 00:13:17,120 Speaker 1: from nine and so so partially what this has to 226 00:13:17,160 --> 00:13:21,280 Speaker 1: do with is the way that governments are processing information 227 00:13:21,640 --> 00:13:24,599 Speaker 1: and um and releasing it to the public right and 228 00:13:24,760 --> 00:13:27,880 Speaker 1: and to be fair, I mean, the Albama administration in 229 00:13:27,920 --> 00:13:31,600 Speaker 1: the United States specifically said that one of its cornerstones 230 00:13:31,760 --> 00:13:34,160 Speaker 1: was going to be transparency, and so in a way 231 00:13:34,200 --> 00:13:36,440 Speaker 1: you can think of the media in general and wiki 232 00:13:36,520 --> 00:13:39,520 Speaker 1: leaks in particular in this case, trying to hold them 233 00:13:39,520 --> 00:13:42,040 Speaker 1: accountable for that claim saying you said you're going to 234 00:13:42,120 --> 00:13:45,199 Speaker 1: be transparent. Here are all these things that have not 235 00:13:45,400 --> 00:13:49,000 Speaker 1: been reported? What is your stance on this? Absolutely? And 236 00:13:49,120 --> 00:13:53,760 Speaker 1: so I mean, you know, I can't disagree with that motivation. 237 00:13:53,840 --> 00:13:55,920 Speaker 1: I think that that's very important. I think that any 238 00:13:56,200 --> 00:13:59,400 Speaker 1: nation where you have a free pet press, that presses 239 00:13:59,480 --> 00:14:02,440 Speaker 1: job to keep an eye and make sure that things 240 00:14:02,520 --> 00:14:05,559 Speaker 1: are not being uh, that people are not behaving in 241 00:14:05,600 --> 00:14:09,000 Speaker 1: an unethical manner or taking advantage of a situation, and 242 00:14:09,120 --> 00:14:11,680 Speaker 1: if they are, that that's reported. I think that's very important. 243 00:14:11,840 --> 00:14:14,079 Speaker 1: It's a key role in journalism, and it's something that 244 00:14:14,800 --> 00:14:18,000 Speaker 1: some people would say has really been slipping from journalism 245 00:14:18,120 --> 00:14:23,440 Speaker 1: over the last couple of decades. Again, I'm not a journalists, 246 00:14:23,440 --> 00:14:25,520 Speaker 1: so I'm not going to comment on that. Well, that's 247 00:14:25,560 --> 00:14:28,000 Speaker 1: that's another you know, there's also a lot of ideas 248 00:14:28,040 --> 00:14:31,520 Speaker 1: about out there about how entertainment based the journalism industry 249 00:14:31,640 --> 00:14:34,240 Speaker 1: is these days. Yes, yeah, the whole idea of of 250 00:14:34,400 --> 00:14:39,360 Speaker 1: the the commercialization of journalism. Uh well, anyway, Wiki leaks 251 00:14:39,560 --> 00:14:44,119 Speaker 1: does have some policies that try to guide the organization 252 00:14:44,200 --> 00:14:46,840 Speaker 1: and how it behaves. One of those is that that 253 00:14:47,400 --> 00:14:51,680 Speaker 1: they do sometimes delay or remove some details in stories 254 00:14:51,840 --> 00:14:55,760 Speaker 1: in order to protect people from immediate harm, especially either 255 00:14:55,960 --> 00:14:59,640 Speaker 1: current victims of of harm somewhere or the whistleblowers who 256 00:14:59,640 --> 00:15:03,120 Speaker 1: are hoarding this information right. And then on top of that, 257 00:15:03,280 --> 00:15:08,040 Speaker 1: they also try to make sure that every single leaked 258 00:15:08,120 --> 00:15:10,600 Speaker 1: report that they receive, which by the way, they they've 259 00:15:10,600 --> 00:15:13,760 Speaker 1: set up an electronic dropbox, that's their preferred method of 260 00:15:13,840 --> 00:15:18,880 Speaker 1: receiving information is electronically, and that this dropbox is encrypted 261 00:15:19,360 --> 00:15:22,640 Speaker 1: so that again in theory, you wouldn't be able to 262 00:15:23,000 --> 00:15:26,480 Speaker 1: trace the origin of that other than being able to 263 00:15:26,520 --> 00:15:29,040 Speaker 1: see that this is original source material, but it's not 264 00:15:29,200 --> 00:15:32,760 Speaker 1: tied to any particular person. So the electronic dropbox is 265 00:15:32,760 --> 00:15:35,880 Speaker 1: supposed to be a secure way of getting files two 266 00:15:35,960 --> 00:15:38,320 Speaker 1: wiki leaks, and they have said that they would also 267 00:15:38,880 --> 00:15:42,720 Speaker 1: receive files or documents in other forms, like including through 268 00:15:42,800 --> 00:15:46,240 Speaker 1: the mail, but they preferred not to because that kind 269 00:15:46,240 --> 00:15:49,200 Speaker 1: of stuff can be intercepted and securities and questions, and 270 00:15:49,360 --> 00:15:51,680 Speaker 1: also also if they have a physical mailing address, that 271 00:15:51,720 --> 00:15:56,120 Speaker 1: makes it more difficult for them to uh avoid entanglements, 272 00:15:56,640 --> 00:16:00,680 Speaker 1: all right, So anyway, they they do refer to their 273 00:16:00,800 --> 00:16:04,120 Speaker 1: electronic drop boxes being the preferred method, and they say 274 00:16:04,200 --> 00:16:08,840 Speaker 1: that everything they get they use traditional investigative journalism techniques 275 00:16:09,280 --> 00:16:13,000 Speaker 1: as well as more modern technology based methods to verify 276 00:16:13,520 --> 00:16:16,960 Speaker 1: that that information is in fact, accurate and true, and 277 00:16:17,040 --> 00:16:20,360 Speaker 1: that it really is from a verifiable source, right, that 278 00:16:20,440 --> 00:16:24,240 Speaker 1: the documents are actually from wherever the person the anonymous 279 00:16:24,280 --> 00:16:26,680 Speaker 1: source says they're from. So if an anonymous source drops 280 00:16:26,760 --> 00:16:32,000 Speaker 1: off a enormous file that is supposed to contain all 281 00:16:32,040 --> 00:16:34,320 Speaker 1: these documents from the Department of Justice in the United States, 282 00:16:34,360 --> 00:16:37,640 Speaker 1: for example, they would go through and try to verify 283 00:16:37,760 --> 00:16:40,800 Speaker 1: that those were in fact d o J documents and 284 00:16:40,920 --> 00:16:45,320 Speaker 1: not just someone using some logos or whatever to fabricate something. Right. 285 00:16:45,400 --> 00:16:48,400 Speaker 1: This is through a largely volunteer based system they've they've 286 00:16:48,440 --> 00:16:50,320 Speaker 1: reported and again you know, these numbers are kind of 287 00:16:50,360 --> 00:16:53,120 Speaker 1: wibbly wobbly, but um they reported having up to eight 288 00:16:53,240 --> 00:16:59,360 Speaker 1: hundred volunteer members, mostly mostly journalists, I think, helping helping 289 00:16:59,400 --> 00:17:01,520 Speaker 1: them out with stuff like this, right, and some of 290 00:17:01,600 --> 00:17:04,680 Speaker 1: those volunteers have a lot of stuff to say about 291 00:17:04,680 --> 00:17:07,280 Speaker 1: wiki leaks, but again we'll cover that probably in the 292 00:17:07,359 --> 00:17:10,000 Speaker 1: second half. I think of this of this episode, so 293 00:17:11,119 --> 00:17:14,880 Speaker 1: they their policies to verify everything to keep it secure, 294 00:17:15,600 --> 00:17:18,399 Speaker 1: and then once they verified it, the next step is 295 00:17:18,920 --> 00:17:23,280 Speaker 1: deciding on when to publish this information. And wiki leaks 296 00:17:23,359 --> 00:17:27,320 Speaker 1: has done uh well, They've done their own journalism. They've 297 00:17:27,359 --> 00:17:30,280 Speaker 1: had their own journalists right stories that were based off 298 00:17:30,359 --> 00:17:32,840 Speaker 1: of the documents that they had found. That I think 299 00:17:32,880 --> 00:17:35,720 Speaker 1: you mentioned earlier that that part of what wiki leaks 300 00:17:36,000 --> 00:17:38,639 Speaker 1: holds very DearS that they want to publish original material. 301 00:17:38,680 --> 00:17:41,320 Speaker 1: They don't want to republish anything, and and that's and 302 00:17:41,400 --> 00:17:45,119 Speaker 1: so yes, so um so actually doing writing an investigation, 303 00:17:45,280 --> 00:17:47,640 Speaker 1: it's a big part of that. They also will partner 304 00:17:47,840 --> 00:17:51,600 Speaker 1: with existing media outlets, although there's been sort of a 305 00:17:51,640 --> 00:17:57,960 Speaker 1: contentious relationship between wiki leaks and several high profile newspapers 306 00:17:58,480 --> 00:18:01,840 Speaker 1: UH in multiple nation and part of that I think 307 00:18:01,920 --> 00:18:05,560 Speaker 1: has to do with the sana's personal handling of situations. 308 00:18:06,080 --> 00:18:09,320 Speaker 1: But anyway, they they their goal is to have stories 309 00:18:09,359 --> 00:18:13,560 Speaker 1: written about the information they have, because it's kind of 310 00:18:13,640 --> 00:18:17,520 Speaker 1: like if you were to walk into a laboratory that's 311 00:18:17,520 --> 00:18:20,119 Speaker 1: doing scientific experiments and you see the experiment going on, 312 00:18:20,800 --> 00:18:24,359 Speaker 1: but you can't really necessarily make any meaningful conclusion based 313 00:18:24,400 --> 00:18:26,359 Speaker 1: on that. You need to have the report written up 314 00:18:26,400 --> 00:18:30,040 Speaker 1: where you can see what the actual methodology was, what 315 00:18:30,320 --> 00:18:33,879 Speaker 1: the results were, and what the conclusions are. That's what 316 00:18:34,280 --> 00:18:38,320 Speaker 1: is digestible to the average person. Same thing with these 317 00:18:38,400 --> 00:18:42,000 Speaker 1: wiki leaks reports. It's usually the raw material is something 318 00:18:42,080 --> 00:18:45,080 Speaker 1: that they hand over to journalists and the journalists and 319 00:18:45,160 --> 00:18:47,480 Speaker 1: up rutting the stories, and then the raw material is 320 00:18:47,520 --> 00:18:52,000 Speaker 1: published along with the story, so that way complete transparency 321 00:18:52,080 --> 00:18:53,840 Speaker 1: that they're really going for, right that way, so that 322 00:18:53,920 --> 00:18:56,480 Speaker 1: you can check their sources exactly exactly. Like if if 323 00:18:56,600 --> 00:18:59,520 Speaker 1: Lauren reads the story and then you know, she reads 324 00:18:59,640 --> 00:19:02,920 Speaker 1: onejournalists perspective on that story and they're the way that 325 00:19:03,040 --> 00:19:06,720 Speaker 1: they have taken this information and then communicated it, she 326 00:19:06,840 --> 00:19:10,080 Speaker 1: can then go to the the source material, read the 327 00:19:10,119 --> 00:19:14,080 Speaker 1: source material and see if her own interpretation of that 328 00:19:14,160 --> 00:19:17,359 Speaker 1: source material is aligned with what the journalists said. Because remember, 329 00:19:17,440 --> 00:19:20,640 Speaker 1: no matter what, we're all humans, so we all view 330 00:19:20,760 --> 00:19:23,639 Speaker 1: things through our own kind of lens in the world, 331 00:19:23,760 --> 00:19:25,720 Speaker 1: and we try and be as you know, a journalists 332 00:19:25,760 --> 00:19:28,600 Speaker 1: tried to be as objective as possible, at least most 333 00:19:28,640 --> 00:19:33,240 Speaker 1: people journalists try. That's the hypothetical ideal. Yeah, So, but 334 00:19:33,400 --> 00:19:36,480 Speaker 1: it's it's you know, there's no way to ever truly 335 00:19:36,880 --> 00:19:39,600 Speaker 1: attain that sure, and and and in some of these 336 00:19:39,680 --> 00:19:43,359 Speaker 1: cases they the Assange and and other folks at Wiki 337 00:19:43,440 --> 00:19:48,920 Speaker 1: leaks have admitted to wanting to market some of these 338 00:19:48,960 --> 00:19:52,399 Speaker 1: bigger releases, uh, to to really get things into the 339 00:19:52,440 --> 00:19:56,480 Speaker 1: public eye. Right, I don't I haven't personally read any 340 00:19:56,640 --> 00:20:00,359 Speaker 1: any ill intent to change details, but it just to 341 00:20:00,520 --> 00:20:03,199 Speaker 1: bring the most important details to the forefront, because they 342 00:20:03,240 --> 00:20:06,560 Speaker 1: are a political organization, right and when you look at it, 343 00:20:06,680 --> 00:20:10,480 Speaker 1: you're you're talking about sometimes they're getting files that are enormous. 344 00:20:10,800 --> 00:20:13,880 Speaker 1: They're talking about pages and pages and pages of documentation. 345 00:20:14,520 --> 00:20:17,840 Speaker 1: It's like like over over three hundred thousand documents and 346 00:20:17,880 --> 00:20:20,600 Speaker 1: ago that that's you know, obviously way too much for 347 00:20:20,720 --> 00:20:23,880 Speaker 1: anyone to just wade through beginning to end and find 348 00:20:24,000 --> 00:20:26,960 Speaker 1: the nuggets that are meaningful, because a lot of this 349 00:20:27,040 --> 00:20:31,520 Speaker 1: stuff has information that's not really um, you know, important 350 00:20:31,680 --> 00:20:33,480 Speaker 1: in the grand scheme of things. It might have been 351 00:20:33,520 --> 00:20:37,119 Speaker 1: important for a very specific purpose, but you know, beyond that, 352 00:20:37,280 --> 00:20:40,400 Speaker 1: it doesn't really matter so much. So well, we'll mention 353 00:20:40,480 --> 00:20:42,920 Speaker 1: a few examples of stuff like that a little bit later. Yeah, 354 00:20:43,000 --> 00:20:45,520 Speaker 1: So they you know, this is this is their basic 355 00:20:46,280 --> 00:20:50,240 Speaker 1: purpose and their basic approach. To go into more detail, 356 00:20:50,320 --> 00:20:52,560 Speaker 1: we'll have to kind of look at the timeline and 357 00:20:52,720 --> 00:20:56,680 Speaker 1: discuss what had happened throughout the history of wiki leaks, 358 00:20:56,720 --> 00:20:58,679 Speaker 1: which course is still in existence today. I don't mean 359 00:20:58,760 --> 00:21:03,000 Speaker 1: to suggest that history is over, but it is. It 360 00:21:03,119 --> 00:21:05,440 Speaker 1: is a continuing story. But um, but most of the 361 00:21:05,680 --> 00:21:07,879 Speaker 1: most of the action within wiki leaks was going on 362 00:21:08,000 --> 00:21:12,840 Speaker 1: around Yeah, that's when the big, big stuff was going on, 363 00:21:13,040 --> 00:21:16,600 Speaker 1: although it's played a part in other stories since then, 364 00:21:16,680 --> 00:21:20,399 Speaker 1: and of course before then too. Hey, guys, Jonathan from here, 365 00:21:20,560 --> 00:21:31,280 Speaker 1: we're going to take a quick break to thank our sponsor. Alright, 366 00:21:31,400 --> 00:21:34,240 Speaker 1: so we've had an overview of what wiki leaks is, 367 00:21:34,320 --> 00:21:37,680 Speaker 1: what its purposes, what you know, role it's it's ideally 368 00:21:37,800 --> 00:21:41,960 Speaker 1: would fill within the world of journalism and transparency and accountability. 369 00:21:42,480 --> 00:21:45,879 Speaker 1: Let's talk a little bit about the timeline. So it 370 00:21:46,000 --> 00:21:50,040 Speaker 1: was around two thousand six that Wiki leaks began to coalesce, 371 00:21:50,280 --> 00:21:53,520 Speaker 1: and it was officially launched in two thousand seven, right 372 00:21:53,960 --> 00:21:56,760 Speaker 1: either December two thousand six or early two thousand seven. Yeah, 373 00:21:56,840 --> 00:22:00,440 Speaker 1: it's right around the anyway. The first publications were coming 374 00:22:00,480 --> 00:22:02,680 Speaker 1: out in in December of two thousand and six. But 375 00:22:03,200 --> 00:22:06,160 Speaker 1: most folks just say that are The official launch where 376 00:22:06,560 --> 00:22:10,280 Speaker 1: we've had a thing was in early two thousand seven, right, 377 00:22:10,400 --> 00:22:12,119 Speaker 1: And and I did want to mention that this is 378 00:22:12,160 --> 00:22:14,359 Speaker 1: coming on the heels of in two thousand five, the 379 00:22:14,440 --> 00:22:18,399 Speaker 1: commission that had been investigating the nine eleven terrorists attacks 380 00:22:18,440 --> 00:22:21,040 Speaker 1: here in the United States had found that that poor 381 00:22:21,119 --> 00:22:24,720 Speaker 1: information sharing was a huge failure of the government in 382 00:22:24,800 --> 00:22:26,720 Speaker 1: the in the lead up to the attacks, of of 383 00:22:26,840 --> 00:22:29,720 Speaker 1: of preventing this sort of thing from happening, and and 384 00:22:29,800 --> 00:22:35,199 Speaker 1: that that had led to internally a lot of reorganization 385 00:22:35,400 --> 00:22:40,360 Speaker 1: of how information is shared in between different departments department. Yeah, 386 00:22:40,400 --> 00:22:42,760 Speaker 1: this this is something that we see in all levels 387 00:22:42,800 --> 00:22:45,920 Speaker 1: of organization where you have multiple departments and then the 388 00:22:46,200 --> 00:22:50,480 Speaker 1: communication between departments, sometimes the communication within a department tends 389 00:22:50,520 --> 00:22:54,000 Speaker 1: to get bogged down by red tape. And uh, you know, 390 00:22:54,080 --> 00:22:55,320 Speaker 1: you see this all the time. Like if you ever 391 00:22:55,720 --> 00:23:02,360 Speaker 1: read any stories about uh police investigation that spanned multiple jurisdictions, 392 00:23:02,920 --> 00:23:05,639 Speaker 1: that's always part of the story is how complicated it 393 00:23:05,800 --> 00:23:10,280 Speaker 1: was to get the cooperation of one police office versus another. 394 00:23:10,480 --> 00:23:14,000 Speaker 1: And if their federal investigators, then that adds another layer 395 00:23:14,080 --> 00:23:16,560 Speaker 1: of complexity. And it was it wasn't until around that 396 00:23:16,640 --> 00:23:19,320 Speaker 1: time that um, I think that a lot of agencies 397 00:23:19,359 --> 00:23:21,920 Speaker 1: had been kind of resisting going digital just because of 398 00:23:22,040 --> 00:23:25,200 Speaker 1: the ease of flow of information, which isn't really considered 399 00:23:25,240 --> 00:23:27,040 Speaker 1: a good thing when you're trying to keep things secret. 400 00:23:27,520 --> 00:23:30,080 Speaker 1: Um And but but it was after after two five 401 00:23:30,119 --> 00:23:33,320 Speaker 1: after that report from that commission, that things started going 402 00:23:33,520 --> 00:23:37,360 Speaker 1: going online. And the first story that wiki leaks partnered 403 00:23:37,640 --> 00:23:41,600 Speaker 1: with The Guardian on was for around August thirty one, 404 00:23:41,720 --> 00:23:43,879 Speaker 1: two thousand seven. It was a story about the alleged 405 00:23:43,960 --> 00:23:47,960 Speaker 1: corruption of Daniel arap Moi, the former president of Kenya, 406 00:23:48,720 --> 00:23:51,239 Speaker 1: and it was a leaked report. The Kenyan government had 407 00:23:51,280 --> 00:23:55,480 Speaker 1: elected to keep this report secret, and UH, someone leaked 408 00:23:55,640 --> 00:23:59,560 Speaker 1: the information to wiki leaks. Wiki leaks then approached the Guardian, 409 00:23:59,680 --> 00:24:04,000 Speaker 1: and the Guardian was very cautious about this kind of relationship. 410 00:24:04,040 --> 00:24:06,240 Speaker 1: It was one of those things where they saw the 411 00:24:06,400 --> 00:24:09,680 Speaker 1: value in what WikiLeaks was doing. WikiLeaks was setting itself 412 00:24:09,800 --> 00:24:13,760 Speaker 1: up to be a completely independent safe house of information 413 00:24:14,040 --> 00:24:17,680 Speaker 1: leaked information. UH. The idea being that of course anyone 414 00:24:17,720 --> 00:24:20,720 Speaker 1: who would submit to it would remain anonymous, WikiLeaks would 415 00:24:20,760 --> 00:24:23,680 Speaker 1: not point the finger at anybody, and that that information 416 00:24:23,720 --> 00:24:26,800 Speaker 1: could then get to some sort of outlet that could 417 00:24:26,800 --> 00:24:30,520 Speaker 1: communicate it to the wider world. So this was the 418 00:24:30,640 --> 00:24:33,920 Speaker 1: first attempt of The Guardian and WikiLeaks to work together. 419 00:24:34,520 --> 00:24:37,120 Speaker 1: From the story I read from Columbia University, it really 420 00:24:37,200 --> 00:24:40,320 Speaker 1: did sound like Julian Assange was a big part of 421 00:24:40,400 --> 00:24:45,119 Speaker 1: this early early on, and that that was both a 422 00:24:45,240 --> 00:24:47,800 Speaker 1: good thing in the early days and turned into a 423 00:24:47,960 --> 00:24:52,040 Speaker 1: complicated thing as time went on. Um from what I understand, 424 00:24:52,440 --> 00:24:55,240 Speaker 1: he can he can have an effect on people, uh 425 00:24:55,960 --> 00:24:59,879 Speaker 1: and make them feel like they're not really being listened to. 426 00:25:00,359 --> 00:25:04,320 Speaker 1: He's frequently described as being a very um dynamic personality. 427 00:25:04,600 --> 00:25:07,440 Speaker 1: That's a very generous way of putting it. Uh. Yeah, 428 00:25:08,040 --> 00:25:10,520 Speaker 1: people who have taught to him have had some pretty 429 00:25:10,840 --> 00:25:13,920 Speaker 1: contentious things to say about him. I've never met the man, 430 00:25:14,080 --> 00:25:16,840 Speaker 1: so I know nothing about him personally, nor have I 431 00:25:16,920 --> 00:25:19,280 Speaker 1: had any interactions with him. But just going from what 432 00:25:19,359 --> 00:25:22,280 Speaker 1: other people said, it seems like he seems he seems 433 00:25:22,400 --> 00:25:26,600 Speaker 1: very intense. Yeah, he can be a handful, yes, um uh. 434 00:25:26,800 --> 00:25:30,119 Speaker 1: This The Wiki Leaks would later be presented with an 435 00:25:30,119 --> 00:25:34,440 Speaker 1: award from Musty International um On behalf of the Kenyan 436 00:25:34,760 --> 00:25:39,320 Speaker 1: folks who leaked this information. Um uh. Now that's one 437 00:25:39,359 --> 00:25:41,879 Speaker 1: of the awards that Wiki Leaks has won Wiki leaks 438 00:25:41,960 --> 00:25:44,879 Speaker 1: of course, you know it's it is very controversial, but 439 00:25:45,119 --> 00:25:49,560 Speaker 1: there are organizations that have recognized its role in UH 440 00:25:49,800 --> 00:25:54,040 Speaker 1: in uncovering corruption and even going so far as to 441 00:25:54,760 --> 00:25:59,199 Speaker 1: UH giving enough information so that authorities could end up 442 00:25:59,440 --> 00:26:03,840 Speaker 1: pursuing and correcting problems UM or that people could end 443 00:26:03,960 --> 00:26:07,680 Speaker 1: up going out and voting what would what some would 444 00:26:07,720 --> 00:26:10,400 Speaker 1: call a corrupt administration out of power. This is, of course, 445 00:26:10,520 --> 00:26:13,639 Speaker 1: you know, on a global scale, not not limited to 446 00:26:13,720 --> 00:26:17,919 Speaker 1: just one country. In February of two eight, a wiki 447 00:26:18,080 --> 00:26:22,320 Speaker 1: leaks report along with The Guardian, exposed a Swiss bank 448 00:26:22,640 --> 00:26:28,000 Speaker 1: called Julius Bear for money laundering UH and UH that 449 00:26:28,359 --> 00:26:30,679 Speaker 1: ended up getting Wiki leaks hit with one of their 450 00:26:30,800 --> 00:26:35,520 Speaker 1: first major lawsuits, which is not something unusual for the organization. 451 00:26:35,560 --> 00:26:38,520 Speaker 1: They've been hit with many of them. Right. Also in 452 00:26:38,560 --> 00:26:42,920 Speaker 1: two thousand eight, they posted a bunch of scientologies secret 453 00:26:43,440 --> 00:26:47,640 Speaker 1: membership manuals. Yeah. Uh. And it's funny because there's there 454 00:26:47,680 --> 00:26:51,760 Speaker 1: are some stories that talk about how wiki leaks expected 455 00:26:51,960 --> 00:26:54,760 Speaker 1: certain things to get a lot more attention than what 456 00:26:54,920 --> 00:26:57,880 Speaker 1: they what it actually did, and that these manuals were 457 00:26:58,080 --> 00:26:59,760 Speaker 1: that was one of the things they thought that a 458 00:26:59,800 --> 00:27:02,919 Speaker 1: lot more attention would be devoted to the scientology manuals 459 00:27:02,960 --> 00:27:06,359 Speaker 1: than what actually happened. And that's another reason why wiki 460 00:27:06,440 --> 00:27:10,280 Speaker 1: leaks was looking to partner with various newspapers around the world, 461 00:27:10,760 --> 00:27:13,840 Speaker 1: because they were discovering that trying to direct people to 462 00:27:14,000 --> 00:27:18,000 Speaker 1: wiki leaks to find out about the information was tricky. 463 00:27:18,320 --> 00:27:21,560 Speaker 1: Anyone who makes a website learns pretty quickly it's hard 464 00:27:21,680 --> 00:27:23,920 Speaker 1: to get folks to go to your website. It is 465 00:27:24,240 --> 00:27:27,040 Speaker 1: folks need to want to go to your website, you 466 00:27:27,119 --> 00:27:31,760 Speaker 1: can't really make them go. So uh, you know, if 467 00:27:31,800 --> 00:27:34,159 Speaker 1: WikiLeaks was going to fulfill its mission and being this 468 00:27:34,440 --> 00:27:37,680 Speaker 1: this uh depository of secret information that could then be 469 00:27:37,800 --> 00:27:41,000 Speaker 1: used by journalists, they determined that they had to reach 470 00:27:41,040 --> 00:27:44,080 Speaker 1: out to more journalists to make that happen. Um. The 471 00:27:44,200 --> 00:27:46,000 Speaker 1: next thing I have is November two thousand nine. Do 472 00:27:46,040 --> 00:27:49,080 Speaker 1: you have anything before that? Okay? That's when they published 473 00:27:49,119 --> 00:27:53,080 Speaker 1: a comprehensive list of text pager messages that were sent 474 00:27:53,400 --> 00:27:56,119 Speaker 1: during September eleventh, two thousand one, which of course was 475 00:27:56,200 --> 00:27:59,680 Speaker 1: the date of the terrorist attacks that that that ended 476 00:28:00,000 --> 00:28:03,680 Speaker 1: thousands of lives here in the United States. UM. And 477 00:28:03,840 --> 00:28:06,240 Speaker 1: they were criticized for this because a lot of those 478 00:28:06,320 --> 00:28:11,320 Speaker 1: text messages were just simple messages from between family members 479 00:28:11,440 --> 00:28:13,520 Speaker 1: or co workers to let people know that they were 480 00:28:13,520 --> 00:28:16,960 Speaker 1: all right, that they had um managed to to stay 481 00:28:17,040 --> 00:28:21,439 Speaker 1: clear of the impact zones and that sort of thing, 482 00:28:21,920 --> 00:28:24,200 Speaker 1: and that there was a real question of is this 483 00:28:24,440 --> 00:28:29,360 Speaker 1: actually newsworthy? Is this is this valuable information to publish? 484 00:28:29,640 --> 00:28:33,360 Speaker 1: Is this are is this getting a little too voyeuristic 485 00:28:33,520 --> 00:28:37,359 Speaker 1: into the personal tragedies that happened during this day and 486 00:28:37,520 --> 00:28:39,160 Speaker 1: and and there was a little bit of a question, um. 487 00:28:39,520 --> 00:28:43,080 Speaker 1: And this ties back into the scientology bit of whether 488 00:28:43,480 --> 00:28:47,000 Speaker 1: wiki leaks was was publishing things to get attention and 489 00:28:47,040 --> 00:28:51,400 Speaker 1: to be sensational um rather than mission right. Yeah, and 490 00:28:51,520 --> 00:28:54,800 Speaker 1: that's that's played wiki leaks throughout its history as well. 491 00:28:54,920 --> 00:28:59,360 Speaker 1: Like the again, the goal that it states seems very 492 00:28:59,480 --> 00:29:03,560 Speaker 1: noble and a journalistic sense, but the behavior of at 493 00:29:03,640 --> 00:29:05,520 Speaker 1: least some of the people in wiki leaks may have 494 00:29:05,640 --> 00:29:09,160 Speaker 1: been so is aligned quite with that. It seems it 495 00:29:09,240 --> 00:29:11,800 Speaker 1: seemed like you said, a little more sensationalists. And that 496 00:29:11,920 --> 00:29:14,880 Speaker 1: might not have ever been the intent of the people 497 00:29:15,120 --> 00:29:17,920 Speaker 1: in wiki leaks. It just maybe the impression that everyone 498 00:29:18,040 --> 00:29:21,400 Speaker 1: got from the way it was handled. Now, getting back 499 00:29:21,440 --> 00:29:25,160 Speaker 1: in one last little thing about these these text pager messages, Um, 500 00:29:25,480 --> 00:29:29,280 Speaker 1: was that Wiki Leaks said their response to the criticism 501 00:29:29,520 --> 00:29:32,440 Speaker 1: of is this actually newsworthy? Their response was, this helps 502 00:29:32,640 --> 00:29:37,000 Speaker 1: create a more complete picture of what happened on that day. 503 00:29:37,880 --> 00:29:41,800 Speaker 1: And I'm not I mean, yeah, it's more complete. I 504 00:29:41,920 --> 00:29:46,560 Speaker 1: just don't know that it's more relevant. Sure, sure I could. 505 00:29:46,560 --> 00:29:48,880 Speaker 1: I could. I could argue that one. You know that 506 00:29:49,000 --> 00:29:51,320 Speaker 1: that that could easily be argued either way. Yeah, but um, 507 00:29:51,560 --> 00:29:53,479 Speaker 1: but yeah, I did want to mention also in two 508 00:29:53,520 --> 00:29:55,960 Speaker 1: thousand nine, UM, that was the year that President Obama 509 00:29:56,120 --> 00:29:59,360 Speaker 1: signed the executive order requiring a whole bunch of people 510 00:29:59,400 --> 00:30:03,080 Speaker 1: who hold classified status here in the States um to 511 00:30:03,320 --> 00:30:09,200 Speaker 1: receive extra training on what actually needs classification and UM 512 00:30:10,080 --> 00:30:13,720 Speaker 1: and also forced people who who are creating these classified 513 00:30:13,760 --> 00:30:17,200 Speaker 1: documents to identify themselves on those documents. And again this 514 00:30:17,360 --> 00:30:21,640 Speaker 1: is that this is that effort for official transparency to say, 515 00:30:21,800 --> 00:30:25,800 Speaker 1: you know, let's not just blanket whitewash everything is classified, 516 00:30:26,360 --> 00:30:28,960 Speaker 1: because that just all that really does is engender a 517 00:30:29,120 --> 00:30:31,760 Speaker 1: spirit of distrust in the government to say, like, what, 518 00:30:32,040 --> 00:30:34,120 Speaker 1: why are they hiding this? What else do they have 519 00:30:34,320 --> 00:30:36,760 Speaker 1: to hide? And uh, And of course I mean this 520 00:30:36,880 --> 00:30:39,600 Speaker 1: is an ongoing story, stuff that we're finding out about now. 521 00:30:39,640 --> 00:30:43,280 Speaker 1: About things like clandestine surveillance, which are these are stories 522 00:30:43,320 --> 00:30:45,800 Speaker 1: that are breaking as we are recording this podcast in 523 00:30:46,480 --> 00:30:48,840 Speaker 1: early June two thousand thirteen. They are all these stories 524 00:30:48,880 --> 00:30:51,680 Speaker 1: about the n s A and surveillance and cell phones 525 00:30:51,720 --> 00:30:54,040 Speaker 1: and things like that. And some might even argue that 526 00:30:54,200 --> 00:30:56,960 Speaker 1: the things that happened with wiki leaks are what kind 527 00:30:57,040 --> 00:31:00,920 Speaker 1: of led to those policies, which is the antithesis of 528 00:31:01,040 --> 00:31:03,640 Speaker 1: what they were supposed to be doing. I think that 529 00:31:03,680 --> 00:31:05,600 Speaker 1: there's been a lot of that actually from from I 530 00:31:05,680 --> 00:31:07,760 Speaker 1: think a lot of governments have kind of cracked down 531 00:31:08,840 --> 00:31:11,000 Speaker 1: on openness in response to this sort of thing. Yeah 532 00:31:11,040 --> 00:31:14,240 Speaker 1: it's it's you know, and some might argue they're cracking 533 00:31:14,320 --> 00:31:17,920 Speaker 1: down on one side while really doubling down on the 534 00:31:18,000 --> 00:31:20,440 Speaker 1: shady stuff on the other side. So it's almost like 535 00:31:20,800 --> 00:31:24,560 Speaker 1: the Wilkie defense. Hey it's Chewbacca. So um yeah, I mean, 536 00:31:24,640 --> 00:31:30,120 Speaker 1: these is complicated issue. April that's when Assane goes to 537 00:31:30,200 --> 00:31:33,760 Speaker 1: the National Press Club in d C and shows a 538 00:31:33,920 --> 00:31:37,040 Speaker 1: video of a two thousand seven incident. Now, this is 539 00:31:37,280 --> 00:31:43,480 Speaker 1: the the most probably the most notorious wiki leaks really, yeah, 540 00:31:44,000 --> 00:31:46,440 Speaker 1: I mean there's been a lot of high profile ones, 541 00:31:46,480 --> 00:31:48,680 Speaker 1: but this one is like the defining one, I think, 542 00:31:49,480 --> 00:31:52,000 Speaker 1: and this is one that showed an incident in two 543 00:31:52,040 --> 00:31:57,280 Speaker 1: thousand seven in which UH two US Apache helicopter pilots 544 00:31:57,280 --> 00:32:01,880 Speaker 1: allegedly executed UH innocent people on the ground in Iraq, 545 00:32:02,440 --> 00:32:05,400 Speaker 1: including to Reuter's correspondence. The whole thing was caught on video, 546 00:32:05,480 --> 00:32:09,200 Speaker 1: and Assange called it collateral murder. That's what he named 547 00:32:09,240 --> 00:32:12,600 Speaker 1: the video. And again that seems very sensationalist. Now, at 548 00:32:12,640 --> 00:32:17,200 Speaker 1: the same time, the video itself really did portray a 549 00:32:17,320 --> 00:32:20,520 Speaker 1: horrific act. Yeah, and and I mean, and there's and 550 00:32:20,560 --> 00:32:23,680 Speaker 1: there's full audio in it, so there's this very chillingly 551 00:32:23,800 --> 00:32:27,160 Speaker 1: calm discussion of everything that they're doing and um and 552 00:32:27,280 --> 00:32:29,120 Speaker 1: and yeah, it's it's you. You can you can see. 553 00:32:29,360 --> 00:32:32,440 Speaker 1: I think at least eighteen people are killed during the 554 00:32:32,480 --> 00:32:36,520 Speaker 1: course of the video, and two of them were Reuters reporters, 555 00:32:36,720 --> 00:32:38,320 Speaker 1: right right. This had been a project that had been 556 00:32:38,400 --> 00:32:41,719 Speaker 1: going on since March of and had been they had 557 00:32:41,760 --> 00:32:44,960 Speaker 1: received these files and UH kind of went into a 558 00:32:45,040 --> 00:32:48,520 Speaker 1: frenzy of work in Iceland putting them together. It was 559 00:32:48,560 --> 00:32:53,200 Speaker 1: called Project B And and they knew that it was big. Yeah, 560 00:32:53,200 --> 00:32:55,320 Speaker 1: they knew it was big. And and again Assange was 561 00:32:55,400 --> 00:32:59,120 Speaker 1: trying to find a way to get more visibility for 562 00:32:59,200 --> 00:33:03,080 Speaker 1: wiki leaks. So part of that sensationalism was in fact 563 00:33:03,240 --> 00:33:06,280 Speaker 1: intended because it was meant to to get as much 564 00:33:06,360 --> 00:33:09,000 Speaker 1: press as possible. But also, I mean it was an 565 00:33:09,040 --> 00:33:13,120 Speaker 1: absolutely pressworthy release. Yes, I mean you know you can't. Yeah, 566 00:33:13,320 --> 00:33:15,480 Speaker 1: it was, I mean, I I was a legitimate It 567 00:33:15,560 --> 00:33:18,160 Speaker 1: was a completely legitimate story. It was something that needed 568 00:33:18,200 --> 00:33:21,520 Speaker 1: to be broken because it was you know, people needed 569 00:33:21,520 --> 00:33:24,280 Speaker 1: to be held accountable. Although it should be said that 570 00:33:24,400 --> 00:33:28,080 Speaker 1: the military never did charge those helicopter pilots with anything 571 00:33:28,600 --> 00:33:32,680 Speaker 1: illegal or the official statement was essentially that the people 572 00:33:32,800 --> 00:33:36,240 Speaker 1: were in an area where there was a suspected um 573 00:33:36,600 --> 00:33:40,360 Speaker 1: uh ambush that was going to attack US forces and 574 00:33:40,560 --> 00:33:45,240 Speaker 1: that the helicopter pilots acted responsibly. That's the official response. 575 00:33:45,920 --> 00:33:49,880 Speaker 1: So anyway, definitely was newsworthy. Uh. Again, it was the 576 00:33:49,960 --> 00:33:52,120 Speaker 1: handling of it that I think was sensational, not the 577 00:33:52,200 --> 00:33:57,200 Speaker 1: material itself, but that was again completely valid. Uh. On 578 00:33:59,520 --> 00:34:03,200 Speaker 1: the pen I gone arrested US Army Private Bradley Manning 579 00:34:03,280 --> 00:34:07,040 Speaker 1: on charges of downloading and then leaking thousands of classified 580 00:34:07,200 --> 00:34:12,640 Speaker 1: US documents, including this video, and uh, that's sort of 581 00:34:12,760 --> 00:34:16,800 Speaker 1: been the the video and and the handling of Bradley 582 00:34:16,840 --> 00:34:20,359 Speaker 1: Manning and and Bradley Manning's case have been this sort 583 00:34:20,400 --> 00:34:24,640 Speaker 1: of defining element to what Wiki Leaks is and the 584 00:34:24,719 --> 00:34:28,479 Speaker 1: way that it's portrayed in general. I mean, this story 585 00:34:28,560 --> 00:34:30,920 Speaker 1: is still ongoing to this day. Right. I believe that 586 00:34:31,000 --> 00:34:33,520 Speaker 1: the trial for Manning is going on right now as 587 00:34:33,560 --> 00:34:36,960 Speaker 1: we're recording this podcast. Yes, it is, although it apparently 588 00:34:37,000 --> 00:34:40,160 Speaker 1: it's going much more quickly than what they had originally planned. 589 00:34:40,160 --> 00:34:41,879 Speaker 1: They thought it was going to take it was gonna 590 00:34:41,920 --> 00:34:44,600 Speaker 1: be a three month trial, but apparently there moving through 591 00:34:44,640 --> 00:34:49,640 Speaker 1: witnesses much more quickly than they had previously thought. And uh, 592 00:34:49,719 --> 00:34:52,680 Speaker 1: and there were thousands of documents involved in this, not 593 00:34:52,960 --> 00:34:56,080 Speaker 1: just the video, right right. Uh. Wiki Leaks ended up 594 00:34:56,120 --> 00:35:00,640 Speaker 1: publishing over two fifty documents for Manning. He had downloaded 595 00:35:00,840 --> 00:35:03,279 Speaker 1: all of these diplomatic cables while he was at an 596 00:35:03,560 --> 00:35:06,600 Speaker 1: Iraq Army outpost between November two thousand nine. In April, 597 00:35:07,800 --> 00:35:10,279 Speaker 1: um reportedly having burned them to a c d R 598 00:35:10,360 --> 00:35:14,000 Speaker 1: labeled Lady Gaga and told told a Hacker friend that 599 00:35:14,080 --> 00:35:17,320 Speaker 1: he had them to Hacker turned him in, And apparently 600 00:35:17,400 --> 00:35:22,040 Speaker 1: that hacker has since felt a great deal of conflict 601 00:35:22,160 --> 00:35:26,480 Speaker 1: about that act. Uh. The charge, the main charge against 602 00:35:26,520 --> 00:35:30,279 Speaker 1: Bradley Manning is that he knowingly gave intelligence to the 603 00:35:30,440 --> 00:35:35,200 Speaker 1: enemy through indirect means. And uh because before he was 604 00:35:35,400 --> 00:35:38,160 Speaker 1: arrested and put on you know, put on trial, he 605 00:35:38,360 --> 00:35:41,759 Speaker 1: had obviously leaked these documents to wiki leaks, right, and 606 00:35:41,920 --> 00:35:44,800 Speaker 1: so the in the trial I can say this because 607 00:35:45,000 --> 00:35:47,480 Speaker 1: this was reported very recently as of the recording of 608 00:35:47,520 --> 00:35:52,200 Speaker 1: this podcast. Uh. The defense called the senior intelligence or 609 00:35:52,320 --> 00:35:56,400 Speaker 1: one senior intelligence analyst from Manning's unit. Her name is 610 00:35:56,520 --> 00:36:00,120 Speaker 1: Casey Fulton, and Fulton said that the unit received no 611 00:36:00,280 --> 00:36:03,400 Speaker 1: specific warning about sites visited by al Qaeda. She did 612 00:36:03,480 --> 00:36:06,560 Speaker 1: say that al Qaeda would visit sites like Facebook or 613 00:36:06,640 --> 00:36:09,520 Speaker 1: Google or even Google Maps to gather information, but she 614 00:36:09,719 --> 00:36:12,759 Speaker 1: did not mention Wiki leaks among them. And the defenses 615 00:36:12,800 --> 00:36:16,480 Speaker 1: case is saying that uh, that there this information went 616 00:36:16,560 --> 00:36:20,040 Speaker 1: directly to write well, that the al Qaeda was not 617 00:36:20,160 --> 00:36:23,000 Speaker 1: using wiki leaks specifically together information that that was not 618 00:36:23,320 --> 00:36:26,200 Speaker 1: Manning's intent. And in fact, the judge in the case 619 00:36:26,280 --> 00:36:29,160 Speaker 1: has said that the prosecution has to show that Manning 620 00:36:29,200 --> 00:36:33,080 Speaker 1: had actual knowledge that he was actually giving intelligence to 621 00:36:33,120 --> 00:36:35,960 Speaker 1: the enemy through a third party and intermediary or in 622 00:36:36,160 --> 00:36:39,600 Speaker 1: some other indirect way. And that the soldier must have 623 00:36:39,680 --> 00:36:43,680 Speaker 1: had quote a general evil intent unquote and to have 624 00:36:43,800 --> 00:36:46,319 Speaker 1: known he was dealing directly or indirectly with an enemy 625 00:36:46,360 --> 00:36:49,200 Speaker 1: of the United States in order for this particular charge 626 00:36:49,760 --> 00:36:52,520 Speaker 1: to hold. Now keep in mind this one charge out 627 00:36:52,600 --> 00:36:57,080 Speaker 1: of all the trial. So but it sounds to me 628 00:36:57,600 --> 00:37:01,319 Speaker 1: that it's a it's a really tough case to make, 629 00:37:02,360 --> 00:37:06,280 Speaker 1: you know, to prove the intent parts specifically, but improving 630 00:37:06,320 --> 00:37:09,759 Speaker 1: intent is one of those things that's uh, basically impossible. 631 00:37:10,280 --> 00:37:13,719 Speaker 1: So but doesn't mean that people don't do it or 632 00:37:13,840 --> 00:37:17,000 Speaker 1: don't don't achieve it now to say they also point 633 00:37:17,040 --> 00:37:19,480 Speaker 1: out that this is a military trial. It's different from 634 00:37:19,520 --> 00:37:21,719 Speaker 1: other trials in the United States. Typically in the United 635 00:37:21,719 --> 00:37:24,040 Speaker 1: States you have a trial with in front of a 636 00:37:24,120 --> 00:37:26,920 Speaker 1: jury of your peers. But in this case, it is 637 00:37:27,080 --> 00:37:29,680 Speaker 1: a judge that is overseeing the case and her word 638 00:37:29,800 --> 00:37:33,800 Speaker 1: is final at least unless you go through an appeals process. 639 00:37:33,880 --> 00:37:35,560 Speaker 1: But and you do not have exactly the same the 640 00:37:35,600 --> 00:37:37,320 Speaker 1: same rights to see what is a normal citizen in 641 00:37:37,360 --> 00:37:40,040 Speaker 1: a court of law. So getting back to two wiki 642 00:37:40,160 --> 00:37:44,719 Speaker 1: leaks back in UH July twenty, two thousand and ten, 643 00:37:44,880 --> 00:37:47,560 Speaker 1: so so Bradley Manning had been arrested By this point, 644 00:37:48,080 --> 00:37:51,400 Speaker 1: three news organizations released separate accounts of the war logs 645 00:37:51,680 --> 00:37:55,200 Speaker 1: gathered from Wiki Leaks regarding the war in Afghanistan. And 646 00:37:55,280 --> 00:38:01,520 Speaker 1: in August, UH, that's when another big introversial event in 647 00:38:01,800 --> 00:38:04,400 Speaker 1: the history of Wiki leaks in general and Julian Assange 648 00:38:04,440 --> 00:38:08,640 Speaker 1: in particular happens. Uh. Two former employees of Wiki leaks 649 00:38:08,760 --> 00:38:12,520 Speaker 1: filed rape charges against Julian Assange, and Assange has essentially 650 00:38:12,520 --> 00:38:16,200 Speaker 1: spent the rest of his life from that point evading 651 00:38:16,360 --> 00:38:20,200 Speaker 1: any extradition to Sweden to stand trial for this. All right, right, 652 00:38:20,239 --> 00:38:22,640 Speaker 1: these charges were brought up in Sweden. UM. The the 653 00:38:22,800 --> 00:38:24,839 Speaker 1: two women who have brought up the charges, their names 654 00:38:24,880 --> 00:38:27,759 Speaker 1: have not been revealed to the press. UM and UH. 655 00:38:27,840 --> 00:38:30,480 Speaker 1: In Sweden there are laws about using using condoms. If 656 00:38:30,520 --> 00:38:32,320 Speaker 1: partner tells you to use one and you do not, 657 00:38:33,120 --> 00:38:36,040 Speaker 1: then you can be brought up on what is called 658 00:38:36,080 --> 00:38:39,399 Speaker 1: a rape charge and and the technical definition I shall 659 00:38:39,480 --> 00:38:45,120 Speaker 1: leave up to other political parties. Yeah, hey guys, it's Jonathan. 660 00:38:45,480 --> 00:38:47,439 Speaker 1: I gotta run out and take a quick Wiki leaks, 661 00:38:47,520 --> 00:38:58,360 Speaker 1: so we're gonna take a break. At any rate, he 662 00:38:58,640 --> 00:39:02,279 Speaker 1: has been fighting x tradition since then, and at this 663 00:39:02,440 --> 00:39:06,600 Speaker 1: point he has actually um been a granted asylum from 664 00:39:07,040 --> 00:39:10,480 Speaker 1: by Ecuador. Ecuador has granted asansing. I'll tell them more 665 00:39:10,520 --> 00:39:12,520 Speaker 1: about that when we get a little further down. But yeah, 666 00:39:13,320 --> 00:39:16,440 Speaker 1: that whole the whole drama is a story in of itself. 667 00:39:16,800 --> 00:39:19,200 Speaker 1: But because again it's so hard to get to the 668 00:39:19,239 --> 00:39:21,920 Speaker 1: truth of the matter, there's just we actually talked about 669 00:39:22,000 --> 00:39:24,759 Speaker 1: doing an episode just about Julia Sange, but instead of 670 00:39:24,920 --> 00:39:27,400 Speaker 1: about wiki leaks. But the more we looked into it, 671 00:39:27,480 --> 00:39:30,759 Speaker 1: the more we realize that this there's no way to 672 00:39:31,040 --> 00:39:34,759 Speaker 1: verify half the information that's out there, right, you know, 673 00:39:35,200 --> 00:39:36,880 Speaker 1: as as I said at the beginning of the episode, 674 00:39:37,280 --> 00:39:40,120 Speaker 1: all of the information about Juliana Sang comes from either 675 00:39:40,239 --> 00:39:45,560 Speaker 1: a Sange himself m or from very close personal ex 676 00:39:45,760 --> 00:39:48,760 Speaker 1: compatriots who are pretty angry at him, right, So there 677 00:39:48,760 --> 00:39:51,400 Speaker 1: are a lot of biases, and neither of those sources 678 00:39:51,440 --> 00:39:58,200 Speaker 1: are really necessarily reliable. So the July you had the 679 00:39:58,840 --> 00:40:02,920 Speaker 1: uh the Afghanistan reports released. Now that was one third 680 00:40:03,160 --> 00:40:06,279 Speaker 1: of three big blocks of information that we're going to 681 00:40:06,360 --> 00:40:11,680 Speaker 1: be released that year. The other two were about the 682 00:40:11,840 --> 00:40:16,239 Speaker 1: Iraqi War, which that information came out in October, and 683 00:40:16,360 --> 00:40:21,279 Speaker 1: then there was a huge block of classified US diplomatic 684 00:40:21,560 --> 00:40:25,200 Speaker 1: cables and a cable is essentially a message. So these 685 00:40:25,239 --> 00:40:28,200 Speaker 1: were all these thousands and thousands of diplomatic messages that 686 00:40:28,440 --> 00:40:32,359 Speaker 1: came out in November. That was the trifecta of big, 687 00:40:32,600 --> 00:40:36,359 Speaker 1: big bombshell releases that came out in twenty that really 688 00:40:36,480 --> 00:40:38,880 Speaker 1: established what wiki leaks was all about. I think I 689 00:40:38,920 --> 00:40:42,759 Speaker 1: think there's in November were the ones are specifically from Manning. Yeah. Yeah, 690 00:40:42,960 --> 00:40:46,640 Speaker 1: And then in September twenty a little bit earlier, that's 691 00:40:46,680 --> 00:40:50,839 Speaker 1: when Daniel dom Scheidberg left Wiki leaks, right, that's when 692 00:40:50,880 --> 00:40:52,640 Speaker 1: he walked. He had joined in two thousand eight and 693 00:40:52,760 --> 00:40:55,839 Speaker 1: had become sort of a Songe's right hand man. Yeah, 694 00:40:55,880 --> 00:40:58,839 Speaker 1: he was kind of. He was specifically a spokesperson four 695 00:40:58,880 --> 00:41:03,160 Speaker 1: wiki leaks in your up, mainly in Germany, and he 696 00:41:03,280 --> 00:41:06,279 Speaker 1: had a major falling out with a Sange to the 697 00:41:06,320 --> 00:41:09,440 Speaker 1: point where it went from he and Assange sharing the 698 00:41:09,520 --> 00:41:12,759 Speaker 1: same ideals to both of them demonizing one another whenever 699 00:41:12,800 --> 00:41:14,800 Speaker 1: they had the opportunity to speak to the media. And 700 00:41:14,840 --> 00:41:17,759 Speaker 1: they were roommates somewhere in between there. So this is 701 00:41:17,880 --> 00:41:19,920 Speaker 1: like talking about that band you used to love and 702 00:41:20,000 --> 00:41:21,880 Speaker 1: then they broke up, and now no one has anything 703 00:41:21,920 --> 00:41:24,320 Speaker 1: good to say about each other. Multiplied by about a 704 00:41:24,440 --> 00:41:28,719 Speaker 1: billion right later later on, he would publish a book 705 00:41:28,800 --> 00:41:31,520 Speaker 1: called Inside WikiLeaks, My Time with Julian Assange at the 706 00:41:31,560 --> 00:41:35,440 Speaker 1: World's Most Dangerous Website. Yeah, he was not particularly complementary 707 00:41:35,520 --> 00:41:38,560 Speaker 1: of Osange in that book or in any of his interviews. 708 00:41:38,840 --> 00:41:42,320 Speaker 1: He left to try and his goal was to found 709 00:41:42,520 --> 00:41:46,239 Speaker 1: a competing leak side. He felt that wiki leaks had 710 00:41:46,680 --> 00:41:48,920 Speaker 1: lost sight of its ideals, that it was not following, 711 00:41:48,960 --> 00:41:50,680 Speaker 1: it was not behaving in a way that again was 712 00:41:50,719 --> 00:41:53,759 Speaker 1: aligned with its ideals. Perhaps he felt that it was 713 00:41:53,840 --> 00:41:57,320 Speaker 1: more sensationalized, the kind of way that we've been talking about. 714 00:41:57,360 --> 00:42:00,680 Speaker 1: How you know, that's always been my person anal impression, 715 00:42:00,800 --> 00:42:04,839 Speaker 1: but again that's a personal impression. Um. So he wanted 716 00:42:04,880 --> 00:42:08,120 Speaker 1: to found a company called open leaks that would be 717 00:42:08,360 --> 00:42:11,960 Speaker 1: essentially a competitor to wiki leaks. Uh, and he wanted 718 00:42:12,000 --> 00:42:15,680 Speaker 1: it to be more transparent than wiki leaks was. So 719 00:42:15,960 --> 00:42:18,800 Speaker 1: he didn't want to have these kind of deals with 720 00:42:18,920 --> 00:42:23,160 Speaker 1: various news organizations where it was almost exclusive, like a 721 00:42:23,239 --> 00:42:25,560 Speaker 1: partnership saying hey, I'm going to give you this information 722 00:42:25,640 --> 00:42:27,359 Speaker 1: and then you can run it and you know, we'll 723 00:42:27,440 --> 00:42:30,160 Speaker 1: we'll have this buddy buddy relationship. Just link back to 724 00:42:30,280 --> 00:42:31,920 Speaker 1: my site. Make sure you do that. If you link 725 00:42:32,000 --> 00:42:34,719 Speaker 1: back to my site, we're all good. He didn't want 726 00:42:34,719 --> 00:42:38,400 Speaker 1: to do any of that, but eventually he ended up 727 00:42:38,520 --> 00:42:42,160 Speaker 1: changing his tune for open leaks because it just didn't 728 00:42:42,600 --> 00:42:46,200 Speaker 1: go well. Part of it was that when he left, 729 00:42:46,320 --> 00:42:49,080 Speaker 1: one of the things he did was he copied about 730 00:42:49,120 --> 00:42:52,560 Speaker 1: thirty five hundred files and then deleted them from wiki 731 00:42:52,640 --> 00:42:58,279 Speaker 1: leaks's database, and he left with those files. Depending upon 732 00:42:58,360 --> 00:43:02,000 Speaker 1: whom you ask, he was either trying to partially sabotage 733 00:43:02,040 --> 00:43:05,879 Speaker 1: wiki leaks and establish open leaks by getting a jump 734 00:43:06,000 --> 00:43:10,640 Speaker 1: start with these files, or from his point of view, 735 00:43:11,040 --> 00:43:13,920 Speaker 1: he felt that what wiki leaks was doing was irresponsible 736 00:43:14,000 --> 00:43:17,480 Speaker 1: and endangering the information that was within these thirty files, 737 00:43:17,520 --> 00:43:20,080 Speaker 1: and he was only copying it so that that information 738 00:43:20,160 --> 00:43:23,240 Speaker 1: would remain safe until such time that he could return 739 00:43:23,400 --> 00:43:26,520 Speaker 1: those files to wiki leaks. This is why this gets 740 00:43:26,560 --> 00:43:29,360 Speaker 1: really complicated, because people get you know, they get a 741 00:43:29,400 --> 00:43:32,560 Speaker 1: little irritated at each other and they act out a bit. 742 00:43:32,719 --> 00:43:35,800 Speaker 1: So um Anyway, open leaks never really took off. It 743 00:43:35,960 --> 00:43:39,640 Speaker 1: ended up sort of transforming into more of a site 744 00:43:40,239 --> 00:43:42,600 Speaker 1: that's designed to teach other people how to set up 745 00:43:42,640 --> 00:43:47,640 Speaker 1: sites that that can accept and publish leaked material, so 746 00:43:48,120 --> 00:43:50,440 Speaker 1: it really did change quite a bit. So then we 747 00:43:50,520 --> 00:43:53,719 Speaker 1: had October and November where those other big releases came out. 748 00:43:54,000 --> 00:43:56,359 Speaker 1: In two thousand eleven, that's when Wiki leaks was hit 749 00:43:56,520 --> 00:44:00,280 Speaker 1: by a pretty hard blow, and it didn't have anything 750 00:44:00,320 --> 00:44:04,000 Speaker 1: to do with a lawsuit. This was a blockade, a 751 00:44:04,120 --> 00:44:10,720 Speaker 1: financial blockade. This was when several major financial companies, banks 752 00:44:10,840 --> 00:44:16,480 Speaker 1: and credit card companies all decided to end any transfer 753 00:44:16,719 --> 00:44:23,080 Speaker 1: of funds towards WikiLeaks. WikiLeaks existed solely upon uh submissions 754 00:44:23,239 --> 00:44:26,279 Speaker 1: of of monetary donations to the site. You know, people 755 00:44:26,320 --> 00:44:28,960 Speaker 1: were donating money in order for WikiLeaks to keep going. 756 00:44:29,640 --> 00:44:32,920 Speaker 1: This was essentially all those methods of transmission, saying we 757 00:44:33,040 --> 00:44:35,960 Speaker 1: are no longer allowing payments to go to WikiLeaks, so 758 00:44:36,040 --> 00:44:39,640 Speaker 1: even if you want to donate, there was no route 759 00:44:39,760 --> 00:44:43,360 Speaker 1: for you to go right. And it actually got to 760 00:44:43,360 --> 00:44:46,280 Speaker 1: a point where Assange said that Wiki leaks had burned 761 00:44:46,320 --> 00:44:50,759 Speaker 1: through of its assets UH and and that it wasn't 762 00:44:50,800 --> 00:44:53,439 Speaker 1: able to really regenerate them in any meaningful way because 763 00:44:53,480 --> 00:44:57,480 Speaker 1: this blockade was preventing payments to go to Wiki leaks. Absolutely. 764 00:44:57,600 --> 00:44:59,319 Speaker 1: And this was also going on at the same time 765 00:44:59,440 --> 00:45:02,279 Speaker 1: that um Or or a little bit before this, UH 766 00:45:03,200 --> 00:45:07,920 Speaker 1: servers had started withdrawing their services from Wiki leaks. Amazon 767 00:45:08,200 --> 00:45:13,080 Speaker 1: dumped them, every DNS had terminated their service, right it 768 00:45:13,160 --> 00:45:14,960 Speaker 1: got to a point where you know, you could still 769 00:45:15,040 --> 00:45:18,120 Speaker 1: get there. It was you know, they would have work 770 00:45:18,160 --> 00:45:21,640 Speaker 1: around so that people could still get to UH, to 771 00:45:21,760 --> 00:45:25,680 Speaker 1: wiki leaks, but it was a lot of that that 772 00:45:25,840 --> 00:45:29,360 Speaker 1: support was going away, and you know, there were theories 773 00:45:29,640 --> 00:45:31,800 Speaker 1: on all sides of this as well, saying there was 774 00:45:31,840 --> 00:45:35,280 Speaker 1: a conspiracy that these companies had been pressured by various 775 00:45:35,360 --> 00:45:38,520 Speaker 1: governments around the world to end any support so that 776 00:45:38,600 --> 00:45:41,799 Speaker 1: way Wiki leaks would essentially kind of starved to death, 777 00:45:42,360 --> 00:45:44,200 Speaker 1: that no one would be able to get there, that 778 00:45:44,680 --> 00:45:46,840 Speaker 1: it would no one would be able to financially support it, 779 00:45:46,920 --> 00:45:49,520 Speaker 1: and that it would have just go away and then hey, 780 00:45:49,719 --> 00:45:52,200 Speaker 1: no more problems with all this leaked information because no 781 00:45:52,239 --> 00:45:54,120 Speaker 1: one has any place for this leaked information to go. 782 00:45:55,560 --> 00:45:58,279 Speaker 1: So UH. Whether or not that's true, or if it 783 00:45:58,360 --> 00:46:00,279 Speaker 1: was just the company saying, you know, this is something 784 00:46:00,360 --> 00:46:02,440 Speaker 1: that is going to cause us problems and we just 785 00:46:02,480 --> 00:46:04,000 Speaker 1: don't want to be a part of it, and they 786 00:46:04,040 --> 00:46:07,719 Speaker 1: were independently coming to that decision, I don't know. I 787 00:46:07,800 --> 00:46:09,839 Speaker 1: guess if there was a leak about it, we'd find 788 00:46:09,880 --> 00:46:12,399 Speaker 1: out this was This was also around the same time 789 00:46:12,480 --> 00:46:15,600 Speaker 1: that that they had published in September of two, two 790 00:46:15,640 --> 00:46:20,560 Speaker 1: thousand eleven, they had published over two cables that did 791 00:46:20,680 --> 00:46:24,840 Speaker 1: not contain read actions, which means that that the sources 792 00:46:25,000 --> 00:46:29,080 Speaker 1: of this information and in some places victims names and etcetera, 793 00:46:29,440 --> 00:46:32,440 Speaker 1: had not been blacked out. Yeah. In other words, there 794 00:46:32,480 --> 00:46:36,640 Speaker 1: were names of individuals within these cables that someone who 795 00:46:36,920 --> 00:46:40,719 Speaker 1: is reading over this could then target, either politically or 796 00:46:41,719 --> 00:46:45,000 Speaker 1: literally target these these folks that are mentioned in these cables, 797 00:46:45,400 --> 00:46:47,520 Speaker 1: and so there were there's a lot of criticism leveled 798 00:46:47,520 --> 00:46:50,160 Speaker 1: against Wiki leaks and against Assange, saying that it was 799 00:46:50,239 --> 00:46:54,759 Speaker 1: being irresponsible and endangering people's lives. Assange has not been 800 00:46:54,960 --> 00:46:59,319 Speaker 1: the most um compassionate person in regards to this. He's 801 00:46:59,400 --> 00:47:03,520 Speaker 1: often said that, uh, his goal is to save innocent lives, 802 00:47:03,560 --> 00:47:05,560 Speaker 1: but if it endangers a few people because of the 803 00:47:05,719 --> 00:47:09,239 Speaker 1: information that's revealed in these in these cables, then that's 804 00:47:09,440 --> 00:47:13,200 Speaker 1: you know, that's acceptable. I'm paraphrasing. That's not exactly what 805 00:47:13,320 --> 00:47:15,800 Speaker 1: he said, but it's it's more or less the message 806 00:47:15,840 --> 00:47:19,440 Speaker 1: that has been given collateral murder. Yeah, it's yeah, that's 807 00:47:19,440 --> 00:47:21,640 Speaker 1: the problem is that the Lauren seems to feel that 808 00:47:21,760 --> 00:47:25,799 Speaker 1: this might be a little touch hypocritical, considering the uh, 809 00:47:26,000 --> 00:47:30,160 Speaker 1: the criticisms that WikiLeaks has levied against, you know, governments 810 00:47:30,200 --> 00:47:33,600 Speaker 1: and corporations, that WikiLeaks itself seems to be engaging in 811 00:47:33,680 --> 00:47:36,880 Speaker 1: the same sort of cavalier behavior towards people's safety. And 812 00:47:37,200 --> 00:47:39,520 Speaker 1: to be fair, it is absolutely not a parallel to 813 00:47:39,800 --> 00:47:44,640 Speaker 1: compare um, uh putting putting a source in hypothetical danger 814 00:47:44,680 --> 00:47:48,719 Speaker 1: of persecution versus an Apache pilot killing a child in 815 00:47:48,800 --> 00:47:51,839 Speaker 1: a car. That's that's different. Yeah, that's the obviously very 816 00:47:51,920 --> 00:47:55,680 Speaker 1: different things. But it does seem to suggest that there's 817 00:47:55,680 --> 00:47:58,520 Speaker 1: a little bit of hypocrisy going on. I don't disagree 818 00:47:58,560 --> 00:48:01,080 Speaker 1: with Lauren, is what I'm saying. UM. But but but 819 00:48:01,200 --> 00:48:03,440 Speaker 1: that's but that is that is my personal opinion and 820 00:48:03,680 --> 00:48:06,520 Speaker 1: UH and I apologize a little bit for injecting it. Um. 821 00:48:06,600 --> 00:48:09,480 Speaker 1: This this move did did create a great rift between 822 00:48:09,560 --> 00:48:12,800 Speaker 1: Wiki Leaks and UM. Several of those newspapers or or 823 00:48:12,920 --> 00:48:16,279 Speaker 1: reporting organizations that we had been talking about them being 824 00:48:16,360 --> 00:48:19,600 Speaker 1: in cahoots with earlier, The Guardian, the New York Times, UM, 825 00:48:20,320 --> 00:48:22,560 Speaker 1: a bunch of papers around the world. Der Spiegel was 826 00:48:22,560 --> 00:48:26,239 Speaker 1: another one. Yeah, it was actually uh and the way 827 00:48:26,320 --> 00:48:30,120 Speaker 1: that Assange was handling the relationship between Wiki Leaks and 828 00:48:30,320 --> 00:48:34,120 Speaker 1: these news organizations was starting to cheesse them off. Like 829 00:48:34,560 --> 00:48:38,680 Speaker 1: the Guardian had certain expectations, the New York Times had 830 00:48:38,680 --> 00:48:42,040 Speaker 1: certain expectations UH, and the Guardian had felt that the 831 00:48:42,080 --> 00:48:44,280 Speaker 1: New York Times was going to be able to report 832 00:48:44,360 --> 00:48:47,240 Speaker 1: on certain things which the Guardian actually wanted to have happened, 833 00:48:47,280 --> 00:48:50,480 Speaker 1: because again the Guardians in the UK and the secrecy 834 00:48:50,560 --> 00:48:53,680 Speaker 1: laws are such that there were there were these partnerships 835 00:48:53,719 --> 00:48:56,080 Speaker 1: between the Guardian and New York Times where New York 836 00:48:56,120 --> 00:48:58,920 Speaker 1: Times could publish some stuff that would possibly get the 837 00:48:58,960 --> 00:49:01,880 Speaker 1: Guardian into trouble but would still benefit the Guardian in 838 00:49:02,080 --> 00:49:05,120 Speaker 1: some way. So it was this this kind of weird 839 00:49:05,239 --> 00:49:08,279 Speaker 1: relationship that was going on. But then a sane got 840 00:49:08,400 --> 00:49:10,719 Speaker 1: essentially got ticked off at the New York Times for 841 00:49:10,800 --> 00:49:13,440 Speaker 1: the way that the New York Times handled its information. 842 00:49:13,520 --> 00:49:15,239 Speaker 1: Because one thing that the New York Times would do 843 00:49:15,760 --> 00:49:18,759 Speaker 1: is approached the government and say, hey, we receive these 844 00:49:18,840 --> 00:49:21,000 Speaker 1: cables and we plan on running with the story, but 845 00:49:21,080 --> 00:49:23,279 Speaker 1: we're letting you know ahead of time, whereas in the 846 00:49:23,400 --> 00:49:26,160 Speaker 1: UK that's generally not done. Generally in the UK they 847 00:49:26,320 --> 00:49:29,480 Speaker 1: run the story, so that put pressure on the Guardian 848 00:49:29,560 --> 00:49:31,279 Speaker 1: and also put pressure on a Sane And then as 849 00:49:31,320 --> 00:49:35,080 Speaker 1: Sane was apparently very much upset about this and wanted 850 00:49:35,120 --> 00:49:37,000 Speaker 1: to sever the relationship with The New York Times, but 851 00:49:37,040 --> 00:49:39,319 Speaker 1: the Guardians still wanted this relationship with the New York Times. 852 00:49:40,040 --> 00:49:43,040 Speaker 1: If this is starting to sound like the relationships in 853 00:49:43,400 --> 00:49:46,560 Speaker 1: middle school drama club, that's kind of what it comes 854 00:49:46,600 --> 00:49:51,239 Speaker 1: out to being, except the stakes are obviously way higher. Absolutely, 855 00:49:51,440 --> 00:49:55,680 Speaker 1: and uh yeah, they the newspapers five five global newspapers 856 00:49:55,719 --> 00:49:57,840 Speaker 1: bound up putting out a joint statement that said, um, 857 00:49:57,960 --> 00:50:00,160 Speaker 1: we deplored the decision of Wiki Leaks to pub which 858 00:50:00,200 --> 00:50:03,480 Speaker 1: the unredacted State Department cables, which may put sources at risk. 859 00:50:03,719 --> 00:50:06,160 Speaker 1: The decision to publish by Julian Assange was his and 860 00:50:06,280 --> 00:50:09,120 Speaker 1: his alone, right, yeah, because they all had the policy 861 00:50:09,239 --> 00:50:14,279 Speaker 1: of actually going through and very carefully uh redacting any 862 00:50:14,760 --> 00:50:18,719 Speaker 1: identifiable information to protect sources and to protect people who 863 00:50:18,760 --> 00:50:22,719 Speaker 1: could potentially become a victim of some agency. And uh 864 00:50:22,840 --> 00:50:25,880 Speaker 1: and you know that was an important part of their process, 865 00:50:25,920 --> 00:50:28,239 Speaker 1: to the point where you had people entire departments in 866 00:50:28,400 --> 00:50:32,680 Speaker 1: charge of reviewing every single cable to make certain that 867 00:50:33,040 --> 00:50:35,400 Speaker 1: all the stuff they published was going to be safe, 868 00:50:36,000 --> 00:50:38,520 Speaker 1: and so for this move to happen on wiki leaks 869 00:50:38,600 --> 00:50:41,480 Speaker 1: and meant that a lot of that work was just nullified. 870 00:50:42,000 --> 00:50:46,120 Speaker 1: And obviously that is a good reason to become upset. Well, 871 00:50:47,480 --> 00:50:50,360 Speaker 1: once you get into about May two thousand twelve, so 872 00:50:50,520 --> 00:50:55,080 Speaker 1: Assange had been fighting extradition attempts and he had been 873 00:50:55,120 --> 00:50:58,680 Speaker 1: living in London while Sweden authorities were trying to extradite 874 00:50:58,680 --> 00:51:00,640 Speaker 1: into Sweden. He had he had spent about a week 875 00:51:00,640 --> 00:51:05,359 Speaker 1: in jail in December on those um extradition charges. Yeah, 876 00:51:05,560 --> 00:51:07,800 Speaker 1: and he had for being released on bail, been in 877 00:51:07,960 --> 00:51:12,960 Speaker 1: and out of court trying to appeal extradition. And by 878 00:51:13,080 --> 00:51:16,319 Speaker 1: made two thousand twelve, the British Supreme Court, it had 879 00:51:16,360 --> 00:51:18,359 Speaker 1: gone all the way up to the British Supreme Court 880 00:51:18,440 --> 00:51:21,800 Speaker 1: said no, we're not going to prevent your extradition. You 881 00:51:21,880 --> 00:51:23,960 Speaker 1: are going to have to go to Sweden to stand trial. 882 00:51:24,520 --> 00:51:28,160 Speaker 1: That's when a sane then started to appeal to Ecuador 883 00:51:28,320 --> 00:51:31,120 Speaker 1: the embassy in the UK and actually saying will you 884 00:51:31,200 --> 00:51:35,000 Speaker 1: grant me asylum? And it was at a point where 885 00:51:35,080 --> 00:51:37,440 Speaker 1: he was in the he was staying in the embassy. 886 00:51:37,840 --> 00:51:41,520 Speaker 1: He had not officially been granted asylum by Ecuador, and 887 00:51:41,600 --> 00:51:44,480 Speaker 1: then there was going to be apparently a raid on 888 00:51:44,640 --> 00:51:47,920 Speaker 1: the Ecuador embassy in order to get a Sange out, 889 00:51:47,960 --> 00:51:50,640 Speaker 1: and that's when Ecuador said, we're giving you asylum. So 890 00:51:50,920 --> 00:51:54,200 Speaker 1: it's almost when it's possible that that that raid was 891 00:51:54,360 --> 00:51:57,719 Speaker 1: the precipitated Yeah, that was exactly the moment where the 892 00:51:57,880 --> 00:51:59,920 Speaker 1: Ecuador said, you know, we weren't going to but now 893 00:52:00,080 --> 00:52:03,839 Speaker 1: we totally are because now it's political. So it's still 894 00:52:04,120 --> 00:52:08,760 Speaker 1: very complicated issue, um and you know, guilt or innocence aside. 895 00:52:09,200 --> 00:52:13,960 Speaker 1: It's it's one of those stories that is really complicated 896 00:52:14,000 --> 00:52:17,360 Speaker 1: and tough to to kind of unwind and follow. So anyway, 897 00:52:17,400 --> 00:52:20,960 Speaker 1: Wiki leaks is still still a thing still around. It's 898 00:52:20,960 --> 00:52:25,920 Speaker 1: still you know, accepts leaked information. Uh. It has uncovered 899 00:52:26,040 --> 00:52:28,719 Speaker 1: lots of stories, not just with governments, but like we said, 900 00:52:28,760 --> 00:52:33,200 Speaker 1: with corporations, some of which have caused newspapers to get 901 00:52:33,239 --> 00:52:35,759 Speaker 1: into trouble for running the stories. Things like you know, 902 00:52:35,880 --> 00:52:40,120 Speaker 1: major corporations that may have fallen short on promises for 903 00:52:40,200 --> 00:52:42,560 Speaker 1: doing things like cleaning up a gasoline spill. There was 904 00:52:42,600 --> 00:52:46,920 Speaker 1: a specific example of that, um and it continues to 905 00:52:47,120 --> 00:52:51,320 Speaker 1: fulfill that role. Some argue that it's even possible that 906 00:52:51,440 --> 00:52:56,080 Speaker 1: the whole Assange story could just be a smoke and 907 00:52:56,160 --> 00:52:58,560 Speaker 1: mirrors for Wiki Leaks to continue doing what it does 908 00:52:58,680 --> 00:53:01,480 Speaker 1: without having to worry about some focus because there's this 909 00:53:01,680 --> 00:53:04,640 Speaker 1: fall guy. Right. Yeah, You've got this very flashbank kind 910 00:53:04,640 --> 00:53:06,800 Speaker 1: of person over here who's going like, look at me, 911 00:53:07,040 --> 00:53:10,960 Speaker 1: extradition charges. Yeah, and then they can continue doing their 912 00:53:11,360 --> 00:53:15,880 Speaker 1: their political political work. Right. So, uh, I doubt that 913 00:53:16,000 --> 00:53:20,560 Speaker 1: anything is that planned out because life is just complicated 914 00:53:20,600 --> 00:53:23,480 Speaker 1: and messy and it's tough to ever have a plan that, 915 00:53:23,920 --> 00:53:27,960 Speaker 1: like James bond Isshue be very impressed. But but you 916 00:53:28,040 --> 00:53:32,800 Speaker 1: know it's possible. You know, if they did do that, 917 00:53:32,960 --> 00:53:36,040 Speaker 1: then it's it's the stuff of movie legend. Speaking of 918 00:53:36,120 --> 00:53:39,000 Speaker 1: movie legends, there is a film being made well there 919 00:53:39,160 --> 00:53:42,560 Speaker 1: there there are two films that have been um in 920 00:53:42,760 --> 00:53:47,440 Speaker 1: production recently. One is a documentary that was just released 921 00:53:48,000 --> 00:53:52,719 Speaker 1: I believe Alex Gibney's We Steal Secrets, right, and that 922 00:53:52,840 --> 00:53:56,280 Speaker 1: one focused both on a sang and on Manning correct 923 00:53:56,400 --> 00:53:59,680 Speaker 1: and uh and sort of it was supposed to be 924 00:53:59,760 --> 00:54:02,840 Speaker 1: in all awards and all kind of portrayal. And some 925 00:54:02,960 --> 00:54:06,640 Speaker 1: people say that his portrayal of Assange does give sort 926 00:54:06,640 --> 00:54:08,600 Speaker 1: of a warts and all approach. In fact, I think 927 00:54:08,760 --> 00:54:12,640 Speaker 1: you know, Assange definitely did not make that relationship a 928 00:54:12,760 --> 00:54:17,960 Speaker 1: sweet one because apparently he demanded outright that if Gimney 929 00:54:18,000 --> 00:54:19,759 Speaker 1: where to talk to Assange, you would cost him a 930 00:54:19,760 --> 00:54:23,719 Speaker 1: million bucks, and the documentary filmmaker, that's probably not the 931 00:54:23,800 --> 00:54:27,560 Speaker 1: best way to win that person over to your side. No, Yeah, 932 00:54:27,840 --> 00:54:32,279 Speaker 1: Assange has come out very vocally against this documentary, which 933 00:54:32,320 --> 00:54:35,920 Speaker 1: is funny because people other people have also on both sides, 934 00:54:36,080 --> 00:54:39,000 Speaker 1: have come down upon this documentary right and and some 935 00:54:39,160 --> 00:54:42,200 Speaker 1: some people are proponents of it. One uh former Wiki 936 00:54:42,239 --> 00:54:46,960 Speaker 1: Leaks employee in particular, James ball Um reported reported it 937 00:54:47,080 --> 00:54:50,040 Speaker 1: being a very accurate portrayal, like to the point that 938 00:54:50,120 --> 00:54:53,000 Speaker 1: it was deja bou seeing the film. Now, there are 939 00:54:53,040 --> 00:54:59,200 Speaker 1: some who say that Manning's portrayal was overly sympathetic. That 940 00:54:59,400 --> 00:55:03,880 Speaker 1: they do, you know, it's hard to say that that 941 00:55:04,080 --> 00:55:07,759 Speaker 1: it's unjustified, because you're talking about one person who may 942 00:55:07,960 --> 00:55:10,160 Speaker 1: very well have been acting in what he thought was 943 00:55:10,920 --> 00:55:13,319 Speaker 1: the right way to to expose what he saw as 944 00:55:13,400 --> 00:55:18,200 Speaker 1: unethical behavior. And there's no you know, there's no official 945 00:55:18,280 --> 00:55:21,560 Speaker 1: way of doing it and any hope of it being addressed. 946 00:55:21,920 --> 00:55:23,880 Speaker 1: And so he went outside the system in order to 947 00:55:23,960 --> 00:55:26,680 Speaker 1: try and have this done. That's that's kind of the 948 00:55:26,760 --> 00:55:29,080 Speaker 1: story that's being told. There are other people who say 949 00:55:29,120 --> 00:55:31,279 Speaker 1: it's more complicated than that, and that you know, it 950 00:55:31,400 --> 00:55:35,160 Speaker 1: wasn't truly altruistic motivations that had him do what he did. 951 00:55:35,719 --> 00:55:39,080 Speaker 1: But again, that's a really complicated story. When you're doing 952 00:55:39,080 --> 00:55:41,759 Speaker 1: a documentary film, you have to simplify things so that 953 00:55:41,840 --> 00:55:46,400 Speaker 1: you can and be so two hours long. And note 954 00:55:46,719 --> 00:55:48,520 Speaker 1: that was one of the criticisms I saw was that 955 00:55:48,760 --> 00:55:52,200 Speaker 1: it made uh Manning out to be more sympathetic than 956 00:55:52,560 --> 00:55:56,600 Speaker 1: than the person felt that he should be. Uh. I 957 00:55:56,680 --> 00:55:59,919 Speaker 1: think it's really hard to say that you can't feel 958 00:56:00,000 --> 00:56:02,560 Speaker 1: sympathy for someone who's being held for a military trial 959 00:56:02,680 --> 00:56:07,400 Speaker 1: that doesn't have the same same protections as a criminal 960 00:56:07,560 --> 00:56:10,080 Speaker 1: or civil case would in a normal court of law. 961 00:56:10,719 --> 00:56:16,080 Speaker 1: But you know, again, I don't know, so we don't 962 00:56:16,080 --> 00:56:18,400 Speaker 1: know the guy, don't know all the details. So uh, 963 00:56:19,040 --> 00:56:21,520 Speaker 1: you know that will play out and the judge will 964 00:56:21,719 --> 00:56:24,680 Speaker 1: come to a decision. So the other one is, um, 965 00:56:24,960 --> 00:56:27,480 Speaker 1: it's it's a fictional film, or I mean fictional, it's 966 00:56:27,520 --> 00:56:35,440 Speaker 1: a biopic maybe Barnaby cumberbund In it a cumber Batch 967 00:56:35,840 --> 00:56:40,880 Speaker 1: right right, I'm sorry, John Harrison. Is that John Harrison? Um? 968 00:56:41,000 --> 00:56:43,000 Speaker 1: This This film is called The Fifth Est State. It's 969 00:56:43,040 --> 00:56:46,680 Speaker 1: being directed by Bill Condon, UM, and it is slated 970 00:56:46,719 --> 00:56:52,000 Speaker 1: to be released on October eleven. UM and uh. Cumber 971 00:56:52,040 --> 00:56:54,920 Speaker 1: Batch has said that Assange directly asked him to not 972 00:56:55,080 --> 00:56:59,640 Speaker 1: do the film, UM, calling it a massive propaganda attack. Yeah, 973 00:57:00,680 --> 00:57:03,120 Speaker 1: then maybe it is. I haven't seen it. I don't know. 974 00:57:03,600 --> 00:57:06,400 Speaker 1: We'll have to wait here. But if you want to 975 00:57:06,440 --> 00:57:11,640 Speaker 1: see some really cool pictures of Benedict Comberbatch looking intense 976 00:57:11,760 --> 00:57:15,720 Speaker 1: with bleach bleach hair, um, those are on the internet. 977 00:57:16,240 --> 00:57:19,240 Speaker 1: And that wraps up this episode, this classic episode of 978 00:57:19,280 --> 00:57:21,360 Speaker 1: tech Stuff, the Wiki leaks story. Like I said, we 979 00:57:21,520 --> 00:57:24,440 Speaker 1: probably need to go back and revisit this one. A 980 00:57:24,520 --> 00:57:26,920 Speaker 1: lot has happened with Wiki leaks, a lot of ups 981 00:57:26,960 --> 00:57:29,920 Speaker 1: and downs with that story. So I'll probably do a 982 00:57:29,960 --> 00:57:32,600 Speaker 1: follow up, maybe a full redo. We'll see. But if 983 00:57:32,600 --> 00:57:34,920 Speaker 1: you guys have suggestions for topics I should cover on 984 00:57:35,080 --> 00:57:37,400 Speaker 1: tech Stuff, reach out to me on Facebook or Twitter. 985 00:57:37,800 --> 00:57:40,600 Speaker 1: The handle it both is tech stuff hs W and 986 00:57:40,680 --> 00:57:48,680 Speaker 1: I'll talk to you again really soon. Text Stuff is 987 00:57:48,720 --> 00:57:51,840 Speaker 1: an I Heart Radio production. For more podcasts from my 988 00:57:51,960 --> 00:57:55,520 Speaker 1: Heart Radio visit the I heart Radio app, Apple podcasts, 989 00:57:55,680 --> 00:58:00,080 Speaker 1: or wherever you listen to your favorite shows e