1 00:00:00,120 --> 00:00:03,600 Speaker 1: Welcome back to the show, fellow conspiracy realist. This is 2 00:00:03,640 --> 00:00:06,560 Speaker 1: a classic episode while we were on the high seas. 3 00:00:06,960 --> 00:00:10,240 Speaker 1: We have a lot of folks that we call returning 4 00:00:10,360 --> 00:00:13,040 Speaker 1: guests on Stuff they Don't Want You to Know, because 5 00:00:13,080 --> 00:00:15,680 Speaker 1: they come in and out of all kinds of stories, 6 00:00:16,160 --> 00:00:19,720 Speaker 1: and for several years over the course of this show, 7 00:00:19,960 --> 00:00:23,439 Speaker 1: one guy who absolutely captivated our attention was the creator 8 00:00:23,480 --> 00:00:25,960 Speaker 1: of WikiLeaks, Julian Assange. 9 00:00:26,760 --> 00:00:29,840 Speaker 2: He was amongst one of the first dozen episodes that 10 00:00:29,880 --> 00:00:32,479 Speaker 2: we ever made of this iteration of stuff they Don't 11 00:00:32,520 --> 00:00:35,559 Speaker 2: want you to Know, and he made several appearances on 12 00:00:35,640 --> 00:00:38,680 Speaker 2: our video series on YouTube. So do check out YouTube 13 00:00:38,680 --> 00:00:42,680 Speaker 2: dot com, slash Conspiracy Stuff or at Conspiracy Stuff if 14 00:00:42,680 --> 00:00:43,000 Speaker 2: you will. 15 00:00:43,120 --> 00:00:45,040 Speaker 3: Well, it's so funny that this just came up because 16 00:00:45,080 --> 00:00:48,400 Speaker 3: I just got served a recommendation on one streaming app 17 00:00:48,440 --> 00:00:51,840 Speaker 3: or another for the Alex Gibney documentary We Steal Secrets, 18 00:00:52,120 --> 00:00:54,440 Speaker 3: the Story of WikiLeaks, which I had never got around 19 00:00:54,440 --> 00:00:57,600 Speaker 3: to watching, and it's on my list, and it reminded 20 00:00:57,640 --> 00:00:59,440 Speaker 3: me of this. And now, isn't there some kind of 21 00:01:00,240 --> 00:01:02,000 Speaker 3: like he's in the wind a little bit right now? 22 00:01:02,040 --> 00:01:03,680 Speaker 3: What's going on with old boy? 23 00:01:04,160 --> 00:01:08,120 Speaker 1: Well, we know that he made it back to Australia 24 00:01:09,080 --> 00:01:12,960 Speaker 1: after we did this episode in twenty twenty, and from 25 00:01:13,200 --> 00:01:17,480 Speaker 1: statements by his brother Gabriel Shipton at the time, the 26 00:01:17,520 --> 00:01:20,320 Speaker 1: word on the street was he was still adjusting spending 27 00:01:20,360 --> 00:01:23,560 Speaker 1: time with his family because he went through some horrific 28 00:01:23,600 --> 00:01:26,080 Speaker 1: stuff right when it was locked up in the embassy. 29 00:01:26,840 --> 00:01:32,200 Speaker 1: We know that he attended some pro Palestinian protest in 30 00:01:32,280 --> 00:01:35,680 Speaker 1: the past year. He also went to some other notable events, 31 00:01:35,720 --> 00:01:39,880 Speaker 1: but he's been kind of keeping his head down. Yeah, Hey, do. 32 00:01:39,880 --> 00:01:42,960 Speaker 2: You think Julian Massage would ever talk to us? Would 33 00:01:42,959 --> 00:01:43,720 Speaker 2: we want to talk to him? 34 00:01:43,760 --> 00:01:46,320 Speaker 4: I think I would want to talk to him, absolutely. 35 00:01:46,360 --> 00:01:51,680 Speaker 1: Yeah, I get text him. Okay, Ben, Well, well it's 36 00:01:51,680 --> 00:01:54,240 Speaker 1: someone who says they're Julian Massage. I think I just 37 00:01:54,280 --> 00:01:58,360 Speaker 1: put up like that my phone. But well, yeah, let's 38 00:01:58,400 --> 00:02:00,920 Speaker 1: reach out because I think it would be a fascinating 39 00:02:01,080 --> 00:02:04,960 Speaker 1: perspective and it's worth exploring, especially because we know the 40 00:02:05,040 --> 00:02:08,880 Speaker 1: reporting about this guy was so heavily tilted and influenced 41 00:02:08,880 --> 00:02:12,959 Speaker 1: by powerful forces when WikiLeaks was having its heyday in 42 00:02:13,000 --> 00:02:13,440 Speaker 1: the news. 43 00:02:13,600 --> 00:02:16,239 Speaker 3: It's practically referred to as like a terrorist, like a 44 00:02:16,840 --> 00:02:18,280 Speaker 3: cyber terrorist. 45 00:02:18,360 --> 00:02:21,080 Speaker 1: Right, and for some he is a soldier in the 46 00:02:21,120 --> 00:02:24,400 Speaker 1: war for transparency. To others, he is portrayed as a 47 00:02:24,520 --> 00:02:28,840 Speaker 1: super villain. In this episode from twenty twenty, we ask 48 00:02:28,960 --> 00:02:31,119 Speaker 1: whatever happened to Julian Massage. 49 00:02:31,440 --> 00:02:33,320 Speaker 4: But first let's hear a quick word from our sponsor. 50 00:02:34,560 --> 00:02:39,040 Speaker 5: From UFOs to psychic powers and government conspiracies, history is 51 00:02:39,120 --> 00:02:43,440 Speaker 5: riddled with unexplained events. You can turn back now or 52 00:02:43,520 --> 00:02:46,520 Speaker 5: learn the stuff they don't want you to know. A 53 00:02:46,560 --> 00:02:49,120 Speaker 5: production of Iheartrading. 54 00:02:58,600 --> 00:03:00,960 Speaker 2: Hello, welcome back to the show. My name is Matt, 55 00:03:01,120 --> 00:03:01,840 Speaker 2: my name is Noah. 56 00:03:02,000 --> 00:03:04,440 Speaker 1: They call me Ben. We are joined as always with 57 00:03:04,520 --> 00:03:08,919 Speaker 1: our super producer, Paul Mission Controlled Decant. Most importantly, you 58 00:03:09,000 --> 00:03:12,360 Speaker 1: are you. You are here, and that makes this stuff 59 00:03:12,400 --> 00:03:16,560 Speaker 1: they don't want you to know. Today, we're focusing on 60 00:03:16,639 --> 00:03:20,160 Speaker 1: the story of one man, a single individual, a larger 61 00:03:20,160 --> 00:03:25,120 Speaker 1: than life figure who once upon a time dominated global news. 62 00:03:25,480 --> 00:03:29,639 Speaker 1: To some, he is an inspiring icon of free speech 63 00:03:29,760 --> 00:03:32,760 Speaker 1: and transparency, the kind of figure you'd build a statue for. 64 00:03:33,680 --> 00:03:37,560 Speaker 1: To others, he's a power mad super villain, an agent 65 00:03:37,640 --> 00:03:43,200 Speaker 1: of chaos, and to still others many people, he's just dangerous. 66 00:03:43,720 --> 00:03:45,920 Speaker 1: His name is Julian Assange. 67 00:03:46,280 --> 00:03:48,520 Speaker 2: Our last update that we did on this show on 68 00:03:48,600 --> 00:03:52,800 Speaker 2: Julian Assange was almost exactly a year ago today. It 69 00:03:52,880 --> 00:03:56,680 Speaker 2: was May tenth, twenty nineteen, and today's May sixth, twenty twenty. 70 00:03:56,720 --> 00:03:59,760 Speaker 2: As we're recording this, so you know, we're not gonna 71 00:03:59,760 --> 00:04:03,200 Speaker 2: spend too much time in the background because previously we 72 00:04:03,280 --> 00:04:08,160 Speaker 2: have covered basically Julian Assauge's earlier situations. If you really 73 00:04:08,160 --> 00:04:10,080 Speaker 2: want to learn that stuff, you can go back to 74 00:04:10,160 --> 00:04:13,280 Speaker 2: those episodes. But for for today, we're gonna look at 75 00:04:13,280 --> 00:04:15,680 Speaker 2: where he is now, briefly, how he got there, and 76 00:04:15,760 --> 00:04:19,240 Speaker 2: the very very real, big picture issues that his current 77 00:04:19,279 --> 00:04:20,760 Speaker 2: situation is revealing. 78 00:04:21,240 --> 00:04:23,440 Speaker 3: Do you guys think what makes people think super villain 79 00:04:23,520 --> 00:04:26,840 Speaker 3: is his like stark white hair. You know, it seems 80 00:04:26,880 --> 00:04:30,279 Speaker 3: like a really good feature for a for a like 81 00:04:30,360 --> 00:04:32,120 Speaker 3: a Superman villain of some kind. 82 00:04:32,440 --> 00:04:35,560 Speaker 2: Well, Ben Ben has his background right now in the 83 00:04:35,600 --> 00:04:38,440 Speaker 2: video chat we're doing. He's got a picture of Julian 84 00:04:38,520 --> 00:04:43,480 Speaker 2: back there, and to me, it doesn't feel very super villainous. 85 00:04:43,839 --> 00:04:47,280 Speaker 1: I mean, that's the that's the issue. You know, we're 86 00:04:48,279 --> 00:04:52,359 Speaker 1: recording shows while we're several weeks into quarantine. But this guy, 87 00:04:52,440 --> 00:04:56,159 Speaker 1: this picture that I have in the background, he's he's 88 00:04:56,160 --> 00:04:58,800 Speaker 1: a quarantine master, you know, he's been on lockdown for 89 00:04:59,040 --> 00:05:02,400 Speaker 1: nearly seven year, so when they're pulling him out, he's 90 00:05:02,440 --> 00:05:05,080 Speaker 1: got the he's grown a beard that is as white 91 00:05:05,240 --> 00:05:08,920 Speaker 1: as his naturally white hair. And I think I could 92 00:05:08,960 --> 00:05:11,960 Speaker 1: agree with you, Noel, for especially for people who aren't 93 00:05:12,360 --> 00:05:15,960 Speaker 1: reading the specifics of what he does. If you just 94 00:05:15,960 --> 00:05:17,640 Speaker 1: see a headline in the picture of the guy and 95 00:05:17,680 --> 00:05:21,480 Speaker 1: you like Bond films, you might be tempted to say, yeah, 96 00:05:21,600 --> 00:05:23,440 Speaker 1: that guy looks like a super villain. A lot of 97 00:05:23,440 --> 00:05:25,719 Speaker 1: it will go into dress too, Like if he wore 98 00:05:25,760 --> 00:05:28,159 Speaker 1: a turtleneck more often, I think that would up the 99 00:05:28,200 --> 00:05:31,840 Speaker 1: Bond quotient. But maybe I'm thinking of the earlier Bond movies. 100 00:05:32,160 --> 00:05:34,880 Speaker 3: Maybe he had like a bionic hand or something, and 101 00:05:34,920 --> 00:05:37,760 Speaker 3: like a cat that he would stroke with that bionic hand. 102 00:05:39,279 --> 00:05:40,599 Speaker 1: Or a weird gun. 103 00:05:41,080 --> 00:05:44,000 Speaker 2: Is that is that, Inspector Gadget, No, what is that from? 104 00:05:44,240 --> 00:05:47,120 Speaker 4: Yeah, doctor Claw, he stroked the weird cat with the 105 00:05:47,160 --> 00:05:48,920 Speaker 4: bionic arm. Yeah, okay, okay. 106 00:05:49,160 --> 00:05:52,280 Speaker 3: Then there's also Blowfeld in the James Bond movies, who 107 00:05:52,279 --> 00:05:53,520 Speaker 3: also had a cat. 108 00:05:53,600 --> 00:05:56,719 Speaker 1: Or at least at the at the very least Doctor 109 00:05:56,800 --> 00:05:59,359 Speaker 1: Claw had a metal gauntlet, but since they called it 110 00:05:59,400 --> 00:06:01,400 Speaker 1: a claw, like to think it was like a bionic 111 00:06:01,440 --> 00:06:05,960 Speaker 1: hand situation. So those are those are some speculations, right, 112 00:06:06,480 --> 00:06:09,880 Speaker 1: but let's get to the facts. So here are the facts. 113 00:06:09,920 --> 00:06:14,640 Speaker 1: The gist of Julian Nossange. You know, Julian Nossange was 114 00:06:14,680 --> 00:06:18,560 Speaker 1: born in Australia in nineteen seventy one July third. Specifically, 115 00:06:19,080 --> 00:06:22,640 Speaker 1: for a full look at his biography and his early 116 00:06:22,760 --> 00:06:26,600 Speaker 1: years his ascendants, if you will please check out our 117 00:06:26,880 --> 00:06:29,839 Speaker 1: previous episodes. What we'd like to give you by way 118 00:06:29,839 --> 00:06:33,640 Speaker 1: of background now is just a very high level look 119 00:06:33,920 --> 00:06:37,560 Speaker 1: at what you could call his career and the consequences 120 00:06:37,640 --> 00:06:38,840 Speaker 1: of his career path. 121 00:06:39,040 --> 00:06:43,080 Speaker 3: So today, Asan is probably best known for his activism, 122 00:06:43,120 --> 00:06:46,440 Speaker 3: which inspired him to create the website Wiki Leagues in 123 00:06:46,440 --> 00:06:49,359 Speaker 3: two thousand and six. I think we're all pretty familiar 124 00:06:49,360 --> 00:06:51,880 Speaker 3: with wiki leaks, but let's think of it as kind 125 00:06:51,880 --> 00:06:54,400 Speaker 3: of a one stop shop for would. 126 00:06:54,200 --> 00:06:57,120 Speaker 4: Be whistleblowers and leakers. 127 00:06:56,920 --> 00:07:02,800 Speaker 3: Of classified documents and footage from around the world. In theory, 128 00:07:03,200 --> 00:07:06,559 Speaker 3: sort of a repository for all the stuff they don't 129 00:07:06,600 --> 00:07:12,320 Speaker 3: want you to know, regardless of who they might specifically be. 130 00:07:12,880 --> 00:07:15,760 Speaker 3: And Wiki Leak's gained international attention in April of twenty 131 00:07:15,800 --> 00:07:18,840 Speaker 3: ten when the organization released footage showing members of the 132 00:07:18,920 --> 00:07:24,080 Speaker 3: US military fatally shooting eighteen civilians, including journalists, from a 133 00:07:24,120 --> 00:07:25,440 Speaker 3: helicopter in Iraq. 134 00:07:25,800 --> 00:07:29,360 Speaker 1: Yeah, that was sort of the leak heard around the world, 135 00:07:29,480 --> 00:07:31,320 Speaker 1: you know what I mean, I've. 136 00:07:31,120 --> 00:07:34,040 Speaker 2: Been I still remember being in the fish Bowl office 137 00:07:34,240 --> 00:07:37,640 Speaker 2: at our previous office when that footage was released, and 138 00:07:37,680 --> 00:07:40,680 Speaker 2: you and I discussing it almost immediately after it came out. 139 00:07:40,880 --> 00:07:46,400 Speaker 2: It was a very intense thing to witness, especially being 140 00:07:46,440 --> 00:07:49,720 Speaker 2: that it was not something that was ever meant to 141 00:07:50,400 --> 00:07:52,880 Speaker 2: get to us as you know, the American public or 142 00:07:52,920 --> 00:07:53,680 Speaker 2: the world public. 143 00:07:54,200 --> 00:07:58,120 Speaker 1: Right. Yeah, And on a personal note, I missed the 144 00:07:58,160 --> 00:08:02,280 Speaker 1: fish Bowl. Everyone listening if you checked out our YouTube 145 00:08:02,320 --> 00:08:04,560 Speaker 1: videos and maybe we talked about it occasionally, but that 146 00:08:04,720 --> 00:08:07,800 Speaker 1: was just so cool. Other than being nearly impossible to 147 00:08:07,840 --> 00:08:12,680 Speaker 1: film in, it was pretty cool. Long story short on 148 00:08:12,680 --> 00:08:16,480 Speaker 1: that one. It used to be a single person's office 149 00:08:16,600 --> 00:08:19,400 Speaker 1: and then we moved in and there were like seven 150 00:08:19,440 --> 00:08:23,320 Speaker 1: of us at one point six, six or seven anyway. 151 00:08:22,960 --> 00:08:24,720 Speaker 2: At least six. Paul would know best. 152 00:08:24,920 --> 00:08:27,400 Speaker 1: Paul would know best. Oh yeah, that's right. Paul was 153 00:08:27,640 --> 00:08:32,760 Speaker 1: a fishbowler as well, and this footage rocked, not just 154 00:08:32,840 --> 00:08:35,800 Speaker 1: the fish Bowl, but the world, and at this point 155 00:08:35,840 --> 00:08:38,520 Speaker 1: won a pause and give you a heads up that 156 00:08:38,840 --> 00:08:40,920 Speaker 1: we are going to be talking about some things of 157 00:08:40,960 --> 00:08:44,360 Speaker 1: a sexual nature, so this may not be suitable for 158 00:08:44,440 --> 00:08:49,120 Speaker 1: all listeners. Later that same year, twenty ten, when he 159 00:08:49,240 --> 00:08:53,240 Speaker 1: released the footage, Assange was detained in the United Kingdom 160 00:08:53,360 --> 00:08:58,120 Speaker 1: because Sweden had issued an international arrest warrant over allegations 161 00:08:58,360 --> 00:09:02,080 Speaker 1: that he had sexually assaulted two women during his time 162 00:09:02,120 --> 00:09:03,439 Speaker 1: in the Scandinavian country. 163 00:09:03,679 --> 00:09:09,000 Speaker 3: Yeah Specifically, the allegations concerned the idea that he had 164 00:09:09,120 --> 00:09:12,480 Speaker 3: raped one woman and then molested and coerced another, all 165 00:09:12,559 --> 00:09:15,120 Speaker 3: during August tenth when he was in Stockholm to give 166 00:09:15,160 --> 00:09:20,199 Speaker 3: a lecture, and authorities were very motivated to question him 167 00:09:20,600 --> 00:09:25,400 Speaker 3: about these allegations. Assans maintained and continues to maintain to 168 00:09:25,480 --> 00:09:29,760 Speaker 3: this day, that both encounters were consensual, and this triggered 169 00:09:29,800 --> 00:09:34,880 Speaker 3: an extensive court battle. Eventually, he ended up seeking asylum 170 00:09:35,160 --> 00:09:39,319 Speaker 3: in the Ecuadorian embassy in London in order to avoid 171 00:09:39,440 --> 00:09:43,600 Speaker 3: possible extradition to either Sweden or ultimately the US, where 172 00:09:43,640 --> 00:09:47,600 Speaker 3: his actions with Wiki Leaks led him to believe he 173 00:09:47,640 --> 00:09:52,720 Speaker 3: would not receive a fair shake. Much like Chelsea Manning, 174 00:09:53,000 --> 00:09:56,640 Speaker 3: who partnered with Wiki leaks to leak that earlier footage 175 00:09:56,640 --> 00:09:59,120 Speaker 3: that we spoke about, and as we know, Chelsea Manning 176 00:09:59,679 --> 00:10:06,560 Speaker 3: was significantly demonized and the brunt of some very very 177 00:10:06,600 --> 00:10:08,160 Speaker 3: serious consequences. 178 00:10:08,559 --> 00:10:13,640 Speaker 1: Yeah. Yeah. Part of Chelsea Manning's case, though, goes into 179 00:10:14,679 --> 00:10:18,760 Speaker 1: the fact that Chelsea was a member of the US Army, 180 00:10:19,240 --> 00:10:23,200 Speaker 1: so there's some heavier consequences. But still you're right the 181 00:10:23,240 --> 00:10:27,080 Speaker 1: writings on the wall. Julian Essange very rightly thought that 182 00:10:27,160 --> 00:10:30,040 Speaker 1: he would not get a fair trial, and so it 183 00:10:30,200 --> 00:10:33,560 Speaker 1: was that he spent He tried to get to Ecuador proper, 184 00:10:33,600 --> 00:10:37,200 Speaker 1: but he ended up spending seven years almost hold up 185 00:10:37,360 --> 00:10:39,880 Speaker 1: in the embassy. And there were a lot of media 186 00:10:39,960 --> 00:10:45,480 Speaker 1: reports about him. You can see some interviews. They depicted 187 00:10:45,559 --> 00:10:49,520 Speaker 1: him in ways that sometimes sounded like propaganda or character 188 00:10:50,080 --> 00:10:53,040 Speaker 1: witness stuff. Now, now, to be fair, we have not 189 00:10:53,240 --> 00:10:57,800 Speaker 1: ourselves spoken with mister Assange, but he does have a 190 00:10:57,840 --> 00:11:01,760 Speaker 1: reputation of being a bit shall we say, contankerous, like 191 00:11:01,840 --> 00:11:04,520 Speaker 1: he'll walk out of an interview if he feels it's unfair. 192 00:11:07,000 --> 00:11:10,319 Speaker 1: His bedside manner, some journalists find it lacking, but you'll 193 00:11:10,320 --> 00:11:14,040 Speaker 1: see reports of him like being a bad HouseGuest, basically 194 00:11:14,120 --> 00:11:16,800 Speaker 1: like he wasn't cleaning up after his cat, and that 195 00:11:16,880 --> 00:11:19,720 Speaker 1: became How surreal is that that became an issue in 196 00:11:19,760 --> 00:11:24,839 Speaker 1: the embassy. He also had two children during this time. 197 00:11:25,520 --> 00:11:29,040 Speaker 1: He was finally arrested by British police on April eleventh, 198 00:11:29,160 --> 00:11:34,000 Speaker 1: twenty nineteen, right after the Ecuadorian President, Lennon Moreno announced 199 00:11:34,200 --> 00:11:39,479 Speaker 1: via Twitter again that the country had withdrawn his asylum 200 00:11:39,760 --> 00:11:45,040 Speaker 1: status along the way Sweden withdrew their earlier allegations. But 201 00:11:45,920 --> 00:11:48,280 Speaker 1: after he was dragged out of the embassy in that 202 00:11:48,480 --> 00:11:52,120 Speaker 1: picture that you guys can see behind me, which is 203 00:11:52,160 --> 00:11:56,400 Speaker 1: contrasted with what I think that's Game of Thrones? Which 204 00:11:56,480 --> 00:12:01,880 Speaker 1: character is that from Game of Thrones? I y lewin. Yeah, 205 00:12:01,880 --> 00:12:03,920 Speaker 1: that's a good call, I think so, Yeah, I think 206 00:12:03,920 --> 00:12:07,800 Speaker 1: you're right. So ultimately, the UK court sentenced him to 207 00:12:07,920 --> 00:12:11,400 Speaker 1: fifty weeks in jail on May first, twenty nineteen, not 208 00:12:11,480 --> 00:12:15,679 Speaker 1: for leaking documents, just for jumping bail, for breaching bail agreements. 209 00:12:16,040 --> 00:12:21,680 Speaker 1: And then that's when Swedish prosecutors reopened their investigation into 210 00:12:21,720 --> 00:12:24,600 Speaker 1: the allegations of sexual misconduct and assault. 211 00:12:25,200 --> 00:12:29,280 Speaker 2: Yeah, and you know Wiki leagues over the years that 212 00:12:29,360 --> 00:12:34,280 Speaker 2: it has been operating, both while Julian Assange was really 213 00:12:34,320 --> 00:12:37,520 Speaker 2: at the Helm and you know, doing day to day 214 00:12:37,559 --> 00:12:39,640 Speaker 2: operations as well as when he was in the embassy, 215 00:12:39,640 --> 00:12:42,840 Speaker 2: and there were others running wiki leaks, which it did 216 00:12:43,000 --> 00:12:49,200 Speaker 2: continue actively publishing releases for a while. There were a 217 00:12:49,280 --> 00:12:53,120 Speaker 2: ton of controversies that arose, you know, obviously from the 218 00:12:53,120 --> 00:12:57,360 Speaker 2: ones we've already outlined there, but countless others. And one 219 00:12:57,360 --> 00:12:59,920 Speaker 2: of the main ones that we talked about that you know, 220 00:13:00,040 --> 00:13:03,360 Speaker 2: the Iraq War tape, that specific video that was released. 221 00:13:04,520 --> 00:13:08,880 Speaker 2: It was a part of a larger Iraq war release 222 00:13:09,000 --> 00:13:11,760 Speaker 2: that was called the Iraq War Logs. It was also 223 00:13:11,960 --> 00:13:15,840 Speaker 2: called i think the Baghdad War Diaries, something to that effect. 224 00:13:17,440 --> 00:13:21,880 Speaker 2: So that tape was released in April of twenty ten. 225 00:13:22,520 --> 00:13:25,800 Speaker 2: Then the full Iraq War Logs were released in October 226 00:13:25,840 --> 00:13:29,319 Speaker 2: of twenty ten, and this was a trove of almost 227 00:13:29,360 --> 00:13:34,400 Speaker 2: four hundred thousand classified reports that, according to WikiLeaks and 228 00:13:34,880 --> 00:13:38,760 Speaker 2: other people who have gone through these documents, quote, document 229 00:13:38,800 --> 00:13:41,600 Speaker 2: the war and occupation in Iraq from the first of 230 00:13:41,720 --> 00:13:44,240 Speaker 2: January two thousand and four to the thirty first of 231 00:13:44,280 --> 00:13:47,199 Speaker 2: December two thousand and nine. And this is the thing 232 00:13:47,240 --> 00:13:51,200 Speaker 2: we were talking about, that the massive event. It was 233 00:13:52,160 --> 00:13:56,719 Speaker 2: everyone talking about this multiple countries and institutions weighed in 234 00:13:56,840 --> 00:13:59,800 Speaker 2: about what needed to happen because of this release, and 235 00:14:00,120 --> 00:14:05,600 Speaker 2: really everyone, especially the United States, went into damage control 236 00:14:05,640 --> 00:14:09,360 Speaker 2: because not only did this affect how the US government 237 00:14:09,440 --> 00:14:12,480 Speaker 2: and militaries were going to be looked at, it also 238 00:14:12,640 --> 00:14:14,880 Speaker 2: was going to affect how the allies of the United 239 00:14:14,920 --> 00:14:18,280 Speaker 2: States government were going to be scrutinized in the future. 240 00:14:19,240 --> 00:14:21,000 Speaker 2: It really was a big deal. 241 00:14:21,280 --> 00:14:26,920 Speaker 1: Yeah, it's set massive precedent, you know, and there were 242 00:14:27,000 --> 00:14:31,960 Speaker 1: so many contradicting narratives when it plays so much attempt 243 00:14:32,040 --> 00:14:36,960 Speaker 1: to spin. It was very it's very difficult to watch 244 00:14:37,400 --> 00:14:42,000 Speaker 1: history being made because history books make it seem so clean, 245 00:14:42,440 --> 00:14:46,240 Speaker 1: so point A to point B, but there's so many 246 00:14:46,280 --> 00:14:49,760 Speaker 1: things that get lost in the shuffle. And that's just 247 00:14:49,840 --> 00:14:53,040 Speaker 1: one of the controversies. Right. We also know that later 248 00:14:53,280 --> 00:14:58,960 Speaker 1: wiki leaks in twenty ten leaked more than two hundred 249 00:14:58,960 --> 00:15:04,840 Speaker 1: and fifty thousand classified diplomatic cables. A cable here in 250 00:15:04,880 --> 00:15:07,440 Speaker 1: this sense is not like an HDMI cable or something. 251 00:15:07,600 --> 00:15:12,920 Speaker 1: It's and it's not like comcasts. It's a message, right, 252 00:15:13,000 --> 00:15:17,880 Speaker 1: It's a secret message that different embassies send to one another. 253 00:15:18,560 --> 00:15:24,200 Speaker 1: And these these privileged communications might be intelligence on the 254 00:15:24,240 --> 00:15:27,720 Speaker 1: ground from an intelligence agency. They might be something as 255 00:15:27,760 --> 00:15:34,240 Speaker 1: small as like the ambassador to Syria said that he 256 00:15:34,400 --> 00:15:36,680 Speaker 1: was willing to work with us on this, but I 257 00:15:37,040 --> 00:15:39,440 Speaker 1: just think I just think he was trying to get 258 00:15:39,440 --> 00:15:42,720 Speaker 1: me out of the room, because the man has notorious ibs. 259 00:15:43,080 --> 00:15:46,680 Speaker 1: Like it's very frank, behind the scenes kind of looks 260 00:15:46,720 --> 00:15:49,640 Speaker 1: at stuff. There are things the governments of the world 261 00:15:49,960 --> 00:15:53,840 Speaker 1: do not want the international public, other countries, or their 262 00:15:53,880 --> 00:15:57,400 Speaker 1: domestic constituents to know. So it is not, by any 263 00:15:57,400 --> 00:16:01,360 Speaker 1: means hyperbole to say that this fundamental rocked the world 264 00:16:01,440 --> 00:16:05,880 Speaker 1: of diplomacy, and the repercussions are still reverberating today. This 265 00:16:05,960 --> 00:16:10,280 Speaker 1: did some irreparable damage to international relations. 266 00:16:10,600 --> 00:16:12,800 Speaker 3: Yeah, I mean think about like if you were on 267 00:16:12,840 --> 00:16:16,440 Speaker 3: an email thread with like somebody, and then you thought 268 00:16:16,520 --> 00:16:19,640 Speaker 3: you had deleted that person temporarily from the thread and 269 00:16:19,680 --> 00:16:21,680 Speaker 3: sort of talking a bunch of trash about them with 270 00:16:21,760 --> 00:16:24,840 Speaker 3: your you know, cohorts, and then you realize that that 271 00:16:24,880 --> 00:16:27,880 Speaker 3: person actually was on the whole time. Not a good look. 272 00:16:28,080 --> 00:16:31,800 Speaker 3: This is sort of like a much larger scale version 273 00:16:31,880 --> 00:16:33,800 Speaker 3: of that, because as you said, Ben, I mean, these 274 00:16:33,840 --> 00:16:37,560 Speaker 3: could just be very candid, little snippy comments, you know, 275 00:16:37,680 --> 00:16:39,880 Speaker 3: and it's not like it would necessarily be stuff that 276 00:16:39,920 --> 00:16:43,720 Speaker 3: would blow open the doors of like, you know, any 277 00:16:43,800 --> 00:16:47,920 Speaker 3: kind of conflict necessarily directly or any kind of intelligence 278 00:16:47,960 --> 00:16:50,360 Speaker 3: that would you know, be just earth shattering. 279 00:16:50,440 --> 00:16:51,680 Speaker 4: It's just a bad look. 280 00:16:51,960 --> 00:16:54,200 Speaker 3: I mean, diplomacy in and of itself is about good 281 00:16:54,240 --> 00:16:59,280 Speaker 3: bedside manner and keeping those relationships healthy and stroking egos. 282 00:16:59,320 --> 00:17:01,960 Speaker 3: And this would do a whole lot of damage to 283 00:17:02,040 --> 00:17:04,159 Speaker 3: break down some of those relationships that a lot of 284 00:17:04,200 --> 00:17:07,320 Speaker 3: these diplomats had worked very hard to nurse and maintain. 285 00:17:07,960 --> 00:17:10,080 Speaker 2: And by the way, if you want to see either 286 00:17:10,119 --> 00:17:13,360 Speaker 2: of these two subjects that we just talked about, the 287 00:17:13,400 --> 00:17:16,680 Speaker 2: Iraq warlogs or the diplomatic cables, you can find all 288 00:17:16,720 --> 00:17:20,760 Speaker 2: of that. It's all searchable on the WikiLeaks dot org website. 289 00:17:21,119 --> 00:17:23,919 Speaker 2: Just if you're curious or you know, you're feeling a 290 00:17:23,920 --> 00:17:26,480 Speaker 2: little dangerous, you can go check those out. 291 00:17:27,680 --> 00:17:31,240 Speaker 1: Yeah, yeah, we should say one benefit of talking about 292 00:17:31,240 --> 00:17:34,280 Speaker 1: these past weeks is that they are all available, you 293 00:17:34,320 --> 00:17:39,320 Speaker 1: know what I mean. It reminds me of the streisand effect. 294 00:17:39,400 --> 00:17:42,679 Speaker 1: You guys know what that is, right, No, So the 295 00:17:42,720 --> 00:17:48,840 Speaker 1: streisand effect is essentially saying that once something is out there, 296 00:17:48,960 --> 00:17:53,640 Speaker 1: once something is published online, especially it cannot be removed 297 00:17:53,640 --> 00:17:56,520 Speaker 1: no matter how much someone wants it to be removed. 298 00:17:56,560 --> 00:18:00,160 Speaker 1: It's named after a picture of Barber streisand I believe 299 00:18:00,240 --> 00:18:03,120 Speaker 1: that she asked to be removed from the internet. And 300 00:18:03,240 --> 00:18:05,639 Speaker 1: you know, we all are familiar with the denizens of 301 00:18:05,680 --> 00:18:08,360 Speaker 1: the internet. That is the same thing as telling them 302 00:18:08,440 --> 00:18:11,159 Speaker 1: please pretty please spread this everywhere. 303 00:18:12,000 --> 00:18:15,399 Speaker 2: Yeah, it was her Malibu home as of two thousand 304 00:18:15,400 --> 00:18:18,200 Speaker 2: and three. I believe that she was attempting to remove. 305 00:18:18,320 --> 00:18:20,960 Speaker 1: That's right, And you could see the argument there because 306 00:18:21,000 --> 00:18:24,679 Speaker 1: it's like, hey, this is, you know, my personal dwelling place. 307 00:18:25,240 --> 00:18:28,399 Speaker 1: But this situation reminds me of something very funny. I 308 00:18:28,400 --> 00:18:30,680 Speaker 1: don't know if I mentioned it to you guys when 309 00:18:30,760 --> 00:18:34,679 Speaker 1: the wikileak stuff really started hitting, but there were internal 310 00:18:34,800 --> 00:18:39,720 Speaker 1: emails throughout the intelligence industry, which is the right word, 311 00:18:40,280 --> 00:18:46,120 Speaker 1: and throughout the defense industry where people were literally telling 312 00:18:46,560 --> 00:18:49,240 Speaker 1: government employees, hey, we know this is out there, it's 313 00:18:49,240 --> 00:18:52,560 Speaker 1: on the news, it's online, just so you know, you 314 00:18:52,600 --> 00:18:54,840 Speaker 1: will get in trouble if you read it unless you 315 00:18:54,880 --> 00:18:58,520 Speaker 1: have the correct classification. And that was such an attempt 316 00:18:58,520 --> 00:19:01,520 Speaker 1: to like close the barn door after the horses have 317 00:19:01,600 --> 00:19:05,439 Speaker 1: stampeded away. We just didn't have the legislation for this, 318 00:19:05,680 --> 00:19:09,240 Speaker 1: and that's part of why it rocked the world so fundamentally. 319 00:19:09,240 --> 00:19:12,720 Speaker 1: And that's not even we're not even talking about twenty sixteen, right, 320 00:19:13,160 --> 00:19:18,320 Speaker 1: but the emails, because later WikiLeaks released some twenty thousand 321 00:19:18,440 --> 00:19:22,920 Speaker 1: pages of emails from the Hillary Clinton campaign, as we'll 322 00:19:22,960 --> 00:19:26,000 Speaker 1: call former Secretary of State Clinton was running for US 323 00:19:26,080 --> 00:19:30,160 Speaker 1: president in twenty sixteen, and also emails from the Democratic 324 00:19:30,280 --> 00:19:33,919 Speaker 1: National Committee. It is again not hyperbole to say that 325 00:19:33,960 --> 00:19:37,240 Speaker 1: these altered the course of US politics in a big, 326 00:19:37,280 --> 00:19:37,760 Speaker 1: big way. 327 00:19:38,240 --> 00:19:38,800 Speaker 4: Yeah, it's true. 328 00:19:38,800 --> 00:19:40,679 Speaker 3: I mean the emails which later became part of the 329 00:19:40,800 --> 00:19:42,080 Speaker 3: internet catchphrase. 330 00:19:43,040 --> 00:19:43,199 Speaker 5: You know. 331 00:19:43,280 --> 00:19:47,600 Speaker 3: But her emails explored and expanded on multiple Clinton controversies 332 00:19:47,600 --> 00:19:51,160 Speaker 3: that had already been out there in the public imagination, 333 00:19:52,000 --> 00:19:55,480 Speaker 3: and we discussed some of those early on in the 334 00:19:55,560 --> 00:19:59,479 Speaker 3: campaign between Hillary Clinton and Donald Trump, where we did Trump, 335 00:19:59,640 --> 00:20:02,439 Speaker 3: you know, the most popular Trump and Clinton conspiracy theories, 336 00:20:02,440 --> 00:20:05,800 Speaker 3: and the Clinton conspiracy theories are massive because they involve 337 00:20:05,880 --> 00:20:09,760 Speaker 3: both her and her husband, and the questionable relationship between 338 00:20:09,760 --> 00:20:12,320 Speaker 3: the Clinton Foundation and its donors. 339 00:20:12,800 --> 00:20:15,760 Speaker 4: Clinton's kind of you know. 340 00:20:16,680 --> 00:20:20,640 Speaker 3: Close knit relationships, shall we say, with some very powerful 341 00:20:20,680 --> 00:20:25,760 Speaker 3: Wall Street interests that became problematic especially considering that she 342 00:20:26,119 --> 00:20:28,200 Speaker 3: kind of, you know, was trying to cast herself as 343 00:20:28,200 --> 00:20:32,719 Speaker 3: sort of the people's candidate and her incredibly close ties 344 00:20:32,760 --> 00:20:37,239 Speaker 3: with very wealthy and powerful campaign contributors. This is just 345 00:20:37,440 --> 00:20:40,240 Speaker 3: you know, beginning to scratch the service. But it's fair 346 00:20:40,280 --> 00:20:43,040 Speaker 3: to say that the League's played a huge role in 347 00:20:43,119 --> 00:20:46,879 Speaker 3: the outcome of that year's presidential election. And it also, 348 00:20:47,119 --> 00:20:49,480 Speaker 3: you know, I mean, I think a lot of people 349 00:20:49,520 --> 00:20:52,200 Speaker 3: looked at as Songs early on as sort of this 350 00:20:52,880 --> 00:20:57,159 Speaker 3: great equalizer, you know, and trying to take powerful people 351 00:20:57,200 --> 00:21:00,679 Speaker 3: to task. And that's all well and good if the 352 00:21:00,720 --> 00:21:03,520 Speaker 3: outcome is on your side, right, And so I think 353 00:21:03,560 --> 00:21:07,919 Speaker 3: he really became a hugely divisive figure on the left 354 00:21:08,080 --> 00:21:10,840 Speaker 3: when this happened, and it started to feel like he 355 00:21:11,000 --> 00:21:15,960 Speaker 3: was perhaps more interested in getting the other side elected, 356 00:21:16,000 --> 00:21:19,160 Speaker 3: and that became very confusing. Part of his character really 357 00:21:19,200 --> 00:21:23,800 Speaker 3: kind of complicated the relationship with the public and Julian Assange. 358 00:21:24,560 --> 00:21:28,240 Speaker 2: I would say two things here. I recently listened to 359 00:21:28,520 --> 00:21:32,840 Speaker 2: Glenn Greenwald, now of the Intercept, that's how he's best 360 00:21:32,880 --> 00:21:38,399 Speaker 2: known currently, but a journalist who is discussing how Julian 361 00:21:38,440 --> 00:21:44,240 Speaker 2: Assange really does represent a largely hated figure on both 362 00:21:44,480 --> 00:21:48,120 Speaker 2: the political left and right within the United States because 363 00:21:48,160 --> 00:21:51,520 Speaker 2: of you know, the people who would consider him anti 364 00:21:51,640 --> 00:21:55,639 Speaker 2: war or anti you know, military, because of the Iraq 365 00:21:55,680 --> 00:21:59,880 Speaker 2: war logs and other things and the diplomatic cables. They 366 00:22:00,040 --> 00:22:02,359 Speaker 2: they very much dislike him from that angle. And then 367 00:22:02,359 --> 00:22:05,120 Speaker 2: from this, like you said, and the people on political 368 00:22:05,200 --> 00:22:08,040 Speaker 2: left very much dislike him because they essentially blame him 369 00:22:08,119 --> 00:22:12,320 Speaker 2: for Hillary Clinton losing the election. One commonality that we've 370 00:22:12,320 --> 00:22:15,720 Speaker 2: seen through these leaks is that it is attempting, at 371 00:22:15,840 --> 00:22:20,120 Speaker 2: least it feels that it's an attempt to show that 372 00:22:20,240 --> 00:22:23,240 Speaker 2: behind the curtain scene of the powerful people in every 373 00:22:23,280 --> 00:22:25,320 Speaker 2: single one of these that we just that we've already 374 00:22:25,359 --> 00:22:28,840 Speaker 2: kind of mentioned here. And I guess when you are 375 00:22:28,880 --> 00:22:31,679 Speaker 2: doing that and showing a peek behind the curtain of 376 00:22:31,720 --> 00:22:35,400 Speaker 2: all powerful people, you're gonna kind of piss everybody off. 377 00:22:36,880 --> 00:22:40,520 Speaker 1: Yeah, yeah, agreed, very much so. And you know, I 378 00:22:40,560 --> 00:22:45,879 Speaker 1: want to ask you guys here just an opinion. You know, 379 00:22:46,880 --> 00:22:51,119 Speaker 1: do are you all of the opinion that this that 380 00:22:51,200 --> 00:22:55,520 Speaker 1: this did significantly atpack the election or was it? I mean, 381 00:22:55,560 --> 00:22:57,399 Speaker 1: how much do you think it swung the needle? 382 00:22:58,640 --> 00:23:03,639 Speaker 2: I would say it swung the needle. I mean, my 383 00:23:03,680 --> 00:23:07,200 Speaker 2: opinion is that. Yes, for sure, it did because it 384 00:23:07,320 --> 00:23:10,840 Speaker 2: changed the conversation that was happening on the news cycle 385 00:23:11,520 --> 00:23:14,840 Speaker 2: almost like right up into the election, right up into 386 00:23:14,880 --> 00:23:17,520 Speaker 2: the day of the election. It really did change what 387 00:23:17,600 --> 00:23:18,680 Speaker 2: people were talking about. 388 00:23:19,359 --> 00:23:21,399 Speaker 4: Oh no, I mean, I think that's there's no question 389 00:23:21,440 --> 00:23:22,040 Speaker 4: about it. 390 00:23:21,880 --> 00:23:27,360 Speaker 3: It absolutely hijacked the conversation in a way that Clinton 391 00:23:27,520 --> 00:23:31,320 Speaker 3: was just not able to put back in that Pandora's box, 392 00:23:31,560 --> 00:23:36,439 Speaker 3: you know, And it really completely took away her power 393 00:23:36,480 --> 00:23:39,840 Speaker 3: to kind of steer the narrative, which, you know, say 394 00:23:39,880 --> 00:23:43,680 Speaker 3: what you will about whichever side of politics that you 395 00:23:44,040 --> 00:23:48,840 Speaker 3: fall on. She definitely was trying very hard to cast 396 00:23:48,840 --> 00:23:52,800 Speaker 3: herself in this light of being kind of the people's candidate, 397 00:23:52,960 --> 00:23:56,280 Speaker 3: of being this sort of you know, even handed person 398 00:23:56,320 --> 00:23:59,720 Speaker 3: that wasn't connected to any kind of wrongdoing and that 399 00:23:59,800 --> 00:24:03,439 Speaker 3: she really cared about the everyman for lack of a 400 00:24:03,440 --> 00:24:06,440 Speaker 3: better term, and this really kind of shattered a lot 401 00:24:06,480 --> 00:24:08,840 Speaker 3: of that, and it made it at least whether it 402 00:24:08,880 --> 00:24:11,960 Speaker 3: was true or not, at least the contents of the 403 00:24:12,000 --> 00:24:15,360 Speaker 3: emails and the nature of the whole private server situation, 404 00:24:15,480 --> 00:24:18,760 Speaker 3: and whether or not she completely flaunt flouted the rules 405 00:24:18,800 --> 00:24:21,760 Speaker 3: and just went her own way. It really robbed her 406 00:24:21,800 --> 00:24:23,960 Speaker 3: of the ability and her campaign of the ability to 407 00:24:24,000 --> 00:24:26,720 Speaker 3: steer that narrative, and it was just impossible to spin 408 00:24:26,800 --> 00:24:27,600 Speaker 3: it at that point. 409 00:24:27,920 --> 00:24:32,920 Speaker 1: Yeah, it's the old saying holds true. If you are 410 00:24:33,040 --> 00:24:37,280 Speaker 1: defending yourself in any kind of political sphere, then you 411 00:24:37,320 --> 00:24:42,119 Speaker 1: are automatically losing. The fascists were right about it. The 412 00:24:42,840 --> 00:24:45,720 Speaker 1: Republicans and the Democrats were right about it. It's just 413 00:24:45,800 --> 00:24:49,560 Speaker 1: sort of the part of the part of the house 414 00:24:49,640 --> 00:24:52,919 Speaker 1: rules of these terrible, terrible games. But there you have it. 415 00:24:53,359 --> 00:24:58,320 Speaker 1: Here's the takeaway. A single man, flesh and blood, just 416 00:24:58,400 --> 00:25:00,680 Speaker 1: as many of the people listening to the show today 417 00:25:00,720 --> 00:25:07,560 Speaker 1: are unelected, himself, was able to shake the world. You 418 00:25:07,680 --> 00:25:13,800 Speaker 1: can understand why so many powerful people consider him dangerous. 419 00:25:14,280 --> 00:25:18,560 Speaker 1: When we last left Julian Assange, he had been removed 420 00:25:18,600 --> 00:25:23,240 Speaker 1: from the Ecuadorian embassy. All indications appeared that he would 421 00:25:23,320 --> 00:25:28,480 Speaker 1: continue on a legal battle at a glacial pace, likely 422 00:25:28,880 --> 00:25:32,600 Speaker 1: from prison as he sought to avoid extradition and probably 423 00:25:32,640 --> 00:25:37,679 Speaker 1: in his impossible death in the United States. So where 424 00:25:37,720 --> 00:25:40,919 Speaker 1: are we at now? We'll answer the question after a 425 00:25:40,920 --> 00:25:52,200 Speaker 1: word from our sponsors. Here's where it gets crazy. What's 426 00:25:52,240 --> 00:25:56,000 Speaker 1: happening now? Well, it turns out the conspiracy to expose 427 00:25:56,160 --> 00:26:00,680 Speaker 1: conspiracies has not worked out especially well for Julian as Songe. 428 00:26:01,000 --> 00:26:03,600 Speaker 3: Yes, you'll recall how he said. As Songe was sent 429 00:26:03,680 --> 00:26:06,360 Speaker 3: us to fifty weeks in jail. Well, he was due 430 00:26:06,480 --> 00:26:09,560 Speaker 3: to be released in September of twenty nineteen, the twenty 431 00:26:09,600 --> 00:26:13,000 Speaker 3: second specifically, when is jail term for breaching that bail 432 00:26:14,040 --> 00:26:19,400 Speaker 3: ended and they just didn't let him out. The Westminster 433 00:26:19,560 --> 00:26:24,240 Speaker 3: Magistrates Court noted that there were substantial grounds for believing 434 00:26:24,320 --> 00:26:27,720 Speaker 3: if free as Songe would make another run attempting to 435 00:26:27,720 --> 00:26:31,840 Speaker 3: gain safe status, or like Edward Snowden, make it to 436 00:26:31,880 --> 00:26:35,480 Speaker 3: a country that would not extradite him to the United States. 437 00:26:35,920 --> 00:26:37,760 Speaker 3: I mean it's sort of you know, it's just the 438 00:26:37,760 --> 00:26:41,840 Speaker 3: same way a court assesses anyone's risk of flight. You know, 439 00:26:41,880 --> 00:26:44,040 Speaker 3: they won't grant you bail, or they'll make your bail 440 00:26:44,080 --> 00:26:47,080 Speaker 3: excessively high if they think or if they even think 441 00:26:47,119 --> 00:26:49,120 Speaker 3: that you're going to flee, let alone, if you've got 442 00:26:49,119 --> 00:26:52,880 Speaker 3: an established pattern of doing that, which as Songe absolutely 443 00:26:53,359 --> 00:26:57,879 Speaker 3: had so. Julian Assange remains in the Belmarsh Prison in 444 00:26:57,920 --> 00:27:02,320 Speaker 3: the United Kingdom. His partner, lawyer Stella Morris, who he 445 00:27:02,400 --> 00:27:06,119 Speaker 3: has two children with, recently said quote, the life of 446 00:27:06,119 --> 00:27:09,600 Speaker 3: my partner Julian Assan is severely at risk. He is 447 00:27:09,640 --> 00:27:14,440 Speaker 3: on remand at HMP Belmarsh, and COVID nineteen is spreading 448 00:27:14,480 --> 00:27:18,159 Speaker 3: within its walls. So this is absolutely a topical update. 449 00:27:18,200 --> 00:27:18,440 Speaker 4: Here. 450 00:27:19,040 --> 00:27:24,560 Speaker 3: Morris sees more than just a song's personal health in 451 00:27:24,640 --> 00:27:28,720 Speaker 3: jeopardy here saying that quote, Julian needs to be released now, 452 00:27:28,800 --> 00:27:31,600 Speaker 3: for him, for our family, and for the society we 453 00:27:31,640 --> 00:27:36,520 Speaker 3: all want our children to grow up in. Yeah, Okay, yeah, 454 00:27:36,560 --> 00:27:39,920 Speaker 3: I mean, you know, it depends on what your take 455 00:27:40,080 --> 00:27:44,760 Speaker 3: is on that particular vision of society that a songe espouses, 456 00:27:45,040 --> 00:27:46,439 Speaker 3: you know. And again, I mean a big part of 457 00:27:46,480 --> 00:27:48,960 Speaker 3: his character is the idea of like free speech and 458 00:27:49,000 --> 00:27:51,800 Speaker 3: about you know, taking powerful people to task. But he 459 00:27:51,920 --> 00:27:55,359 Speaker 3: is absolutely, has been and remains a super divisive figure. 460 00:27:56,200 --> 00:27:59,400 Speaker 3: So let's talk about a bigger picture issue here that 461 00:27:59,440 --> 00:28:03,000 Speaker 3: we've a little bit in some of our COVID nineteen updates, 462 00:28:03,040 --> 00:28:05,520 Speaker 3: the risk of coronavirus and prisons. 463 00:28:05,760 --> 00:28:09,240 Speaker 2: Well, yeah, it's certainly a hairy situation for anyone who 464 00:28:09,320 --> 00:28:13,920 Speaker 2: is being kept anywhere within fairly close quarters to other 465 00:28:14,080 --> 00:28:17,679 Speaker 2: human beings, especially if there are any kind of sanitary 466 00:28:17,760 --> 00:28:23,720 Speaker 2: issues within let's say a prison like that, And in 467 00:28:23,720 --> 00:28:28,080 Speaker 2: prisons across the world, people are being infected with coronavirus 468 00:28:28,160 --> 00:28:31,080 Speaker 2: because it only takes one person to enter that closed 469 00:28:31,080 --> 00:28:34,320 Speaker 2: system at some point to then infect one other and 470 00:28:34,359 --> 00:28:39,040 Speaker 2: then exponentially grow that outbreak. But in this particular prison 471 00:28:39,480 --> 00:28:43,840 Speaker 2: HMP Belmarsh where Julian Assange is at least two people 472 00:28:44,000 --> 00:28:47,520 Speaker 2: within the prison have contracted COVID nineteen or have they've 473 00:28:47,560 --> 00:28:50,920 Speaker 2: contracted the coronavirus and are dealing with symptoms of COVID nineteen. 474 00:28:51,760 --> 00:28:55,600 Speaker 2: And prisons just in general are pretty rough places to 475 00:28:55,680 --> 00:29:01,640 Speaker 2: be with individuals who have attributes and possibly charges pending 476 00:29:01,680 --> 00:29:05,280 Speaker 2: against them for varying things of varying severities, and in 477 00:29:05,480 --> 00:29:09,479 Speaker 2: you know, these uncertain times, there are higher than average 478 00:29:09,560 --> 00:29:13,480 Speaker 2: odds of things like violence occurring within prisons where there's 479 00:29:13,600 --> 00:29:17,640 Speaker 2: that kind of fear of something like coronavirus or uprisings 480 00:29:17,760 --> 00:29:21,320 Speaker 2: or you know, maybe even riots, very violent riots occurring 481 00:29:21,320 --> 00:29:22,240 Speaker 2: within these prisons. 482 00:29:22,640 --> 00:29:25,560 Speaker 1: Yeah, yeah, that's absolutely correct. In fact, there are numerous 483 00:29:25,600 --> 00:29:32,560 Speaker 1: reports already of riots about prison conditions in the US 484 00:29:32,560 --> 00:29:36,920 Speaker 1: and abroad, and in some countries prisoners have even successfully 485 00:29:36,960 --> 00:29:40,960 Speaker 1: taken guards hostage, not even necessarily to say, hey, we're 486 00:29:41,000 --> 00:29:44,800 Speaker 1: throwing down the prison walls, let my people go. But 487 00:29:44,840 --> 00:29:48,000 Speaker 1: more to say, we need to bring attention to the 488 00:29:48,000 --> 00:29:52,160 Speaker 1: pandemic bloodbath that may occur when we are locked in 489 00:29:52,240 --> 00:29:57,800 Speaker 1: here like animals for whatever reason. So this, this prison 490 00:29:57,840 --> 00:30:04,000 Speaker 1: angle has a lot of fairly plausible conspiratorial aspects to it, 491 00:30:04,040 --> 00:30:08,520 Speaker 1: because you know, we've seen the kind of manufactured concerns 492 00:30:08,560 --> 00:30:12,640 Speaker 1: that pop up in other celebrity prison stories. Someone has 493 00:30:12,680 --> 00:30:16,080 Speaker 1: a high paid lawyer, like someone working for Weinstein or something, 494 00:30:16,160 --> 00:30:21,040 Speaker 1: who says, you know, he's very fragile. Therefore, even though 495 00:30:21,080 --> 00:30:23,719 Speaker 1: he's been convicted of these crimes, he should get some 496 00:30:23,720 --> 00:30:27,000 Speaker 1: sort of special treatment because it's the human thing to do. 497 00:30:27,800 --> 00:30:30,280 Speaker 1: By the way, for R and B fans, R Kelly 498 00:30:30,720 --> 00:30:33,960 Speaker 1: is looking for the same thing, the argument being that 499 00:30:34,000 --> 00:30:38,320 Speaker 1: he has diabetes and COVID is going to blow through 500 00:30:38,360 --> 00:30:42,520 Speaker 1: prison systems. But in Assanja's case, there's solid evidence that 501 00:30:42,680 --> 00:30:47,360 Speaker 1: his life may be at risk more so than a Weinstein, 502 00:30:47,520 --> 00:30:48,600 Speaker 1: more so than an R. 503 00:30:48,760 --> 00:30:55,160 Speaker 2: Kelly, because not as much as a an ep free Epstein, 504 00:30:55,520 --> 00:30:56,760 Speaker 2: not as much as an Epstein. 505 00:30:56,920 --> 00:30:59,959 Speaker 1: Right, we should call that we should make the Epstein scale. 506 00:31:00,720 --> 00:31:04,600 Speaker 1: If the you know, fellow conspiracy realist. If you have 507 00:31:04,800 --> 00:31:06,920 Speaker 1: the time and the inclination, please feel free to make 508 00:31:06,920 --> 00:31:10,640 Speaker 1: an infographic of the Epstein scale the likelihood have died 509 00:31:10,760 --> 00:31:11,240 Speaker 1: in prison. 510 00:31:12,360 --> 00:31:14,320 Speaker 2: You know, I would I just really quickly bet. This 511 00:31:14,400 --> 00:31:17,680 Speaker 2: is a very interesting thing to think about, because the 512 00:31:17,840 --> 00:31:21,560 Speaker 2: reason Julian Assange is in jail, you know, is officially 513 00:31:22,160 --> 00:31:25,920 Speaker 2: because of jumping bail, but really, when you put all 514 00:31:25,960 --> 00:31:31,040 Speaker 2: of the math together, it's because he had damning evidence 515 00:31:31,240 --> 00:31:34,480 Speaker 2: against very powerful people, right or he was able to 516 00:31:34,880 --> 00:31:38,080 Speaker 2: release information about powerful people in the actions they take. 517 00:31:38,640 --> 00:31:38,680 Speaker 4: That. 518 00:31:38,800 --> 00:31:43,280 Speaker 2: Then you think about somebody like Jeffrey Epstein that very 519 00:31:43,520 --> 00:31:48,800 Speaker 2: likely at least allegedly had the worst kind of evidence 520 00:31:49,000 --> 00:31:53,400 Speaker 2: against extremely powerful people, and we saw what happened to him. 521 00:31:53,720 --> 00:31:56,920 Speaker 2: You know, who knows if it was. Look, we can't 522 00:31:57,000 --> 00:31:59,000 Speaker 2: prove to you if it was self inflicted or not 523 00:31:59,080 --> 00:32:02,160 Speaker 2: self inflicted in his case, but he was only in 524 00:32:02,240 --> 00:32:04,440 Speaker 2: prison for a short time, That's all I would. 525 00:32:04,200 --> 00:32:07,960 Speaker 1: Say, Right, Yeah, it's very much worth pointing out that 526 00:32:08,160 --> 00:32:14,840 Speaker 1: plenty of powerful people wants this guy dead, even if 527 00:32:14,880 --> 00:32:18,120 Speaker 1: they can't kill WikiLeaks, which is also on the wish list. 528 00:32:19,400 --> 00:32:23,720 Speaker 1: We should note that, you know, the Trump administration is 529 00:32:23,960 --> 00:32:30,680 Speaker 1: currently attempting to extradite assage to the US. However, remember 530 00:32:31,240 --> 00:32:35,000 Speaker 1: we're very quickly going into the no real good guy's 531 00:32:35,120 --> 00:32:38,920 Speaker 1: morally gray territory because the desire to gain possession of 532 00:32:38,960 --> 00:32:46,120 Speaker 1: assage transcends the growing stark ideological divide in domestic US politics. 533 00:32:46,720 --> 00:32:50,880 Speaker 1: Establishment Democrats want the guy under the jail or under 534 00:32:50,920 --> 00:32:55,520 Speaker 1: the daisies as much as establishment Republicans. He has very 535 00:32:55,600 --> 00:32:58,080 Speaker 1: few friends in Congress, you know what I mean, And 536 00:32:58,200 --> 00:33:01,080 Speaker 1: it's kind of misleading. That's why I appreciate the point 537 00:33:01,160 --> 00:33:05,440 Speaker 1: you made knowl earlier about people liking a hero or 538 00:33:05,520 --> 00:33:10,200 Speaker 1: calling someone a hero until their principles, whatever they may be, 539 00:33:10,960 --> 00:33:13,040 Speaker 1: run counter to our own principles. 540 00:33:13,360 --> 00:33:16,240 Speaker 3: It's just the idea of like one person's freedom fighter 541 00:33:16,360 --> 00:33:19,520 Speaker 3: is another person's terrorist. It all depends on what side 542 00:33:19,560 --> 00:33:21,400 Speaker 3: of the issue you fall A lot of the times, 543 00:33:21,480 --> 00:33:25,120 Speaker 3: you know, I mean, certainly both sides can be guilty 544 00:33:25,200 --> 00:33:29,840 Speaker 3: of what would be considered negative acts or even atrocities. 545 00:33:30,160 --> 00:33:33,600 Speaker 3: But still, in terms of the way you view the 546 00:33:33,760 --> 00:33:37,320 Speaker 3: means justifying the ends, you could probably say, well, they 547 00:33:37,400 --> 00:33:40,560 Speaker 3: might have done some bad things, but they were ultimately 548 00:33:40,800 --> 00:33:42,800 Speaker 3: supporting what I think is a just cause. 549 00:33:44,240 --> 00:33:49,280 Speaker 1: And oh and we should point out a in a 550 00:33:49,600 --> 00:33:52,239 Speaker 1: no way, baby, but kind of moment. Let me tell 551 00:33:52,280 --> 00:33:54,840 Speaker 1: you what really happened. The US is still kind of 552 00:33:54,880 --> 00:33:57,800 Speaker 1: in damage control over this, and has been since two 553 00:33:57,880 --> 00:34:00,680 Speaker 1: thousand and six, so for more than a decade they've 554 00:34:00,680 --> 00:34:04,560 Speaker 1: been in damage control here in the States. Uncle Sam 555 00:34:04,720 --> 00:34:10,719 Speaker 1: says Assange is not wanted because he caught the US 556 00:34:10,840 --> 00:34:13,880 Speaker 1: doing some bad things red handed, or because he damaged 557 00:34:13,920 --> 00:34:19,400 Speaker 1: the soft power of the reputation abroad, or because he 558 00:34:19,520 --> 00:34:23,319 Speaker 1: embarrassed the US. They say he's wanted because he illegally 559 00:34:23,760 --> 00:34:29,640 Speaker 1: endangered the lives of informants aka spies or assets, dissidents 560 00:34:30,640 --> 00:34:34,000 Speaker 1: aka also possibly spies or assets, but maybe you know, 561 00:34:34,120 --> 00:34:39,040 Speaker 1: maybe sincere dissidents as well as activist aka spies or assets. 562 00:34:39,080 --> 00:34:43,600 Speaker 1: But hey, maybe maybe activists in multiple countries including of 563 00:34:43,680 --> 00:34:50,040 Speaker 1: course rock Iran and Afghanistan. So what what? What gives? 564 00:34:50,200 --> 00:34:50,239 Speaker 4: What? 565 00:34:50,600 --> 00:34:53,960 Speaker 1: What is he actually going to be charged with or 566 00:34:54,239 --> 00:34:54,880 Speaker 1: indicted for? 567 00:34:55,120 --> 00:34:59,720 Speaker 2: Do we know? Yeah, there is an indictment impending charges 568 00:35:00,040 --> 00:35:05,319 Speaker 2: standing against Julian Assange from the United States, And I'm 569 00:35:05,360 --> 00:35:07,520 Speaker 2: gonna boil it down to you in the way that 570 00:35:07,760 --> 00:35:12,440 Speaker 2: Jennifer Robinson explained it in a recent conversation with Glenn Greenwald. 571 00:35:12,960 --> 00:35:16,680 Speaker 2: She's a human rights attorney and she's representing Assannge, and 572 00:35:16,840 --> 00:35:22,560 Speaker 2: she explains it that the charges essentially state that Assonge 573 00:35:22,880 --> 00:35:26,440 Speaker 2: was communicating had communications of some sort with Chelsea Manning 574 00:35:26,719 --> 00:35:30,120 Speaker 2: at the time known as Bradley Manning, and that Assannge 575 00:35:30,200 --> 00:35:33,960 Speaker 2: discussed with Chelsea ways in which she could access highly 576 00:35:34,040 --> 00:35:37,800 Speaker 2: classified materials in a way that would protect her identity. 577 00:35:38,120 --> 00:35:41,480 Speaker 2: That's really what it's saying. That's the whole thing. Assannge 578 00:35:41,680 --> 00:35:46,120 Speaker 2: was in connection with Chelsea Manning, and Chelsea Manning got 579 00:35:46,160 --> 00:35:50,000 Speaker 2: the materials to him and then he released them. And also, 580 00:35:50,120 --> 00:35:54,239 Speaker 2: according to Jennifer, there's a widely held misconception that's just 581 00:35:54,360 --> 00:35:57,360 Speaker 2: kind of been floating around and it started from this 582 00:35:57,560 --> 00:36:01,239 Speaker 2: Department of Justice press release that came out out not 583 00:36:01,480 --> 00:36:05,520 Speaker 2: that long ago, and within this release it falsely stated, 584 00:36:05,560 --> 00:36:08,359 Speaker 2: at least according to Robinson, that as Soonge was being 585 00:36:08,480 --> 00:36:13,800 Speaker 2: charged with quote hacking, so in some way using you know, 586 00:36:13,960 --> 00:36:18,800 Speaker 2: a computer or a system to access classified government materials 587 00:36:18,960 --> 00:36:23,080 Speaker 2: and then get those for Chelsea or with Chelsea Manning, 588 00:36:23,520 --> 00:36:27,160 Speaker 2: because you know, we discussed Chelsea Manning is the person 589 00:36:27,360 --> 00:36:30,640 Speaker 2: who ended up getting a lot of that documentation that 590 00:36:30,800 --> 00:36:35,920 Speaker 2: became the IRAQ warlogs and the tapes. So just the 591 00:36:36,000 --> 00:36:39,799 Speaker 2: last thing here is that, according to Jennifer Robinson, within 592 00:36:39,880 --> 00:36:43,759 Speaker 2: that indictment, there is no allegation that Assonge attempted to 593 00:36:43,840 --> 00:36:46,920 Speaker 2: hack the US government in any way to get to 594 00:36:47,000 --> 00:36:50,759 Speaker 2: gain material. There's also no allegation within that indictment that 595 00:36:50,880 --> 00:36:56,399 Speaker 2: he unlawfully accessed any government computers or systems whatsoever. That's 596 00:36:56,520 --> 00:37:01,080 Speaker 2: just just to put it out there. In conclusion, the 597 00:37:02,120 --> 00:37:05,160 Speaker 2: charges are that he had a discussion with Chelsea Manning 598 00:37:05,560 --> 00:37:10,400 Speaker 2: about how to hide her identity while accessing secret documents. 599 00:37:11,160 --> 00:37:14,960 Speaker 1: Okay, and then once that kind of stuff proceeds, if 600 00:37:15,000 --> 00:37:19,360 Speaker 1: it does indeed proceed, then we'll see those kind of 601 00:37:19,520 --> 00:37:28,719 Speaker 1: charges become increasingly specific as prosecutors explore their options. So 602 00:37:29,040 --> 00:37:32,600 Speaker 1: this naturally leads us to the next part of the update, 603 00:37:33,280 --> 00:37:38,279 Speaker 1: which is this what's on the horizon. We'll explore that 604 00:37:38,920 --> 00:37:51,080 Speaker 1: after a word from our sponsors. We've returned. So just 605 00:37:51,160 --> 00:37:54,640 Speaker 1: a few days ago, as we record this peak behind 606 00:37:54,680 --> 00:37:58,920 Speaker 1: the curtain interest of transparency. It is May sixth, Wednesday, 607 00:38:00,000 --> 00:38:03,480 Speaker 1: A lovely day outside in Atlanta. As far as I can. 608 00:38:03,400 --> 00:38:06,279 Speaker 4: Tell, Blustery, I would say, yes, I love it. 609 00:38:06,360 --> 00:38:07,919 Speaker 1: I was writing on the porch. I wish I could 610 00:38:07,920 --> 00:38:13,960 Speaker 1: record out there. Just a few days ago, Juliannassange received 611 00:38:14,040 --> 00:38:17,040 Speaker 1: word that he will have to wait until at least 612 00:38:17,239 --> 00:38:21,200 Speaker 1: September of twenty twenty before a British judge will hear 613 00:38:21,360 --> 00:38:25,080 Speaker 1: the US request for his extradition. This comes to us 614 00:38:25,560 --> 00:38:29,000 Speaker 1: from a possibly biased source. To be fair, Kristen Fraftsman, 615 00:38:29,480 --> 00:38:33,440 Speaker 1: the editor in chief of WikiLeaks. Earlier they said a 616 00:38:33,680 --> 00:38:38,440 Speaker 1: video posted on social media that it's unacceptable and confirmed 617 00:38:38,480 --> 00:38:41,839 Speaker 1: that ASSANGEO likely has to spend another four months at 618 00:38:41,960 --> 00:38:44,759 Speaker 1: least in prison, and if a hearing does come to 619 00:38:44,880 --> 00:38:47,680 Speaker 1: pass in September, that means that he will have spent 620 00:38:48,120 --> 00:38:50,920 Speaker 1: one year in prison after being dragged out of that 621 00:38:51,040 --> 00:38:54,279 Speaker 1: embassy on his what was that on his charge of 622 00:38:54,400 --> 00:38:56,000 Speaker 1: fifty weeks for jumping and bail. 623 00:38:56,400 --> 00:38:59,480 Speaker 2: Yeah, that he was supposed to be let out. What'd 624 00:38:59,520 --> 00:39:00,960 Speaker 2: you say last year? 625 00:39:01,680 --> 00:39:07,279 Speaker 1: Yes, yeah, that's correct. And additionally, the editor there at 626 00:39:07,320 --> 00:39:10,920 Speaker 1: WikiLeaks said he was not able to set a sounge, 627 00:39:11,080 --> 00:39:14,759 Speaker 1: that was not able to attend some earlier hearings via 628 00:39:14,920 --> 00:39:21,279 Speaker 1: video link because he was unwell. So that gives us, 629 00:39:21,680 --> 00:39:24,720 Speaker 1: unless that spin, that gives us a pretty solid argument 630 00:39:24,800 --> 00:39:27,680 Speaker 1: that at least in his case, the health concerns are real. 631 00:39:28,680 --> 00:39:32,279 Speaker 1: Reporters said he was deteriorating mentally already when he was 632 00:39:32,320 --> 00:39:33,720 Speaker 1: in the Ecuadorian embassy. 633 00:39:34,120 --> 00:39:37,400 Speaker 3: So the fate of a Songe may actually set a 634 00:39:37,600 --> 00:39:40,840 Speaker 3: legal precedent in the United States. Regardless of where you 635 00:39:40,960 --> 00:39:44,440 Speaker 3: stand with these leaks. In particular, it can't really be 636 00:39:44,520 --> 00:39:48,480 Speaker 3: denied that they pushed the public in multiple countries to 637 00:39:48,600 --> 00:39:51,759 Speaker 3: hold politicians accountable and well attempt to do so. 638 00:39:51,800 --> 00:39:52,120 Speaker 4: Anyway. 639 00:39:52,520 --> 00:39:56,719 Speaker 3: Imagine a world where any disclosure, even if it's incredibly 640 00:39:56,960 --> 00:39:59,680 Speaker 3: vital to the public interest, becomes a crime. 641 00:40:00,160 --> 00:40:03,000 Speaker 4: And that's like dystopian kind of stuff right there. 642 00:40:03,239 --> 00:40:07,040 Speaker 3: Whistleblower protections, which have historically been better on paper than 643 00:40:07,280 --> 00:40:11,520 Speaker 3: in actual practice, have truly eroded in the time since 644 00:40:12,320 --> 00:40:15,520 Speaker 3: a Soinge was kind of at his peak. Consider that 645 00:40:15,640 --> 00:40:18,839 Speaker 3: other countries like China and Russia have already started intimidating 646 00:40:18,920 --> 00:40:23,680 Speaker 3: civilians for quote rumor mongering end quote. As we've discussed 647 00:40:23,719 --> 00:40:28,719 Speaker 3: in the COVID nineteen episodes, the scientists that reported the 648 00:40:28,880 --> 00:40:32,600 Speaker 3: early signs of that virus was accused of that very 649 00:40:32,719 --> 00:40:38,040 Speaker 3: thing and essentially blackballed and treated like a criminal. And 650 00:40:38,200 --> 00:40:39,920 Speaker 3: that was a big part of what we can now 651 00:40:40,239 --> 00:40:42,160 Speaker 3: call some form of cover up. 652 00:40:42,280 --> 00:40:45,920 Speaker 1: Yeah. Yeah, And Russian doctors keep falling out of windows. 653 00:40:47,239 --> 00:40:50,000 Speaker 1: One fell out of a window after he made a 654 00:40:50,360 --> 00:40:54,200 Speaker 1: video update where he's protesting being forced to work despite 655 00:40:54,320 --> 00:40:58,680 Speaker 1: not having ppe personal protective equipment and despite having tested 656 00:40:58,800 --> 00:41:02,000 Speaker 1: positive for COVID, they still wanted him to work. And 657 00:41:02,080 --> 00:41:04,959 Speaker 1: then just a few days later he released a video 658 00:41:05,160 --> 00:41:07,920 Speaker 1: where he said that was all crazy. He denied any 659 00:41:08,000 --> 00:41:10,839 Speaker 1: of those claims, everything was fine, and then he fell 660 00:41:10,920 --> 00:41:13,600 Speaker 1: out of a window. I don't know if you guys 661 00:41:13,600 --> 00:41:15,640 Speaker 1: would keep a track of that, but yeah, make no mistake, 662 00:41:15,800 --> 00:41:19,759 Speaker 1: there is a war for information. That war is very old, 663 00:41:20,120 --> 00:41:22,640 Speaker 1: but now there is a war on information. 664 00:41:23,400 --> 00:41:23,680 Speaker 4: Hmm. 665 00:41:24,480 --> 00:41:28,600 Speaker 2: This is very true and you can see that depending 666 00:41:28,640 --> 00:41:33,200 Speaker 2: on the outcome of Julian Osandra's situation, this could have 667 00:41:33,520 --> 00:41:38,360 Speaker 2: a supreme impact on the future of potential whistleblowers like 668 00:41:38,480 --> 00:41:40,520 Speaker 2: you were saying, Noel, And it has a lot to 669 00:41:40,640 --> 00:41:45,439 Speaker 2: do with the fact that if you decide to leak 670 00:41:45,560 --> 00:41:50,440 Speaker 2: information somewhere and you could be hunted down essentially by 671 00:41:50,800 --> 00:41:54,520 Speaker 2: the government that you were blowing the whistle on. I mean, 672 00:41:54,600 --> 00:41:57,320 Speaker 2: it would really set a precedent for that. And the 673 00:41:57,560 --> 00:42:02,320 Speaker 2: big problem here it bans from the individual whistleblower to 674 00:42:02,800 --> 00:42:06,920 Speaker 2: the outlets, the major news outlets that cover stories about 675 00:42:07,239 --> 00:42:11,640 Speaker 2: leaked information. If if you're recalled, during the Iraq warlogs 676 00:42:11,880 --> 00:42:15,600 Speaker 2: Saga of twenty ten, as well as the diplomatic cables, 677 00:42:16,040 --> 00:42:21,320 Speaker 2: the DNC emails, these were major releases done in conjunction 678 00:42:21,560 --> 00:42:26,840 Speaker 2: with newspapers. It wasn't just WikiLeaks putting out information in 679 00:42:27,160 --> 00:42:29,880 Speaker 2: with the Iraq warlogs. That was the Guardian, that was 680 00:42:29,960 --> 00:42:34,719 Speaker 2: the New York Times. I think Despiegel released part of 681 00:42:34,800 --> 00:42:37,600 Speaker 2: that information. They all kind of segmented it out. It 682 00:42:37,760 --> 00:42:42,840 Speaker 2: was all, you know, major news outlets releasing leaked material 683 00:42:43,040 --> 00:42:47,120 Speaker 2: via WikiLeaks. And this is something we have to remember. 684 00:42:48,320 --> 00:42:51,800 Speaker 2: Major news outlets, let's say, like the Guardian or the 685 00:42:51,880 --> 00:42:55,520 Speaker 2: New York Times, these news outlets have on their sites 686 00:42:56,560 --> 00:43:01,720 Speaker 2: easily accessible methods for anyone to anonymously provide newsworthy information 687 00:43:01,960 --> 00:43:06,440 Speaker 2: to those outlets. Essentially, if we're thinking about it in 688 00:43:06,520 --> 00:43:11,520 Speaker 2: this framework, aiding and abetting potential whistleblowers. And we have 689 00:43:11,640 --> 00:43:14,400 Speaker 2: two examples of this. You can go to the Guardian 690 00:43:14,520 --> 00:43:18,520 Speaker 2: dot com slash secure drop right now to check out 691 00:43:19,000 --> 00:43:21,280 Speaker 2: the way that they want you to give them leaked 692 00:43:21,320 --> 00:43:25,400 Speaker 2: information or to leak them information securely and anonymously. You 693 00:43:25,440 --> 00:43:29,120 Speaker 2: can also go to ny times dot com slash tips 694 00:43:29,560 --> 00:43:31,239 Speaker 2: to see how The New York Times wants you to 695 00:43:31,320 --> 00:43:35,160 Speaker 2: do it. They're suggesting that you use WhatsApp and signal 696 00:43:35,280 --> 00:43:37,920 Speaker 2: and secure drop to make contact with them and then 697 00:43:38,000 --> 00:43:43,399 Speaker 2: to send them materials. And you know, if the whistleblower 698 00:43:43,440 --> 00:43:48,040 Speaker 2: individual falls because of Julian Osandra's situation, it would also 699 00:43:48,160 --> 00:43:53,000 Speaker 2: make sense that perhaps the you know, journalism is the 700 00:43:53,040 --> 00:43:57,520 Speaker 2: way we understand it, leaked important information would become in 701 00:43:57,640 --> 00:43:58,640 Speaker 2: some ways illegal. 702 00:43:59,719 --> 00:44:05,759 Speaker 1: Yeah, and then maybe your past actions could become prosecutable offenses. 703 00:44:06,400 --> 00:44:08,520 Speaker 1: There's one thing. I did this on a different show. 704 00:44:08,600 --> 00:44:10,920 Speaker 1: I wanted to mention this to you guys because I 705 00:44:10,960 --> 00:44:13,520 Speaker 1: thought you would find it interesting, and hopefully you will too, 706 00:44:13,640 --> 00:44:19,759 Speaker 1: fellow listeners, speaking of censorship, a completely different person who 707 00:44:19,920 --> 00:44:24,920 Speaker 1: perhaps sees themselves as a whistleblower has run into what 708 00:44:25,040 --> 00:44:29,040 Speaker 1: they say as a conspiracy of censorship. David Ike just 709 00:44:29,200 --> 00:44:33,400 Speaker 1: had his YouTube channel deleted. Will not be going back up. 710 00:44:33,880 --> 00:44:39,960 Speaker 1: I think around nine hundred thousand subscribers. His Facebook page 711 00:44:40,080 --> 00:44:45,000 Speaker 1: was also deleted because the tech companies are instituting a 712 00:44:45,120 --> 00:44:51,880 Speaker 1: ban against anybody spreading misinformation about the coronavirus. So, as 713 00:44:51,960 --> 00:44:53,759 Speaker 1: you can imagine, there are a lot of people who 714 00:44:53,760 --> 00:44:56,120 Speaker 1: are saying, yes, you have to do this because he's 715 00:44:56,239 --> 00:44:59,560 Speaker 1: endangering lives, which is similar to the argument made by 716 00:44:59,600 --> 00:45:02,239 Speaker 1: the US government about assage. And then you have people 717 00:45:02,320 --> 00:45:05,560 Speaker 1: who are saying, you know, I might not agree with 718 00:45:05,760 --> 00:45:10,880 Speaker 1: this guy at all, but he is he is exercising 719 00:45:10,960 --> 00:45:13,600 Speaker 1: free speech. It's a little sticky there, you know, when 720 00:45:13,640 --> 00:45:17,879 Speaker 1: we start to navigate the idea of public safety, censorship, 721 00:45:18,560 --> 00:45:23,359 Speaker 1: free speech, because with great speech comes great responsibility, does 722 00:45:23,400 --> 00:45:25,920 Speaker 1: it not, And you know, you have to ask yourself 723 00:45:25,960 --> 00:45:29,759 Speaker 1: about the platform there. But but tech companies are a 724 00:45:29,760 --> 00:45:32,920 Speaker 1: little different, of course, because it's their sandbox. They make 725 00:45:32,960 --> 00:45:36,200 Speaker 1: the rules, they can do whatever they want. I just 726 00:45:36,280 --> 00:45:38,920 Speaker 1: think it's interesting that we're seeing more and more of 727 00:45:39,000 --> 00:45:45,080 Speaker 1: these these information conflicts rising to the foe. And this 728 00:45:45,280 --> 00:45:47,000 Speaker 1: is in no way an endorsement of David Ike. 729 00:45:48,320 --> 00:45:49,960 Speaker 2: Yeah, assuredly. 730 00:45:50,840 --> 00:45:51,040 Speaker 1: Yes. 731 00:45:52,000 --> 00:45:54,440 Speaker 2: I just wanted to bring this up where we're talking 732 00:45:54,520 --> 00:46:01,000 Speaker 2: about kind of the control of speech on outlets like that, 733 00:46:01,520 --> 00:46:05,200 Speaker 2: on platforms such as YouTube. I just wanted to mention 734 00:46:05,280 --> 00:46:06,600 Speaker 2: to you guys. I don't know if you saw the 735 00:46:06,640 --> 00:46:10,840 Speaker 2: email come through. We got contacted by our old pals 736 00:46:10,880 --> 00:46:14,520 Speaker 2: at All Time Conspiracies. Some of you older conspiracy realists 737 00:46:14,560 --> 00:46:17,200 Speaker 2: may remember a time on YouTube when we made some 738 00:46:17,360 --> 00:46:21,319 Speaker 2: videos with the guys on this channel called All Time Conspiracies, 739 00:46:21,920 --> 00:46:26,040 Speaker 2: and I did a little catching up with them, and 740 00:46:26,400 --> 00:46:29,440 Speaker 2: they've written back to us. They went through the exact 741 00:46:29,520 --> 00:46:34,480 Speaker 2: same issues we did with the YouTube algorithm and their 742 00:46:34,600 --> 00:46:39,560 Speaker 2: content being suppressed essentially because they were talking about political issues, 743 00:46:40,200 --> 00:46:45,200 Speaker 2: conspiratorial issues, and they ended up having to shutter their 744 00:46:45,320 --> 00:46:48,840 Speaker 2: channel and they've moved on. So if anyone is interested 745 00:46:49,880 --> 00:46:52,520 Speaker 2: they want to do an episode with us, They've got 746 00:46:52,560 --> 00:46:55,320 Speaker 2: a podcast now similarly to ours. So I thought that 747 00:46:55,440 --> 00:46:58,680 Speaker 2: might be a fun matchup mash up in the podcast 748 00:46:58,719 --> 00:47:02,080 Speaker 2: world now instead of on video, just to talk about 749 00:47:02,560 --> 00:47:04,120 Speaker 2: what happened to our YouTube channels. 750 00:47:04,600 --> 00:47:06,680 Speaker 1: I would be interested in that for sure. What do 751 00:47:06,719 --> 00:47:07,239 Speaker 1: you think, Nol? 752 00:47:07,960 --> 00:47:11,000 Speaker 3: Absolutely? Yeah, no, I mean always down for a good 753 00:47:11,200 --> 00:47:12,680 Speaker 3: collab with like minded folks. 754 00:47:13,640 --> 00:47:18,160 Speaker 1: And this leads us to one more potential development. So 755 00:47:18,320 --> 00:47:21,239 Speaker 1: we are at a we are at a branch. We 756 00:47:21,360 --> 00:47:26,120 Speaker 1: are possible forks in the road, and as Yogi Bearrap 757 00:47:26,280 --> 00:47:29,040 Speaker 1: You'll announcer famously said, when you come to a fork 758 00:47:29,080 --> 00:47:33,520 Speaker 1: in the road take it. Sorry that jokes for you, Dad, 759 00:47:33,960 --> 00:47:40,080 Speaker 1: But this we do see some very important, mutually exclusive 760 00:47:40,200 --> 00:47:43,919 Speaker 1: things on the horizon. One of these things is going 761 00:47:44,040 --> 00:47:49,880 Speaker 1: to happen to Julian Assange. One He may die in 762 00:47:50,080 --> 00:47:53,719 Speaker 1: prison before that extradition hearing occurs or before it is 763 00:47:53,800 --> 00:47:59,080 Speaker 1: carried through, and then for some very powerful people, a 764 00:47:59,160 --> 00:48:02,320 Speaker 1: big part of the problem would have solved itself. However, 765 00:48:02,400 --> 00:48:07,600 Speaker 1: we have to remember, you know, like Jay Gavera famously 766 00:48:07,680 --> 00:48:11,320 Speaker 1: said in his last words, shoot fool, you're only killing 767 00:48:11,400 --> 00:48:16,320 Speaker 1: a man. WikiLeaks wouldn't die if asanj did, it would continue. 768 00:48:16,680 --> 00:48:19,920 Speaker 1: It's not a perfect system, and there are allegations, as 769 00:48:19,960 --> 00:48:22,040 Speaker 1: I believe one of us mentioned earlier, that it has 770 00:48:22,120 --> 00:48:25,280 Speaker 1: a shifting agenda of its own, you know, including pretty 771 00:48:25,320 --> 00:48:28,640 Speaker 1: serious allegations that have partnered with Russia to wage some 772 00:48:28,800 --> 00:48:32,640 Speaker 1: sort of asymmetrical information warfare against the US. So one 773 00:48:32,680 --> 00:48:36,000 Speaker 1: possible occurrence is that he dies in prison. End of story. 774 00:48:36,400 --> 00:48:40,160 Speaker 2: And we're talking there, you know, about one person dying 775 00:48:40,200 --> 00:48:44,080 Speaker 2: in the movement continuing. We're also talking earlier about maybe 776 00:48:44,080 --> 00:48:47,120 Speaker 2: it would prevent other whistleblowers from coming forward. But if 777 00:48:47,160 --> 00:48:49,759 Speaker 2: he did die in prison, the other scenario is that 778 00:48:50,000 --> 00:48:54,520 Speaker 2: maybe more whistleblowers come forward because they see it almost 779 00:48:54,600 --> 00:48:59,240 Speaker 2: as a martyrdom situation, where the way he was treated 780 00:48:59,400 --> 00:49:03,200 Speaker 2: and how it all went down, they want to stick 781 00:49:03,280 --> 00:49:06,960 Speaker 2: it to the man essentially and continue in that legacy 782 00:49:07,680 --> 00:49:08,920 Speaker 2: outside of what he leaks. 783 00:49:09,880 --> 00:49:14,480 Speaker 1: Mm hmm. And now we have to ask ourselves, what 784 00:49:14,600 --> 00:49:18,080 Speaker 1: about the precedents If he goes to trial, it's going 785 00:49:18,160 --> 00:49:20,239 Speaker 1: to be even more crucial, you know what I mean? 786 00:49:20,480 --> 00:49:20,600 Speaker 2: Uh? 787 00:49:21,600 --> 00:49:24,680 Speaker 1: Then, I mean here, let's be a little bit idyllic, 788 00:49:25,239 --> 00:49:27,560 Speaker 1: at least from his perspective. What if a songe is 789 00:49:27,600 --> 00:49:30,239 Speaker 1: able to stay in the United Kingdom the same way, 790 00:49:30,360 --> 00:49:34,160 Speaker 1: for instance, that Edward Snowden is currently staying in Russia. Uh, 791 00:49:34,760 --> 00:49:38,160 Speaker 1: it probably won't happen. What if he's able to escape 792 00:49:38,280 --> 00:49:43,080 Speaker 1: and live on the run, you know, like that Beatles 793 00:49:43,560 --> 00:49:45,680 Speaker 1: Is it Beatles? Or is just Paul McCartney band on 794 00:49:45,719 --> 00:49:51,719 Speaker 1: the run? That's it? Thank you? Yeah? So what if 795 00:49:51,760 --> 00:49:53,839 Speaker 1: he pulls that just for the rest of his life? 796 00:49:54,320 --> 00:49:56,080 Speaker 5: Uh song on the run? 797 00:49:56,239 --> 00:49:59,960 Speaker 1: There we go. Yeah, we'll figure out who Sailor's saying? 798 00:50:00,480 --> 00:50:02,879 Speaker 1: Is that in that regard? I guess it's Uncle Sam. 799 00:50:04,120 --> 00:50:07,600 Speaker 1: But here's the question, what if what if Assane goes 800 00:50:07,640 --> 00:50:10,000 Speaker 1: to a US court, gets extra died, it goes to 801 00:50:10,120 --> 00:50:13,160 Speaker 1: trial and gets found somehow not guilty, it gets off 802 00:50:13,200 --> 00:50:15,840 Speaker 1: scott free. Don't worry about that too much, because that 803 00:50:16,080 --> 00:50:18,520 Speaker 1: is almost definitely not going to happen. 804 00:50:19,640 --> 00:50:22,880 Speaker 2: Yeah, he would end up going to jail again, right, 805 00:50:23,160 --> 00:50:25,160 Speaker 2: And I could I could see a scenario like that, Ben, 806 00:50:25,239 --> 00:50:30,360 Speaker 2: where he does end up going to jail in the 807 00:50:30,480 --> 00:50:33,120 Speaker 2: United States and then just goes away and then every 808 00:50:33,160 --> 00:50:36,080 Speaker 2: once in a while, one journalist from one outlet will 809 00:50:36,080 --> 00:50:38,440 Speaker 2: write a piece on it once a year. 810 00:50:39,600 --> 00:50:39,799 Speaker 5: Yeah. 811 00:50:40,120 --> 00:50:43,600 Speaker 1: Well, you know, I think one thing we can all 812 00:50:43,680 --> 00:50:52,240 Speaker 1: agree on, Fellas, is that Wings has an enormous fan base, 813 00:50:52,400 --> 00:50:55,560 Speaker 1: you know, an army across the world, and they probably 814 00:50:55,640 --> 00:50:58,880 Speaker 1: won't let this story die because, you know, now that 815 00:50:58,880 --> 00:51:01,360 Speaker 1: I'm thinking back on it, I'm pretty sure that the 816 00:51:01,600 --> 00:51:05,960 Speaker 1: entire catalog of Wings is terribly prescient to the career 817 00:51:06,400 --> 00:51:09,239 Speaker 1: and the controversy that is Julian A. Sounge. I've got 818 00:51:09,280 --> 00:51:11,680 Speaker 1: I've got to look out the lyrics. This goes deep. 819 00:51:11,800 --> 00:51:14,839 Speaker 1: We need to get Paul McCartney on the phone. Who 820 00:51:14,880 --> 00:51:17,160 Speaker 1: can call Paul McCartney, Matt Noel, can you do it? 821 00:51:17,239 --> 00:51:19,160 Speaker 3: You know, it's funny you should say that I'm actually 822 00:51:19,200 --> 00:51:23,040 Speaker 3: working on a podcast with Paul McCartney's guitarist, So may 823 00:51:23,160 --> 00:51:27,600 Speaker 3: well have an end to Macca. Nice perfect, They call 824 00:51:27,680 --> 00:51:30,839 Speaker 3: him Macca. They call him Macca like the guitarist calls 825 00:51:30,920 --> 00:51:33,520 Speaker 3: him Mecca. That's a that's a big big Paul fans 826 00:51:33,560 --> 00:51:38,359 Speaker 3: call him like McCartney Macca. Oh okay, wow wow, it's 827 00:51:38,440 --> 00:51:39,280 Speaker 3: like a pet name. 828 00:51:39,680 --> 00:51:43,600 Speaker 1: Or like McDonald's slaying for in Australia like McCaw. 829 00:51:45,320 --> 00:51:46,320 Speaker 4: I just called him poll. 830 00:51:47,280 --> 00:51:50,520 Speaker 1: So so that's where we are now, and to you 831 00:51:50,800 --> 00:51:54,120 Speaker 1: from failing hands, we throw the torch, be yours to 832 00:51:54,239 --> 00:51:57,520 Speaker 1: hold it high. If you break faith with us who die, 833 00:51:57,640 --> 00:52:00,319 Speaker 1: we shall not sleep. Though poppies grow and flanders Field 834 00:52:00,400 --> 00:52:03,360 Speaker 1: went on a little longer than it needed to, but 835 00:52:04,120 --> 00:52:06,920 Speaker 1: you get the picture. We want to know two things 836 00:52:06,960 --> 00:52:10,239 Speaker 1: from you. First, what do you think should happen to 837 00:52:10,400 --> 00:52:13,239 Speaker 1: Julian Aesonge and others like him or those who come 838 00:52:13,320 --> 00:52:17,200 Speaker 1: after him? Secondly, what do you think will happen to 839 00:52:17,680 --> 00:52:22,600 Speaker 1: this notorious whistleblowing mastermind? Let us know. You can find 840 00:52:22,680 --> 00:52:24,480 Speaker 1: us on Facebook, you can find us on Instagram, you 841 00:52:24,520 --> 00:52:29,560 Speaker 1: can find us on Twitter as a show and as individuals. 842 00:52:29,719 --> 00:52:32,319 Speaker 2: Yes you can, and just one last question I want 843 00:52:32,320 --> 00:52:34,760 Speaker 2: to pose to you, what do you think the next 844 00:52:35,000 --> 00:52:38,200 Speaker 2: big leak will be about? Because there will be another 845 00:52:38,239 --> 00:52:38,719 Speaker 2: big leak. 846 00:52:38,800 --> 00:52:40,640 Speaker 1: I'm gonna blow this wings thing wide open. 847 00:52:40,719 --> 00:52:44,160 Speaker 2: I'm telling you, all right, Well, we've got a nomination 848 00:52:44,360 --> 00:52:47,480 Speaker 2: from this side. Let us know what you think. Like 849 00:52:47,600 --> 00:52:51,000 Speaker 2: Ben said, you could find us usually at Conspiracy Stuff, 850 00:52:51,239 --> 00:52:55,960 Speaker 2: sometimes at Conspiracy Stuff show. Check out our Facebook group. 851 00:52:56,280 --> 00:53:00,120 Speaker 2: Here's where it gets crazy. It's fantastic stuff written by 852 00:53:00,200 --> 00:53:03,279 Speaker 2: fantastic people like you. So go check it out. Talk 853 00:53:03,320 --> 00:53:06,680 Speaker 2: about the shows, Let's discuss, you know, some of the minutia. 854 00:53:07,120 --> 00:53:11,080 Speaker 2: Let's talk about future episodes. Let's hang out there together 855 00:53:11,239 --> 00:53:14,040 Speaker 2: on Facebook. If you're into that kind of thing. If 856 00:53:14,040 --> 00:53:15,759 Speaker 2: you're not, you can always give us a call. 857 00:53:16,200 --> 00:53:18,680 Speaker 3: Our number is one eight three three st d w 858 00:53:19,080 --> 00:53:22,840 Speaker 3: y t K. Leave a message at the sound of 859 00:53:22,960 --> 00:53:28,160 Speaker 3: Ben's tone, his sultry tones, and it'll go to us. 860 00:53:28,600 --> 00:53:30,359 Speaker 3: All three of us have access to it. Though Matt 861 00:53:30,440 --> 00:53:34,040 Speaker 3: is still kind of the gatekeeper. I think Ben is 862 00:53:34,120 --> 00:53:37,200 Speaker 3: more the key master. But but Matt is Matt is 863 00:53:37,239 --> 00:53:37,840 Speaker 3: the gatekeeper. 864 00:53:37,960 --> 00:53:40,959 Speaker 2: Wait, does that mean we have to copulated? I forget 865 00:53:41,000 --> 00:53:44,239 Speaker 2: how Win and Ghostbusters man, whatever, whatever, whatever you need, dude, 866 00:53:44,760 --> 00:53:47,520 Speaker 2: you need, so. 867 00:53:49,200 --> 00:53:52,360 Speaker 1: You know, I do wonder which Ghostbusters we would be. 868 00:53:52,600 --> 00:53:56,839 Speaker 1: That's funny. I was. I got some spoilers about dan 869 00:53:56,920 --> 00:53:59,640 Speaker 1: Ackroyd possibly in the future, but we'll we'll catch up 870 00:53:59,640 --> 00:54:03,239 Speaker 1: on that down the road. Yes, as as Noel said, 871 00:54:04,200 --> 00:54:07,359 Speaker 1: give us a call. Matt is our Matt is our 872 00:54:07,480 --> 00:54:10,400 Speaker 1: phone guru. But we do all have access and we 873 00:54:10,480 --> 00:54:13,480 Speaker 1: are all listening in. It makes our day to hear 874 00:54:13,600 --> 00:54:16,440 Speaker 1: from you. Just let us know if you're comfortable with 875 00:54:16,560 --> 00:54:20,880 Speaker 1: your story, your name, or your specifics of your account 876 00:54:20,920 --> 00:54:23,480 Speaker 1: being shared on air, because we don't want to compromise you. 877 00:54:24,000 --> 00:54:27,320 Speaker 2: Or if you're cool with us, you know, intentionally or 878 00:54:27,440 --> 00:54:28,879 Speaker 2: unintentionally calling you back. 879 00:54:29,160 --> 00:54:31,600 Speaker 1: Yes, yes, if you're cool with us accidentally, but DIALI 880 00:54:31,640 --> 00:54:35,879 Speaker 1: and you full steam ahead. And if you hate all 881 00:54:35,960 --> 00:54:39,280 Speaker 1: of that stuff for one reason or another, we totally 882 00:54:39,320 --> 00:54:43,279 Speaker 1: get it and we as always have a backup plan. 883 00:54:43,719 --> 00:54:45,320 Speaker 1: You can email us directly. 884 00:54:45,600 --> 00:54:48,200 Speaker 4: We are conspiracy at iHeartRadio dot com. 885 00:55:06,920 --> 00:55:08,960 Speaker 2: Stuff they don't want you to know is a production 886 00:55:09,120 --> 00:55:13,600 Speaker 2: of iHeartRadio. For more podcasts from iHeartRadio, visit the iHeartRadio app, 887 00:55:13,719 --> 00:55:16,560 Speaker 2: Apple Podcasts, or wherever you listen to your favorite shows.