1 00:00:01,360 --> 00:00:03,560 Speaker 1: I'm John Cipher and I'm Jerry O'Shea. 2 00:00:04,000 --> 00:00:06,800 Speaker 2: I was a CIA officer stationed around the world in 3 00:00:06,920 --> 00:00:09,520 Speaker 2: high threat posts in Europe, Russia, and in Asia. 4 00:00:09,600 --> 00:00:12,640 Speaker 1: And I served in Africa, Asia, Europe, the Middle East 5 00:00:12,760 --> 00:00:16,840 Speaker 1: and in war zones. We sometimes created conspiracies to deceive 6 00:00:16,880 --> 00:00:17,799 Speaker 1: our adversaries. 7 00:00:18,040 --> 00:00:21,720 Speaker 2: Now we're going to use our expertise to deconstruct conspiracy 8 00:00:21,760 --> 00:00:23,520 Speaker 2: theories large and small. 9 00:00:23,800 --> 00:00:26,280 Speaker 1: Could they be true or are we being manipulated? 10 00:00:26,800 --> 00:00:28,840 Speaker 2: This is mission implausible. 11 00:00:32,560 --> 00:00:35,199 Speaker 3: We now continue with part two of our conversation with 12 00:00:35,280 --> 00:00:40,040 Speaker 3: Robin Kerno, former journalists for CNN and all around South African. 13 00:00:42,000 --> 00:00:44,559 Speaker 2: Actually, South Africans are having quite an impact on us 14 00:00:44,600 --> 00:00:46,680 Speaker 2: here politically in the United States now right. So there's 15 00:00:46,720 --> 00:00:50,160 Speaker 2: a tech bros. Elon musk is the best known David Sachs, 16 00:00:50,240 --> 00:00:53,479 Speaker 2: Peter Teel who have strong views about what the US 17 00:00:53,560 --> 00:00:56,800 Speaker 2: government should be, what should happen to the future, where 18 00:00:56,840 --> 00:00:59,639 Speaker 2: are we going with technology? And they're playing a very 19 00:00:59,720 --> 00:01:03,360 Speaker 2: large role in our politics. Is there something based on 20 00:01:03,400 --> 00:01:06,119 Speaker 2: your experience with South Africa that is South African about 21 00:01:06,120 --> 00:01:08,160 Speaker 2: that or is there anything as we look at them 22 00:01:08,160 --> 00:01:10,200 Speaker 2: and we try to figure out what impact is having 23 00:01:10,280 --> 00:01:13,759 Speaker 2: on us, Is there something that we should know based 24 00:01:13,800 --> 00:01:15,600 Speaker 2: on their background in South Africa that will help us 25 00:01:15,720 --> 00:01:16,800 Speaker 2: put it into context. 26 00:01:17,000 --> 00:01:19,000 Speaker 4: A lot of people have asked me this, and I've 27 00:01:19,000 --> 00:01:22,360 Speaker 4: tried to listen to them as well. I'll listen to 28 00:01:22,400 --> 00:01:25,200 Speaker 4: sax on the All In podcast. There's also David Freeberg 29 00:01:25,200 --> 00:01:28,200 Speaker 4: who's also South African. Elon, you know, I've listened to 30 00:01:28,280 --> 00:01:30,520 Speaker 4: a lot of his conversations. I've listened to and Chris 31 00:01:30,560 --> 00:01:33,880 Speaker 4: Cuomo asked me this and then got Elon's dad on 32 00:01:33,959 --> 00:01:36,480 Speaker 4: his show and asked him, you know, were you racist 33 00:01:36,480 --> 00:01:40,840 Speaker 4: aparty topologists? And Elon's father dealt with it in he 34 00:01:40,880 --> 00:01:44,200 Speaker 4: said no. The other question with that, which sometimes I 35 00:01:44,200 --> 00:01:48,120 Speaker 4: think is also dangerous, is if is anybody who isn't 36 00:01:48,160 --> 00:01:51,720 Speaker 4: black who grew up in white South Africa racist and 37 00:01:51,760 --> 00:01:56,880 Speaker 4: therefore a crazy person who believes in things that aren't cool. 38 00:01:57,760 --> 00:01:59,600 Speaker 4: And there's also that you've got to be careful that 39 00:01:59,640 --> 00:02:02,200 Speaker 4: you're not painting an entire generation it just happened to 40 00:02:02,240 --> 00:02:06,880 Speaker 4: be unfortunately born during a totalitarian race government that you 41 00:02:06,960 --> 00:02:10,519 Speaker 4: all get tempered with some sort of race brush. 42 00:02:10,639 --> 00:02:12,320 Speaker 2: Yeah, but the things that they're saying this is the 43 00:02:12,360 --> 00:02:15,639 Speaker 2: sort of viewpoint about what the role of government is 44 00:02:15,720 --> 00:02:18,600 Speaker 2: and what the future is and what has I don't 45 00:02:18,639 --> 00:02:21,560 Speaker 2: necessarily think of it as a racist thing, but there's something. 46 00:02:21,680 --> 00:02:25,440 Speaker 2: Is there something cultural that's behind that sort of mental 47 00:02:25,480 --> 00:02:27,680 Speaker 2: I don't know the answer, maybe no, But the. 48 00:02:27,639 --> 00:02:30,400 Speaker 4: Only way I can I could probably bring some sort 49 00:02:30,440 --> 00:02:33,920 Speaker 4: of light into it. So if you think about conspiracy theories, 50 00:02:34,760 --> 00:02:36,880 Speaker 4: and I say we because I'm almost the same age 51 00:02:36,880 --> 00:02:38,680 Speaker 4: as all of those men, and we grew up in 52 00:02:38,919 --> 00:02:43,160 Speaker 4: Johannesburg Pretoria area, so you know, I had almost for 53 00:02:43,240 --> 00:02:47,840 Speaker 4: some of the same childhood and news was censored. There 54 00:02:47,880 --> 00:02:51,920 Speaker 4: was the SABC, it was the only news. The apartheid 55 00:02:52,000 --> 00:02:56,840 Speaker 4: government was extremely good at propaganda and censorship. We lived 56 00:02:56,919 --> 00:03:00,720 Speaker 4: in isolation. It was deliberate by the outside world. Of course, 57 00:03:00,840 --> 00:03:03,160 Speaker 4: like the Soviet Union and the South Africans were cut off. 58 00:03:04,160 --> 00:03:10,720 Speaker 4: The apartheid media machine was vicious. It was highly proficient 59 00:03:10,840 --> 00:03:13,120 Speaker 4: at what it did, and it worked hand in hand 60 00:03:13,560 --> 00:03:17,360 Speaker 4: with the apartheid government and the apartheid intelligence authorities. So 61 00:03:17,680 --> 00:03:20,920 Speaker 4: when you grow up in an environment where nothing, you 62 00:03:21,000 --> 00:03:23,560 Speaker 4: don't know anything, you and this is I think of 63 00:03:23,639 --> 00:03:26,960 Speaker 4: true of any totalitarian state. Because South Africa and Apartheid 64 00:03:26,960 --> 00:03:30,320 Speaker 4: at his basis was a Christian nationalist totalitarian state. When 65 00:03:30,360 --> 00:03:34,440 Speaker 4: you grow up in that with no information, your instinct 66 00:03:34,600 --> 00:03:38,040 Speaker 4: is to just not trust anything. If you're not a 67 00:03:38,040 --> 00:03:40,560 Speaker 4: believer in the in the state, which you know none 68 00:03:40,600 --> 00:03:43,200 Speaker 4: of us. I don't think you know the sac set. 69 00:03:43,280 --> 00:03:45,560 Speaker 4: You know, they were a Jewish South African family who 70 00:03:45,640 --> 00:03:48,120 Speaker 4: moved to the state. So did the Freeburgs. I think 71 00:03:48,440 --> 00:03:51,200 Speaker 4: Elon Musk hated it, and they got out there as 72 00:03:51,240 --> 00:03:52,880 Speaker 4: quickly as possible. I mean he was out of there 73 00:03:52,880 --> 00:03:55,240 Speaker 4: before we could. And you know my family as well, 74 00:03:55,320 --> 00:03:58,040 Speaker 4: liberal white English speaking South Africans. You know, you were 75 00:03:58,160 --> 00:04:02,200 Speaker 4: stuck in this place and you couldn't leave, and so 76 00:04:02,480 --> 00:04:04,960 Speaker 4: your instinct was, I don't believe any of this stuff, 77 00:04:04,960 --> 00:04:06,720 Speaker 4: So I'm just not going to believe anything, or you 78 00:04:06,840 --> 00:04:10,840 Speaker 4: take everything with a pinch of salt, or in that environment, 79 00:04:11,760 --> 00:04:15,080 Speaker 4: you then say, okay, this seems to be vaguely true. 80 00:04:15,600 --> 00:04:19,000 Speaker 4: This is maybe I saw this the past books or 81 00:04:19,000 --> 00:04:21,200 Speaker 4: some violence on the street. But then there are a 82 00:04:21,200 --> 00:04:24,320 Speaker 4: bunch of gaps like why is this happening, who gave 83 00:04:24,360 --> 00:04:28,159 Speaker 4: the orders? What are the implications? So you start filling 84 00:04:28,160 --> 00:04:31,919 Speaker 4: in the gaps, and that's where the conspiracy theories come in. 85 00:04:32,320 --> 00:04:34,679 Speaker 4: And so you grow up either in a society which 86 00:04:34,760 --> 00:04:39,320 Speaker 4: there's no information, or you start believing maybe this is true. 87 00:04:39,400 --> 00:04:42,840 Speaker 4: I think an American diplomat once described the old South 88 00:04:42,839 --> 00:04:46,640 Speaker 4: Africa as the republic of rumors, and so, you know, 89 00:04:47,000 --> 00:04:51,400 Speaker 4: information became power because if you knew something, you could 90 00:04:51,440 --> 00:04:54,919 Speaker 4: protect yourself, you could use it, you could just be 91 00:04:55,000 --> 00:04:57,920 Speaker 4: an ostrich and bury your head in the sand. So 92 00:04:57,920 --> 00:05:01,960 Speaker 4: South Africans, black and white, tend to have a very 93 00:05:02,560 --> 00:05:08,080 Speaker 4: instinctive reaction to information sometimes because we're like, well, I 94 00:05:08,120 --> 00:05:10,200 Speaker 4: don't know if that's true. You know, is it an 95 00:05:10,200 --> 00:05:12,000 Speaker 4: official source? Who's telling me this? 96 00:05:12,160 --> 00:05:12,440 Speaker 1: Why? 97 00:05:12,480 --> 00:05:15,000 Speaker 4: Why are they telling me this? Why are they telling 98 00:05:15,040 --> 00:05:16,960 Speaker 4: me in this way? Is this something I need to 99 00:05:16,960 --> 00:05:19,400 Speaker 4: read between the lines or is it not. So I 100 00:05:19,440 --> 00:05:24,600 Speaker 4: think that that kind of anarchist, maybe kind of deconstructionist 101 00:05:25,080 --> 00:05:28,560 Speaker 4: Peter teele version of the world might have come from that. 102 00:05:29,279 --> 00:05:34,400 Speaker 4: I don't know. I think maybe Elon's obsession with free 103 00:05:34,440 --> 00:05:39,000 Speaker 4: speech to the point that it's just overwhelmed, maybe comes 104 00:05:39,040 --> 00:05:42,760 Speaker 4: from growing up in an environment where speech was so 105 00:05:43,640 --> 00:05:48,159 Speaker 4: limited and no, you couldn't say anything. I still my 106 00:05:48,200 --> 00:05:51,680 Speaker 4: grandmother's I still call home and my grandmother will, well, 107 00:05:51,720 --> 00:05:53,760 Speaker 4: just be careful what you say on the telephone. Don't 108 00:05:53,760 --> 00:05:57,440 Speaker 4: say that, Robin. You never know who's listening. Now, nobody's 109 00:05:57,480 --> 00:05:59,400 Speaker 4: going to be listening to a one hundred year old 110 00:05:59,440 --> 00:06:02,840 Speaker 4: lady and her granddaughter sitting in the northern suburbs of Johannesburg. 111 00:06:02,880 --> 00:06:06,640 Speaker 4: But that instinct that somebody's listening and that you've got 112 00:06:06,680 --> 00:06:11,560 Speaker 4: to fight the system can maybe create the Elons and 113 00:06:11,680 --> 00:06:15,359 Speaker 4: the Peter Teals and their sort of nilistic views sometimes 114 00:06:15,360 --> 00:06:18,920 Speaker 4: of the world. And that would be my only explanation. 115 00:06:19,000 --> 00:06:21,640 Speaker 4: I think the interesting thing when we talk about information 116 00:06:21,680 --> 00:06:24,400 Speaker 4: and conspiracy. So in those days, like a totalitarian state, 117 00:06:24,520 --> 00:06:29,120 Speaker 4: no information, you how use conspiracy theories, rumors, half truths, 118 00:06:29,200 --> 00:06:31,279 Speaker 4: mythstruths to kind of fill in the blanks of what 119 00:06:31,360 --> 00:06:33,880 Speaker 4: you don't know. And then in a place like America 120 00:06:33,920 --> 00:06:36,320 Speaker 4: now or in the sort of social media world were 121 00:06:36,320 --> 00:06:38,560 Speaker 4: living in, there's too much information, so people are so 122 00:06:38,720 --> 00:06:42,640 Speaker 4: overwhelmed by all the information coming at them that they 123 00:06:42,640 --> 00:06:44,960 Speaker 4: then start to distrust it all and then come up 124 00:06:44,960 --> 00:06:47,240 Speaker 4: and try and fill in the bits. So I think 125 00:06:47,279 --> 00:06:50,359 Speaker 4: there's this sublime and the ridiculous here, and that is 126 00:06:50,400 --> 00:06:54,080 Speaker 4: my kind of convoluted way of trying to understand the 127 00:06:54,120 --> 00:06:58,120 Speaker 4: world views perhaps of the teals and the sexes and 128 00:06:58,160 --> 00:07:01,240 Speaker 4: the Musks. I don't know if you think that makes. 129 00:06:59,920 --> 00:07:03,040 Speaker 1: It so there's a word I use. I use lots 130 00:07:03,080 --> 00:07:07,040 Speaker 1: of words that I don't actually understand that I don't understand, 131 00:07:07,640 --> 00:07:11,360 Speaker 1: and it's it's it's it's algorithm, right, and algorithms give 132 00:07:11,480 --> 00:07:14,560 Speaker 1: us the news that we want, right, and so that. 133 00:07:14,880 --> 00:07:17,400 Speaker 1: But did they well sort of? I think that you know, 134 00:07:17,480 --> 00:07:19,760 Speaker 1: if you're interested in something, it tends to come your 135 00:07:19,800 --> 00:07:23,000 Speaker 1: way to sell you more snow tires or beer, you know, 136 00:07:23,040 --> 00:07:25,760 Speaker 1: whatever we sell. It also goes back to again when 137 00:07:25,760 --> 00:07:28,000 Speaker 1: I was in the living in the third World. I'm 138 00:07:28,000 --> 00:07:30,040 Speaker 1: not sure that's the right term, but in Africa and 139 00:07:30,040 --> 00:07:35,200 Speaker 1: the Middle East and Asia that people often use conspiracy 140 00:07:35,240 --> 00:07:39,800 Speaker 1: theories and lack of information and Hollywood weirdly to make 141 00:07:39,840 --> 00:07:44,040 Speaker 1: themselves the heroes of their own of their own movie. 142 00:07:44,120 --> 00:07:47,320 Speaker 1: And so I was shocked when I first went to Zimbabwe, 143 00:07:47,880 --> 00:07:51,680 Speaker 1: where it was clear that everyone in the Zimbabwean government 144 00:07:52,120 --> 00:07:55,480 Speaker 1: thought that the president would wake up and he wants 145 00:07:55,520 --> 00:07:59,360 Speaker 1: to know what's going on in Zimbabwe. What is Mugabi 146 00:07:59,440 --> 00:08:02,800 Speaker 1: up to now? What about the white farmers, what's the 147 00:08:02,920 --> 00:08:06,960 Speaker 1: what's his concern on the Shona and de bella split, right, 148 00:08:07,040 --> 00:08:09,840 Speaker 1: what are his real views on it? Like, we obviously 149 00:08:09,880 --> 00:08:13,640 Speaker 1: want to take over Zimbabwe and Botswana. And I found 150 00:08:13,640 --> 00:08:16,840 Speaker 1: that in the Middle East as well, where people are like, yeah, 151 00:08:17,040 --> 00:08:19,440 Speaker 1: you know that the president wants to know about the 152 00:08:19,680 --> 00:08:23,080 Speaker 1: ismaily angle on. It's like, you know, look, you'll be 153 00:08:23,120 --> 00:08:25,400 Speaker 1: lucky if they can figure out she you know. And 154 00:08:25,480 --> 00:08:29,240 Speaker 1: so also with the US though, I find it also 155 00:08:29,280 --> 00:08:33,040 Speaker 1: through family members who are into conspiracy theories that like, 156 00:08:33,280 --> 00:08:36,000 Speaker 1: I get it. I'm the hero. I figured this out. 157 00:08:36,120 --> 00:08:39,719 Speaker 1: I have a narrative and often it's me against the world, right, 158 00:08:39,760 --> 00:08:42,280 Speaker 1: if you know, it's like they're almost superheroes. I was wondering, 159 00:08:42,320 --> 00:08:44,880 Speaker 1: if you've talked to a lot of people around the world, 160 00:08:44,880 --> 00:08:47,120 Speaker 1: do you get a sense that there's also this sense 161 00:08:47,160 --> 00:08:48,360 Speaker 1: of ego as well? 162 00:08:48,520 --> 00:08:50,800 Speaker 4: Yeah, but isn't that it? And it does because it's 163 00:08:50,840 --> 00:08:51,880 Speaker 4: sort of like if. 164 00:08:51,720 --> 00:08:55,480 Speaker 1: You if you and with Jesus, right, he's there for you. 165 00:08:56,120 --> 00:08:57,880 Speaker 4: Yeah, and I think that person. Isn't that the whole 166 00:08:57,880 --> 00:09:00,199 Speaker 4: point of it? You know, in times of chain, a 167 00:09:00,200 --> 00:09:03,200 Speaker 4: conspiracy theory will offer you a false sense of security 168 00:09:04,040 --> 00:09:07,840 Speaker 4: or an explanation for the unknown. A conspiracy theory, to 169 00:09:07,920 --> 00:09:14,360 Speaker 4: your point, also creates a social identity. So whether it's 170 00:09:14,400 --> 00:09:18,160 Speaker 4: the bridge ladies discussing something in the northern suburbs or 171 00:09:18,160 --> 00:09:21,880 Speaker 4: whether it's a group of Shauna having a conversation over 172 00:09:22,520 --> 00:09:25,880 Speaker 4: warm beer. At the end of the day, there's a 173 00:09:25,920 --> 00:09:28,880 Speaker 4: social oh did you hear this? And we feel a 174 00:09:28,960 --> 00:09:32,560 Speaker 4: sense of belonging. Superiority I think comes into it. But 175 00:09:32,679 --> 00:09:36,720 Speaker 4: victimhood two. Sometimes I think that they're the same sides 176 00:09:36,760 --> 00:09:38,440 Speaker 4: of the you know, different sides of the coin. The 177 00:09:38,480 --> 00:09:41,360 Speaker 4: sense of superiority or victim or shared victimhood that you'd 178 00:09:41,360 --> 00:09:45,200 Speaker 4: get from sharing a conspiracy theory, and that confirmation bias 179 00:09:45,280 --> 00:09:48,320 Speaker 4: that comes from that that we all agree that we're 180 00:09:48,440 --> 00:09:53,160 Speaker 4: either being done in by somebody together, the victimhood one, 181 00:09:53,440 --> 00:09:57,079 Speaker 4: or we know something that those other folks down the 182 00:09:57,120 --> 00:10:01,600 Speaker 4: road don't know and we have evidence whatever is you know, 183 00:10:01,720 --> 00:10:05,640 Speaker 4: also creates that confirmation bias. So yes, and I think 184 00:10:07,520 --> 00:10:09,440 Speaker 4: I think in Africa, what's so funny? And then you 185 00:10:09,440 --> 00:10:11,320 Speaker 4: probably saw too. I spent a lot of times also 186 00:10:11,320 --> 00:10:16,040 Speaker 4: being fascinated by witchcraft. You mentioned it, and I think 187 00:10:16,040 --> 00:10:20,360 Speaker 4: a World Health organization said ninety five percent of sub 188 00:10:20,400 --> 00:10:24,240 Speaker 4: Saharan Africans visit a witch doctor a sangoma and younger 189 00:10:24,880 --> 00:10:27,640 Speaker 4: first before they go to a traditional doctor. Now this 190 00:10:27,800 --> 00:10:34,439 Speaker 4: cuts across class, and this is generally Black Africans, tribal Africans, 191 00:10:34,440 --> 00:10:38,280 Speaker 4: but it also is infused within sort of white society too, 192 00:10:38,360 --> 00:10:41,360 Speaker 4: I would argue, but just a different type of a younger. 193 00:10:41,440 --> 00:10:46,200 Speaker 4: You're going to trust your restuctor, you know, your physiotherapist 194 00:10:46,360 --> 00:10:49,320 Speaker 4: before you trust or whatever I think is the wrong. 195 00:10:49,480 --> 00:10:51,679 Speaker 4: You're going to trust your acupunctures before you touch the doctor. 196 00:10:52,200 --> 00:10:54,600 Speaker 4: But for me, I always found it so interesting how 197 00:10:54,640 --> 00:10:58,280 Speaker 4: that sense of superiority played into the conspiracy theories around 198 00:10:59,080 --> 00:11:04,360 Speaker 4: health and power and social status in Africa. So the 199 00:11:04,400 --> 00:11:07,800 Speaker 4: witch doctor would be the kind of dispenser of the 200 00:11:07,880 --> 00:11:11,440 Speaker 4: conspiracy theory. Often whether it's sleeping with a virgin to 201 00:11:11,480 --> 00:11:15,040 Speaker 4: cure your aids, or whether it's to chase away a 202 00:11:15,120 --> 00:11:18,360 Speaker 4: cheating wife, or whether it's to get good luck so 203 00:11:18,440 --> 00:11:23,719 Speaker 4: you can be employed next year. And various tasks are 204 00:11:23,720 --> 00:11:26,880 Speaker 4: given out, whether you slaughter a goat or a cow, 205 00:11:27,120 --> 00:11:29,360 Speaker 4: or whether you go and you walk into the you know, 206 00:11:29,559 --> 00:11:34,760 Speaker 4: and whatever the different tribal rituals are. That then creates 207 00:11:34,800 --> 00:11:38,040 Speaker 4: this sort of narrative that then younger the san gorm 208 00:11:38,200 --> 00:11:40,400 Speaker 4: has power. And then if you do this, you too 209 00:11:40,440 --> 00:11:45,120 Speaker 4: will get power. And often I found it fascinating that 210 00:11:45,520 --> 00:11:48,760 Speaker 4: the people who rejected the conspiracy theories or said no, 211 00:11:48,840 --> 00:11:52,160 Speaker 4: this is BS, or I disagree with this, or no, 212 00:11:52,280 --> 00:11:56,840 Speaker 4: you shouldn't rape children, or your wife wasn't cheating on you. 213 00:11:57,559 --> 00:12:00,680 Speaker 4: Often those people would then be punished. It wouldn't be 214 00:12:00,679 --> 00:12:04,040 Speaker 4: the soun gorma who would be punished. And so these 215 00:12:04,080 --> 00:12:07,960 Speaker 4: witchcraft killings that would often turn up, these sort of 216 00:12:07,960 --> 00:12:13,040 Speaker 4: tribalized killings were often often that sense of superiority by 217 00:12:13,080 --> 00:12:18,520 Speaker 4: the group silencing somebody who was questioning their confirmation bias 218 00:12:18,559 --> 00:12:22,520 Speaker 4: around whatever that theory was, and that sense of superior 219 00:12:22,559 --> 00:12:27,840 Speaker 4: to ority often came, I think because people were threatened 220 00:12:28,520 --> 00:12:30,800 Speaker 4: by somebody who was different or had a different point 221 00:12:30,800 --> 00:12:34,040 Speaker 4: of view. And I think that group dynamic plays into everything. 222 00:12:34,080 --> 00:12:36,000 Speaker 4: It plays into stuff we're seeing here in the States. 223 00:12:36,040 --> 00:12:38,480 Speaker 4: And you don't have to be in rural vendor in 224 00:12:38,520 --> 00:12:42,040 Speaker 4: the Soappunsburg getting someone dispensing information from a mad hut 225 00:12:42,360 --> 00:12:44,880 Speaker 4: to know that those instincts can be amplified. 226 00:12:44,920 --> 00:12:48,160 Speaker 2: I saw in our old world right, everybody the hero 227 00:12:48,240 --> 00:12:50,720 Speaker 2: of their own movie. I'm armed, I can fight off 228 00:12:51,000 --> 00:12:53,959 Speaker 2: bad guys when they come in intelligence, you'd see the 229 00:12:53,960 --> 00:12:56,160 Speaker 2: same thing. Especially I remember when the snowed and thing 230 00:12:56,200 --> 00:12:58,760 Speaker 2: happened and he left, there's this view, Oh my god, 231 00:12:58,840 --> 00:13:01,800 Speaker 2: you know they're going after my privacy, They're coming after you, 232 00:13:01,880 --> 00:13:04,200 Speaker 2: and like if you work in the intelligence, you're like 233 00:13:04,280 --> 00:13:07,080 Speaker 2: nobody cares about you. No one is looking at the 234 00:13:07,200 --> 00:13:10,040 Speaker 2: Americans stuff. There's millions and million, hundreds of millions of 235 00:13:10,040 --> 00:13:13,520 Speaker 2: Americans like this is not something that we are interested in, 236 00:13:13,600 --> 00:13:15,679 Speaker 2: have time for, have legal reason to do. And it 237 00:13:15,720 --> 00:13:18,720 Speaker 2: goes further, like when you talk to people who are voting, 238 00:13:18,760 --> 00:13:21,199 Speaker 2: it's like the government's not solving my problems, and like 239 00:13:21,280 --> 00:13:24,360 Speaker 2: it's everybody sees the world through their own lens, which 240 00:13:24,400 --> 00:13:27,360 Speaker 2: is not surprising, but they take it incredibly seriously. 241 00:13:27,559 --> 00:13:29,640 Speaker 4: But why do you think, why do you think America 242 00:13:29,760 --> 00:13:32,320 Speaker 4: is such a bogey man? Because there is even now 243 00:13:32,800 --> 00:13:35,160 Speaker 4: the sense of And that's one of the reasons I've 244 00:13:35,240 --> 00:13:38,360 Speaker 4: realized with this podcast is that the sort of anti 245 00:13:38,480 --> 00:13:41,360 Speaker 4: Americanism clearly is is a fear of what Trump is 246 00:13:41,360 --> 00:13:44,640 Speaker 4: doing and the sort of the global implications of his policies. 247 00:13:45,240 --> 00:13:49,920 Speaker 4: But there's also very easy anti americanism even without even 248 00:13:50,040 --> 00:13:53,280 Speaker 4: during the Obama days, when people loved him. You know, 249 00:13:53,880 --> 00:13:56,360 Speaker 4: you know, you guys are listening or you following, or 250 00:13:56,480 --> 00:13:58,720 Speaker 4: you know, are you following me? You know, like you know, 251 00:13:58,800 --> 00:14:02,720 Speaker 4: nobody's listening in Americans have far more important things to 252 00:14:02,720 --> 00:14:05,640 Speaker 4: deal with right now than, like you said, whether or 253 00:14:05,679 --> 00:14:08,840 Speaker 4: not at even a cabinet meeting in some random country. 254 00:14:08,840 --> 00:14:10,000 Speaker 4: This is going one way or the other. 255 00:14:13,000 --> 00:14:35,080 Speaker 1: Let's passib a second. We'll be right back. Remember a 256 00:14:35,160 --> 00:14:38,520 Speaker 1: senior Iraqi official who was running one of their versions 257 00:14:38,560 --> 00:14:42,120 Speaker 1: of the FBI said to me that he doesn't trust 258 00:14:42,400 --> 00:14:47,800 Speaker 1: the United States because President Obama is anti Shia, because 259 00:14:48,000 --> 00:14:51,000 Speaker 1: what do you mean his father was a Sunni. He's 260 00:14:51,040 --> 00:14:54,040 Speaker 1: a Sunni. He bought into the conspiracy theory that he 261 00:14:54,080 --> 00:14:56,200 Speaker 1: was born in Kenya and that he's a Muslim, right 262 00:14:56,240 --> 00:14:58,840 Speaker 1: that was a given to him. And as a Sunni, 263 00:14:59,320 --> 00:15:02,880 Speaker 1: he doesn't like that shea dominated Iraqi government. And I 264 00:15:02,960 --> 00:15:05,040 Speaker 1: had to try to dissuade him to the fact that 265 00:15:05,360 --> 00:15:07,960 Speaker 1: Obama knows a difference between She and Sunny a little bit, 266 00:15:08,040 --> 00:15:10,360 Speaker 1: but like he doesn't really care. 267 00:15:10,200 --> 00:15:13,400 Speaker 4: About that, and that plays into a superiority thing, like 268 00:15:13,440 --> 00:15:15,800 Speaker 4: if that was it's your grouping that. 269 00:15:16,120 --> 00:15:19,000 Speaker 1: But he was riffing off of American conspiracy theories right 270 00:15:19,840 --> 00:15:22,680 Speaker 1: that he was picking up on the internet. Robert, there 271 00:15:22,720 --> 00:15:25,840 Speaker 1: is one thing. So we've got two CIA guys and 272 00:15:25,920 --> 00:15:29,320 Speaker 1: we've got an award winning journalist here. I just want 273 00:15:29,320 --> 00:15:32,360 Speaker 1: to say, for the record, what record, Well, I don't 274 00:15:32,360 --> 00:15:33,960 Speaker 1: know whatever anybody, So all. 275 00:15:33,960 --> 00:15:35,920 Speaker 4: The people who are listening in and taking that. 276 00:15:36,000 --> 00:15:39,760 Speaker 2: Everybody shut over your own story. Someone's keeping a record 277 00:15:39,800 --> 00:15:40,440 Speaker 2: of what we're. 278 00:15:40,240 --> 00:15:46,360 Speaker 1: Saying, really truly common. No shit, CIA does not use 279 00:15:46,440 --> 00:15:50,520 Speaker 1: journalistic cover. It is like death to us. Now. I 280 00:15:50,560 --> 00:15:55,880 Speaker 1: think we church committee like before the nineteen seventies, like 281 00:15:55,920 --> 00:16:01,320 Speaker 1: you know, fifty years ago, we may have dabbled in that, but. 282 00:16:00,320 --> 00:16:03,560 Speaker 4: But you can protest that until the cow's come and 283 00:16:03,640 --> 00:16:07,680 Speaker 4: nobody's going to believe you that a journalist in a 284 00:16:07,880 --> 00:16:11,000 Speaker 4: place is not that we that I wasn't feeding back 285 00:16:11,120 --> 00:16:15,360 Speaker 4: information to the president. I just think we know that. 286 00:16:15,720 --> 00:16:18,360 Speaker 1: The way to tell the CIA guy in a diplomatic function, 287 00:16:18,680 --> 00:16:20,880 Speaker 1: if you're a journalist, is the person who runs away 288 00:16:20,880 --> 00:16:23,160 Speaker 1: from you because we have to report it, and it's 289 00:16:23,200 --> 00:16:25,360 Speaker 1: like it's trouble. 290 00:16:26,040 --> 00:16:28,360 Speaker 4: I would love to know now who, Yeah, because I 291 00:16:28,400 --> 00:16:31,280 Speaker 4: and also I'm not particularly interested in your point of 292 00:16:31,320 --> 00:16:33,960 Speaker 4: view as a journalist either. I'm trying to get my 293 00:16:34,000 --> 00:16:36,960 Speaker 4: own stuff. My output is different from. 294 00:16:36,840 --> 00:16:39,800 Speaker 2: Your outputes are very thin too. We're all interested in like, 295 00:16:40,120 --> 00:16:44,000 Speaker 2: very specific security related issues, not wider cultural and bigger 296 00:16:44,560 --> 00:16:45,680 Speaker 2: news related things. 297 00:16:46,080 --> 00:16:47,680 Speaker 4: And I mean you get you know, the ambassador you 298 00:16:47,680 --> 00:16:49,360 Speaker 4: can see from when the CA you talk what Snowden, 299 00:16:49,400 --> 00:16:52,240 Speaker 4: when the cables releaked, I mean the ambassadors are painting 300 00:16:52,240 --> 00:16:56,800 Speaker 4: a picture of you know, you know, the power elite 301 00:16:56,800 --> 00:16:59,520 Speaker 4: within a country. Often you know those that's often an 302 00:16:59,520 --> 00:17:03,360 Speaker 4: interesting anecdotal stuff. You know, some of those cables were fascinating, 303 00:17:03,360 --> 00:17:05,840 Speaker 4: but it wasn't I'm assuming what you guys were doing. 304 00:17:05,880 --> 00:17:08,800 Speaker 4: You're not particularly interested in who's on top and who 305 00:17:08,840 --> 00:17:12,640 Speaker 4: isn't unless it's a very specific angle that you're trying 306 00:17:12,640 --> 00:17:15,159 Speaker 4: to get. I was more interested in painting a picture 307 00:17:15,200 --> 00:17:19,280 Speaker 4: of what was important and what was happening, and again 308 00:17:19,480 --> 00:17:23,080 Speaker 4: probably narrow, how is Mandela? Is he dying? Will he 309 00:17:23,200 --> 00:17:25,119 Speaker 4: die tomorrow? That I spent a whole year of my 310 00:17:25,200 --> 00:17:28,960 Speaker 4: life having that conversation, and that was important, I think 311 00:17:29,000 --> 00:17:31,760 Speaker 4: for the US diplomatic call, because they needed to know 312 00:17:31,800 --> 00:17:34,240 Speaker 4: if the president, if Obama, would be able to come 313 00:17:34,240 --> 00:17:38,040 Speaker 4: to Mandela's funeral, for example, that was a political state 314 00:17:38,400 --> 00:17:41,040 Speaker 4: argument nothing to do with what you guys were doing, obviously, 315 00:17:41,280 --> 00:17:43,840 Speaker 4: But I think a lot of countries do use journalists 316 00:17:43,840 --> 00:17:46,160 Speaker 4: as covered and there are a lot of stories of that. 317 00:17:46,240 --> 00:17:48,240 Speaker 4: But yes, not the Americans. We have put it on 318 00:17:48,240 --> 00:17:50,200 Speaker 4: the record for all the Chinese and listening. 319 00:17:50,640 --> 00:17:53,280 Speaker 1: But I did a lot of counter terrorism in my time, 320 00:17:53,640 --> 00:17:57,000 Speaker 1: and I remember Dick Cheney being Vice President United States, 321 00:17:57,080 --> 00:18:01,960 Speaker 1: being really upset saying peterberg In, a journalist gets into 322 00:18:02,000 --> 00:18:05,119 Speaker 1: the cave and sits with Osama bin Laden and we 323 00:18:05,280 --> 00:18:08,359 Speaker 1: the CIA, we can't do We the US government, you, 324 00:18:08,440 --> 00:18:11,240 Speaker 1: the CIA, you can't do this. And the answer was 325 00:18:11,400 --> 00:18:16,360 Speaker 1: bin Laden actually believes that CIA doesn't use journalistic cover 326 00:18:16,760 --> 00:18:20,439 Speaker 1: because we don't. So we had convinced beIN Laden that 327 00:18:20,520 --> 00:18:23,560 Speaker 1: we don't do it. I don't know, can you tell me? 328 00:18:23,600 --> 00:18:23,840 Speaker 2: I think. 329 00:18:24,280 --> 00:18:26,080 Speaker 4: I think we've had this conversation, and I said, our 330 00:18:26,119 --> 00:18:30,399 Speaker 4: output was different. I think the method of gathering is 331 00:18:30,440 --> 00:18:33,720 Speaker 4: the same. It's getting information from people, and more often 332 00:18:33,720 --> 00:18:37,400 Speaker 4: than not, that comes from connecting with people, looking them 333 00:18:37,400 --> 00:18:39,719 Speaker 4: in the eye, getting them to trust you, keeping them, 334 00:18:40,520 --> 00:18:44,679 Speaker 4: keeping them safe. The storytelling in terms of how you 335 00:18:44,760 --> 00:18:48,840 Speaker 4: get somebody to share something with you. I was just 336 00:18:48,880 --> 00:18:52,639 Speaker 4: putting it out publicly, you were just putting it out internally. 337 00:18:52,880 --> 00:18:56,240 Speaker 4: It's probably the same skill set on some level, but 338 00:18:56,880 --> 00:18:59,480 Speaker 4: very different, no doubt. 339 00:18:59,520 --> 00:19:01,439 Speaker 2: That's what we like. We like talking to journals. There 340 00:19:01,440 --> 00:19:03,760 Speaker 2: are very similarities in what your work was and what 341 00:19:03,840 --> 00:19:06,960 Speaker 2: our work was. But the one thing that's not similar, 342 00:19:07,000 --> 00:19:09,119 Speaker 2: which I do want to ask you about, is in 343 00:19:09,160 --> 00:19:12,600 Speaker 2: your career you've interviewed lots of famous and interesting people 344 00:19:13,280 --> 00:19:15,679 Speaker 2: and powerful people. Can you talk a little bit just 345 00:19:15,760 --> 00:19:19,159 Speaker 2: in just like who you found most interesting or I 346 00:19:19,160 --> 00:19:21,800 Speaker 2: know you've talked to Oprah for example, is she willing 347 00:19:21,840 --> 00:19:25,600 Speaker 2: to accept blame for doctor Osby? Or who have you interviewed? 348 00:19:25,640 --> 00:19:28,240 Speaker 2: What was interesting? Any anecdotes that those of us who 349 00:19:28,400 --> 00:19:29,800 Speaker 2: live in a sort of loved run. 350 00:19:30,000 --> 00:19:33,720 Speaker 1: Would find, especially Mandela, who is still a hero of mine. 351 00:19:33,800 --> 00:19:36,720 Speaker 4: Yeah, I think, and I think again. I wanted to 352 00:19:36,760 --> 00:19:39,800 Speaker 4: meet those people and interview those people because I was 353 00:19:39,880 --> 00:19:43,240 Speaker 4: interested in what made them tick. And I'm particularly interested 354 00:19:43,240 --> 00:19:45,360 Speaker 4: in people who aren't that nice. I'm sure you're saying 355 00:19:45,440 --> 00:19:47,160 Speaker 4: some of the most fascinating. 356 00:19:46,560 --> 00:19:48,240 Speaker 1: People, which is while you're here, which is. 357 00:19:48,280 --> 00:19:51,800 Speaker 4: Exactly is the sort of nefarious I just like the 358 00:19:51,800 --> 00:19:54,960 Speaker 4: way power works, and I like to watch it, and 359 00:19:55,000 --> 00:19:57,359 Speaker 4: I also like to watch people when they leave power 360 00:19:57,880 --> 00:20:01,399 Speaker 4: because I think that's sometimes the most interesting place for 361 00:20:01,480 --> 00:20:04,199 Speaker 4: someone to be. And with that in mind, one of 362 00:20:04,280 --> 00:20:06,440 Speaker 4: funnal LEO that for the most interesting people I found 363 00:20:06,480 --> 00:20:08,920 Speaker 4: who surprised me, because I don't get surprised by much, 364 00:20:09,520 --> 00:20:13,119 Speaker 4: was actually George W. Bush. He surprised me a because 365 00:20:13,160 --> 00:20:15,640 Speaker 4: he was so likable, and I went into the interview 366 00:20:15,720 --> 00:20:18,640 Speaker 4: kind of thinking, and it was under an acacia tree 367 00:20:18,680 --> 00:20:22,040 Speaker 4: in Zambia. He was opening up a clinic for to 368 00:20:22,200 --> 00:20:25,680 Speaker 4: encourage the HPV vaccines. So he had saved all these 369 00:20:25,800 --> 00:20:31,520 Speaker 4: lives with pepfar by giving HIV positive Africans entry retrovirals, 370 00:20:31,520 --> 00:20:34,960 Speaker 4: which is an amazing program that saved so many lives. 371 00:20:34,960 --> 00:20:37,359 Speaker 4: It was a Bush administration thing. And then he realized 372 00:20:37,400 --> 00:20:39,040 Speaker 4: a lot of people, a lot of women were dying 373 00:20:39,040 --> 00:20:42,639 Speaker 4: of cervical cancer because there were their immune systems were down, 374 00:20:42,720 --> 00:20:45,119 Speaker 4: so he was saving them from HIV AIDS and then 375 00:20:45,160 --> 00:20:48,640 Speaker 4: they were getting cancer. And so just the hot that's 376 00:20:48,680 --> 00:20:51,840 Speaker 4: a George W. Bush conversation, you know what I mean. 377 00:20:52,119 --> 00:20:54,399 Speaker 4: It was just he surprised me on so many levels. 378 00:20:54,400 --> 00:20:57,000 Speaker 4: And this was during the snowdon time. So I actually 379 00:20:57,040 --> 00:20:58,800 Speaker 4: said to him, I said, do you think Snowden is 380 00:20:58,840 --> 00:21:01,880 Speaker 4: a traitor? And he was like absolutely. He was very 381 00:21:01,880 --> 00:21:05,960 Speaker 4: angry about Snowden. He also seemed to be fascinating you guys. 382 00:21:06,160 --> 00:21:09,560 Speaker 4: He also seemed to be asked him about waterboarding, and 383 00:21:10,000 --> 00:21:14,080 Speaker 4: he seemed to be very indignant that it wasn't it 384 00:21:14,160 --> 00:21:19,000 Speaker 4: wasn't his fault, the whole waterboarding stuff, because it Congress 385 00:21:19,000 --> 00:21:23,359 Speaker 4: had given permission or there'd been like a whole process. Yeah, 386 00:21:23,400 --> 00:21:26,600 Speaker 4: and he said, they knew what everybody knew what they 387 00:21:26,640 --> 00:21:30,000 Speaker 4: were signing off on. So why am I cutting the crap? 388 00:21:30,040 --> 00:21:31,960 Speaker 4: Why am I getting Why am I getting in trouble 389 00:21:32,000 --> 00:21:34,760 Speaker 4: for something? And he was wounded and hurt as if 390 00:21:34,760 --> 00:21:37,640 Speaker 4: he had followed procedure and now he was being slapped 391 00:21:37,640 --> 00:21:40,520 Speaker 4: on the wrist, and so that was interesting. But then 392 00:21:40,920 --> 00:21:44,960 Speaker 4: and then he just became very chatty, and I just 393 00:21:45,040 --> 00:21:48,520 Speaker 4: I could see why he had at a president because 394 00:21:48,560 --> 00:21:51,200 Speaker 4: he was so authentic. You didn't have to agree with 395 00:21:51,240 --> 00:21:55,000 Speaker 4: his politics, but you saw what you got, and I 396 00:21:55,119 --> 00:21:59,720 Speaker 4: understood then the motivations behind the Americans. Mandela was the 397 00:22:00,000 --> 00:22:03,159 Speaker 4: fleet opposite. I remember sitting with Mandela in one of 398 00:22:03,160 --> 00:22:06,600 Speaker 4: his sort of one of his last official interviews. He 399 00:22:06,680 --> 00:22:09,560 Speaker 4: was ninety. He was at home in Kunu, which is 400 00:22:09,560 --> 00:22:13,560 Speaker 4: in the rural Eastern Cape, and he was looking out 401 00:22:13,560 --> 00:22:16,600 Speaker 4: of his window and his cattle, and his Jerry cattle 402 00:22:16,720 --> 00:22:20,800 Speaker 4: of very important wealth, particularly to a tribal leader in 403 00:22:20,880 --> 00:22:24,240 Speaker 4: Southern Africa. And I was sitting there, and I was 404 00:22:24,240 --> 00:22:27,440 Speaker 4: with a cameraman and quite and a few other journalists, 405 00:22:27,880 --> 00:22:30,520 Speaker 4: and he just wanted to talk about his cattle. He 406 00:22:30,600 --> 00:22:33,320 Speaker 4: just wanted to talk about what he saw out of 407 00:22:33,320 --> 00:22:36,200 Speaker 4: his window. And he was he had grown up in 408 00:22:36,280 --> 00:22:40,040 Speaker 4: those hills, walking barefoot as a sort of young boy. 409 00:22:40,200 --> 00:22:42,720 Speaker 4: And he was just so difficult to make small talk with, 410 00:22:42,840 --> 00:22:46,480 Speaker 4: you know. And I think someone like Bush was very 411 00:22:46,520 --> 00:22:48,880 Speaker 4: easy to sort of shoot the breeze with. We talked 412 00:22:48,880 --> 00:22:53,199 Speaker 4: about all sorts of things off the record. Mandela was 413 00:22:53,320 --> 00:22:55,879 Speaker 4: never really very easy to be off the record with. 414 00:22:55,960 --> 00:22:59,080 Speaker 4: You couldn't massage him before an interview, you know, and 415 00:22:59,240 --> 00:23:02,280 Speaker 4: soften him up or just make yourself seem a little 416 00:23:02,320 --> 00:23:04,960 Speaker 4: less intimidating, which is what you would do as a journalist, 417 00:23:04,960 --> 00:23:06,879 Speaker 4: and probably when you're meeting a source, so you like 418 00:23:07,359 --> 00:23:09,840 Speaker 4: self deprecating, or you make a joke, you try and 419 00:23:09,880 --> 00:23:13,080 Speaker 4: sort of make yourself not too intimidating. Because it's CNN 420 00:23:13,200 --> 00:23:16,600 Speaker 4: or whatever. Mandela was just very difficult to make small 421 00:23:16,640 --> 00:23:19,080 Speaker 4: talk with. And then even when he spoke, he was 422 00:23:19,160 --> 00:23:21,359 Speaker 4: very guarded. I mean, that's what happens when you know, 423 00:23:21,359 --> 00:23:25,199 Speaker 4: spend twenty seven years incarcerated by a racist regime. You 424 00:23:25,320 --> 00:23:28,320 Speaker 4: learn to keep things very close to your chest. So 425 00:23:28,359 --> 00:23:32,919 Speaker 4: he was very difficult to get to know Mandela. And 426 00:23:32,960 --> 00:23:36,480 Speaker 4: funnily enough, that's what I actually that what I literally 427 00:23:36,480 --> 00:23:38,239 Speaker 4: this conversation I'm having with you now is the one 428 00:23:38,280 --> 00:23:41,919 Speaker 4: I had with Michelle Obama before I interviewed her. The 429 00:23:41,960 --> 00:23:46,000 Speaker 4: one thing we softened each other up about was our 430 00:23:46,200 --> 00:23:49,800 Speaker 4: both of our inability to break the ice with Mandela. 431 00:23:50,560 --> 00:23:51,040 Speaker 2: Interesting. 432 00:23:51,359 --> 00:23:53,560 Speaker 4: He was just, even to Michelle Obama as the sitting 433 00:23:53,640 --> 00:23:57,879 Speaker 4: first lady at the time, he wasn't a natural raconteur, 434 00:23:58,080 --> 00:24:00,960 Speaker 4: or he didn't let you in and I remember that time, 435 00:24:01,080 --> 00:24:03,359 Speaker 4: or he just wasn't that kind of guy. So I 436 00:24:03,840 --> 00:24:07,880 Speaker 4: think it's about trying to understand people's motivations and why 437 00:24:07,920 --> 00:24:09,520 Speaker 4: they act in a way, and what you can get 438 00:24:09,560 --> 00:24:11,359 Speaker 4: out of them in terms of information. And you have 439 00:24:11,440 --> 00:24:13,280 Speaker 4: to be flexible, and you have to be organic, and 440 00:24:13,280 --> 00:24:15,639 Speaker 4: you have to read somebody very well to be able 441 00:24:15,680 --> 00:24:21,040 Speaker 4: to extract what you need from them. And yeah, Bush 442 00:24:21,080 --> 00:24:25,720 Speaker 4: was an open book, Mandela, I mean closed book. I'm 443 00:24:25,760 --> 00:24:29,080 Speaker 4: probably forgetting some of the others. Oprah was. Oprah was 444 00:24:29,119 --> 00:24:34,000 Speaker 4: just on. She had as on switch on. She was 445 00:24:34,160 --> 00:24:37,359 Speaker 4: fully one hundred and twenty percent to eleven Oprah. So 446 00:24:37,600 --> 00:24:40,000 Speaker 4: she was very huggy. She kind of touched me and 447 00:24:40,119 --> 00:24:41,919 Speaker 4: hugged me a lot, which is, you know, it's an 448 00:24:41,960 --> 00:24:49,439 Speaker 4: interesting device. But yeah, I mean I'm fascinated by and 449 00:24:49,480 --> 00:24:52,320 Speaker 4: I mean I was there at Fidel's. I didn't interview Fidel, 450 00:24:52,359 --> 00:24:54,240 Speaker 4: you mentioned at the beginning. I didn't ever interview Fidel, 451 00:24:54,280 --> 00:24:56,600 Speaker 4: but he obviously was a huge supporter of the South 452 00:24:56,640 --> 00:24:59,240 Speaker 4: African Liberation movements and so he was there a lot 453 00:24:59,680 --> 00:25:03,680 Speaker 4: in sod Africa supporting the ANC. So I saw him 454 00:25:03,680 --> 00:25:06,240 Speaker 4: with a few times. Same with Gadaffi. I don't know 455 00:25:06,240 --> 00:25:08,040 Speaker 4: about you guys. I don't know how much you guys 456 00:25:08,040 --> 00:25:13,040 Speaker 4: saw Godaffi in Africa. But seeing Gadaffi with his ladies 457 00:25:13,080 --> 00:25:19,880 Speaker 4: in tow and Gadaffi in full tinfoiled regalia, like, yeah, 458 00:25:20,240 --> 00:25:24,119 Speaker 4: he had his female little these hot chicks and his 459 00:25:24,240 --> 00:25:25,560 Speaker 4: sort of personal bodyguard. 460 00:25:25,680 --> 00:25:28,439 Speaker 2: I want to make a dictator. That's I probably will 461 00:25:28,480 --> 00:25:29,119 Speaker 2: go that way too. 462 00:25:29,600 --> 00:25:31,680 Speaker 1: He was got to make on female journalists as well, 463 00:25:31,760 --> 00:25:34,600 Speaker 1: right He would send them special clothes before they interviewed him, 464 00:25:34,640 --> 00:25:37,160 Speaker 1: and it was really what you get. It's just saying, 465 00:25:37,160 --> 00:25:43,879 Speaker 1: I don't think you got anything about you. 466 00:25:43,880 --> 00:25:46,720 Speaker 4: Look you must look so pretty in one of God's dresses. 467 00:25:47,119 --> 00:25:50,119 Speaker 4: I don't know. It's you tell me? Is that what 468 00:25:50,160 --> 00:25:53,320 Speaker 4: you miss about is not It's not about being in 469 00:25:53,359 --> 00:25:59,200 Speaker 4: the know. It's more about watching people and wondering what 470 00:25:59,280 --> 00:26:01,520 Speaker 4: makes them tech. I think that's what I missed from 471 00:26:01,560 --> 00:26:04,119 Speaker 4: being in the field, and I missed that when I 472 00:26:04,200 --> 00:26:06,600 Speaker 4: moved to Atlanta and was anchoring a show. That sort 473 00:26:06,600 --> 00:26:09,600 Speaker 4: of disconnect between either being in the field or being 474 00:26:09,640 --> 00:26:11,760 Speaker 4: in a studio. It's probably like being in the field 475 00:26:11,800 --> 00:26:13,840 Speaker 4: and then going to Langley. You feel a little bit 476 00:26:13,880 --> 00:26:15,560 Speaker 4: of use, a lot left behind in the field. 477 00:26:18,280 --> 00:26:39,920 Speaker 1: We'll be right back in a moment. All the people 478 00:26:40,000 --> 00:26:43,720 Speaker 1: you've talked about, the famous or there people who like 479 00:26:44,040 --> 00:26:49,480 Speaker 1: really impressed you. You've been involved in change in South Africa, 480 00:26:49,800 --> 00:26:53,600 Speaker 1: the death of Mugabi at least, a change in Zimbabwe, Israel, 481 00:26:53,640 --> 00:26:59,280 Speaker 1: Palestine issues, Russia, Ukraine issues. The biggest heroes often are 482 00:26:59,280 --> 00:27:01,800 Speaker 1: people that are unsung. But you talk to a lot 483 00:27:01,840 --> 00:27:04,080 Speaker 1: of those people and people that may just feel good 484 00:27:04,119 --> 00:27:06,919 Speaker 1: about being a human being. Real heroes. Were there people 485 00:27:06,960 --> 00:27:11,920 Speaker 1: out there Israelis or Palestinians or Russian dissidents? And I'm 486 00:27:11,960 --> 00:27:14,920 Speaker 1: just thinking of this, and I'm trying to remember her name, 487 00:27:15,480 --> 00:27:20,480 Speaker 1: that this Ukrainian journalist who was just tortured and murdered 488 00:27:20,480 --> 00:27:24,520 Speaker 1: by the Russians, right, Victoria. Yeah, they returned. She was 489 00:27:24,680 --> 00:27:29,040 Speaker 1: detained for a year without charges, finally returned without her eyeballs, 490 00:27:29,119 --> 00:27:32,760 Speaker 1: brains or internal organs. Peers should be She'd had been 491 00:27:32,840 --> 00:27:36,320 Speaker 1: strangled and tortured, muchecuted on her feet. This is a 492 00:27:36,440 --> 00:27:40,800 Speaker 1: journalist who like risked everything, right, there are real heroes 493 00:27:40,840 --> 00:27:43,520 Speaker 1: out there who were journalists and people that journalists work with, 494 00:27:43,680 --> 00:27:46,400 Speaker 1: same with CIA, we work with. Some of our assets 495 00:27:46,400 --> 00:27:49,640 Speaker 1: are like true heroes. I'm just wondering if there's any 496 00:27:49,760 --> 00:27:53,400 Speaker 1: one or two sort of people or stories that made 497 00:27:53,440 --> 00:27:55,840 Speaker 1: you feel good about being a journalist or a human being. 498 00:27:56,280 --> 00:28:02,160 Speaker 4: I was always in awe of grandmother. There's something quite 499 00:28:02,240 --> 00:28:06,360 Speaker 4: astounding about a grandmother. In Africa in particular, they're like Atlas. 500 00:28:06,359 --> 00:28:09,080 Speaker 4: They hold the world on their shoulders, particularly during the 501 00:28:09,240 --> 00:28:12,520 Speaker 4: HIV AIDS crisis, particularly during some of the political violence 502 00:28:13,080 --> 00:28:16,399 Speaker 4: in the townships, the younger people were either killed or 503 00:28:16,440 --> 00:28:21,560 Speaker 4: died or exiled or and a lot of these grandmothers 504 00:28:21,720 --> 00:28:25,800 Speaker 4: were raising their grandchildren. And for me that I have, 505 00:28:25,920 --> 00:28:28,640 Speaker 4: there are a number of images of grandmothers who I've 506 00:28:28,800 --> 00:28:33,359 Speaker 4: encountered over my career, and I will always think to myself, 507 00:28:33,480 --> 00:28:36,520 Speaker 4: if you ever if you know, they're the backbone of society, 508 00:28:36,600 --> 00:28:40,160 Speaker 4: but they're some of the strongest, bravest people I ever 509 00:28:40,320 --> 00:28:42,400 Speaker 4: have come across. You go back to our original thing, 510 00:28:42,520 --> 00:28:45,160 Speaker 4: like what happened with the world right now? And why 511 00:28:45,240 --> 00:28:49,040 Speaker 4: is there a sense of people not believing the news 512 00:28:49,720 --> 00:28:53,400 Speaker 4: the stuff that is important? And I think a lot 513 00:28:53,440 --> 00:28:56,440 Speaker 4: of it is that these are moral conundrums, these are 514 00:28:56,480 --> 00:29:01,040 Speaker 4: complicated issues, and there's this sort of binary kind of perspective, 515 00:29:01,160 --> 00:29:04,600 Speaker 4: this very simplified version of the world that is being 516 00:29:04,640 --> 00:29:09,040 Speaker 4: put out and that bothers me, the sort of black 517 00:29:09,080 --> 00:29:13,160 Speaker 4: and white binary lack of nuance, whether you want to 518 00:29:13,200 --> 00:29:16,959 Speaker 4: couch it in oppressed versus the oppressor, the colonizer versus 519 00:29:16,960 --> 00:29:21,960 Speaker 4: the colonized. I think the oversimplification of the effects after 520 00:29:22,000 --> 00:29:26,080 Speaker 4: October the seventh, all the stuff we've seen in Southern Africa, 521 00:29:26,400 --> 00:29:30,320 Speaker 4: you know, Syria and the terrorists. 522 00:29:30,360 --> 00:29:33,160 Speaker 2: Go back to that personalized stuff, Like the sort of 523 00:29:33,240 --> 00:29:35,240 Speaker 2: view is the world's out there, there's right and wrong, 524 00:29:35,800 --> 00:29:38,880 Speaker 2: something went bad. I would have chosen right. You people 525 00:29:38,960 --> 00:29:42,200 Speaker 2: chose wrong. That's a very simple narrative. I look good 526 00:29:42,240 --> 00:29:45,120 Speaker 2: in the narrative, whereas these things that we're talking about 527 00:29:45,400 --> 00:29:49,400 Speaker 2: are situations that there wasn't a right, like there was 528 00:29:50,160 --> 00:29:54,360 Speaker 2: gradations of different things, whether you believe in truth or 529 00:29:54,400 --> 00:29:56,719 Speaker 2: read that you believe in equality. I mean, those are 530 00:29:56,760 --> 00:30:01,200 Speaker 2: hard issues and you almost have to lane a background 531 00:30:01,280 --> 00:30:04,040 Speaker 2: to someone and say, Okay, here's the decision you are 532 00:30:04,080 --> 00:30:06,480 Speaker 2: stuck with. How do you make that decision? And if 533 00:30:06,480 --> 00:30:07,800 Speaker 2: people want to get out of it, they're like, well, 534 00:30:07,800 --> 00:30:10,440 Speaker 2: I would you know? Move? No, no, no, we're stuck. This 535 00:30:10,480 --> 00:30:13,880 Speaker 2: is the decision now. Both choices are bad, Both choices 536 00:30:13,920 --> 00:30:17,560 Speaker 2: have consequences. That's where we are, Like everybody wants to 537 00:30:17,600 --> 00:30:19,680 Speaker 2: back out and make it easy right and wrong, and 538 00:30:20,080 --> 00:30:22,240 Speaker 2: if it was just right and wrong, we would almost 539 00:30:22,280 --> 00:30:23,680 Speaker 2: always make the right decision. 540 00:30:24,000 --> 00:30:25,840 Speaker 4: Yeah, And I don't think you know these comms. And 541 00:30:25,880 --> 00:30:28,080 Speaker 4: that's the problem right now is that I think if 542 00:30:28,080 --> 00:30:30,400 Speaker 4: I think about Mike, we didn't even talk really about Zimbabwe, 543 00:30:30,400 --> 00:30:32,680 Speaker 4: And I think about all the coverage and times and 544 00:30:32,840 --> 00:30:36,480 Speaker 4: detained in Zimbabwe and how mcgaby just played footsy with 545 00:30:36,600 --> 00:30:39,480 Speaker 4: us the whole time with the media. In the end, 546 00:30:40,040 --> 00:30:43,720 Speaker 4: I knew what was happening because I had I had 547 00:30:43,800 --> 00:30:47,760 Speaker 4: accessed through the way you do things is build sources 548 00:30:47,840 --> 00:30:51,040 Speaker 4: within mcgaby's own inner circle. And so in the end, 549 00:30:51,560 --> 00:30:54,680 Speaker 4: in those last few hours, I was getting messages from 550 00:30:54,680 --> 00:30:58,280 Speaker 4: within the Blue House by the people who were with him. 551 00:30:58,600 --> 00:31:01,920 Speaker 4: And I think, I think a lot of people see 552 00:31:02,000 --> 00:31:06,040 Speaker 4: journalism now as you these champions for truth and then 553 00:31:06,080 --> 00:31:08,600 Speaker 4: you you know, put a thirty second video on TikTok 554 00:31:08,680 --> 00:31:11,000 Speaker 4: and this will explain why I was you know, this 555 00:31:11,120 --> 00:31:14,680 Speaker 4: is right, Whereas if you try and explain why you're 556 00:31:14,680 --> 00:31:18,600 Speaker 4: building relationships with the bad guys so you understand the 557 00:31:18,720 --> 00:31:21,440 Speaker 4: impact in the end in a way that kind of 558 00:31:21,480 --> 00:31:23,480 Speaker 4: shows the nuance, do you know what I mean? Like, 559 00:31:23,520 --> 00:31:28,240 Speaker 4: I think that depth, that analog way of reporting, old school. 560 00:31:28,440 --> 00:31:30,560 Speaker 4: I think that's what we're missing. And I you know, 561 00:31:30,600 --> 00:31:32,520 Speaker 4: I sound like a when we you know, when we 562 00:31:32,560 --> 00:31:35,120 Speaker 4: were you know, to think better. But I do think 563 00:31:35,160 --> 00:31:38,720 Speaker 4: some of the nuance and the complexity gets lost particularly 564 00:31:38,760 --> 00:31:42,320 Speaker 4: with the younger generation and the need to see another 565 00:31:42,440 --> 00:31:43,520 Speaker 4: someone's point of view. 566 00:31:44,000 --> 00:31:45,880 Speaker 1: You've got to spend some of those twenty minutes with 567 00:31:45,960 --> 00:31:46,640 Speaker 1: these things. 568 00:31:46,480 --> 00:31:49,240 Speaker 4: And or you've got to go away into the bush 569 00:31:49,240 --> 00:31:51,000 Speaker 4: for the week and know and you say, listen, I 570 00:31:51,040 --> 00:31:54,800 Speaker 4: can't be doing live shots and updating my YouTube and 571 00:31:54,960 --> 00:31:57,640 Speaker 4: treating and putting it on Instagram and doing a little 572 00:31:57,680 --> 00:31:59,840 Speaker 4: short vignette reel. Do you know what I mean? You 573 00:31:59,880 --> 00:32:02,760 Speaker 4: just need to go and immerse yourself, get the story 574 00:32:03,040 --> 00:32:05,280 Speaker 4: come back. And so I think Sann is still doing 575 00:32:05,360 --> 00:32:07,480 Speaker 4: that to some extent, But you know, it's a double 576 00:32:07,640 --> 00:32:09,760 Speaker 4: you know. I think all these media organizations are trying 577 00:32:09,800 --> 00:32:12,840 Speaker 4: to hold onto that analog way of storytelling. At the 578 00:32:12,880 --> 00:32:17,160 Speaker 4: same time they're also trying to commoditize the algorithm. And 579 00:32:17,240 --> 00:32:20,400 Speaker 4: it's a very it's a very uncomfortable juggling act. And 580 00:32:20,440 --> 00:32:23,760 Speaker 4: I think that's the kind of messiness of where we 581 00:32:23,800 --> 00:32:24,240 Speaker 4: are now. 582 00:32:24,760 --> 00:32:26,760 Speaker 2: You know, we can talk to you forever and hopefully 583 00:32:26,760 --> 00:32:29,760 Speaker 2: we can do this again. Absolutely, to the people out there, 584 00:32:29,800 --> 00:32:32,479 Speaker 2: it is very much worth listening to her podcast as 585 00:32:32,520 --> 00:32:35,160 Speaker 2: she tries to explain our crazy country to the rest 586 00:32:35,160 --> 00:32:36,880 Speaker 2: of the world. So thank you for what you're doing 587 00:32:36,920 --> 00:32:38,120 Speaker 2: and thanks for your time with us. 588 00:32:38,320 --> 00:32:41,000 Speaker 4: Thanks Jerry, thanks John, appreciate it. We also thank you 589 00:32:41,040 --> 00:32:43,360 Speaker 4: for all the work you did for this great, crazy, 590 00:32:43,400 --> 00:32:46,280 Speaker 4: beautiful country. 591 00:32:46,480 --> 00:32:51,480 Speaker 3: Mission Implausible is produced by Adam Davidson, Jerry O'Shea, John Cipher, 592 00:32:51,840 --> 00:32:56,880 Speaker 3: and Jonathan Stern. The associate producer is Rachel Harner. Mission Implausible. 593 00:32:56,920 --> 00:33:00,400 Speaker 3: It's a production of honorable mention and abominable pictures for 594 00:33:00,560 --> 00:33:01,720 Speaker 3: iHeart Podcasts.