1 00:00:00,120 --> 00:00:03,320 Speaker 1: We are here at the UBS Australasia Conference is Sydney 2 00:00:03,320 --> 00:00:06,160 Speaker 1: on day one and really excited about this conversation. Coming up. 3 00:00:06,200 --> 00:00:08,639 Speaker 1: Joining me now is Michael Rogers, who's a senior advisor 4 00:00:08,640 --> 00:00:12,080 Speaker 1: at Bondi Partners. He's also the former commander of the 5 00:00:12,240 --> 00:00:15,720 Speaker 1: US Harber Command and the National Security Agency Director. And 6 00:00:16,480 --> 00:00:18,239 Speaker 1: as we were just chatting about in the break, nothing 7 00:00:18,320 --> 00:00:21,080 Speaker 1: going on, nothing pertinent to your wealth of experience out 8 00:00:21,079 --> 00:00:24,799 Speaker 1: all the image we jest, but if we get your 9 00:00:24,880 --> 00:00:27,920 Speaker 1: reaction to the election results, and what do you think, 10 00:00:28,480 --> 00:00:32,120 Speaker 1: having worked under both Presidents Obama and President Trump, what 11 00:00:32,200 --> 00:00:34,839 Speaker 1: do you think Trump two point zero is going to 12 00:00:34,880 --> 00:00:38,240 Speaker 1: look like in terms of his approach to security and cybersecurity. 13 00:00:38,560 --> 00:00:42,800 Speaker 2: So clearly President Trump won by a clear majority, not 14 00:00:42,880 --> 00:00:45,320 Speaker 2: a landslibe, but still a clear majority fifty one percent 15 00:00:45,440 --> 00:00:49,360 Speaker 2: or so of the popular vote. I think what the 16 00:00:49,400 --> 00:00:51,959 Speaker 2: election shows, particularly in his mind, is he now has 17 00:00:51,960 --> 00:00:54,880 Speaker 2: a mandate for change. That his election as a result 18 00:00:54,920 --> 00:00:59,480 Speaker 2: of broad dissatisfaction, that universal, but broad dissatisfaction with the 19 00:00:59,480 --> 00:01:03,360 Speaker 2: policy of the Biden administration. And I suspect President Trump's 20 00:01:03,400 --> 00:01:05,039 Speaker 2: view is I now a man need to change and 21 00:01:05,080 --> 00:01:07,880 Speaker 2: I will do that. He'll probably focus, as he's indicated 22 00:01:07,959 --> 00:01:11,160 Speaker 2: so far economic lines. He clearly has a different vision 23 00:01:11,480 --> 00:01:14,720 Speaker 2: trade tariff policy, his view as prices have gotten out 24 00:01:14,720 --> 00:01:17,000 Speaker 2: of control in the US, he wants to change that. 25 00:01:17,480 --> 00:01:19,880 Speaker 2: He clearly articulated concerns about, Hey, we've got to get 26 00:01:19,880 --> 00:01:21,720 Speaker 2: tougher on the border, We've got to control the flow 27 00:01:21,720 --> 00:01:24,760 Speaker 2: of illegal immigration. I suspect that's going to be a priority. 28 00:01:24,800 --> 00:01:26,960 Speaker 2: And then he has also said, Hey, the world more 29 00:01:27,000 --> 00:01:30,000 Speaker 2: broadly outside the United States, it's in a worse situation 30 00:01:30,040 --> 00:01:31,360 Speaker 2: than it was when I left, and I'm going to 31 00:01:31,440 --> 00:01:33,520 Speaker 2: try what I can do to address those issues, whether 32 00:01:33,560 --> 00:01:36,880 Speaker 2: it's Ukraine, China, Israel, and the Middle East. So no 33 00:01:37,000 --> 00:01:39,000 Speaker 2: small amount of challenges at all. 34 00:01:39,360 --> 00:01:43,080 Speaker 1: And he's he's been quite hawkish in terms of indicating 35 00:01:43,080 --> 00:01:45,520 Speaker 1: that would he take a heart line on Iran and 36 00:01:45,800 --> 00:01:49,640 Speaker 1: China for example, And the security risks are so complex. 37 00:01:49,880 --> 00:01:53,440 Speaker 1: In his last term, there were worries about President Trump 38 00:01:53,440 --> 00:01:55,520 Speaker 1: at that point wanting to put in place his supporters 39 00:01:56,000 --> 00:01:57,920 Speaker 1: at the top of the CIA, for example, and other 40 00:01:57,960 --> 00:02:03,160 Speaker 1: intelligence agencies. Do you want worry about the politicization over 41 00:02:03,320 --> 00:02:04,960 Speaker 1: components perhaps for these agents. 42 00:02:05,000 --> 00:02:07,640 Speaker 2: Look, I was one of those individuals. I'm in the 43 00:02:07,720 --> 00:02:10,160 Speaker 2: Mueller report because twice he asked me to do something 44 00:02:10,160 --> 00:02:11,600 Speaker 2: and I told him I wasn't going to do that 45 00:02:11,639 --> 00:02:13,640 Speaker 2: because I didn't think it was appropriate. I told him why, 46 00:02:13,720 --> 00:02:16,040 Speaker 2: and to his credit, he said, Okay, Mike, I understand. 47 00:02:16,280 --> 00:02:18,519 Speaker 2: So my view is if everyone does their job, we're 48 00:02:18,520 --> 00:02:21,160 Speaker 2: going to be fine. We have a series of protections, 49 00:02:21,440 --> 00:02:24,440 Speaker 2: checks and balances in the US system, and trying to 50 00:02:24,440 --> 00:02:27,400 Speaker 2: make fundamental change is not impossible, but it's really difficult. 51 00:02:27,720 --> 00:02:27,880 Speaker 1: Now. 52 00:02:27,919 --> 00:02:30,959 Speaker 2: Clearly, I think when President Biden or President Trump che's 53 00:02:31,040 --> 00:02:33,960 Speaker 2: we starts this term on the twentieth of January twenty 54 00:02:34,000 --> 00:02:36,800 Speaker 2: twenty five. Number one, H'll have already been the president 55 00:02:36,840 --> 00:02:39,079 Speaker 2: for four years, so he's got a steep learning curve 56 00:02:39,120 --> 00:02:42,000 Speaker 2: behind him. He'd be very focused on I've got experience here. 57 00:02:42,000 --> 00:02:44,480 Speaker 2: I know exactly what I want to do. Secondly, he 58 00:02:44,560 --> 00:02:49,280 Speaker 2: truly believes this in his mind, this idea that parts 59 00:02:49,280 --> 00:02:52,160 Speaker 2: of the government are resisting him, that don't want to 60 00:02:52,919 --> 00:02:55,600 Speaker 2: follow his direction. I always tried to tell him, sir, 61 00:02:55,720 --> 00:02:58,960 Speaker 2: they're all you know, we're all professionals. We take an 62 00:02:59,000 --> 00:03:02,359 Speaker 2: oath to the Constitution, and we serve professionally. Whoever is 63 00:03:02,400 --> 00:03:05,960 Speaker 2: a duly elected leader of the United States. But I 64 00:03:05,960 --> 00:03:08,399 Speaker 2: think he'll be inclined to make sure that his supporters 65 00:03:08,480 --> 00:03:11,960 Speaker 2: are spread across a much greater spectrum of leadership within 66 00:03:12,000 --> 00:03:14,600 Speaker 2: the government, not just at the cabinet level, but below 67 00:03:14,639 --> 00:03:16,679 Speaker 2: that as well. That will be a difference, I think 68 00:03:16,720 --> 00:03:18,360 Speaker 2: from the first time, where he just didn't have the 69 00:03:18,440 --> 00:03:20,799 Speaker 2: time really to do that as much. I think he'll 70 00:03:20,840 --> 00:03:23,520 Speaker 2: start out that'll be important to him this time around. 71 00:03:25,040 --> 00:03:27,240 Speaker 1: What was it like working for him, Because you've spoken 72 00:03:27,240 --> 00:03:29,679 Speaker 1: about in the past this idea of access and that 73 00:03:29,840 --> 00:03:32,800 Speaker 1: being a bit of a risk to President Trump in 74 00:03:32,800 --> 00:03:35,560 Speaker 1: the sense that you know, he wasn't a man for 75 00:03:35,680 --> 00:03:38,240 Speaker 1: consensus or necessarily sort of research. But he liked to 76 00:03:38,240 --> 00:03:41,760 Speaker 1: have these conversations I know, and it often dependent on 77 00:03:41,800 --> 00:03:43,440 Speaker 1: the last person that had access to him. 78 00:03:43,560 --> 00:03:46,040 Speaker 2: So to his credit, he would often say to you, look, 79 00:03:46,040 --> 00:03:48,560 Speaker 2: I'm a business individual, I'm not a politician, so I'm 80 00:03:48,560 --> 00:03:50,640 Speaker 2: not interested in the twenty five years of history on 81 00:03:50,680 --> 00:03:53,560 Speaker 2: this issue. What I want to understand is what's the problem. 82 00:03:53,640 --> 00:03:55,400 Speaker 2: Why are we here today, and what do you want 83 00:03:55,400 --> 00:03:59,320 Speaker 2: from me? What is the recommendation you're making now. Part 84 00:03:59,320 --> 00:04:01,840 Speaker 2: of the challenge with him began He tended to be 85 00:04:01,880 --> 00:04:04,040 Speaker 2: a person who tended a great store by the last 86 00:04:04,040 --> 00:04:05,960 Speaker 2: person he spoke to. So it's interesting to watch the 87 00:04:06,000 --> 00:04:08,240 Speaker 2: people around him try to make sure they were the 88 00:04:08,320 --> 00:04:10,800 Speaker 2: last person that he spoke to. Others have comment on this, 89 00:04:10,880 --> 00:04:13,760 Speaker 2: It's not unique to Mike Rodgers and my impression having 90 00:04:13,960 --> 00:04:17,120 Speaker 2: worked with him for two and a half years. On 91 00:04:17,160 --> 00:04:19,960 Speaker 2: the other hand, I do believe that, you know, there'll 92 00:04:20,000 --> 00:04:21,960 Speaker 2: be a system in place around him that he'll be 93 00:04:22,000 --> 00:04:24,080 Speaker 2: able to hear a broad range of views now which 94 00:04:24,080 --> 00:04:26,000 Speaker 2: he decides he wants to pay the most attention to 95 00:04:26,400 --> 00:04:28,520 Speaker 2: look as in any president, that's going to be up 96 00:04:28,960 --> 00:04:31,520 Speaker 2: to him. So you know, the other thing I think 97 00:04:31,520 --> 00:04:35,200 Speaker 2: we need to remember is number one, he thought being 98 00:04:35,279 --> 00:04:38,120 Speaker 2: unpredictable was a great strength, was an advantage. He actually 99 00:04:38,240 --> 00:04:40,960 Speaker 2: liked being perceived as unpredictable, so we got to remember that. 100 00:04:41,360 --> 00:04:44,359 Speaker 2: And secondly, he's an individual who liked generating a response. 101 00:04:44,440 --> 00:04:47,200 Speaker 2: Many politicians in my experience, I'm not a political individual, 102 00:04:47,200 --> 00:04:50,080 Speaker 2: but having worked with many of the leaders in my government, 103 00:04:50,839 --> 00:04:53,080 Speaker 2: their view is, hey, look, let's not do things that 104 00:04:53,400 --> 00:04:55,760 Speaker 2: say things that are going to spark turbulence. They're going 105 00:04:55,839 --> 00:04:58,800 Speaker 2: to generate a lot of banks. He's very different in 106 00:04:58,800 --> 00:05:01,720 Speaker 2: that regard. His view is, hey, I like generating a response. 107 00:05:02,160 --> 00:05:03,560 Speaker 2: He almost enjoyed it. 108 00:05:03,640 --> 00:05:04,280 Speaker 1: I thought my. 109 00:05:04,320 --> 00:05:06,960 Speaker 2: Perception was, And so I tell people, look, you're gonna 110 00:05:07,040 --> 00:05:08,719 Speaker 2: have to get used to the idea that what he 111 00:05:09,000 --> 00:05:12,400 Speaker 2: says and what he ultimately chooses to do may not 112 00:05:12,480 --> 00:05:13,520 Speaker 2: be the same thing. 113 00:05:14,120 --> 00:05:18,159 Speaker 1: Which would be comforting to some people. Right. Do you 114 00:05:18,200 --> 00:05:21,120 Speaker 1: see the risks to national security as being the same, 115 00:05:21,520 --> 00:05:24,520 Speaker 1: perhaps exacerbated than they were four years ago. You've spoken, 116 00:05:24,520 --> 00:05:27,880 Speaker 1: of course about misinformation and what's happening online. What is 117 00:05:27,920 --> 00:05:29,200 Speaker 1: the biggest sort of risks for you? 118 00:05:29,440 --> 00:05:32,200 Speaker 2: The security environment clearly has gotten worse or tougher on 119 00:05:32,279 --> 00:05:34,960 Speaker 2: a global basis. I mean, we're now dealing with conflict 120 00:05:34,960 --> 00:05:38,640 Speaker 2: in the Middle East in multiple dimensions, both activity directed 121 00:05:38,680 --> 00:05:41,880 Speaker 2: against economic things like shipping in the Red Sea from 122 00:05:41,920 --> 00:05:45,880 Speaker 2: Iran in Israel to directly exchanging attacks against each other, 123 00:05:45,880 --> 00:05:49,720 Speaker 2: as well as the ongoing conflicts in Gaza and Lebanon, 124 00:05:50,040 --> 00:05:51,760 Speaker 2: none of which to me appear as if going to 125 00:05:51,920 --> 00:05:54,360 Speaker 2: end any time in the immediate like the next few 126 00:05:54,360 --> 00:05:57,960 Speaker 2: weeks or months. So the world's got a lot more contentious, 127 00:05:57,960 --> 00:06:01,440 Speaker 2: a lot more complicated. Technology and information playing in ever 128 00:06:01,520 --> 00:06:03,960 Speaker 2: greater role, and you're watching how more and more nations 129 00:06:04,080 --> 00:06:09,800 Speaker 2: China Russia around are using the power of disinformation, of misinformation, 130 00:06:10,240 --> 00:06:13,440 Speaker 2: deep fakes, the role of technology to make us believe 131 00:06:13,480 --> 00:06:15,520 Speaker 2: that what we are seeing, what we are reading is 132 00:06:15,600 --> 00:06:18,480 Speaker 2: real or accurate, when in fact it's not. It is 133 00:06:18,680 --> 00:06:22,120 Speaker 2: created out of you know, it's not real. It's some 134 00:06:22,160 --> 00:06:24,800 Speaker 2: person's attempt to technically try to convince us it's real 135 00:06:25,320 --> 00:06:27,400 Speaker 2: that The information dynamic, I think is one of our 136 00:06:27,400 --> 00:06:31,800 Speaker 2: biggest challenges, particularly in free democratic societies, where the ability 137 00:06:31,800 --> 00:06:34,159 Speaker 2: to express our opinion is so foundational, and the idea 138 00:06:34,200 --> 00:06:37,800 Speaker 2: that well, the government doesn't censor, we allow people to read, view, 139 00:06:38,080 --> 00:06:41,480 Speaker 2: consider and they will make their choice. That becomes increasingly 140 00:06:41,560 --> 00:06:45,920 Speaker 2: difficult when we've got nation states and individuals, you know, 141 00:06:45,960 --> 00:06:48,800 Speaker 2: trying to use those technologies and those freedoms against us. 142 00:06:48,839 --> 00:06:52,600 Speaker 1: As it worked is deregulation, which is what we're looking 143 00:06:52,640 --> 00:06:55,600 Speaker 1: at with a second Trump administration going to potentially make 144 00:06:55,640 --> 00:06:56,720 Speaker 1: that worse. I don't know. 145 00:06:56,760 --> 00:06:59,720 Speaker 2: I think it varies by the specific situation we're dealing with. 146 00:07:00,000 --> 00:07:03,240 Speaker 2: And I also, before we start making blanking assumptions about 147 00:07:03,240 --> 00:07:04,480 Speaker 2: he's going to do this, he's going to do that, 148 00:07:04,560 --> 00:07:06,640 Speaker 2: let's wait and see. I'm the first of that. I 149 00:07:06,640 --> 00:07:09,279 Speaker 2: don't know what the answer is, but I just caution 150 00:07:09,440 --> 00:07:12,320 Speaker 2: us against making a lot of blanket assumptions this early. 151 00:07:13,080 --> 00:07:15,800 Speaker 1: You're here in Sydney, and I do wonder to kind 152 00:07:15,800 --> 00:07:17,800 Speaker 1: of end this conversation, what do you see the risks 153 00:07:17,880 --> 00:07:21,560 Speaker 1: the challenges are for middle nations like Australia. 154 00:07:22,360 --> 00:07:24,960 Speaker 2: Well, first of all, we all, regardless of size, we 155 00:07:25,000 --> 00:07:27,800 Speaker 2: all live in this hyper connected world. Our economies are 156 00:07:27,920 --> 00:07:32,560 Speaker 2: all intertwined. Our relationships, our friendships. You look at aucust, 157 00:07:32,640 --> 00:07:34,800 Speaker 2: you look at the great history. Are two nations the 158 00:07:34,880 --> 00:07:36,480 Speaker 2: United States and Australia have together. 159 00:07:36,600 --> 00:07:37,840 Speaker 1: Yeah. 160 00:07:37,880 --> 00:07:40,200 Speaker 2: So the thing I always would say to my Australian 161 00:07:40,240 --> 00:07:42,800 Speaker 2: teammates is, look, you're a great nation with a great history. 162 00:07:42,880 --> 00:07:45,200 Speaker 2: One of the things that makes you great, you're a 163 00:07:45,240 --> 00:07:48,080 Speaker 2: democratic society that is willing to do hard things. I 164 00:07:48,160 --> 00:07:50,400 Speaker 2: just I have great respect for countries like that, you know, 165 00:07:50,440 --> 00:07:52,000 Speaker 2: the selfish image of Americas. 166 00:07:52,040 --> 00:07:52,480 Speaker 1: We're like that. 167 00:07:52,600 --> 00:07:55,280 Speaker 2: We're willing to do hard things. We're willing to do 168 00:07:55,320 --> 00:07:56,520 Speaker 2: the things that we think are right.