1 00:00:00,320 --> 00:00:02,679 Speaker 1: As you know yesterday, and we spoke about this yesterday 2 00:00:02,680 --> 00:00:06,200 Speaker 1: morning on the program. It is a rise in our 3 00:00:06,440 --> 00:00:10,040 Speaker 1: terror alert level the Prime Minister announced yesterday from possible 4 00:00:10,480 --> 00:00:14,319 Speaker 1: to probable, and to have a look at some of 5 00:00:14,400 --> 00:00:20,079 Speaker 1: that in terms of the I suppose primarily the way 6 00:00:20,320 --> 00:00:25,239 Speaker 1: terror messages are delivered to people susceptible to become indoctrinated 7 00:00:25,280 --> 00:00:27,880 Speaker 1: and then carry out an attack. Doctor Levi West is 8 00:00:28,000 --> 00:00:31,160 Speaker 1: research fellow at the Australian National University and he joins 9 00:00:31,280 --> 00:00:34,120 Speaker 1: me now, Levi, good morning, Mary, made thank you for 10 00:00:34,159 --> 00:00:38,760 Speaker 1: your time. So we've seen in the past, particularly Islamic State, 11 00:00:38,800 --> 00:00:42,640 Speaker 1: the indoctrination that occurred through social media and things like Twitter. 12 00:00:42,680 --> 00:00:44,680 Speaker 1: I don't know if Twitter is still in favor amongst 13 00:00:44,680 --> 00:00:47,760 Speaker 1: the terrorist community, but no doubt social media is. 14 00:00:49,360 --> 00:00:54,000 Speaker 2: Yeah, social media and the broader online environment play a 15 00:00:54,080 --> 00:00:58,880 Speaker 2: reasonably substantial role in the radicalization process and sort of change. 16 00:00:59,120 --> 00:01:02,240 Speaker 2: What's broadly understand is sort of the accessibility of extremist 17 00:01:02,280 --> 00:01:05,800 Speaker 2: content and propaganda. That's changed the sort of the dynamics 18 00:01:05,800 --> 00:01:08,680 Speaker 2: of the relationship between an audience and someone who's producing 19 00:01:08,720 --> 00:01:09,760 Speaker 2: extremist propaganda. 20 00:01:10,400 --> 00:01:12,760 Speaker 1: All right, so what do we need to know and do? 21 00:01:12,840 --> 00:01:16,639 Speaker 1: And I see people like Julianmundwest, the Safety Commissioner, calling 22 00:01:16,680 --> 00:01:21,399 Speaker 1: on social media giants to change algorithms to not have 23 00:01:21,680 --> 00:01:24,880 Speaker 1: all the information come up if somebody was to look 24 00:01:25,000 --> 00:01:28,960 Speaker 1: up or stop at a post on for instance, terrorism, 25 00:01:29,040 --> 00:01:32,920 Speaker 1: and then have the inbox flooded with that sort of material. 26 00:01:34,400 --> 00:01:36,800 Speaker 2: Yeah, So, I mean, I guess there's there's there's multiple 27 00:01:36,840 --> 00:01:40,320 Speaker 2: parts for that, and one is that some tech platforms 28 00:01:40,319 --> 00:01:43,440 Speaker 2: do a better job than others trying to regulate the 29 00:01:43,560 --> 00:01:47,560 Speaker 2: kind of content that is accessible on their platforms. For instance, 30 00:01:48,240 --> 00:01:54,080 Speaker 2: when Islamic State was exploiting Twitter incredibly effectively, Twitter then 31 00:01:54,440 --> 00:01:59,120 Speaker 2: under its previous leadership was relatively cooperative with government and 32 00:01:59,360 --> 00:02:02,360 Speaker 2: made story ancial efforts to try and take measures to 33 00:02:02,680 --> 00:02:05,600 Speaker 2: at least reduce the kind of extremist Islamic State content 34 00:02:05,600 --> 00:02:08,680 Speaker 2: that was available on Twitter, and that was reasonably effective 35 00:02:09,320 --> 00:02:12,120 Speaker 2: and collectively across a bundle of platforms and across five 36 00:02:12,160 --> 00:02:14,720 Speaker 2: by its governments, we made a pretty solid dent in 37 00:02:14,760 --> 00:02:18,280 Speaker 2: the accessibility of that content. The challenge is that not 38 00:02:18,360 --> 00:02:22,560 Speaker 2: all platforms are cooperative, or rather platforms vary in their 39 00:02:22,639 --> 00:02:27,920 Speaker 2: levels of a cooperation, and the terrorist organizations and terrorist 40 00:02:28,000 --> 00:02:31,560 Speaker 2: movements tend to switch and adapt and evolve in response 41 00:02:31,600 --> 00:02:34,400 Speaker 2: to the measures that are taken. So Twitter isn't really 42 00:02:34,480 --> 00:02:38,359 Speaker 2: the predominant platform for accessing material anymore in part because 43 00:02:38,360 --> 00:02:42,320 Speaker 2: of measures that Twitter took, and it's evolved to move 44 00:02:42,320 --> 00:02:44,760 Speaker 2: on to more alternity platforms like Telegram, which people might 45 00:02:44,800 --> 00:02:47,760 Speaker 2: be familiar with. So it's constant game of cat and mouse. 46 00:02:47,840 --> 00:02:50,240 Speaker 1: Yeah, indeed, and some of them around, Like I've never 47 00:02:50,280 --> 00:02:53,400 Speaker 1: heard of gab until today, but that's a platform, a 48 00:02:53,440 --> 00:02:56,120 Speaker 1: messaging platform, and there are others as well, And perhaps 49 00:02:56,200 --> 00:02:59,160 Speaker 1: many people haven't heard of Telegram because that's relatively in 50 00:02:59,200 --> 00:03:00,440 Speaker 1: the scheme of things new. 51 00:03:01,520 --> 00:03:05,360 Speaker 2: Yeah, reasonably, like unless you're highly online. Gab was established 52 00:03:05,680 --> 00:03:09,440 Speaker 2: partly in response to fors both Facebook and Twitter to 53 00:03:09,520 --> 00:03:12,480 Speaker 2: address the extreme right wing content that was happening, particularly 54 00:03:12,480 --> 00:03:15,640 Speaker 2: in the United States. Gab Loo looks a lot like 55 00:03:15,720 --> 00:03:18,880 Speaker 2: Twitter and interacts a lot like Twitter, but has an 56 00:03:18,919 --> 00:03:21,880 Speaker 2: incredibly liberal approach to free speech, so it doesn't really 57 00:03:21,919 --> 00:03:24,400 Speaker 2: police its content in any way, shape or form, and 58 00:03:24,600 --> 00:03:27,120 Speaker 2: predominantly wound up as a home for extreme right wing 59 00:03:27,160 --> 00:03:29,720 Speaker 2: content like the A Nazy material and white supremacist material 60 00:03:29,760 --> 00:03:33,079 Speaker 2: and that sort of stuff. Telegram is popular amongst a 61 00:03:33,200 --> 00:03:36,760 Speaker 2: range of different types of terrorist organizations and terrorist movements. 62 00:03:37,040 --> 00:03:40,360 Speaker 2: It's not as effective as Twitter was. Twitter's public facing, 63 00:03:40,480 --> 00:03:46,480 Speaker 2: open open accessibility was particularly valuable for reaching broader audiences. 64 00:03:47,080 --> 00:03:49,600 Speaker 2: You know, the propaganda exercise of a terrorist organization is 65 00:03:49,920 --> 00:03:53,480 Speaker 2: always sort of multifaceted in that it's both seeking to 66 00:03:53,560 --> 00:03:56,120 Speaker 2: reach an audience that it seeks to radicalize, but it's 67 00:03:56,160 --> 00:03:58,400 Speaker 2: also seeking to communicate with a broader audience, like the 68 00:03:58,440 --> 00:04:02,000 Speaker 2: broader public, not to directly, but to get content into 69 00:04:02,040 --> 00:04:06,760 Speaker 2: the media cycle and have media outlets discuss terrorist movements 70 00:04:06,840 --> 00:04:08,920 Speaker 2: and what they believe in and what they're after as 71 00:04:08,920 --> 00:04:12,160 Speaker 2: a form of promotion and advertising essentially, And it puts 72 00:04:12,200 --> 00:04:15,680 Speaker 2: media organizations in a bind whereby things happen that are 73 00:04:15,680 --> 00:04:20,240 Speaker 2: genuinely newsworthy, but in the process wind up discussing and 74 00:04:20,400 --> 00:04:24,400 Speaker 2: promoting and advertising terrorist organizations. Which is a perpetual dilemma 75 00:04:24,440 --> 00:04:27,520 Speaker 2: that dates back to the late eighteen hundreds when terrorists 76 00:04:27,520 --> 00:04:31,200 Speaker 2: were exploiting print, the emerging print media. So that's a 77 00:04:31,240 --> 00:04:32,560 Speaker 2: perpetual problem it is. 78 00:04:32,600 --> 00:04:34,400 Speaker 1: I mean, do you talk about it or do you not? 79 00:04:34,720 --> 00:04:37,120 Speaker 1: And what's right to do in that situation, I suppose 80 00:04:37,400 --> 00:04:38,240 Speaker 1: is what it comes down to. 81 00:04:38,760 --> 00:04:41,520 Speaker 2: Yeah, I mean, I think one of the probably most 82 00:04:41,560 --> 00:04:44,520 Speaker 2: important messages that came out of the press conference yesterday 83 00:04:44,560 --> 00:04:46,800 Speaker 2: morning with the Prime Minister in the General and the 84 00:04:46,800 --> 00:04:50,719 Speaker 2: Director General Security was about the importance of language and moderation, 85 00:04:50,880 --> 00:04:53,880 Speaker 2: and I don't think they identified both not just politicians, 86 00:04:53,920 --> 00:04:59,120 Speaker 2: but media and commentation more generally, the value of moderating language. 87 00:04:59,120 --> 00:05:01,479 Speaker 2: And that doesn't mean not how views. It means not 88 00:05:01,600 --> 00:05:04,280 Speaker 2: talking about the other side of politics in the same 89 00:05:04,320 --> 00:05:08,279 Speaker 2: way that you would talk about an enemy in a 90 00:05:08,320 --> 00:05:11,800 Speaker 2: war context. And that when reporting on these kinds of matters, 91 00:05:11,880 --> 00:05:15,520 Speaker 2: terrorist types of matters, keeping things reasonably calm and sensible 92 00:05:15,839 --> 00:05:18,919 Speaker 2: is It's not the tendency of media outlets, it's not 93 00:05:18,960 --> 00:05:23,240 Speaker 2: the tendency of politicians to do that. But I think 94 00:05:23,320 --> 00:05:27,600 Speaker 2: if we accept the heightened and sensitive security environment as 95 00:05:27,640 --> 00:05:31,240 Speaker 2: a significant component of the dynamics of politics, then it 96 00:05:31,360 --> 00:05:33,919 Speaker 2: becomes sort of incumbent on those of us who seek 97 00:05:33,920 --> 00:05:37,719 Speaker 2: to be responsible to make sure that we're talking about 98 00:05:37,720 --> 00:05:39,680 Speaker 2: these things in proportionate and sensible way. 99 00:05:39,800 --> 00:05:41,720 Speaker 1: Well, the Prime Minister made a point yesterday and it's 100 00:05:41,720 --> 00:05:44,200 Speaker 1: on the front page of The Australian as a headline today. 101 00:05:44,360 --> 00:05:48,920 Speaker 1: Green's divisive rhetoric fueling domestic terror threat and the first 102 00:05:49,000 --> 00:05:51,920 Speaker 1: paragraph of that article reads Anthony Aarbernez. He's accused the 103 00:05:52,000 --> 00:05:55,640 Speaker 1: Greens of fueling community divisions that have prompted the government 104 00:05:55,680 --> 00:05:59,160 Speaker 1: to raise the terrorism threat level from possible to probable, 105 00:05:59,440 --> 00:06:02,160 Speaker 1: saying the part they support the long running protests outside 106 00:06:02,279 --> 00:06:06,080 Speaker 1: MP's officers is undermining social cohesion. Do you think he's right? 107 00:06:07,600 --> 00:06:10,280 Speaker 2: I think that's a take on the comments the Prime 108 00:06:10,279 --> 00:06:14,640 Speaker 2: Minister actually made in the press conference that I watched yesterday. 109 00:06:14,880 --> 00:06:17,159 Speaker 2: That's probably guilty of the things that we were just 110 00:06:17,200 --> 00:06:20,120 Speaker 2: talking about. That's not a particularly calm or moderate or 111 00:06:20,160 --> 00:06:24,120 Speaker 2: sensible interpretation of the point that the Prime Minister was making, 112 00:06:24,640 --> 00:06:27,280 Speaker 2: which was that yes, there are politicians and I would 113 00:06:27,360 --> 00:06:29,359 Speaker 2: argue that they just on both sides of the political 114 00:06:29,400 --> 00:06:32,920 Speaker 2: divide who are happy to sort of score political points 115 00:06:32,920 --> 00:06:36,360 Speaker 2: on issues that are particularly that can potentially be particularly 116 00:06:36,440 --> 00:06:41,640 Speaker 2: divisive or sensitive within politics. I don't I think suggesting 117 00:06:41,680 --> 00:06:44,680 Speaker 2: that the Prime Minister was accusing the Greeds of inciting 118 00:06:44,760 --> 00:06:47,680 Speaker 2: terrorism is a grossly irresponsible lay to report on the 119 00:06:47,720 --> 00:06:51,559 Speaker 2: comments that he made and as I as I just said, 120 00:06:51,760 --> 00:06:54,479 Speaker 2: is literally guilty of the things that both the Prime 121 00:06:54,480 --> 00:06:57,480 Speaker 2: Minister and the Director General of Security pointed out yesterday. 122 00:06:58,320 --> 00:07:01,680 Speaker 2: Which was the being hyperbolic about these things doesn't help anyone. 123 00:07:02,839 --> 00:07:05,479 Speaker 2: I mean, it might get a few extra reads of 124 00:07:05,520 --> 00:07:08,000 Speaker 2: the story and sell a few more hard copies of 125 00:07:08,000 --> 00:07:10,840 Speaker 2: the newspaper, but it's not a particularly socially responsible thing 126 00:07:10,920 --> 00:07:14,800 Speaker 2: to do. The difficult part for any media enterprise, and 127 00:07:14,880 --> 00:07:17,520 Speaker 2: I have a degree of sympathy with this, is that 128 00:07:17,600 --> 00:07:21,760 Speaker 2: they are first and foremost business enterprises. And I think 129 00:07:21,800 --> 00:07:25,040 Speaker 2: everyone understands that, and everyone understands that they first and 130 00:07:25,040 --> 00:07:30,400 Speaker 2: foremost exist as profit generating enterprises. But for media outlets 131 00:07:30,400 --> 00:07:32,760 Speaker 2: to maintain the kind of social license that they have 132 00:07:33,080 --> 00:07:35,600 Speaker 2: to do what they do, there is a sort of 133 00:07:35,680 --> 00:07:39,800 Speaker 2: a degree of responsibility that they're expected to exercise in 134 00:07:39,840 --> 00:07:42,520 Speaker 2: the way that they report on things, particularly security matters, 135 00:07:43,520 --> 00:07:46,320 Speaker 2: And a headline like that and a paragraph like that 136 00:07:47,320 --> 00:07:52,680 Speaker 2: is not a particularly sensitive or more accurate way to 137 00:07:52,880 --> 00:07:56,520 Speaker 2: report what was what was actually said in the press conference. 138 00:07:57,600 --> 00:08:02,080 Speaker 1: Probable now from possible what led to it falling? Because 139 00:08:02,560 --> 00:08:05,960 Speaker 1: back in the Islamic day state days twenty fourteen, ten 140 00:08:06,040 --> 00:08:10,160 Speaker 1: years ago, of course it went up to probable. And 141 00:08:10,600 --> 00:08:13,320 Speaker 1: to me, nothing really has changed. I mean, the world, 142 00:08:13,600 --> 00:08:16,119 Speaker 1: if anything, the only calmness that came into the world 143 00:08:16,200 --> 00:08:18,480 Speaker 1: was as a result of COVID and the lockdowns all 144 00:08:18,520 --> 00:08:22,840 Speaker 1: over the place, So what caused it to fall? And 145 00:08:23,080 --> 00:08:25,560 Speaker 1: so because social media is still here, isn't it? It 146 00:08:25,600 --> 00:08:28,440 Speaker 1: was around them, it's around them, so what made the difference. 147 00:08:29,200 --> 00:08:32,839 Speaker 2: So the lifting of the threat level initially in twenty 148 00:08:32,880 --> 00:08:36,880 Speaker 2: fourteen was almost in direct response to what was happening 149 00:08:36,880 --> 00:08:39,839 Speaker 2: in relation to Islamic State, and by the time it 150 00:08:40,000 --> 00:08:45,400 Speaker 2: was dropped again, the vast majority of the capacity and 151 00:08:45,480 --> 00:08:49,280 Speaker 2: capability of Islamic State to inspire people to commit attacks 152 00:08:49,320 --> 00:08:52,320 Speaker 2: in the Western world, which they were achieving quite effectively 153 00:08:52,760 --> 00:08:56,719 Speaker 2: through twenty sort of fourteen through sixteen had largely been 154 00:08:56,760 --> 00:08:59,800 Speaker 2: dealt with and eradicated. The Caliphate was on the back foot, 155 00:09:00,120 --> 00:09:03,840 Speaker 2: not all entirely collapsed. Their media unit that was producing 156 00:09:03,840 --> 00:09:07,240 Speaker 2: the kind of propaganda that was inspiring Westerners to commit 157 00:09:07,320 --> 00:09:10,520 Speaker 2: loan actor attacks in the West had largely been destroyed, 158 00:09:10,520 --> 00:09:12,640 Speaker 2: along with a bundle of the key actors that were 159 00:09:12,720 --> 00:09:15,840 Speaker 2: driving that. And so as a result that period that 160 00:09:15,880 --> 00:09:20,040 Speaker 2: everyone will remember where there was frequent loan actor geehrdest 161 00:09:20,080 --> 00:09:23,640 Speaker 2: operations happening in the Western world on a regular basis 162 00:09:23,640 --> 00:09:28,040 Speaker 2: of stabbings and vehicle remings. The capacity of Islamic State 163 00:09:28,160 --> 00:09:30,040 Speaker 2: to continue to do what it was doing to make 164 00:09:30,080 --> 00:09:33,040 Speaker 2: that happen had largely been dealt with, and so as 165 00:09:33,040 --> 00:09:38,240 Speaker 2: a result, that particular problem set was not eradicated. They 166 00:09:38,280 --> 00:09:42,440 Speaker 2: still happened intermittently, but the tempo that it was occurring 167 00:09:42,480 --> 00:09:45,200 Speaker 2: at had largely diminished, and so as a result, it 168 00:09:45,240 --> 00:09:47,320 Speaker 2: was possible to sort of look at the environment so 169 00:09:47,400 --> 00:09:51,520 Speaker 2: that the overall kind of likelihood of an attack had 170 00:09:51,520 --> 00:09:55,640 Speaker 2: diminished the increase in the threat level. And I think 171 00:09:55,679 --> 00:09:57,520 Speaker 2: this is the important thing too that sort of gotten 172 00:09:57,559 --> 00:10:00,000 Speaker 2: skipped in some of the commentary since yesterday's press conferences, 173 00:10:00,080 --> 00:10:04,000 Speaker 2: that that threat level change is exclusively in relation to terrorism. 174 00:10:04,600 --> 00:10:06,240 Speaker 2: So this isn't something that's going to thing to do 175 00:10:06,280 --> 00:10:09,280 Speaker 2: with the war in Ukraine. In some ways it doesn't 176 00:10:10,000 --> 00:10:13,200 Speaker 2: less so more directly, I guess affected by that. Even 177 00:10:13,280 --> 00:10:15,600 Speaker 2: yesterday they made a strong point of saying that they 178 00:10:15,640 --> 00:10:18,199 Speaker 2: didn't lift the threat level after October seventh, after the 179 00:10:18,320 --> 00:10:22,079 Speaker 2: musket attacks, and that this was not a threat level 180 00:10:22,200 --> 00:10:25,320 Speaker 2: change in response to the Israeli response to October seven, either, 181 00:10:25,320 --> 00:10:27,040 Speaker 2: that that wasn't the driver of this, that it was 182 00:10:27,040 --> 00:10:30,360 Speaker 2: happening across the board, across all of the various ideological 183 00:10:30,720 --> 00:10:34,680 Speaker 2: motivations that impact the environment. They were increasing the threat 184 00:10:34,720 --> 00:10:40,360 Speaker 2: level because things had changed across the board, so you know, 185 00:10:40,440 --> 00:10:42,800 Speaker 2: dropping it down back then was an assessment that the 186 00:10:43,360 --> 00:10:45,880 Speaker 2: likelihood of a terrorist attack had diminished, and that was 187 00:10:45,920 --> 00:10:48,880 Speaker 2: a reasonable assessment that certainly in relation to Islamic State. 188 00:10:49,920 --> 00:10:52,240 Speaker 2: So you know, we find ourselves in a different environment now. 189 00:10:52,360 --> 00:10:54,839 Speaker 1: Absolutely, And just sort of to wind back where we 190 00:10:54,920 --> 00:10:59,680 Speaker 1: started about social media, the attacks this year, all the 191 00:10:59,720 --> 00:11:02,680 Speaker 1: ones that azo have stopped, and indeed the one that 192 00:11:02,760 --> 00:11:08,120 Speaker 1: occurred at the Assyrian church all teenage boys, all between 193 00:11:08,120 --> 00:11:12,120 Speaker 1: around fourteen and nineteen, give or take. And I suppose 194 00:11:12,160 --> 00:11:17,120 Speaker 1: it is this group that authorities fear being radicalized the most. 195 00:11:19,000 --> 00:11:22,120 Speaker 2: Yeah, I mean in the data was in the detail 196 00:11:22,200 --> 00:11:24,320 Speaker 2: that was provided yesterday, that seems to be the case. 197 00:11:24,320 --> 00:11:27,440 Speaker 2: Sort of fourteen to twenty one. I think the top age. 198 00:11:28,320 --> 00:11:30,600 Speaker 2: You know, the reality is the terrorist organizations, in the 199 00:11:30,640 --> 00:11:35,840 Speaker 2: broadest of senses, their pitch is targeted at sort of 200 00:11:36,480 --> 00:11:39,240 Speaker 2: broadly speaking, fourteen or sixteen to twenty five year old men. 201 00:11:39,800 --> 00:11:42,880 Speaker 2: If you were pitching to recruit to the military or 202 00:11:42,920 --> 00:11:46,040 Speaker 2: any other fine of enterprise that engages in violence, then 203 00:11:46,080 --> 00:11:49,320 Speaker 2: that's your primary age cohort that you be going to 204 00:11:49,320 --> 00:11:51,559 Speaker 2: pitch at. In addition to that, there's a whole bundle 205 00:11:51,640 --> 00:11:55,080 Speaker 2: of completely conventional vulnerabilities that are part of being a 206 00:11:55,120 --> 00:12:01,360 Speaker 2: teenager that are exploitable for an extremist organization that teenage 207 00:12:01,400 --> 00:12:05,800 Speaker 2: boys and teenagers generally are navigating all of the development 208 00:12:05,840 --> 00:12:09,120 Speaker 2: processes that are about coming to realize or understand where 209 00:12:09,160 --> 00:12:11,560 Speaker 2: you think you fit in the world, who you consider 210 00:12:11,600 --> 00:12:13,840 Speaker 2: to be authority, figures, what you believe to be true. 211 00:12:13,880 --> 00:12:18,000 Speaker 2: All these are normal things. But if you're struggling to 212 00:12:18,080 --> 00:12:22,120 Speaker 2: get those answers, as some kids do, then what an 213 00:12:22,120 --> 00:12:26,200 Speaker 2: extremist organization can offer, or rather people perceive that it 214 00:12:26,200 --> 00:12:28,640 Speaker 2: can offer, is a sense of belonging and a sense 215 00:12:28,640 --> 00:12:30,800 Speaker 2: of identity and a sense of significance, and a whole 216 00:12:30,800 --> 00:12:33,120 Speaker 2: bunch of these kinds of things which everyone is looking for. 217 00:12:34,120 --> 00:12:38,160 Speaker 2: It's the kid who can't find that in a sporting team, 218 00:12:38,720 --> 00:12:42,680 Speaker 2: cultural organization, church, whatever it might be, but still needs 219 00:12:42,679 --> 00:12:48,880 Speaker 2: to satisfy that desire that an extrememist organization that's effective 220 00:12:49,400 --> 00:12:52,680 Speaker 2: shapes its pitch and its propaganda and its claims to 221 00:12:52,720 --> 00:12:54,000 Speaker 2: satisfy all those needs. 222 00:12:54,600 --> 00:12:58,400 Speaker 1: Unreal Yeah, now, doctor West, appreciate your time. Thank you no. 223 00:12:58,840 --> 00:13:02,200 Speaker 1: Doctor Levi West Great Insights, a and U research fellow 224 00:13:02,360 --> 00:13:06,040 Speaker 1: on the raising of the terror alert from possible to 225 00:13:06,080 --> 00:13:10,680 Speaker 1: probable yesterday and social media one of the driving forces 226 00:13:10,720 --> 00:13:16,480 Speaker 1: behind that. Julianman Grant, the E Safety Commissioner calling on 227 00:13:16,840 --> 00:13:21,040 Speaker 1: social media platforms to regulate their algorithms to make it 228 00:13:21,080 --> 00:13:24,280 Speaker 1: more difficult. But of course while some may have and 229 00:13:24,559 --> 00:13:29,920 Speaker 1: gone down that path, others aren't quite so active in 230 00:13:30,360 --> 00:13:34,160 Speaker 1: doing that, and so the risk is there. The old 231 00:13:34,240 --> 00:13:39,080 Speaker 1: mantra of be alert, not alarmed still stands, and that's 232 00:13:39,280 --> 00:13:41,320 Speaker 1: certainly what the pitch is overall,