1 00:00:00,600 --> 00:00:05,760 Speaker 1: Already and this da This is the Daily OS. Oh, 2 00:00:05,920 --> 00:00:06,960 Speaker 1: now it makes sense. 3 00:00:14,720 --> 00:00:17,520 Speaker 2: Good morning and welcome to the Daily OS. It's Wednesday, 4 00:00:17,560 --> 00:00:21,000 Speaker 2: the ninth of October. I'm Zara, I'm Chloe. A new 5 00:00:21,040 --> 00:00:25,120 Speaker 2: digital player is helping to shape public discourse online. How 6 00:00:25,120 --> 00:00:28,400 Speaker 2: an audience feels about a brand, a narrative, a personality 7 00:00:28,640 --> 00:00:31,400 Speaker 2: or a scandal is now being molded by the rise 8 00:00:31,480 --> 00:00:32,360 Speaker 2: of bots. 9 00:00:32,400 --> 00:00:35,800 Speaker 1: Pr bots to be specific. These bots can be programmed 10 00:00:35,800 --> 00:00:38,800 Speaker 1: to target your social media algorithms and the content you 11 00:00:38,920 --> 00:00:42,080 Speaker 1: do or don't get served. In today's deep dive, we'll 12 00:00:42,120 --> 00:00:45,639 Speaker 1: explore how these digital armies can be mobilized to influence 13 00:00:45,680 --> 00:00:50,360 Speaker 1: public opinion and even elections. The first aurah what's making headlines. 14 00:00:53,800 --> 00:00:57,320 Speaker 2: Federal Opposition leader Peter Dutton has refused to support a 15 00:00:57,360 --> 00:01:01,040 Speaker 2: government motion marking one year since Hummus's attack on Israel. 16 00:01:01,560 --> 00:01:05,000 Speaker 2: In an address from Camberra yesterday, PM Anthony Alberzi said 17 00:01:05,040 --> 00:01:09,160 Speaker 2: that quote Parliament comes together again to unequivocally condemn Hamas's 18 00:01:09,200 --> 00:01:12,760 Speaker 2: actions in Israel on October seven, which he said will 19 00:01:12,800 --> 00:01:16,360 Speaker 2: always be a day of pain. The PM also acknowledged 20 00:01:16,360 --> 00:01:19,880 Speaker 2: the quote devastating humanitarian crisis in Gaza and said that 21 00:01:19,920 --> 00:01:24,399 Speaker 2: every innocent Israeli, Palestinian and Lebanese life quote matters. He 22 00:01:24,480 --> 00:01:27,679 Speaker 2: called for an immediate ceasefire and the release of all hostages. 23 00:01:28,440 --> 00:01:31,640 Speaker 2: Opposition leader Peter Dudden refused to support that motion, which 24 00:01:31,680 --> 00:01:34,400 Speaker 2: he said was quote supposed to be about October seven 25 00:01:34,560 --> 00:01:37,759 Speaker 2: about the loss of human life. Dun And accused Albanesi 26 00:01:37,840 --> 00:01:40,559 Speaker 2: of lacking quote respect for the Jewish community and trying 27 00:01:40,600 --> 00:01:42,400 Speaker 2: to please all people in this debate. 28 00:01:45,360 --> 00:01:47,600 Speaker 1: Residents in the US state of Florida are on high 29 00:01:47,600 --> 00:01:51,040 Speaker 1: alert as a brace for Hurricane Milton. Authorities have described 30 00:01:51,040 --> 00:01:55,160 Speaker 1: the category five storm as extremely dangerous. Milton is expected 31 00:01:55,200 --> 00:01:57,560 Speaker 1: to make a landfall in the southeastern state over the 32 00:01:57,600 --> 00:02:01,120 Speaker 1: next day, bringing life threatening storm surgeon and damaging wins 33 00:02:01,200 --> 00:02:05,000 Speaker 1: with it. The Federal Emergency Management Agency, or FEMA, has 34 00:02:05,000 --> 00:02:08,399 Speaker 1: issued evacuation orders across parts of Florida. While heavy rain 35 00:02:08,440 --> 00:02:11,359 Speaker 1: has already drenched parts of the state. Authorities are also 36 00:02:11,400 --> 00:02:14,160 Speaker 1: concerned about the risk of coastal flooding along the neighboring 37 00:02:14,240 --> 00:02:18,680 Speaker 1: South Carolina and Georgia coasts. Emergency services are urging residents 38 00:02:18,720 --> 00:02:20,760 Speaker 1: to protect themselves and take care of others. 39 00:02:23,680 --> 00:02:26,200 Speaker 2: Quantus has been ordered to pay one hundred million dollars 40 00:02:26,200 --> 00:02:29,760 Speaker 2: in penalties following landmark legal action brought by the Consumer 41 00:02:29,800 --> 00:02:33,480 Speaker 2: Watchdog in the Federal Court. The HBLEC accused Quantus of 42 00:02:33,520 --> 00:02:37,280 Speaker 2: misleading consumers by selling tickets on thousands of canceled flights 43 00:02:37,320 --> 00:02:41,280 Speaker 2: and failing to inform customers. In particular cases, the atriple 44 00:02:41,280 --> 00:02:43,600 Speaker 2: C found evidence of flights being offered for up to 45 00:02:43,639 --> 00:02:46,920 Speaker 2: two months after Quantus had decided it would cancel the service. 46 00:02:47,520 --> 00:02:50,160 Speaker 2: Earlier this year, Quantus agreed to pay around twenty million 47 00:02:50,200 --> 00:02:53,960 Speaker 2: dollars to reimburse impacted customers. It's now been fined an 48 00:02:53,960 --> 00:02:56,680 Speaker 2: additional one hundred million dollars after the court sided with 49 00:02:56,720 --> 00:02:58,240 Speaker 2: the Consumer Watchdog. 50 00:03:00,520 --> 00:03:03,079 Speaker 1: And for today's good News, ninety nine year old veteran 51 00:03:03,120 --> 00:03:05,720 Speaker 1: Glenn Dobby has been honored at a ceremony in South 52 00:03:05,760 --> 00:03:08,840 Speaker 1: Australia where he received his World War II Service Medal. 53 00:03:09,200 --> 00:03:11,920 Speaker 1: Dobby was shot during service on the Solomon Islands as 54 00:03:11,960 --> 00:03:14,160 Speaker 1: a nineteen year old at the end of World War Two. 55 00:03:14,560 --> 00:03:17,120 Speaker 1: The overdue medal was presented to the veteran in front 56 00:03:17,120 --> 00:03:19,440 Speaker 1: of his son and a crowd of veterans in Adelaide 57 00:03:19,520 --> 00:03:20,080 Speaker 1: last week. 58 00:03:23,440 --> 00:03:26,560 Speaker 2: Now, Chloe, you've become something of a bot expert in 59 00:03:26,600 --> 00:03:29,760 Speaker 2: the last couple of weeks. But for anyone that hasn't 60 00:03:29,800 --> 00:03:32,160 Speaker 2: spent as much time in the weeds of botmania as 61 00:03:32,200 --> 00:03:36,320 Speaker 2: you have, can you just explain what exactly is a bot? 62 00:03:36,560 --> 00:03:39,080 Speaker 1: I have been in the weeds of the botmania. It's 63 00:03:39,240 --> 00:03:42,040 Speaker 1: quite a place to be. And it can sound really 64 00:03:42,080 --> 00:03:45,320 Speaker 1: techy when we talk about this stuff, but it's pretty 65 00:03:45,400 --> 00:03:47,520 Speaker 1: much every time you jump on social media and you 66 00:03:47,560 --> 00:03:51,360 Speaker 1: see somewhat of a spammy comment on a popular Instagram account. 67 00:03:51,720 --> 00:03:53,720 Speaker 1: Just look at the Daily Yos's account, you'll see plenty 68 00:03:53,800 --> 00:03:54,080 Speaker 1: of them. 69 00:03:54,240 --> 00:03:56,200 Speaker 2: Excellent plug, excellent plug. 70 00:03:56,400 --> 00:03:59,080 Speaker 1: You'll see them most often in the common section within 71 00:03:59,120 --> 00:04:00,880 Speaker 1: seconds of making a on Instagram. 72 00:04:01,160 --> 00:04:03,200 Speaker 2: Yeah, and I feel like, you know, even if you 73 00:04:03,240 --> 00:04:05,680 Speaker 2: just look at our Instagram, you're seeing things like how 74 00:04:05,720 --> 00:04:07,880 Speaker 2: coach Sarah changed my life. They've changed the name of 75 00:04:07,920 --> 00:04:11,240 Speaker 2: Sarah to multiple different women. But there you go. There's 76 00:04:11,320 --> 00:04:14,080 Speaker 2: some like semi nude women that have come up in 77 00:04:14,240 --> 00:04:15,720 Speaker 2: gifts in our comment section. 78 00:04:15,920 --> 00:04:18,279 Speaker 1: Click on my bio you'll get rich fast. 79 00:04:18,760 --> 00:04:21,120 Speaker 2: So those sorts of comments you're saying, those are coming 80 00:04:21,120 --> 00:04:22,320 Speaker 2: from bot accounts, right. 81 00:04:22,360 --> 00:04:24,560 Speaker 1: Yeap, So that is a bot account. But when we 82 00:04:24,640 --> 00:04:29,080 Speaker 1: look at bots more broadly, bots are programmed to perform automated, 83 00:04:29,360 --> 00:04:33,440 Speaker 1: repetitive tasks over a network. Okay, so they're deliberately designed 84 00:04:33,480 --> 00:04:36,559 Speaker 1: to mimic human behavior. They look like humans, they sound 85 00:04:36,640 --> 00:04:39,520 Speaker 1: like real users. Only bots can generate content at a 86 00:04:39,600 --> 00:04:43,159 Speaker 1: speed and a scale that we humans simply couldn't. So 87 00:04:43,240 --> 00:04:47,000 Speaker 1: that's thousands upon thousands of comments a day. There are 88 00:04:47,040 --> 00:04:51,400 Speaker 1: some helpful bots like chat gpt is, a chatbot, Siri Alexa, 89 00:04:51,600 --> 00:04:54,040 Speaker 1: but others, as we know, are less helpful. 90 00:04:54,360 --> 00:04:58,120 Speaker 2: They are less helpful, but they are absolutely everywhere. Like 91 00:04:58,200 --> 00:05:01,479 Speaker 2: it takes, you know, one second of looking, as you said, 92 00:05:01,520 --> 00:05:03,800 Speaker 2: at a popular account and you just get flooded with 93 00:05:03,839 --> 00:05:07,120 Speaker 2: this stuff. How significant I mean, I'm just using anecdotal 94 00:05:07,200 --> 00:05:10,120 Speaker 2: evidence here. So how significant are the presence of these 95 00:05:10,160 --> 00:05:11,400 Speaker 2: bots in the online world? 96 00:05:11,839 --> 00:05:13,839 Speaker 1: Yeah, it's wild. And to give you a sense of 97 00:05:13,880 --> 00:05:17,960 Speaker 1: the scale of things, nearly half of all online traffic 98 00:05:18,040 --> 00:05:21,320 Speaker 1: in twenty twenty three came from fake users. 99 00:05:21,560 --> 00:05:22,600 Speaker 2: That is crazy. 100 00:05:22,680 --> 00:05:25,200 Speaker 1: So bots, yeah, now, that's according to a new report 101 00:05:25,279 --> 00:05:29,239 Speaker 1: by IT security firm Imperva, which also found that bad 102 00:05:29,320 --> 00:05:33,240 Speaker 1: bots programmed to defraud and scam users accounted for nearly 103 00:05:33,279 --> 00:05:35,200 Speaker 1: one third of all of that traffic. 104 00:05:35,480 --> 00:05:37,880 Speaker 2: Wow, I mean I knew they were everywhere, but that's 105 00:05:38,000 --> 00:05:39,320 Speaker 2: really there everywhere. 106 00:05:39,360 --> 00:05:40,280 Speaker 1: They're everywhere. 107 00:05:40,560 --> 00:05:42,880 Speaker 2: And so when we're thinking about bots, you know, I 108 00:05:42,880 --> 00:05:45,000 Speaker 2: think that anyone that has spent time on the Internet 109 00:05:45,080 --> 00:05:48,479 Speaker 2: knows not to click on you know, come earn money 110 00:05:48,480 --> 00:05:50,840 Speaker 2: with me with your crypto dollars, you know in our 111 00:05:50,839 --> 00:05:53,680 Speaker 2: comment section. Ye, But what are some of the other 112 00:05:53,760 --> 00:05:55,919 Speaker 2: ways the bots show up or the other ways that 113 00:05:55,960 --> 00:05:59,560 Speaker 2: they can I guess influence behaviors or conversations. 114 00:06:00,000 --> 00:06:02,680 Speaker 1: So that's a whole different podcast about how people are 115 00:06:02,680 --> 00:06:05,640 Speaker 1: scammed and frauded into clicking on things that they shouldn't. 116 00:06:06,080 --> 00:06:08,560 Speaker 1: But what I was really interested in was the way 117 00:06:08,720 --> 00:06:11,680 Speaker 1: bots can be used to make an opinion seem like 118 00:06:11,720 --> 00:06:14,280 Speaker 1: a fact, or to make it feel and seem as 119 00:06:14,320 --> 00:06:17,960 Speaker 1: though it has either widespread support or widespread opposition for something. 120 00:06:18,040 --> 00:06:21,280 Speaker 1: What does that do for real users online? Because if 121 00:06:21,279 --> 00:06:24,240 Speaker 1: you can gear thousands upon thousands of accounts to push 122 00:06:24,240 --> 00:06:27,000 Speaker 1: a certain narrative, that can be really dangerous if we 123 00:06:27,040 --> 00:06:31,440 Speaker 1: think about elections or democracy or just public discourse at large. 124 00:06:31,960 --> 00:06:35,320 Speaker 2: It's a really interesting topic, but it does still seem 125 00:06:35,480 --> 00:06:37,160 Speaker 2: quite up there, and I would love to bring it 126 00:06:37,200 --> 00:06:39,960 Speaker 2: down to down here. Is there an example that you 127 00:06:40,000 --> 00:06:42,520 Speaker 2: can just provide I guess, to give some orientation as 128 00:06:42,560 --> 00:06:43,920 Speaker 2: to what you're actually talking about here. 129 00:06:44,240 --> 00:06:47,080 Speaker 1: So let's just say you're scrolling on X and you 130 00:06:47,200 --> 00:06:50,960 Speaker 1: read a bunch of different tweets, retweets shares comments of 131 00:06:51,000 --> 00:06:54,440 Speaker 1: those tweets saying apples are really really bad for you 132 00:06:54,920 --> 00:06:58,599 Speaker 1: and instead you should be buying oranges. This is really basic, 133 00:06:58,600 --> 00:07:01,480 Speaker 1: but just to wrap my heads around it, you're probably 134 00:07:01,480 --> 00:07:04,120 Speaker 1: going to start feeling a little bit suss about apples, 135 00:07:04,120 --> 00:07:06,480 Speaker 1: even though you might really like them. You just sort 136 00:07:06,480 --> 00:07:08,440 Speaker 1: of have a bit of questions about that. You might 137 00:07:08,480 --> 00:07:12,400 Speaker 1: even consider buying more oranges. And bots can create these 138 00:07:12,440 --> 00:07:15,640 Speaker 1: spaces that feel like communities are sharing ideas that it's 139 00:07:15,680 --> 00:07:18,480 Speaker 1: just normal people talking about how we all don't really 140 00:07:18,520 --> 00:07:21,840 Speaker 1: like apples, but there is an agenda at play. Now 141 00:07:21,880 --> 00:07:23,920 Speaker 1: when you consider the influence that, as I said, this 142 00:07:24,000 --> 00:07:26,880 Speaker 1: could have on politics or elections. 143 00:07:26,240 --> 00:07:28,880 Speaker 2: Not just apples and oranges, not justs. 144 00:07:28,400 --> 00:07:30,320 Speaker 1: And oranges, it can be really scary. 145 00:07:30,920 --> 00:07:33,880 Speaker 2: What are the experts say about bots? Because you know 146 00:07:34,160 --> 00:07:36,840 Speaker 2: they are a fairly new phenomenon, but there must be 147 00:07:36,880 --> 00:07:38,280 Speaker 2: a body of research out there. 148 00:07:38,760 --> 00:07:41,920 Speaker 1: I reached out to a bot doc. Of course, a 149 00:07:41,920 --> 00:07:44,360 Speaker 1: bot doc has a dock of bot. She does have 150 00:07:44,400 --> 00:07:46,960 Speaker 1: a real name. Her name is doctor Sophia Mellinson Richard 151 00:07:47,000 --> 00:07:51,520 Speaker 1: done from Canada's McMaster University. Now, she did a PhD 152 00:07:51,760 --> 00:07:56,520 Speaker 1: on Bota gander when bot armies generate mass content to 153 00:07:56,640 --> 00:08:01,080 Speaker 1: saturate social media feeds and then manipulate audiences. So doctor 154 00:08:01,160 --> 00:08:04,880 Speaker 1: Richardoni told me about something called hashtag flooding, which is 155 00:08:05,000 --> 00:08:09,960 Speaker 1: essentially a tweet containing nothing but popularized keywords and catchphrases 156 00:08:10,000 --> 00:08:11,840 Speaker 1: in the form of hashtag. 157 00:08:11,320 --> 00:08:13,960 Speaker 2: After hashtag after hashtag, so annoying. 158 00:08:13,800 --> 00:08:17,880 Speaker 1: So annoying. Now, this tactic, as well as the rapid 159 00:08:17,920 --> 00:08:21,240 Speaker 1: re sharing of human posts so twenty five thousand in 160 00:08:21,440 --> 00:08:24,200 Speaker 1: ten minutes, can create, like we were saying before, this 161 00:08:24,240 --> 00:08:28,960 Speaker 1: illusion of widespread support or widespread opposition for specific viewpoints. 162 00:08:29,720 --> 00:08:32,680 Speaker 1: She says that this happens as though the idea embedded 163 00:08:32,720 --> 00:08:36,280 Speaker 1: in the tweet came from grassroots popularization. So when we're 164 00:08:36,320 --> 00:08:39,360 Speaker 1: talking about mimicking human behavior, it's this feeling of oh, 165 00:08:39,600 --> 00:08:40,800 Speaker 1: everyone thinks. 166 00:08:40,559 --> 00:08:42,920 Speaker 2: This, and you want to be a part of the everyone, 167 00:08:43,000 --> 00:08:45,080 Speaker 2: Like that's the human condition, right exactly. 168 00:08:45,160 --> 00:08:47,720 Speaker 1: It's the concept of herd mentality, which is the idea 169 00:08:47,760 --> 00:08:51,760 Speaker 1: that individuals naturally want to conform to the dominant view 170 00:08:51,880 --> 00:08:52,600 Speaker 1: of the community. 171 00:08:53,000 --> 00:08:56,400 Speaker 2: Yeah. I mean it's so fascinating. And the reason that 172 00:08:56,440 --> 00:08:59,400 Speaker 2: you came to I don't want to say become obsessed 173 00:08:59,400 --> 00:09:03,240 Speaker 2: with this topic, but I had started your fascination was 174 00:09:03,440 --> 00:09:07,160 Speaker 2: around one example that we've seen in the last couple 175 00:09:07,160 --> 00:09:10,480 Speaker 2: of years. Talk me through the Johnny Depp amber Heard story. 176 00:09:11,120 --> 00:09:13,880 Speaker 1: I don't know if many people will remember what the 177 00:09:13,880 --> 00:09:16,800 Speaker 1: internet looked like in twenty twenty two during the Johnny 178 00:09:16,800 --> 00:09:19,880 Speaker 1: Depp Amber Herd defamation trial, but I recently listened to 179 00:09:19,960 --> 00:09:23,040 Speaker 1: a six part podcast that brought me right back there. 180 00:09:23,520 --> 00:09:26,559 Speaker 1: So for people who need a refresher on the story, Essentially, 181 00:09:26,640 --> 00:09:29,920 Speaker 1: Johnny Depp filed a lawsuit against his ex wife amber 182 00:09:29,920 --> 00:09:33,320 Speaker 1: Heard over an opinion article Herd wrote for The Washington Post. 183 00:09:34,080 --> 00:09:38,040 Speaker 1: She alleged she had experienced domestic abuse. Her didn't name Depp, 184 00:09:38,120 --> 00:09:41,640 Speaker 1: but he launched defamation proceedings against her, arguing he was 185 00:09:41,720 --> 00:09:45,240 Speaker 1: identifiable from the article and depth denies claims that he 186 00:09:45,280 --> 00:09:49,720 Speaker 1: physically abused her. Now, a jury in Fairfax County, Virginia 187 00:09:49,880 --> 00:09:53,439 Speaker 1: ultimately sided with Depp, and Herd was sued for defaming him. 188 00:09:53,480 --> 00:09:55,719 Speaker 1: But I think you might remember that the court proceedings 189 00:09:55,720 --> 00:09:57,040 Speaker 1: were streamed online. 190 00:09:57,240 --> 00:09:59,720 Speaker 2: Yeah, so I was going to say, can you, in 191 00:09:59,760 --> 00:10:02,800 Speaker 2: the of our listeners connect what we've just been talking about, 192 00:10:02,800 --> 00:10:04,959 Speaker 2: which is, you know, all of this stuff about bots 193 00:10:05,160 --> 00:10:07,400 Speaker 2: and this case, what's the connection here? 194 00:10:07,840 --> 00:10:10,840 Speaker 1: So, as the case played out in the courtroom, the 195 00:10:10,920 --> 00:10:15,600 Speaker 1: internet mounted its own unofficial trial of Amber Heard. This 196 00:10:15,679 --> 00:10:17,920 Speaker 1: is the premise of the podcast I was talking about earlier, 197 00:10:17,960 --> 00:10:22,200 Speaker 1: Who Trolled Amber? It's by UK media organization Tortoise. The 198 00:10:22,280 --> 00:10:25,280 Speaker 1: podcast found that a large part of the online hate 199 00:10:25,320 --> 00:10:31,000 Speaker 1: campaign against Herd was actually manufactured and executed by bot accounts. 200 00:10:31,240 --> 00:10:34,000 Speaker 1: Now I spoke to Xavier Greenwood. He produced the Who 201 00:10:34,040 --> 00:10:35,439 Speaker 1: Trolled Amber podcast. 202 00:10:35,679 --> 00:10:38,800 Speaker 3: Some of the main themes were that Amberherd deserves prison, 203 00:10:38,920 --> 00:10:41,880 Speaker 3: Amber Heard as a gold digger, Amberhard as a fraud, 204 00:10:41,880 --> 00:10:44,120 Speaker 3: Amber Heard as a liar. To give you a sense 205 00:10:44,160 --> 00:10:47,160 Speaker 3: of the scale of things, there was a hashtag justice 206 00:10:47,160 --> 00:10:51,160 Speaker 3: for Johnny Depp that was viewed fifteen point seven billion times. 207 00:10:51,360 --> 00:10:54,120 Speaker 3: Hashtag just as for Amberherd viewed a fraction of that. 208 00:10:54,559 --> 00:10:57,839 Speaker 3: And my friends were suddenly saying things that didn't really 209 00:10:57,880 --> 00:11:00,480 Speaker 3: sound like them. They were saying, well, yeah, what if 210 00:11:00,559 --> 00:11:02,959 Speaker 3: this time she was the abuser? And these were people 211 00:11:02,960 --> 00:11:05,360 Speaker 3: who typically in the me too movement would be a 212 00:11:05,360 --> 00:11:07,880 Speaker 3: bit more cynical or maybe a bit more reserved and 213 00:11:07,920 --> 00:11:11,360 Speaker 3: making a judgment like that. So yeah, from the beginning, 214 00:11:11,800 --> 00:11:14,440 Speaker 3: we sort of saw that this was quite suspicious. 215 00:11:14,840 --> 00:11:17,000 Speaker 2: What else did this investigation uncover? 216 00:11:17,760 --> 00:11:21,400 Speaker 1: So the team A Tortoise brought together a database of 217 00:11:21,440 --> 00:11:25,040 Speaker 1: over one million anti amber Heard tweets and they brought 218 00:11:25,040 --> 00:11:27,520 Speaker 1: it together and they found that more than half of 219 00:11:27,559 --> 00:11:31,200 Speaker 1: them were inauthentic. So what that means is either they 220 00:11:31,240 --> 00:11:34,480 Speaker 1: were posted from spam accounts with three followers and they 221 00:11:34,480 --> 00:11:37,520 Speaker 1: were built in the last two months, or they were 222 00:11:37,600 --> 00:11:42,000 Speaker 1: amplified in an inauthentic way so reshared and reshared thousands 223 00:11:42,080 --> 00:11:43,200 Speaker 1: upon thousands of times. 224 00:11:43,880 --> 00:11:47,320 Speaker 2: Okay, I think the thing that I wonder when I 225 00:11:47,400 --> 00:11:50,400 Speaker 2: hear about this story is like, who is behind this 226 00:11:50,480 --> 00:11:53,200 Speaker 2: trolling campaign? It's very clear who benefits from it, but 227 00:11:53,280 --> 00:11:54,600 Speaker 2: who's behind. 228 00:11:54,200 --> 00:11:57,960 Speaker 1: It we don't know. According to the investigation, the team 229 00:11:58,000 --> 00:12:02,080 Speaker 1: could hypothesize multiple different scenarios and their likely were multiple 230 00:12:02,080 --> 00:12:03,640 Speaker 1: different agendas at player. 231 00:12:03,840 --> 00:12:05,640 Speaker 2: It's crazy, So this could. 232 00:12:05,480 --> 00:12:09,200 Speaker 1: Be genuine Johnny Depp fans who can acquire bots for 233 00:12:09,240 --> 00:12:12,720 Speaker 1: pretty cheap online and push support for their favorite actor 234 00:12:13,120 --> 00:12:16,080 Speaker 1: in other scenarios, Xavier floats what he calls a more 235 00:12:16,120 --> 00:12:17,080 Speaker 1: abstract theory. 236 00:12:17,400 --> 00:12:20,199 Speaker 3: I think it's sort of fairly widely known now that 237 00:12:20,280 --> 00:12:24,760 Speaker 3: authoritarian regimes who want to sow discouse or want to 238 00:12:24,800 --> 00:12:28,800 Speaker 3: cause confusion in the West, that they seek out wedge issues. 239 00:12:28,840 --> 00:12:33,040 Speaker 3: So they seek out issues which divide Britain's, divide Americans, 240 00:12:33,080 --> 00:12:36,559 Speaker 3: divide Australians. They try to sort of drive a deeper wedge. 241 00:12:37,040 --> 00:12:39,720 Speaker 1: But something that really stuck with me from my conversation 242 00:12:39,800 --> 00:12:42,520 Speaker 1: with Xavier was a point he made about the future 243 00:12:42,559 --> 00:12:44,880 Speaker 1: of political discourse on social media. 244 00:12:45,320 --> 00:12:49,520 Speaker 3: If you can attack a celebrity who has enormous an 245 00:12:49,640 --> 00:12:52,680 Speaker 3: enormous amount of resources, as Amber Herd did, whilst to 246 00:12:52,720 --> 00:12:55,360 Speaker 3: stop someone doing the same thing when it comes to 247 00:12:55,720 --> 00:12:58,400 Speaker 3: attacking a politician or when it comes to trying to 248 00:12:58,440 --> 00:12:59,439 Speaker 3: sway an election. 249 00:13:00,080 --> 00:13:03,280 Speaker 2: I mean, we're talking there about elections, and we of 250 00:13:03,360 --> 00:13:05,760 Speaker 2: course know that there is one coming up just around 251 00:13:05,760 --> 00:13:08,880 Speaker 2: the corner in the US. Are we expecting to see 252 00:13:08,920 --> 00:13:11,800 Speaker 2: this sort of interference from bots in the US election 253 00:13:12,280 --> 00:13:12,760 Speaker 2: this year? 254 00:13:13,200 --> 00:13:18,040 Speaker 1: We already have. Last month, the Biden administration charged Russian 255 00:13:18,240 --> 00:13:22,600 Speaker 1: media executives over an alleged targeted online campaign to influence 256 00:13:22,800 --> 00:13:26,840 Speaker 1: voters in the US and push hidden Russian government messaging. 257 00:13:27,280 --> 00:13:30,840 Speaker 1: Now it comes after US officials seized thirty two Internet 258 00:13:30,840 --> 00:13:35,480 Speaker 1: domain names that were covertly targeting specific demographics on social 259 00:13:35,520 --> 00:13:39,080 Speaker 1: media and promoting AI generated false narratives and pushing that 260 00:13:39,280 --> 00:13:40,200 Speaker 1: to those groups. 261 00:13:40,520 --> 00:13:43,360 Speaker 2: Okay, so there are ready allegations that we've seen this 262 00:13:43,640 --> 00:13:47,240 Speaker 2: at play when it comes to the US election. I mean, 263 00:13:47,960 --> 00:13:50,360 Speaker 2: I think that a natural endpoint here is that if 264 00:13:50,360 --> 00:13:54,200 Speaker 2: we're talking about the fact that even the most powerful institutions, people, 265 00:13:54,240 --> 00:13:57,640 Speaker 2: whatever in the world can be susceptible to this kind 266 00:13:57,640 --> 00:14:00,720 Speaker 2: of botagander, as we'll call it, what is like an 267 00:14:00,760 --> 00:14:02,680 Speaker 2: average dough like you or I meant to do when 268 00:14:02,679 --> 00:14:05,160 Speaker 2: it comes to protecting yourself against something like. 269 00:14:05,080 --> 00:14:07,079 Speaker 1: This, Well, that was what came back in so many 270 00:14:07,120 --> 00:14:09,640 Speaker 1: of the conversations I was having. For the average dough 271 00:14:09,760 --> 00:14:11,520 Speaker 1: like you and me, A lot of it comes down 272 00:14:11,520 --> 00:14:16,240 Speaker 1: to awareness, staying vigilant, staying curious, question what you're getting served, 273 00:14:16,280 --> 00:14:18,520 Speaker 1: who's pushing it. And that's generally for anything that you 274 00:14:18,559 --> 00:14:21,600 Speaker 1: see online, but particularly when you're thinking something looks a 275 00:14:21,640 --> 00:14:25,360 Speaker 1: bit spammy, it likely might be it's important to be 276 00:14:25,360 --> 00:14:28,320 Speaker 1: critical about the content you're seeing before you form an opinion, 277 00:14:28,560 --> 00:14:32,080 Speaker 1: especially ahead of an election. Now, as for what comes next, 278 00:14:32,240 --> 00:14:35,560 Speaker 1: Xavier Greenwood says, we're largely in uncharted waters. 279 00:14:36,120 --> 00:14:39,240 Speaker 3: What we saw with amberhard trial, we may well continue 280 00:14:39,280 --> 00:14:41,240 Speaker 3: to see in an even more intense way in the 281 00:14:41,280 --> 00:14:43,760 Speaker 3: future to some extent the genius out of the box. 282 00:14:44,080 --> 00:14:47,800 Speaker 2: Well, Chloe, I am absolutely terrified about the state of 283 00:14:47,800 --> 00:14:50,480 Speaker 2: the internet. So thank you for enlightening me on that, 284 00:14:50,640 --> 00:14:53,400 Speaker 2: and thanks for jumping on today's pod, and thank you 285 00:14:53,640 --> 00:14:56,480 Speaker 2: for listening to this episode of The Daily os. If 286 00:14:56,480 --> 00:14:58,880 Speaker 2: you're listening on Spotify or Apple, we would love it 287 00:14:58,920 --> 00:15:01,120 Speaker 2: if you could hit follow, so it sends signal to 288 00:15:01,240 --> 00:15:03,560 Speaker 2: the platforms that you love what we're doing. And if 289 00:15:03,560 --> 00:15:06,120 Speaker 2: you're watching us on at YouTube, you can hit subscribe 290 00:15:06,120 --> 00:15:09,360 Speaker 2: there and the same thing happens. We'll be back again tomorrow, 291 00:15:09,360 --> 00:15:11,040 Speaker 2: but until then, have a fabulous day. 292 00:15:13,720 --> 00:15:16,040 Speaker 3: My name is Lily Maddon and I'm a proud Arunda 293 00:15:16,280 --> 00:15:21,080 Speaker 3: Bungelung Calcoton woman from Gadighl Country. The Daily oz acknowledges 294 00:15:21,120 --> 00:15:23,280 Speaker 3: that this podcast is recorded on the lands of the 295 00:15:23,320 --> 00:15:26,880 Speaker 3: Gadighl people and pays respect to all Aboriginal and Torres 296 00:15:27,000 --> 00:15:29,800 Speaker 3: Right island and nations. We pay our respects to the 297 00:15:29,840 --> 00:15:32,600 Speaker 3: first peoples of these countries, both past and present,