1 00:00:00,520 --> 00:00:04,080 Speaker 1: Already, and this is the daily This is the Daily 2 00:00:04,160 --> 00:00:06,840 Speaker 1: os oh. Now it makes sense. 3 00:00:14,840 --> 00:00:16,920 Speaker 2: Good morning and welcome to the Daly Ohs. 4 00:00:17,200 --> 00:00:19,920 Speaker 3: I'm Zara and over the next little while, we're going 5 00:00:19,960 --> 00:00:23,079 Speaker 3: to be bringing you a bonus series featuring our favorite 6 00:00:23,120 --> 00:00:26,079 Speaker 3: deep dives from twenty twenty four. We've put together the 7 00:00:26,120 --> 00:00:28,800 Speaker 3: best deep dives to listen to on the beach, road, 8 00:00:28,800 --> 00:00:30,600 Speaker 3: tripping when you don't want to talk to the person 9 00:00:30,680 --> 00:00:33,280 Speaker 3: next to you, or just reflecting on the year that was. 10 00:00:33,720 --> 00:00:40,559 Speaker 3: Welcome to tda's summer series. Earlier this year, we did 11 00:00:40,560 --> 00:00:43,519 Speaker 3: a deep dive into the new digital player helping to 12 00:00:43,560 --> 00:00:48,000 Speaker 3: shape public discourse online. Now the TLDR is the bots 13 00:00:48,040 --> 00:00:51,760 Speaker 3: are targeting your social media algorithms and the content that 14 00:00:51,800 --> 00:00:55,520 Speaker 3: you get served online. In today's deep dive, Chloe explores 15 00:00:55,560 --> 00:00:59,560 Speaker 3: how these digital armies are being mobilized to influence public opinion. 16 00:01:00,080 --> 00:01:03,680 Speaker 3: It's a fascinating element of our online world. So without 17 00:01:03,680 --> 00:01:07,240 Speaker 3: further ado, let's get into the episode. Now, Chloe, you've 18 00:01:07,319 --> 00:01:08,920 Speaker 3: become something of a bot. 19 00:01:08,720 --> 00:01:11,160 Speaker 2: Expert in the last couple of weeks. 20 00:01:11,280 --> 00:01:13,680 Speaker 3: But for anyone that hasn't spent as much time in 21 00:01:13,720 --> 00:01:16,479 Speaker 3: the weeds of bot mania as you have, Yes, can 22 00:01:16,520 --> 00:01:19,319 Speaker 3: you just explain what exactly is a bot? 23 00:01:19,600 --> 00:01:22,119 Speaker 1: I have been in the weeds of the botmania. It's 24 00:01:22,280 --> 00:01:25,080 Speaker 1: quite a place to be, and it can sound really 25 00:01:25,120 --> 00:01:28,360 Speaker 1: techy when we talk about this stuff, but it's pretty 26 00:01:28,440 --> 00:01:30,560 Speaker 1: much every time you jump on social media and you 27 00:01:30,600 --> 00:01:34,400 Speaker 1: see somewhat of a spammy comment on a popular Instagram account. 28 00:01:34,760 --> 00:01:36,800 Speaker 1: Just look at the Daily Yours's account, you'll see plenty 29 00:01:36,800 --> 00:01:37,120 Speaker 1: of them. 30 00:01:37,280 --> 00:01:39,240 Speaker 2: Excellent plug, excellent plug. 31 00:01:39,840 --> 00:01:42,520 Speaker 1: You'll see them most often in the comment section within 32 00:01:42,560 --> 00:01:44,319 Speaker 1: seconds of making a post on Instagram. 33 00:01:44,560 --> 00:01:46,600 Speaker 3: Yeah, and I feel like, you know, even if you 34 00:01:46,640 --> 00:01:49,120 Speaker 3: just look at our Instagram, you're seeing things like how 35 00:01:49,160 --> 00:01:51,320 Speaker 3: coach Sarah changed my life. They changed the name of 36 00:01:51,360 --> 00:01:53,160 Speaker 3: Sarah to multiple different women. 37 00:01:53,240 --> 00:01:54,000 Speaker 2: But there you go. 38 00:01:54,440 --> 00:01:57,320 Speaker 3: There's some like semi nude women that have come up 39 00:01:57,360 --> 00:01:59,120 Speaker 3: in gifts in our comment section. 40 00:01:59,360 --> 00:02:01,800 Speaker 1: Click on my io you'll get rich fast. 41 00:02:02,160 --> 00:02:04,559 Speaker 3: So those sorts of comments you're saying, those are coming 42 00:02:04,560 --> 00:02:05,800 Speaker 3: from bot accounts, right. 43 00:02:05,800 --> 00:02:08,000 Speaker 1: Yeap, So that is a bot account. But when we 44 00:02:08,040 --> 00:02:12,519 Speaker 1: look at bots more broadly, bots are programmed to perform automated, 45 00:02:12,800 --> 00:02:16,880 Speaker 1: repetitive tasks over a network. Okay, so they're deliberately designed 46 00:02:16,919 --> 00:02:20,000 Speaker 1: to mimic human behavior. They look like humans, they sound 47 00:02:20,080 --> 00:02:22,960 Speaker 1: like real users. Only bots can generate content at a 48 00:02:23,040 --> 00:02:26,640 Speaker 1: speed and a scale that we humans simply couldn't. So 49 00:02:26,680 --> 00:02:30,440 Speaker 1: that's thousands upon thousands of comments a day. There are 50 00:02:30,440 --> 00:02:34,840 Speaker 1: some helpful bots like chat gpt is, a chatbot, Siri Alexa, 51 00:02:35,040 --> 00:02:37,160 Speaker 1: but others, as we know, are less helpful. 52 00:02:37,800 --> 00:02:41,560 Speaker 3: They are less helpful, but they are absolutely everywhere. Like 53 00:02:41,639 --> 00:02:44,919 Speaker 3: it takes, you know, one second of looking, as you said, 54 00:02:44,919 --> 00:02:47,200 Speaker 3: at a popular account and you just get flooded with 55 00:02:47,280 --> 00:02:50,560 Speaker 3: this stuff. How significant I mean, I'm just using anecdotal 56 00:02:50,639 --> 00:02:53,560 Speaker 3: evidence here, So how significant are the presence of these 57 00:02:53,600 --> 00:02:54,840 Speaker 3: bots in the online world. 58 00:02:55,280 --> 00:02:57,280 Speaker 1: Yeah, it's wild. And to give you a sense of 59 00:02:57,320 --> 00:03:01,400 Speaker 1: the scale of things, nearly half of all online traffic 60 00:03:01,480 --> 00:03:04,760 Speaker 1: in twenty twenty three came from fake users. 61 00:03:05,000 --> 00:03:06,040 Speaker 2: That is crazy. 62 00:03:06,120 --> 00:03:08,640 Speaker 1: So bots, yeah, now, that's according to a new report 63 00:03:08,680 --> 00:03:12,680 Speaker 1: by IT security firm Imperva, which also found that bad 64 00:03:12,760 --> 00:03:16,680 Speaker 1: bots programmed to defraud and scam users accounted for nearly 65 00:03:16,720 --> 00:03:18,640 Speaker 1: one third of all of that traffic. 66 00:03:18,919 --> 00:03:19,239 Speaker 4: Wow. 67 00:03:19,600 --> 00:03:21,760 Speaker 3: I mean, I knew they were everywhere, but that's really 68 00:03:22,000 --> 00:03:22,760 Speaker 3: there everywhere. 69 00:03:22,800 --> 00:03:23,720 Speaker 1: They're everywhere. 70 00:03:24,000 --> 00:03:26,320 Speaker 3: And so when we're thinking about bots, you know, I 71 00:03:26,320 --> 00:03:28,440 Speaker 3: think that anyone that has spent time on the internet 72 00:03:28,520 --> 00:03:31,919 Speaker 3: knows not to click on you know, come earn money 73 00:03:31,919 --> 00:03:34,280 Speaker 3: with me with your crypto dollars, you know in our 74 00:03:34,280 --> 00:03:37,600 Speaker 3: comment section. But what are some of the other ways 75 00:03:37,640 --> 00:03:39,520 Speaker 3: the bots show up or the other ways that they 76 00:03:39,520 --> 00:03:43,000 Speaker 3: can I guess influence behaviors or conversations. 77 00:03:43,280 --> 00:03:46,120 Speaker 1: Well, that's a whole different podcast about how people are 78 00:03:46,120 --> 00:03:49,080 Speaker 1: scammed and frauded into clicking on things that they shouldn't. 79 00:03:49,520 --> 00:03:52,000 Speaker 1: But what I was really interested in was the way 80 00:03:52,120 --> 00:03:55,120 Speaker 1: bots can be used to make an opinion seem like 81 00:03:55,160 --> 00:03:57,440 Speaker 1: a fact, or to make it feel and zeem as 82 00:03:57,440 --> 00:04:01,080 Speaker 1: though it has either widespread support or widespace opposition for something. 83 00:04:01,120 --> 00:04:04,400 Speaker 1: What does that do for real users online? Because if 84 00:04:04,400 --> 00:04:07,320 Speaker 1: you can gear thousands upon thousands of accounts to push 85 00:04:07,360 --> 00:04:10,120 Speaker 1: a certain narrative, that can be really dangerous if we 86 00:04:10,160 --> 00:04:14,680 Speaker 1: think about elections or democracy or just public discourse at large. 87 00:04:14,960 --> 00:04:18,279 Speaker 3: It's a really interesting topic, but it does still seem 88 00:04:18,480 --> 00:04:20,159 Speaker 3: quite up there, and I would love to bring it 89 00:04:20,200 --> 00:04:22,960 Speaker 3: down to down here. Is there an example that you 90 00:04:23,000 --> 00:04:25,560 Speaker 3: can just provide, I guess to give some orientation as 91 00:04:25,560 --> 00:04:26,920 Speaker 3: to what you're actually talking about here. 92 00:04:27,240 --> 00:04:30,120 Speaker 1: So let's just say you're scrolling on X and you 93 00:04:30,200 --> 00:04:33,960 Speaker 1: read a bunch of different tweets, retweets shares comments of 94 00:04:33,960 --> 00:04:37,440 Speaker 1: those tweets saying apples are really really bad for you 95 00:04:37,920 --> 00:04:41,559 Speaker 1: and instead you should be buying oranges. This is really basic, 96 00:04:41,600 --> 00:04:44,159 Speaker 1: but just still wrapping heads around it. You're probably going 97 00:04:44,240 --> 00:04:46,840 Speaker 1: to start feeling a little bit suss about apples, even 98 00:04:46,880 --> 00:04:49,159 Speaker 1: though you might really like them. You just sort of 99 00:04:49,160 --> 00:04:51,279 Speaker 1: have a bit of questions about that. You might even 100 00:04:51,360 --> 00:04:55,560 Speaker 1: consider buying more oranges. And bots can create these spaces 101 00:04:55,560 --> 00:04:58,359 Speaker 1: that feel like communities are sharing ideas that it's just 102 00:04:58,440 --> 00:05:02,080 Speaker 1: normal people talking about how we all don't really like apples, 103 00:05:02,240 --> 00:05:04,719 Speaker 1: but there is an agenda at play. Now when you 104 00:05:04,760 --> 00:05:06,920 Speaker 1: consider the influence that, as I said, this could have 105 00:05:07,000 --> 00:05:09,480 Speaker 1: on politics or elections. 106 00:05:08,839 --> 00:05:10,560 Speaker 2: Not just apples and oranges, Not just. 107 00:05:10,600 --> 00:05:12,919 Speaker 1: Apples and oranges, it can be really scary. 108 00:05:13,520 --> 00:05:15,880 Speaker 2: What are the experts say about bots? 109 00:05:15,920 --> 00:05:18,640 Speaker 3: Because you know, they are a fairly new phenomenon, but 110 00:05:18,880 --> 00:05:20,880 Speaker 3: there must be a body of research out there. 111 00:05:21,279 --> 00:05:23,119 Speaker 1: I reached out to a bot doc. 112 00:05:23,520 --> 00:05:26,239 Speaker 2: Of course, a bot doc has a dock of bot. 113 00:05:26,400 --> 00:05:28,239 Speaker 1: She does have a real name. Her name is doctor 114 00:05:28,279 --> 00:05:32,880 Speaker 1: Sophia Mellinson Richard done from Canada's McMaster University. Now She 115 00:05:33,040 --> 00:05:37,880 Speaker 1: did a PhD on Bota gander when bot armies generate 116 00:05:38,120 --> 00:05:42,320 Speaker 1: mass content to saturate social media feeds and then manipulate audiences. 117 00:05:42,920 --> 00:05:46,720 Speaker 1: So doctor Richardoni told me about something called hashtag flooding, 118 00:05:47,080 --> 00:05:51,520 Speaker 1: which is essentially a tweet containing nothing but popularized keywords 119 00:05:51,560 --> 00:05:54,360 Speaker 1: and catchphrases in the form of hashtag. 120 00:05:53,839 --> 00:05:56,880 Speaker 2: After hashtag up to hashtag so annoying, so annoying. 121 00:05:57,200 --> 00:06:01,039 Speaker 1: Now, this tactic, as well as the rapid resharing of 122 00:06:01,120 --> 00:06:05,520 Speaker 1: human posts so twenty five thousand in ten minutes, can create, 123 00:06:05,680 --> 00:06:08,359 Speaker 1: like we were saying before, this illusion of widespread support 124 00:06:08,839 --> 00:06:13,000 Speaker 1: or widespread opposition for specific viewpoints. She says that this 125 00:06:13,040 --> 00:06:16,080 Speaker 1: happens as though the idea embedded in the tweed came 126 00:06:16,120 --> 00:06:20,839 Speaker 1: from grassroots popularization. So when we're talking about mimicking human behavior, 127 00:06:20,880 --> 00:06:23,320 Speaker 1: it's this feeling of oh, everyone thinks. 128 00:06:23,080 --> 00:06:25,400 Speaker 3: This, and you want to be a part of the everyone. 129 00:06:25,520 --> 00:06:27,600 Speaker 3: Like that's the human condition, right exactly. 130 00:06:27,680 --> 00:06:30,240 Speaker 1: It's the concept of herd mentality, which is the idea 131 00:06:30,279 --> 00:06:34,280 Speaker 1: that individuals naturally want to conform to the dominant view 132 00:06:34,400 --> 00:06:35,120 Speaker 1: of the community. 133 00:06:35,480 --> 00:06:38,880 Speaker 3: Yeah, I mean, it's so fascinating, and the reason that 134 00:06:38,960 --> 00:06:41,920 Speaker 3: you came to I don't want to say, become obsessed 135 00:06:41,920 --> 00:06:45,160 Speaker 3: with this topic, but I mean had started your fascination 136 00:06:45,640 --> 00:06:49,359 Speaker 3: was around one example that we've seen in the last 137 00:06:49,360 --> 00:06:50,159 Speaker 3: couple of years. 138 00:06:50,600 --> 00:06:53,000 Speaker 2: Talk me through the Johnny Depp Amber Herd story. 139 00:06:53,600 --> 00:06:56,799 Speaker 1: I don't know if many people remember what the internet 140 00:06:56,839 --> 00:06:59,560 Speaker 1: looked like in twenty twenty two during the Johnny Depp 141 00:06:59,600 --> 00:07:02,520 Speaker 1: amber her defamation trial, but I recently listened to a 142 00:07:02,560 --> 00:07:06,200 Speaker 1: six part podcast that brought me right back there. So 143 00:07:06,400 --> 00:07:09,080 Speaker 1: for people who need a refresher on the story. Essentially, 144 00:07:09,160 --> 00:07:12,400 Speaker 1: Johnny Depp filed a lawsuit against his ex wife Amber 145 00:07:12,440 --> 00:07:15,840 Speaker 1: heard over an opinion article Herd wrote for The Washington Post. 146 00:07:16,600 --> 00:07:20,560 Speaker 1: She alleged she had experienced domestic abuse. Her didn't name Debt, 147 00:07:20,640 --> 00:07:24,120 Speaker 1: but he launched defamation proceedings against her, arguing he was 148 00:07:24,240 --> 00:07:27,760 Speaker 1: identifiable from the article and depth denies claims that he 149 00:07:27,800 --> 00:07:32,239 Speaker 1: physically abused her. Now, a jury in Fairfax County, Virginia 150 00:07:32,400 --> 00:07:35,960 Speaker 1: ultimately sided with Depp, and Herd was sued for defaming him. 151 00:07:36,000 --> 00:07:38,200 Speaker 1: But I think you might remember that the court proceedings 152 00:07:38,240 --> 00:07:39,640 Speaker 1: were streamed online. 153 00:07:39,760 --> 00:07:42,280 Speaker 3: Yeah, so I was going to say, can you, in 154 00:07:42,320 --> 00:07:44,800 Speaker 3: the minds of our listeners connect what we've just been 155 00:07:44,840 --> 00:07:46,800 Speaker 3: talking about which is you know all of this stuff 156 00:07:46,840 --> 00:07:49,880 Speaker 3: about bots and this case, what's the connection here? 157 00:07:50,360 --> 00:07:53,320 Speaker 1: So as the case played out in the courtroom, the 158 00:07:53,440 --> 00:07:58,120 Speaker 1: internet mounted its own unofficial trial of Amber Heard. This 159 00:07:58,200 --> 00:08:00,400 Speaker 1: is the premise of the podcast I was talking about earlier, 160 00:08:00,440 --> 00:08:04,720 Speaker 1: Who Trolled Amber? It's by UK media organization Tortoise. The 161 00:08:04,800 --> 00:08:07,800 Speaker 1: podcast found that a large part of the online hate 162 00:08:07,840 --> 00:08:13,520 Speaker 1: campaign against her was actually manufactured and executed by bot accounts. 163 00:08:13,720 --> 00:08:16,520 Speaker 1: Now I spoke to Xavia Greenwood, he produced the Who 164 00:08:16,560 --> 00:08:17,920 Speaker 1: Trolled Amber podcast. 165 00:08:18,080 --> 00:08:21,200 Speaker 4: Some of the main themes were that Amber Heerd deserves prison, 166 00:08:21,280 --> 00:08:24,240 Speaker 4: Amber Heard as a gold digger, Amber Heard as a fraud, 167 00:08:24,320 --> 00:08:26,520 Speaker 4: Amber Heard as a liar. To give you a sense 168 00:08:26,520 --> 00:08:29,560 Speaker 4: of the scale of things, there was a hashtag justice 169 00:08:29,560 --> 00:08:33,080 Speaker 4: for Johnny Depp that was viewed fifteen point seven billion 170 00:08:33,160 --> 00:08:36,520 Speaker 4: times hashtag just as Amber Heerd viewed a fraction of that, 171 00:08:36,960 --> 00:08:40,240 Speaker 4: and my friends were suddenly saying things that didn't really 172 00:08:40,280 --> 00:08:42,640 Speaker 4: sound like them. They were saying, well, you know what 173 00:08:42,760 --> 00:08:45,040 Speaker 4: if this time she was the abuser. And these were 174 00:08:45,040 --> 00:08:47,520 Speaker 4: people who typically in the Me too movement would be 175 00:08:47,520 --> 00:08:50,120 Speaker 4: a bit more cynical or maybe a bit more reserved 176 00:08:50,120 --> 00:08:53,680 Speaker 4: in making a judgment like that. So yeah, from the beginning, 177 00:08:54,120 --> 00:08:56,720 Speaker 4: we sort of saw that this was quite suspicious. 178 00:08:57,000 --> 00:08:59,200 Speaker 3: What else did this investigation uncover? 179 00:08:59,840 --> 00:09:04,000 Speaker 1: The team Tortoise brought together a database of over one 180 00:09:04,120 --> 00:09:07,439 Speaker 1: million anti amber Heard tweets, and they brought it together 181 00:09:07,520 --> 00:09:11,200 Speaker 1: and they found that more than half of them were inauthentic. 182 00:09:11,760 --> 00:09:14,000 Speaker 1: So what that means is either they were posted from 183 00:09:14,080 --> 00:09:17,200 Speaker 1: spam accounts with three followers and they were built in 184 00:09:17,240 --> 00:09:20,640 Speaker 1: the last two months, or they were amplified in an 185 00:09:20,679 --> 00:09:25,200 Speaker 1: inauthentic way so reshared and reshared thousands upon thousands of times. 186 00:09:25,840 --> 00:09:29,320 Speaker 3: Okay, I think the thing that I wonder when I 187 00:09:29,360 --> 00:09:32,400 Speaker 3: hear about this story is like, who is behind this 188 00:09:32,480 --> 00:09:35,200 Speaker 3: trolling campaign? It's very clear who benefits from it, but 189 00:09:35,280 --> 00:09:36,520 Speaker 3: who's behind it. 190 00:09:36,760 --> 00:09:39,800 Speaker 1: We don't know. According to the investigation, the team could 191 00:09:39,920 --> 00:09:44,120 Speaker 1: hypothesize multiple different scenarios and their likely were multiple different 192 00:09:44,280 --> 00:09:45,120 Speaker 1: agendas at play. 193 00:09:45,200 --> 00:09:47,359 Speaker 2: Yeah, it's crazy, So this could. 194 00:09:47,080 --> 00:09:50,840 Speaker 1: Be genuine Johnny Depp fans who can acquire bots for 195 00:09:50,840 --> 00:09:54,360 Speaker 1: pretty cheap online and push support for their favorite actor 196 00:09:54,760 --> 00:09:57,720 Speaker 1: in other scenarios. Xavier floats what he calls a more 197 00:09:57,760 --> 00:09:58,680 Speaker 1: abstract theory. 198 00:09:59,040 --> 00:10:01,760 Speaker 4: I think it's sort of fairly widely known now that 199 00:10:01,960 --> 00:10:06,440 Speaker 4: authoritarian regimes who want to sow discouse or want to 200 00:10:06,440 --> 00:10:10,480 Speaker 4: cause confusion in the West that they seek out wedge issues. 201 00:10:10,520 --> 00:10:14,720 Speaker 4: So they seek out issues which divide Britain's, divide Americans, 202 00:10:14,760 --> 00:10:18,160 Speaker 4: divide Australians. They try to sort of drive a deeper wedge. 203 00:10:18,440 --> 00:10:21,160 Speaker 1: But something that really stuck with me from my conversation 204 00:10:21,240 --> 00:10:23,959 Speaker 1: with Xavier was a point he made about the future 205 00:10:23,960 --> 00:10:26,600 Speaker 1: of political discourse on social media. 206 00:10:26,679 --> 00:10:30,880 Speaker 4: If you can attack a celebrity who has enormous an 207 00:10:31,000 --> 00:10:34,400 Speaker 4: enormous amount of resources, as Amberhard did, was to stop 208 00:10:34,600 --> 00:10:37,480 Speaker 4: someone doing the same thing when it comes to attacking 209 00:10:37,520 --> 00:10:40,120 Speaker 4: a politician or when it comes to trying to sway 210 00:10:40,240 --> 00:10:40,800 Speaker 4: an election. 211 00:10:41,440 --> 00:10:44,720 Speaker 3: I mean, we're talking there about elections, and we of 212 00:10:44,760 --> 00:10:47,160 Speaker 3: course know that there is one coming up just around 213 00:10:47,160 --> 00:10:50,280 Speaker 3: the corner in the US. Are we expecting to see 214 00:10:50,320 --> 00:10:53,200 Speaker 3: this sort of interference from bots in the US election 215 00:10:53,679 --> 00:10:54,160 Speaker 3: this year? 216 00:10:54,600 --> 00:10:59,400 Speaker 1: We already have. Last month, the Biden administration charged Russian 217 00:10:59,600 --> 00:11:03,960 Speaker 1: media executives over an alleged targeted online campaign to influence 218 00:11:04,200 --> 00:11:08,240 Speaker 1: voters in the US and push hidden Russian government messaging. 219 00:11:08,559 --> 00:11:12,120 Speaker 1: Now it comes after US officials seized thirty two Internet 220 00:11:12,120 --> 00:11:16,760 Speaker 1: domain names that were covertly targeting specific demographics on social 221 00:11:16,760 --> 00:11:20,360 Speaker 1: media and promoting AI generated false narratives and pushing that 222 00:11:20,559 --> 00:11:21,439 Speaker 1: to those groups. 223 00:11:21,800 --> 00:11:24,640 Speaker 3: Okay, so there are ready allegations that we've seen this 224 00:11:24,880 --> 00:11:27,600 Speaker 3: at play when it comes to the US election. Ye, 225 00:11:28,240 --> 00:11:31,280 Speaker 3: I mean, I think that a natural endpoint here is 226 00:11:31,320 --> 00:11:33,120 Speaker 3: that if we're talking about the fact that even the 227 00:11:33,120 --> 00:11:37,160 Speaker 3: most powerful institutions, people, whatever, in the world can be 228 00:11:37,200 --> 00:11:40,480 Speaker 3: susceptible to this kind of bodagander, as we'll call it, 229 00:11:41,240 --> 00:11:43,200 Speaker 3: what is like an average dough like you or I 230 00:11:43,280 --> 00:11:46,360 Speaker 3: meant to do when it comes to protecting yourself against something. 231 00:11:46,200 --> 00:11:48,160 Speaker 1: Like this, Well, that was what came back in so 232 00:11:48,200 --> 00:11:50,640 Speaker 1: many of the conversations I was having. For the average 233 00:11:50,679 --> 00:11:52,520 Speaker 1: dough like you and me, A lot of it comes 234 00:11:52,559 --> 00:11:56,839 Speaker 1: down to awareness, staying vigilant, staying curious, question what you're 235 00:11:56,840 --> 00:11:59,559 Speaker 1: getting served, who's pushing it. And that's generally for anything 236 00:11:59,600 --> 00:12:02,559 Speaker 1: that you've seen online, but particularly when you're thinking something 237 00:12:02,600 --> 00:12:06,440 Speaker 1: looks a bit spammy, it likely might be it's important 238 00:12:06,480 --> 00:12:08,440 Speaker 1: to be critical about the content you're seeing before you 239 00:12:08,640 --> 00:12:12,440 Speaker 1: form an opinion, especially ahead of an election. Now, as 240 00:12:12,480 --> 00:12:15,640 Speaker 1: for what comes next, Xavier Greenwood says, we're largely in 241 00:12:15,800 --> 00:12:16,840 Speaker 1: uncharted waters. 242 00:12:17,400 --> 00:12:19,800 Speaker 4: What we saw with Amber heard trial, we may well 243 00:12:20,040 --> 00:12:22,400 Speaker 4: continue to see in an even more intense way in 244 00:12:22,440 --> 00:12:24,640 Speaker 4: the future. To some extent, the genie is out at 245 00:12:24,640 --> 00:12:25,040 Speaker 4: the box. 246 00:12:25,920 --> 00:12:29,840 Speaker 3: Thanks for listening to today's episode of tda's summer series. 247 00:12:30,120 --> 00:12:32,640 Speaker 3: We'll be back again tomorrow with another of our favorite 248 00:12:32,679 --> 00:12:39,680 Speaker 3: deep dives, but until then, have a wonderfully warm day. 249 00:12:39,960 --> 00:12:42,280 Speaker 4: My name is Lily Maddon and I'm a proud Arunda 250 00:12:42,520 --> 00:12:47,280 Speaker 4: Bungelung Caalcuttin woman from Gadighl Country. The Daily oz acknowledges 251 00:12:47,360 --> 00:12:49,560 Speaker 4: that this podcast is recorded on the lands of the 252 00:12:49,559 --> 00:12:53,120 Speaker 4: Gadighl people and pays respect to all Aboriginal and Torres 253 00:12:53,120 --> 00:12:56,040 Speaker 4: Strait Island and nations. We pay our respects to the 254 00:12:56,080 --> 00:12:58,840 Speaker 4: first peoples of these countries, both past and present.