1 00:00:00,520 --> 00:00:04,400 Speaker 1: Already and this is the Daily This is the Daily OS. 2 00:00:05,120 --> 00:00:06,840 Speaker 1: Oh now it makes sense. 3 00:00:14,640 --> 00:00:17,360 Speaker 2: Good morning and welcome to the Daily OS. It's Sunday, 4 00:00:17,400 --> 00:00:19,640 Speaker 2: the second of March. I'm Billy, I'm Zara. 5 00:00:19,920 --> 00:00:23,560 Speaker 1: We're back in your ears on a Sunday morning again. Again. 6 00:00:23,720 --> 00:00:27,120 Speaker 2: We love surprising you with a new episode on Sundays. 7 00:00:27,160 --> 00:00:29,080 Speaker 1: And Zara, I think today's topic. 8 00:00:28,960 --> 00:00:33,800 Speaker 2: Is very relevant and interesting and super interesting. So we 9 00:00:33,880 --> 00:00:37,120 Speaker 2: are nearly, at any moment now about to enter election 10 00:00:37,240 --> 00:00:38,600 Speaker 2: season in Australia. 11 00:00:38,760 --> 00:00:41,280 Speaker 3: The number of times that we have said that ONCAS 12 00:00:41,520 --> 00:00:45,760 Speaker 3: we're nearly there, We're just there. However, many synonyms can 13 00:00:45,840 --> 00:00:48,040 Speaker 3: say for we are close to an election. 14 00:00:48,320 --> 00:00:49,600 Speaker 1: We are close to an election. 15 00:00:49,880 --> 00:00:52,120 Speaker 2: So there is one that is due to be held 16 00:00:52,240 --> 00:00:55,400 Speaker 2: on the seventeenth of May, or before if Prime Minister 17 00:00:55,480 --> 00:00:58,720 Speaker 2: Anthony Alberanezi decides for us to go to an election earlier, 18 00:00:59,080 --> 00:01:01,560 Speaker 2: and that could mean that we are about to enter 19 00:01:01,640 --> 00:01:05,880 Speaker 2: a period of dis and misinformation in Australia. So today 20 00:01:06,040 --> 00:01:09,160 Speaker 2: we are talking about a very specific type of false 21 00:01:09,200 --> 00:01:10,720 Speaker 2: information and that. 22 00:01:10,760 --> 00:01:11,759 Speaker 1: Is deep fakes. 23 00:01:12,040 --> 00:01:14,920 Speaker 2: More specifically, we want to talk about how deep fakes 24 00:01:15,000 --> 00:01:18,720 Speaker 2: can impact elections, and in this context we're talking about 25 00:01:18,760 --> 00:01:22,520 Speaker 2: deep fakes that are deceptive and harmful. A quick note 26 00:01:22,600 --> 00:01:26,360 Speaker 2: before we start. This podcast is produced with financial support 27 00:01:26,440 --> 00:01:31,280 Speaker 2: from Microsoft. However, the content does remain entirely independent. And 28 00:01:31,360 --> 00:01:34,560 Speaker 2: if you're wondering why Microsoft would be interested in sponsoring 29 00:01:34,640 --> 00:01:36,360 Speaker 2: a podcast on deep fakes. 30 00:01:36,080 --> 00:01:37,319 Speaker 4: In it, I was quite interested. 31 00:01:37,480 --> 00:01:40,000 Speaker 2: Well you, yeah, because you might not know that they 32 00:01:40,040 --> 00:01:43,360 Speaker 2: are very involved in this space and they have teams 33 00:01:43,360 --> 00:01:46,560 Speaker 2: that track the threat of actors who are often the 34 00:01:46,600 --> 00:01:48,680 Speaker 2: ones creating deep fakes. 35 00:01:49,040 --> 00:01:52,160 Speaker 3: Very interesting, now, Billy, you have used the word deep 36 00:01:52,160 --> 00:01:55,240 Speaker 3: fake a record number of times in the introduction of 37 00:01:55,320 --> 00:01:58,600 Speaker 3: a podcast. For anyone who's not familiar with the term, though, 38 00:01:58,600 --> 00:02:00,960 Speaker 3: what are you talking about when you say deep fake? 39 00:02:01,320 --> 00:02:05,160 Speaker 2: So they are digital photos or videos of someone that 40 00:02:05,400 --> 00:02:09,880 Speaker 2: looks very real, but they have actually been created using 41 00:02:10,000 --> 00:02:14,880 Speaker 2: artificial intelligence and they are fake, although often they're not 42 00:02:15,240 --> 00:02:18,240 Speaker 2: entirely fake. So it could be a video where ninety 43 00:02:18,280 --> 00:02:21,440 Speaker 2: percent of that video is real, but then there's ten 44 00:02:21,520 --> 00:02:25,640 Speaker 2: percent that is generated by AI and is fake. But 45 00:02:25,680 --> 00:02:28,280 Speaker 2: it's that ten percent that is really deceptive. 46 00:02:28,520 --> 00:02:29,280 Speaker 4: Well, I guess yeah. 47 00:02:29,320 --> 00:02:31,680 Speaker 3: When you're looking at something and so much of it 48 00:02:31,720 --> 00:02:34,120 Speaker 3: looks real, your sense of what's real and what's fake 49 00:02:34,200 --> 00:02:34,480 Speaker 3: is so. 50 00:02:34,480 --> 00:02:37,880 Speaker 2: Distortive exactly, and that is a key feature of deep fakes, 51 00:02:37,880 --> 00:02:42,639 Speaker 2: that they are highly highly realistic. Now, one famous example 52 00:02:42,639 --> 00:02:44,880 Speaker 2: that comes to my mind when we talk about deep 53 00:02:44,880 --> 00:02:46,880 Speaker 2: fakes is I don't know if you've seen this there 54 00:02:47,040 --> 00:02:49,359 Speaker 2: or if you remember this, but the image of Pope 55 00:02:49,360 --> 00:02:53,600 Speaker 2: Francis in a Valenciago quilted jacket. It's kind of walking 56 00:02:53,720 --> 00:02:58,120 Speaker 2: the streets in this massive white jacket, and it went 57 00:02:58,240 --> 00:03:01,480 Speaker 2: completely viral at the time in March train twenty three, 58 00:03:01,680 --> 00:03:04,200 Speaker 2: and a lot of people obviously didn't know it was 59 00:03:04,240 --> 00:03:07,480 Speaker 2: a fake photo because it looked so real, and two 60 00:03:07,600 --> 00:03:11,280 Speaker 2: years ago, there wasn't really the conversations happening at the 61 00:03:11,280 --> 00:03:13,560 Speaker 2: time that there are now, when we are much more 62 00:03:13,600 --> 00:03:16,920 Speaker 2: aware that there are so many fake images, and obviously 63 00:03:17,240 --> 00:03:19,799 Speaker 2: we are talking about AI so much even we're talking 64 00:03:19,800 --> 00:03:21,680 Speaker 2: about AI so much more today than we were a 65 00:03:21,680 --> 00:03:24,600 Speaker 2: month ago, let alone two years ago, and so that 66 00:03:24,760 --> 00:03:26,640 Speaker 2: is one of I would say that's one of the 67 00:03:26,639 --> 00:03:31,000 Speaker 2: more innocuous examples of deep fakes where although it was fake, 68 00:03:31,080 --> 00:03:34,600 Speaker 2: it didn't necessarily do any damage by people believing that 69 00:03:34,720 --> 00:03:37,320 Speaker 2: it was real. But why we're talking about it today 70 00:03:37,440 --> 00:03:41,040 Speaker 2: is because deep fakes can and they have been used 71 00:03:41,040 --> 00:03:44,520 Speaker 2: in a much more deceptive way, and one of those 72 00:03:44,560 --> 00:03:46,600 Speaker 2: ways is during elections. 73 00:03:46,920 --> 00:03:48,400 Speaker 4: I see how you're bringing this altogether. 74 00:03:49,360 --> 00:03:50,240 Speaker 1: We're circling that. 75 00:03:50,960 --> 00:03:54,960 Speaker 2: So there have been examples around the world where politicians 76 00:03:55,080 --> 00:03:58,200 Speaker 2: or political candidates or even just people who want a 77 00:03:58,200 --> 00:04:02,000 Speaker 2: certain party to win, have used deep fakes to mislead 78 00:04:02,080 --> 00:04:05,400 Speaker 2: voters and to benefit their campaign. Or it could not 79 00:04:05,520 --> 00:04:07,560 Speaker 2: even be for that reason, it could just be to 80 00:04:07,920 --> 00:04:09,760 Speaker 2: cause chaos or cause disruption. 81 00:04:10,360 --> 00:04:12,840 Speaker 3: Okay, So we're talking about it because we're in the 82 00:04:12,880 --> 00:04:16,160 Speaker 3: lead up to an election and we've seen not just 83 00:04:16,400 --> 00:04:19,560 Speaker 3: here but across the world examples of deep fakes being 84 00:04:19,720 --> 00:04:23,080 Speaker 3: used during an election cycle. Can you talk me through 85 00:04:23,080 --> 00:04:26,520 Speaker 3: some of the examples that we've seen of this and 86 00:04:26,560 --> 00:04:27,599 Speaker 3: how it's been used. 87 00:04:27,839 --> 00:04:28,000 Speaker 4: Yeah. 88 00:04:28,000 --> 00:04:31,520 Speaker 2: I thought we would start in Australia because you might 89 00:04:31,560 --> 00:04:34,320 Speaker 2: not be aware that there have actually been examples in 90 00:04:34,440 --> 00:04:38,560 Speaker 2: Australian elections or by Australian parties of using deep fakes. 91 00:04:38,920 --> 00:04:42,839 Speaker 2: So Queensland, Hm'm sure you've heard of it, familiar. They 92 00:04:43,000 --> 00:04:46,800 Speaker 2: had an election last year and the Liberal National Party 93 00:04:47,000 --> 00:04:51,400 Speaker 2: posted a video of the then Premier, Steven Miles dancing 94 00:04:51,800 --> 00:04:55,000 Speaker 2: with text beneath it that said pov my rent is 95 00:04:55,080 --> 00:04:57,400 Speaker 2: up sixty dollars a week, my power bill is up 96 00:04:57,440 --> 00:05:00,720 Speaker 2: twenty percent. But the premier made a sandwich on TikTok, 97 00:05:00,960 --> 00:05:04,000 Speaker 2: So basically they were saying that the premiere at the 98 00:05:04,000 --> 00:05:06,720 Speaker 2: time was dancing and was happy despite the fact that 99 00:05:07,000 --> 00:05:09,799 Speaker 2: the rent in the state had gone up on average 100 00:05:09,800 --> 00:05:13,760 Speaker 2: for the average renter. Now, that video was actually labeled 101 00:05:13,920 --> 00:05:17,240 Speaker 2: as AI generated, although I would say that the label 102 00:05:17,440 --> 00:05:20,880 Speaker 2: was pretty easy to miss if you weren't necessarily looking 103 00:05:20,880 --> 00:05:22,880 Speaker 2: out for it, and so it was just this completely 104 00:05:22,920 --> 00:05:25,200 Speaker 2: fake video of Steven Male dancing. 105 00:05:26,000 --> 00:05:28,839 Speaker 3: I remember seeing that at the time and coming up 106 00:05:28,880 --> 00:05:30,880 Speaker 3: on you for you page it did, and I remember 107 00:05:30,920 --> 00:05:35,159 Speaker 3: thinking that it was the first time I recall seeing 108 00:05:35,200 --> 00:05:36,880 Speaker 3: something like that here in Australia. 109 00:05:37,080 --> 00:05:40,280 Speaker 2: Yeah, and it generated a lot of media coverage. But 110 00:05:40,360 --> 00:05:42,920 Speaker 2: then a lot of that coverage then pointed out that 111 00:05:43,320 --> 00:05:46,560 Speaker 2: this is not unique to one side of politics. And 112 00:05:46,800 --> 00:05:50,600 Speaker 2: actually around that time, well actually before that video was posted, 113 00:05:50,920 --> 00:05:54,240 Speaker 2: the Australian Labor Party had also posted a video on 114 00:05:54,279 --> 00:05:57,760 Speaker 2: their TikTok that had used AI to create a fake 115 00:05:57,839 --> 00:06:01,440 Speaker 2: video of Federal Opposition leader Peter dune and dancing with 116 00:06:01,520 --> 00:06:04,440 Speaker 2: the words dance if you want to build nuclear power 117 00:06:04,480 --> 00:06:07,600 Speaker 2: plants in everyone's backyard. Zara, I'm going to send you 118 00:06:07,680 --> 00:06:08,920 Speaker 2: the video for you to look. 119 00:06:09,000 --> 00:06:09,520 Speaker 1: Right now. 120 00:06:10,320 --> 00:06:12,680 Speaker 2: I realized this is an audio platform and so the 121 00:06:12,760 --> 00:06:15,200 Speaker 2: audience can't see it. But Zara, I want to get 122 00:06:15,200 --> 00:06:15,760 Speaker 2: your reaction. 123 00:06:16,120 --> 00:06:19,280 Speaker 3: My god, oh my god. 124 00:06:19,520 --> 00:06:23,120 Speaker 2: I'm going to say, it's probably not the most realistic. 125 00:06:23,240 --> 00:06:25,200 Speaker 4: It looks nothing like Peter Dudden. 126 00:06:25,000 --> 00:06:27,640 Speaker 2: Okay, but you can see that it's trying to kind 127 00:06:27,680 --> 00:06:31,680 Speaker 2: of imitate him vaguely. Sure, Look, this is one of 128 00:06:31,720 --> 00:06:35,400 Speaker 2: the examples where it is pretty clear to most people. 129 00:06:35,200 --> 00:06:36,400 Speaker 1: That it was a fake video. 130 00:06:36,760 --> 00:06:39,640 Speaker 3: However, it must be said that like you and I 131 00:06:39,760 --> 00:06:42,440 Speaker 3: look at Peter Dudden day and day out where we 132 00:06:42,560 --> 00:06:44,960 Speaker 3: work in news, we're very familiar with what he looks like. 133 00:06:45,000 --> 00:06:48,200 Speaker 3: I imagine for someone that's perhaps not paying the same amount 134 00:06:48,240 --> 00:06:50,159 Speaker 3: of attention, maybe they wouldn't be able to tell. 135 00:06:50,040 --> 00:06:53,160 Speaker 2: You yes exactly. And also, I think the Stephen Miles one, 136 00:06:53,200 --> 00:06:55,039 Speaker 2: if you look at that one, it was a lot 137 00:06:55,120 --> 00:06:59,520 Speaker 2: more realistic. Yeah, but again, these are videos of the politicians, 138 00:07:00,000 --> 00:07:03,600 Speaker 2: and so it is one of the more benign examples 139 00:07:03,640 --> 00:07:06,640 Speaker 2: of how deep fakes can be used in elections. 140 00:07:07,240 --> 00:07:10,080 Speaker 3: Okay, So if you're saying that those are benign examples, 141 00:07:10,160 --> 00:07:12,960 Speaker 3: or perhaps more benign examples. What are some of the 142 00:07:12,960 --> 00:07:16,120 Speaker 3: more troublesome deceptive examples that we've got out there. 143 00:07:16,280 --> 00:07:19,560 Speaker 2: Yeah, so if we look overseas, there are quite a 144 00:07:19,600 --> 00:07:24,640 Speaker 2: few more troublesome examples. So one example that the audience 145 00:07:24,720 --> 00:07:26,440 Speaker 2: might be familiar with or might have seen in their 146 00:07:26,440 --> 00:07:29,720 Speaker 2: feeds is that during the US election last year, in 147 00:07:29,760 --> 00:07:33,040 Speaker 2: the lead up to that election, Elon Musk shared a 148 00:07:33,160 --> 00:07:36,960 Speaker 2: video that looked like an ad campaign for Kamala Harris, 149 00:07:37,240 --> 00:07:41,000 Speaker 2: who was the Democratic candidate for the US presidency last year. 150 00:07:41,360 --> 00:07:42,560 Speaker 2: Here's a bit of that clip. 151 00:07:43,040 --> 00:07:46,680 Speaker 5: I Kamal Harris and your Democrat candidate for presidents because 152 00:07:46,800 --> 00:07:50,400 Speaker 5: Joe Biden finally exposed to sinilitive to be thanks, John, 153 00:07:50,760 --> 00:07:53,520 Speaker 5: I was selected because I am the ultimate diversity hire. 154 00:07:53,600 --> 00:07:56,080 Speaker 5: I'm both a woman and a president of color. So 155 00:07:56,160 --> 00:07:59,600 Speaker 5: if you've criticized anything I say, you're both sexist and racist. 156 00:08:00,120 --> 00:08:03,880 Speaker 3: That one's crazy because that sounds exactly like her. 157 00:08:04,160 --> 00:08:07,119 Speaker 1: Yeah, it would be look video. 158 00:08:07,400 --> 00:08:10,160 Speaker 2: Yeah, if you were watching it, it would be really 159 00:08:10,200 --> 00:08:13,120 Speaker 2: hard to tell that that is a fake video. And 160 00:08:13,200 --> 00:08:16,960 Speaker 2: you probably couldn't immediately tell unless you were in tune 161 00:08:17,000 --> 00:08:19,880 Speaker 2: with every single thing Kamla Harris has ever said, you 162 00:08:19,880 --> 00:08:22,080 Speaker 2: would have to do some digging to figure out if 163 00:08:22,080 --> 00:08:25,160 Speaker 2: that was real or fake, especially when it's shared by 164 00:08:25,400 --> 00:08:30,280 Speaker 2: a very influential person. Now, that video received nearly one 165 00:08:30,360 --> 00:08:35,080 Speaker 2: hundred and forty million views according to x's Metrics, and 166 00:08:35,360 --> 00:08:38,840 Speaker 2: it was liked by nearly one million accounts. Wow, so 167 00:08:38,880 --> 00:08:41,200 Speaker 2: that gives you an idea of how viral this kind 168 00:08:41,280 --> 00:08:42,520 Speaker 2: of content can go. 169 00:08:43,080 --> 00:08:46,120 Speaker 4: I'm just looking at the tweet responses. What's that called 170 00:08:46,240 --> 00:08:47,360 Speaker 4: yesh x response? 171 00:08:47,400 --> 00:08:51,079 Speaker 3: And ex responds someone who was retweeted one hundred and 172 00:08:51,120 --> 00:08:54,080 Speaker 3: ninety three times and like ten thousand times said. 173 00:08:54,000 --> 00:08:55,160 Speaker 4: Is this AI or reel? 174 00:08:55,400 --> 00:08:55,640 Speaker 3: Yeah? 175 00:08:55,880 --> 00:08:57,280 Speaker 4: So clear genuinely no idea. 176 00:08:57,440 --> 00:09:00,920 Speaker 2: Yeah, it clearly wasn't that clear to me any many people. 177 00:09:01,120 --> 00:09:03,120 Speaker 2: As I'm sure you've guessed. The reason we're talking about 178 00:09:03,160 --> 00:09:06,720 Speaker 2: it is because it was completely fake. Kamala Harris never 179 00:09:06,800 --> 00:09:11,000 Speaker 2: said those words, so that voice was completely AI generated, 180 00:09:11,120 --> 00:09:16,360 Speaker 2: but it sounded and looked extremely realistic. Now, it's hard 181 00:09:16,440 --> 00:09:19,959 Speaker 2: to measure what impact a video like that had on 182 00:09:20,040 --> 00:09:22,520 Speaker 2: the minds of voters in the US. It could have 183 00:09:22,640 --> 00:09:27,199 Speaker 2: had no impact at all, or it could be quite significant, Again, 184 00:09:27,360 --> 00:09:30,880 Speaker 2: especially when shared by a very influential person such as 185 00:09:31,120 --> 00:09:35,000 Speaker 2: Musk So that's one example, but there are also examples 186 00:09:35,080 --> 00:09:40,079 Speaker 2: of foreign countries using deep fakes to interfere with elections. 187 00:09:40,280 --> 00:09:41,600 Speaker 4: Can you unpack that more for me? 188 00:09:41,800 --> 00:09:44,760 Speaker 2: Yeah, it's quite a big concept to get your head around. 189 00:09:44,800 --> 00:09:50,120 Speaker 3: So foreign actors interfering with domestic elections in a different country. 190 00:09:49,960 --> 00:09:52,560 Speaker 2: Exactly, And it's not really something that many of us 191 00:09:52,559 --> 00:09:55,120 Speaker 2: would ever think about, but it's probably something that you've 192 00:09:55,240 --> 00:09:58,560 Speaker 2: heard murmurs about, this idea that there are countries who 193 00:09:58,600 --> 00:10:02,720 Speaker 2: try to interfere with other countries' elections with the aim 194 00:10:02,880 --> 00:10:06,559 Speaker 2: of electing a government they believe will be more closely 195 00:10:06,640 --> 00:10:09,840 Speaker 2: aligned with their own policies and agendas, or like I 196 00:10:09,880 --> 00:10:12,120 Speaker 2: said before, it could not be because of that, it 197 00:10:12,160 --> 00:10:16,920 Speaker 2: could be again just to create chaos and disruption to society. 198 00:10:17,240 --> 00:10:18,400 Speaker 4: What are some examples of that? 199 00:10:18,800 --> 00:10:20,840 Speaker 2: So if we stick with looking at the US election 200 00:10:20,960 --> 00:10:24,920 Speaker 2: last year, Microsoft actually did a report into how rushing 201 00:10:24,960 --> 00:10:29,800 Speaker 2: groups created deep fakes to try to influence the US election. Now, 202 00:10:29,840 --> 00:10:31,760 Speaker 2: before I go on, I do just want to quickly 203 00:10:31,800 --> 00:10:35,360 Speaker 2: mention that, as we discussed earlier, this podcast is sponsored 204 00:10:35,400 --> 00:10:38,280 Speaker 2: by Microsoft, but when not required to talk about them 205 00:10:38,320 --> 00:10:41,080 Speaker 2: in this section, they just have done a lot of 206 00:10:41,120 --> 00:10:43,920 Speaker 2: research in this area that I thought was very relevant 207 00:10:43,920 --> 00:10:47,240 Speaker 2: to talk about. Okay, good clear disclaimer, just want to 208 00:10:47,240 --> 00:10:50,520 Speaker 2: be completely transparent there. So in a report last year, 209 00:10:50,760 --> 00:10:54,000 Speaker 2: Microsoft found that two groups that were aligned with the 210 00:10:54,080 --> 00:10:59,000 Speaker 2: Russian government, the Kremlin, had created and disseminated videos that 211 00:10:59,120 --> 00:11:03,040 Speaker 2: were quote designed to discredit Harris again talking about Kamala 212 00:11:03,080 --> 00:11:07,920 Speaker 2: Harris and stoke controversy around her campaign. So, for example, 213 00:11:07,960 --> 00:11:11,599 Speaker 2: they found this one video that showed Harris supporters attacking 214 00:11:11,720 --> 00:11:14,880 Speaker 2: an attendee at a Trump rally, and that video was 215 00:11:15,000 --> 00:11:19,000 Speaker 2: completely false, but it was still seen by millions and 216 00:11:19,120 --> 00:11:22,760 Speaker 2: millions of people, and there was no labeling or anything 217 00:11:22,920 --> 00:11:24,400 Speaker 2: that indicated. 218 00:11:23,800 --> 00:11:25,000 Speaker 1: This video was fake. 219 00:11:25,440 --> 00:11:28,680 Speaker 2: So to someone who was just on their feedscrolling through, 220 00:11:29,040 --> 00:11:32,840 Speaker 2: there wasn't any clear signs that this was a fake video. 221 00:11:33,520 --> 00:11:35,880 Speaker 2: And so basically what they found is that these groups 222 00:11:36,000 --> 00:11:40,160 Speaker 2: intentionally created these videos and planted them in front of 223 00:11:40,320 --> 00:11:45,720 Speaker 2: US audiences to sway their opinion in this case against Harris. 224 00:11:46,120 --> 00:11:48,600 Speaker 3: Yeah, it's really interesting and again needs to be said, 225 00:11:48,720 --> 00:11:51,040 Speaker 3: this isn't happening on one side of politics. We've seen 226 00:11:51,080 --> 00:11:55,319 Speaker 3: examples across the aisle when it comes to this Billy 227 00:11:55,360 --> 00:11:57,240 Speaker 3: one thing that I think about when you're talking about 228 00:11:57,240 --> 00:12:01,319 Speaker 3: this is the onus that exists on on the individual, 229 00:12:01,520 --> 00:12:05,520 Speaker 3: because ultimately we're saying you can be served up fake 230 00:12:05,640 --> 00:12:09,720 Speaker 3: content that is mimicking reality but is not real, and 231 00:12:09,800 --> 00:12:13,360 Speaker 3: sometimes deceptively, sometimes in a more benign nature, and we 232 00:12:13,400 --> 00:12:17,280 Speaker 3: are expecting audiences to be able to critically, I guess, 233 00:12:17,320 --> 00:12:22,040 Speaker 3: analyze what they're seeing and understand how can people tell 234 00:12:22,120 --> 00:12:24,480 Speaker 3: if videos that they're watching or photos they're seeing a 235 00:12:24,600 --> 00:12:27,240 Speaker 3: real or AI generator. You know, for example, we've seen 236 00:12:27,280 --> 00:12:30,280 Speaker 3: the President of the United States share something recently there 237 00:12:30,360 --> 00:12:32,920 Speaker 3: was completely AI generated and completely fake. 238 00:12:33,160 --> 00:12:33,360 Speaker 4: Yeah. 239 00:12:33,360 --> 00:12:35,199 Speaker 2: I speak to my friends about this all the time 240 00:12:35,280 --> 00:12:39,120 Speaker 2: because we're always, you know, sending each other videos and whatnot, 241 00:12:39,240 --> 00:12:41,480 Speaker 2: and we're always talking about how do we tell if 242 00:12:41,480 --> 00:12:44,439 Speaker 2: this is real or not? And what's hard is that 243 00:12:44,640 --> 00:12:48,920 Speaker 2: by definition, deep fakes are incredibly realistic. The whole point 244 00:12:48,960 --> 00:12:49,760 Speaker 2: is for you to think. 245 00:12:49,640 --> 00:12:50,440 Speaker 1: That they are real. 246 00:12:51,200 --> 00:12:53,800 Speaker 2: I think the main thing to look for is who 247 00:12:53,840 --> 00:12:56,480 Speaker 2: is sharing this video and what is the primary source 248 00:12:56,600 --> 00:12:59,199 Speaker 2: or does a primary source even exist? So if it's 249 00:12:59,240 --> 00:13:03,720 Speaker 2: from an verified, completely random account, then that's more likely 250 00:13:03,800 --> 00:13:07,040 Speaker 2: to be a fake video than one shared by a 251 00:13:07,080 --> 00:13:12,240 Speaker 2: reputable news outlet, or even better, a primary source, so Zara, 252 00:13:12,360 --> 00:13:15,120 Speaker 2: as you know, at TDA, we are all about the 253 00:13:15,160 --> 00:13:18,360 Speaker 2: primary source, So that means looking at where each and 254 00:13:18,440 --> 00:13:21,679 Speaker 2: every piece of information originates from. So just for some 255 00:13:21,720 --> 00:13:24,840 Speaker 2: context for the listeners, our guiding rule at TDA is 256 00:13:24,840 --> 00:13:28,200 Speaker 2: that every fact we share needs to have a primary source. 257 00:13:28,480 --> 00:13:31,439 Speaker 2: We need to know exactly where it came from and 258 00:13:31,880 --> 00:13:34,960 Speaker 2: go to that place to get that information ourselves. 259 00:13:35,360 --> 00:13:36,800 Speaker 1: So if we look at the video of. 260 00:13:36,880 --> 00:13:39,560 Speaker 2: Kamala Harris shared by Elon Musk, that kind of is 261 00:13:39,559 --> 00:13:43,520 Speaker 2: more murky because obviously Elon Musk's page is verified, so 262 00:13:43,840 --> 00:13:45,880 Speaker 2: I understand why a lot of people would have thought 263 00:13:45,880 --> 00:13:48,560 Speaker 2: it was real. But again, if we are looking for 264 00:13:48,640 --> 00:13:51,280 Speaker 2: the primary source in that case, you would find it 265 00:13:51,320 --> 00:13:55,600 Speaker 2: on either Harris's own verified page, all the Democratic parties 266 00:13:55,720 --> 00:13:59,480 Speaker 2: verified socials. And if you can't find that video on either, 267 00:14:00,000 --> 00:14:02,800 Speaker 2: and there's a good chance it's not real. But I 268 00:14:02,800 --> 00:14:06,720 Speaker 2: think you can also consider whether that content perfectly fits 269 00:14:06,720 --> 00:14:10,640 Speaker 2: a particular narrative being pushed by someone that it potentially 270 00:14:10,720 --> 00:14:13,880 Speaker 2: could be too good to be true. And I also 271 00:14:13,920 --> 00:14:16,280 Speaker 2: just want to say that this isn't about making you 272 00:14:16,360 --> 00:14:19,480 Speaker 2: think that everything is fake. That is not what we're saying. 273 00:14:19,720 --> 00:14:22,840 Speaker 2: I think we're just talking about developing a healthy level 274 00:14:22,880 --> 00:14:27,400 Speaker 2: of skepticism, not too much that you completely distrust everything 275 00:14:27,440 --> 00:14:30,480 Speaker 2: you see, but enough to stop and consider where this 276 00:14:30,600 --> 00:14:34,120 Speaker 2: content is from. Could it be fake or manipulated? And 277 00:14:34,200 --> 00:14:37,000 Speaker 2: if you're basing your opinion on this video that you're seeing, 278 00:14:37,520 --> 00:14:40,240 Speaker 2: should you be checking whether there is a primary source? 279 00:14:40,760 --> 00:14:42,640 Speaker 3: Now, Billy, I just want to end back. I guess 280 00:14:42,640 --> 00:14:45,720 Speaker 3: where we started with a discussion about it here in Australia. 281 00:14:45,800 --> 00:14:48,400 Speaker 3: Are there laws that regulate this sort of behavior here? 282 00:14:48,760 --> 00:14:51,440 Speaker 2: Yeah, our audience has asked us a lot about this, 283 00:14:51,800 --> 00:14:54,640 Speaker 2: and essentially the answer is that there is nothing that 284 00:14:54,720 --> 00:14:59,120 Speaker 2: prohibits the use of AI in election campaigns. Also remembering 285 00:14:59,160 --> 00:15:03,200 Speaker 2: that this is quite new technology. There is, however, regulation 286 00:15:03,360 --> 00:15:07,640 Speaker 2: about campaign communication, so you know, information coming from the 287 00:15:07,720 --> 00:15:11,280 Speaker 2: Labor Party directly or the Liberal Party or whatever party 288 00:15:11,360 --> 00:15:15,600 Speaker 2: or political candidate, that communication is regulated. But as we 289 00:15:15,640 --> 00:15:18,600 Speaker 2: saw in many of the examples above, the deep fakes 290 00:15:18,600 --> 00:15:23,200 Speaker 2: that are misleading voters aren't usually coming from political parties directly. 291 00:15:23,240 --> 00:15:27,240 Speaker 2: They're coming from other third party actors who perhaps have 292 00:15:27,400 --> 00:15:31,320 Speaker 2: a certain agenda to push. It is a criminal offense 293 00:15:31,360 --> 00:15:35,160 Speaker 2: in the Electoral Act to mislead or deceive an elector. 294 00:15:34,800 --> 00:15:38,000 Speaker 1: When it comes to casting a vote. But again, this. 295 00:15:37,960 --> 00:15:41,160 Speaker 2: Becomes harder to apply to videos posted on random accounts 296 00:15:41,160 --> 00:15:44,040 Speaker 2: where the exact source of that video or image is 297 00:15:44,080 --> 00:15:44,520 Speaker 2: not known. 298 00:15:45,040 --> 00:15:49,000 Speaker 3: Look a really complex area, and especially as it interacts 299 00:15:49,040 --> 00:15:51,520 Speaker 3: with election season. Definitely one that we have to keep 300 00:15:51,560 --> 00:15:53,920 Speaker 3: an eye on and one that will continue to keep 301 00:15:53,960 --> 00:15:54,600 Speaker 3: speaking about. 302 00:15:54,640 --> 00:15:55,840 Speaker 4: So thank you for taking us through that. 303 00:15:55,840 --> 00:15:56,560 Speaker 1: Billy, Thank you. 304 00:15:56,800 --> 00:15:59,120 Speaker 3: We'll be back again tomorrow morning with a deep dive 305 00:15:59,240 --> 00:16:02,000 Speaker 3: as usual, but until then, enjoy your weekend. 306 00:16:06,120 --> 00:16:08,480 Speaker 1: My name is Lily Maddon and I'm a proud Arunda 307 00:16:08,680 --> 00:16:11,240 Speaker 1: Bungelung Kalkadin woman from Gadighl Country. 308 00:16:12,040 --> 00:16:15,200 Speaker 2: The Daily oz acknowledges that this podcast is recorded on 309 00:16:15,240 --> 00:16:17,720 Speaker 2: the lands of the Gadighl people and pays respect to 310 00:16:17,800 --> 00:16:21,120 Speaker 2: all Aboriginal and torrest Rate island and nations. We pay 311 00:16:21,160 --> 00:16:24,080 Speaker 2: our respects to the first peoples of these countries, both 312 00:16:24,120 --> 00:16:25,040 Speaker 2: past and present.