1 00:00:00,520 --> 00:00:04,040 Speaker 1: Already and this is the daily This is the daily 2 00:00:04,120 --> 00:00:06,840 Speaker 1: ohs oh, now it makes sense. 3 00:00:14,640 --> 00:00:17,320 Speaker 2: Good morning, and welcome to the Daily Ods. It's Monday, 4 00:00:17,400 --> 00:00:20,200 Speaker 2: the tenth of February. I'm emma, and I'm a told. 5 00:00:20,680 --> 00:00:25,480 Speaker 2: Artificial intelligence or AI has become increasingly ingrained in our 6 00:00:25,520 --> 00:00:28,920 Speaker 2: lives in recent years, and we've seen this technology used 7 00:00:28,920 --> 00:00:32,320 Speaker 2: for both good and for bad, from life changing medical 8 00:00:32,360 --> 00:00:36,160 Speaker 2: discoveries to the rise of explicit deep fakes. But during 9 00:00:36,280 --> 00:00:39,040 Speaker 2: the recent AI boom, we've heard a lot of conversations 10 00:00:39,120 --> 00:00:44,120 Speaker 2: around ethics and safety. As those conversations have become more intense, 11 00:00:44,360 --> 00:00:48,839 Speaker 2: there is a lesser discussed concern that is now gaining attention, 12 00:00:48,960 --> 00:00:52,440 Speaker 2: and that's all about the environmental impact of AI. 13 00:00:53,040 --> 00:00:57,720 Speaker 1: AI requires immense quantities of resources, and this includes electricity, water, 14 00:00:57,840 --> 00:01:00,800 Speaker 1: and finite minerals. Now, with a growing amount of AI, 15 00:01:00,840 --> 00:01:05,120 Speaker 1: it's carbon footprint is only expected to rise in coming decades. Today, 16 00:01:05,160 --> 00:01:08,280 Speaker 1: we're going to explore that early Red Flag's environmental experts 17 00:01:08,280 --> 00:01:11,960 Speaker 1: are raising and what they mean for the sustainability of AI. 18 00:01:15,560 --> 00:01:18,520 Speaker 2: Chol you have gone deep on this topic for us 19 00:01:18,560 --> 00:01:21,080 Speaker 2: because I think we've seen a lot of headlines floating 20 00:01:21,120 --> 00:01:24,479 Speaker 2: around lately about AI not being great for the environment. 21 00:01:24,920 --> 00:01:28,640 Speaker 2: But it's a confusing space to start with. Let alone, 22 00:01:28,720 --> 00:01:32,200 Speaker 2: before we get into the specifics of this, it does 23 00:01:32,200 --> 00:01:34,640 Speaker 2: feel like a really new advancement. But I was surprised 24 00:01:34,640 --> 00:01:36,880 Speaker 2: when you told me. AI is a term that was 25 00:01:36,920 --> 00:01:41,759 Speaker 2: first coined in nineteen fifty five by an American computer scientist. 26 00:01:41,880 --> 00:01:45,440 Speaker 2: His name was John McCarthy. But these days, when we're 27 00:01:45,480 --> 00:01:48,880 Speaker 2: talking about AI, I think most people are probably thinking 28 00:01:49,000 --> 00:01:52,800 Speaker 2: of what's called generative AI. So that's like open AI's 29 00:01:52,880 --> 00:01:56,720 Speaker 2: chat GPT, right, so those chatbot based systems which are 30 00:01:56,720 --> 00:02:00,520 Speaker 2: built on large language models, and that means they use 31 00:02:00,640 --> 00:02:04,040 Speaker 2: these vast databases of online text and images to generate 32 00:02:04,360 --> 00:02:09,919 Speaker 2: new content. Now that chat GPT has become so incredibly popular, though, 33 00:02:10,080 --> 00:02:13,800 Speaker 2: people are turning to this discussion about its demands, So achold, 34 00:02:13,840 --> 00:02:17,840 Speaker 2: what does it actually take to power generative AI like 35 00:02:17,960 --> 00:02:18,600 Speaker 2: chat GPT? 36 00:02:19,080 --> 00:02:22,520 Speaker 1: Yeah, so popular might just be an understatement. So since 37 00:02:22,600 --> 00:02:26,240 Speaker 1: launching in November twenty twenty two, chat gpt has amassed 38 00:02:26,280 --> 00:02:31,040 Speaker 1: around three hundred million weekly active users worldwide. Wow, I know. 39 00:02:31,520 --> 00:02:35,079 Speaker 1: So every time someone asks chat gpt to complete a task, 40 00:02:35,160 --> 00:02:39,160 Speaker 1: it uses around two point nine what's per hour of energy? Now, 41 00:02:39,280 --> 00:02:42,760 Speaker 1: that's according to the International Energy Agency. To really put 42 00:02:42,800 --> 00:02:45,799 Speaker 1: this into perspective, a recent study found that chat GPT 43 00:02:46,000 --> 00:02:50,399 Speaker 1: consumes enough power annually to charge over three million electric 44 00:02:50,400 --> 00:02:53,400 Speaker 1: cars or about fifty million iPhones. 45 00:02:53,680 --> 00:02:56,000 Speaker 2: Wow, I got to be honest with you, I didn't 46 00:02:56,040 --> 00:02:59,560 Speaker 2: know what what's per hour energy really meant. So that 47 00:03:00,000 --> 00:03:01,760 Speaker 2: definitely helps put it into perspective. 48 00:03:01,960 --> 00:03:05,959 Speaker 1: Yeah. Absolutely. Now, Chat GPT has dominated the AI space 49 00:03:06,000 --> 00:03:08,680 Speaker 1: for the past three years that it's been running. However, 50 00:03:08,800 --> 00:03:12,080 Speaker 1: this year we witnessed a new generative AI called deep 51 00:03:12,120 --> 00:03:16,760 Speaker 1: Seek disrupt global financial markets and you probably saw headlines left, 52 00:03:16,800 --> 00:03:19,959 Speaker 1: right and center earlier this year. Now, the Chinese model 53 00:03:20,000 --> 00:03:23,880 Speaker 1: introduced itself as a cheaper, more energy efficient alternative to 54 00:03:23,960 --> 00:03:28,440 Speaker 1: its American competitors. Just like open AIS Chat GPT, deep 55 00:03:28,440 --> 00:03:32,400 Speaker 1: Sea can summarize texts, answer questions, and generate writing based 56 00:03:32,440 --> 00:03:35,960 Speaker 1: on prompts. Now, what's really fascinating about deep Seek is 57 00:03:36,000 --> 00:03:38,880 Speaker 1: that it performs just as well as the leading American 58 00:03:38,960 --> 00:03:42,200 Speaker 1: AI systems, but only for a fraction of the cost, 59 00:03:42,600 --> 00:03:45,360 Speaker 1: and it claims it can use up to forty times 60 00:03:45,480 --> 00:03:46,320 Speaker 1: less energy. 61 00:03:46,960 --> 00:03:49,280 Speaker 2: Yeah, this is a big claim from deep Seek, and 62 00:03:49,320 --> 00:03:51,880 Speaker 2: I think we're all kind of waiting to hear a 63 00:03:51,880 --> 00:03:55,000 Speaker 2: bit more about how it plans on achieving that or 64 00:03:55,080 --> 00:03:57,360 Speaker 2: if it will really be able to offer the same 65 00:03:57,520 --> 00:04:01,400 Speaker 2: service as a chat GPT. But when we think about 66 00:04:01,400 --> 00:04:05,000 Speaker 2: the environmental impact of generative AI, it's not just about 67 00:04:05,000 --> 00:04:09,880 Speaker 2: this massive strain on energy grids. There are other strains 68 00:04:09,880 --> 00:04:11,240 Speaker 2: on resources right. 69 00:04:11,520 --> 00:04:15,080 Speaker 1: Exactly right. The resources needed to run, support and train 70 00:04:15,200 --> 00:04:19,640 Speaker 1: generative AI a house in these facilities called data centers. Now, 71 00:04:19,680 --> 00:04:23,120 Speaker 1: while the exact figures are still being debated, it's estimated 72 00:04:23,160 --> 00:04:26,080 Speaker 1: that AI data centers account for up to one point 73 00:04:26,200 --> 00:04:30,800 Speaker 1: five percent of global electricity usage. Now that might seem small, 74 00:04:31,080 --> 00:04:33,640 Speaker 1: but it means that a single data center could consume 75 00:04:33,800 --> 00:04:38,159 Speaker 1: enough energy to heat fifty thousand homes for a whole year. Okay, 76 00:04:38,400 --> 00:04:41,440 Speaker 1: these facilities are expected to use as much power annually 77 00:04:41,480 --> 00:04:45,240 Speaker 1: as countries like Japan and Russia by next year. That's 78 00:04:45,279 --> 00:04:49,680 Speaker 1: according to a recent study from mit SO. This raises 79 00:04:49,720 --> 00:04:53,400 Speaker 1: concerns amongst experts who fear it could potentially put additional 80 00:04:53,440 --> 00:04:57,880 Speaker 1: strain on electricity on global electricity grids, including countries like 81 00:04:57,960 --> 00:05:01,400 Speaker 1: Australia where we are regularly experiencing blackouts. 82 00:05:01,480 --> 00:05:04,680 Speaker 2: Yeah, it's hard to imagine, I guess in places like 83 00:05:04,720 --> 00:05:08,320 Speaker 2: Australia where there is that strain on grids and uncertainty 84 00:05:08,640 --> 00:05:11,520 Speaker 2: about you know what the long term sustainability of those 85 00:05:11,520 --> 00:05:16,080 Speaker 2: grids looks like. Then this huge extra strain, it sounds 86 00:05:16,120 --> 00:05:19,240 Speaker 2: like a lot of work. So ATOL, you've got these 87 00:05:19,279 --> 00:05:22,200 Speaker 2: massive facilities that you've talked us through that store this 88 00:05:22,360 --> 00:05:26,039 Speaker 2: large infrastructure and machinery. So we've learned that, you know, 89 00:05:26,160 --> 00:05:30,320 Speaker 2: AI puts a strain on electricity demands because of the 90 00:05:30,360 --> 00:05:33,200 Speaker 2: actual process of asking it questions. Then there's also the 91 00:05:33,240 --> 00:05:37,560 Speaker 2: physical places where the computing lives. I can only imagine 92 00:05:37,560 --> 00:05:40,960 Speaker 2: how hot those rooms could get. I mean think about 93 00:05:41,040 --> 00:05:43,600 Speaker 2: like when you have a laptop, a small laptop on 94 00:05:43,640 --> 00:05:46,760 Speaker 2: your lap, the fan is going into overdrive. It gets 95 00:05:46,800 --> 00:05:49,400 Speaker 2: really hot on your lap. That's like a small laptop 96 00:05:49,440 --> 00:05:51,560 Speaker 2: that generates that kind of a heat. I mean, if 97 00:05:51,600 --> 00:05:55,040 Speaker 2: you've got this much machinery in these facilities, I'm sure 98 00:05:55,080 --> 00:05:56,040 Speaker 2: it must get really warm. 99 00:05:56,240 --> 00:05:59,960 Speaker 1: Oh exactly. I mean, AI centers need advance cooling system 100 00:06:00,080 --> 00:06:04,400 Speaker 1: so basically keep the technology from overheating. Traditionally, data centers 101 00:06:04,440 --> 00:06:07,600 Speaker 1: rely on air cooling to manage rising temperatures caused by 102 00:06:07,640 --> 00:06:11,440 Speaker 1: the heat emitted from the hardware. However, this isn't quite 103 00:06:11,440 --> 00:06:15,159 Speaker 1: sufficient for AI technology. So what that means is modern 104 00:06:15,200 --> 00:06:18,400 Speaker 1: data centers are using liquid cooling systems that rely on 105 00:06:18,560 --> 00:06:22,039 Speaker 1: water to keep temperatures between the ideal terms of twenty 106 00:06:22,120 --> 00:06:23,640 Speaker 1: one and twenty four degrees. 107 00:06:24,160 --> 00:06:26,920 Speaker 2: Okay, I'm trying to write my head around this concept 108 00:06:27,200 --> 00:06:30,240 Speaker 2: because the idea of water and computers pretty much goes 109 00:06:30,240 --> 00:06:32,120 Speaker 2: against everything we've ever been told. 110 00:06:31,880 --> 00:06:35,120 Speaker 1: Right exactly, But I wouldn't recommend throwing a glass of 111 00:06:35,160 --> 00:06:39,520 Speaker 1: water at your laptop right now, okay noted, So researchers 112 00:06:39,520 --> 00:06:42,479 Speaker 1: in the US predict that by twenty twenty seven, up 113 00:06:42,560 --> 00:06:45,960 Speaker 1: to six point six billion cubic meters of water will 114 00:06:46,000 --> 00:06:50,080 Speaker 1: be needed annually to meet global AI demands. That's equivalent 115 00:06:50,120 --> 00:06:54,360 Speaker 1: to half of the UK's yearly water consumption. Okay, a 116 00:06:54,440 --> 00:06:58,719 Speaker 1: lot of water, a lot exactly. These statistics raise concerns 117 00:06:58,760 --> 00:07:01,839 Speaker 1: for climate experts, who basically say, in a country like 118 00:07:01,920 --> 00:07:06,520 Speaker 1: Australia who experiences droughts regularly, this would be detrimental. 119 00:07:06,920 --> 00:07:09,720 Speaker 2: With droughts. I mean, if we look at the climate science, 120 00:07:09,800 --> 00:07:13,360 Speaker 2: its droughts are expected to intensify those kind of long 121 00:07:13,400 --> 00:07:17,040 Speaker 2: periods without rain, and this water usage I can imagine 122 00:07:17,040 --> 00:07:20,640 Speaker 2: for those climate experts. Kind of sounds an alarm exactly. 123 00:07:21,320 --> 00:07:25,720 Speaker 1: Now. Water consumption isn't the only environmental issue linked to 124 00:07:25,880 --> 00:07:29,960 Speaker 1: data centers. The greenhouse gas emissions released from these facilities 125 00:07:30,000 --> 00:07:33,360 Speaker 1: are also sounding the alarm in the fight against global warming. 126 00:07:33,960 --> 00:07:38,440 Speaker 1: Exact figures on AI's contribution to global emissions still remain unclear. 127 00:07:38,720 --> 00:07:42,720 Speaker 1: The International Energy Agency estimates that data centers account for 128 00:07:42,880 --> 00:07:47,240 Speaker 1: zero point six percent of annual emissions, while Science and 129 00:07:47,280 --> 00:07:50,360 Speaker 1: Technology Australia say this figure has already hit one percent. 130 00:07:50,720 --> 00:07:54,160 Speaker 2: So some kind of differences in the scale there, but 131 00:07:54,320 --> 00:07:57,880 Speaker 2: anywhere between kind of half to one percent of emissions 132 00:07:57,920 --> 00:07:59,120 Speaker 2: exactly now. 133 00:07:59,280 --> 00:08:03,040 Speaker 1: Recent reports warned that if AI adoption continues at the 134 00:08:03,120 --> 00:08:06,280 Speaker 1: current pace that it's going, data centers could account for 135 00:08:06,400 --> 00:08:10,640 Speaker 1: fourteen percent of yearly emissions by twenty forty. Wow. 136 00:08:10,800 --> 00:08:14,560 Speaker 2: Predictions like that, I mean, might just become a reality. 137 00:08:14,600 --> 00:08:18,640 Speaker 2: If we look to the reports of rising emissions among 138 00:08:18,760 --> 00:08:21,920 Speaker 2: tech giants using AI, they have been transparent about this, 139 00:08:22,000 --> 00:08:24,560 Speaker 2: and I guess that's kind of why we've taken note 140 00:08:24,600 --> 00:08:27,440 Speaker 2: and why we're talking about it today. What do we 141 00:08:27,560 --> 00:08:30,120 Speaker 2: know from those tech giants about their emissions? 142 00:08:30,320 --> 00:08:33,040 Speaker 1: Yeah, So, in the case of Microsoft, in its latest 143 00:08:33,040 --> 00:08:37,240 Speaker 1: Sustainability Report, it attributed a thirty percent increase in its 144 00:08:37,280 --> 00:08:40,800 Speaker 1: carbon emissions since twenty twenty due to AI models and 145 00:08:40,920 --> 00:08:42,160 Speaker 1: services that it provides. 146 00:08:42,280 --> 00:08:45,360 Speaker 2: Wow, so Microsoft is saying that it's emissions, like the 147 00:08:45,400 --> 00:08:50,160 Speaker 2: whole of Microsoft, increased by thirty percent because of what 148 00:08:50,280 --> 00:08:51,360 Speaker 2: it takes to run AI. 149 00:08:51,720 --> 00:08:54,920 Speaker 1: Exactly that. Now, despite this, we know that tech giants 150 00:08:54,920 --> 00:08:58,280 Speaker 1: have no plans on scaling back their AI programs. So 151 00:08:58,520 --> 00:09:02,920 Speaker 1: basically this means greater demand for infrastructure and resources, which 152 00:09:03,080 --> 00:09:06,599 Speaker 1: brings us to mining. So lithium is one of the 153 00:09:06,679 --> 00:09:10,120 Speaker 1: key materials used to produce the rechargeable batteries that powers 154 00:09:10,160 --> 00:09:13,720 Speaker 1: AI technology, and Australia is one of the largest producers 155 00:09:13,760 --> 00:09:14,439 Speaker 1: of the mineral. 156 00:09:14,640 --> 00:09:17,360 Speaker 2: We have heard a lot about lithium in recent years. 157 00:09:17,400 --> 00:09:21,400 Speaker 2: Of course, it's something that goes in phones, computers, batteries 158 00:09:21,400 --> 00:09:24,840 Speaker 2: of all kinds. And you're right, Australia is the world's 159 00:09:24,880 --> 00:09:29,480 Speaker 2: largest lithium producer. But we might not be forever. 160 00:09:30,280 --> 00:09:33,840 Speaker 1: No, not exactly. So a twenty twenty study from a 161 00:09:33,880 --> 00:09:38,199 Speaker 1: German university found that global lithium deposits could be depleted 162 00:09:38,360 --> 00:09:42,120 Speaker 1: sometime within the next seventy five years. So with the 163 00:09:42,200 --> 00:09:45,680 Speaker 1: rapid uptake of AI services. Some experts predict that lithium 164 00:09:45,760 --> 00:09:50,559 Speaker 1: shortages could occur as soon as twenty forty now. Additionally, 165 00:09:50,640 --> 00:09:53,960 Speaker 1: the amount of eWays generated through the frequent maintenance of 166 00:09:54,120 --> 00:09:57,800 Speaker 1: AI equipment poses a significant challenge to Australia's waste and 167 00:09:57,840 --> 00:09:59,040 Speaker 1: recycling systems. 168 00:09:59,320 --> 00:10:03,680 Speaker 2: Wow, there are so many aspects to the sustainability concerns here. 169 00:10:03,840 --> 00:10:07,520 Speaker 2: We've heard about the power it takes to ask chat 170 00:10:07,559 --> 00:10:11,560 Speaker 2: GPT a question, the electricity and water it takes for 171 00:10:11,760 --> 00:10:14,360 Speaker 2: these computers to be stored, and then we've also got 172 00:10:14,400 --> 00:10:18,680 Speaker 2: this e waste and the mining of lithium to think about. 173 00:10:19,200 --> 00:10:22,600 Speaker 2: You're right that AI isn't going anywhere. We are only 174 00:10:22,640 --> 00:10:26,640 Speaker 2: hearing more and more about its advancements and how we 175 00:10:26,720 --> 00:10:30,160 Speaker 2: can live alongside AI or integrate it into our life, 176 00:10:30,240 --> 00:10:33,880 Speaker 2: into our work. But what are the experts saying about 177 00:10:33,920 --> 00:10:37,120 Speaker 2: how we ensure that this technology doesn't set us back 178 00:10:37,280 --> 00:10:41,920 Speaker 2: environmentally even while it might kind of bring us forward technologically. 179 00:10:42,840 --> 00:10:45,880 Speaker 1: Well, that's exactly what global leaders are trying to figure 180 00:10:45,880 --> 00:10:49,640 Speaker 1: out with the help of these environmental and technological experts. 181 00:10:49,960 --> 00:10:53,040 Speaker 1: So last year in Australia, the Federal Senate launched an 182 00:10:53,040 --> 00:10:57,480 Speaker 1: inquiry into AI to explore both its opportunities and its impacts. 183 00:10:57,800 --> 00:11:01,520 Speaker 1: So the inquiry held six public hearings and received submissions 184 00:11:01,559 --> 00:11:06,160 Speaker 1: from two hundred and forty five experts, academics, business leaders 185 00:11:06,200 --> 00:11:07,840 Speaker 1: and members of the public. 186 00:11:08,240 --> 00:11:11,160 Speaker 2: That's a lot of submissions. Were there any kind of 187 00:11:11,160 --> 00:11:13,040 Speaker 2: common themes that came up. 188 00:11:13,520 --> 00:11:17,240 Speaker 1: Yeah, So the environmental impacts of AI was a major 189 00:11:17,280 --> 00:11:22,880 Speaker 1: concern basically across all submissions and in its omissions. UNSWAI 190 00:11:22,960 --> 00:11:26,319 Speaker 1: Institute noted that the impacts of AI are currently difficult 191 00:11:26,320 --> 00:11:29,720 Speaker 1: to quantify due to few standards for its reporting. 192 00:11:29,960 --> 00:11:31,840 Speaker 2: Okay, so I think that's kind of been reflected in 193 00:11:31,880 --> 00:11:33,520 Speaker 2: some of the numbers we've talked about today. 194 00:11:33,600 --> 00:11:34,040 Speaker 1: Exactly. 195 00:11:34,080 --> 00:11:36,480 Speaker 2: There's a bit of a range scope. There's not kind 196 00:11:36,520 --> 00:11:40,280 Speaker 2: of a definitive regulatory body that says this is what 197 00:11:40,320 --> 00:11:42,360 Speaker 2: AI is doing and these are the emissions that it's 198 00:11:42,400 --> 00:11:44,360 Speaker 2: contributing and that kind of thing. So I guess that 199 00:11:44,480 --> 00:11:46,600 Speaker 2: makes it difficult to get a real sense of what's 200 00:11:46,640 --> 00:11:47,600 Speaker 2: going on. Right. 201 00:11:47,920 --> 00:11:51,120 Speaker 1: So, Science and Technology Australia, which is the peak body 202 00:11:51,160 --> 00:11:54,680 Speaker 1: for Australia science and technology sector, called on governments to 203 00:11:54,760 --> 00:11:59,120 Speaker 1: ensure that renewable energy policy and net zero investments play 204 00:11:59,240 --> 00:12:03,040 Speaker 1: a key role in developing digital infrastructure to support AI 205 00:12:03,120 --> 00:12:07,679 Speaker 1: use in Australia. Now, what's interesting is that recent innovations 206 00:12:07,679 --> 00:12:12,239 Speaker 1: show that AI can actually help us tackle environmental challenges. 207 00:12:12,320 --> 00:12:13,320 Speaker 2: Interesting, I know. 208 00:12:13,600 --> 00:12:17,720 Speaker 1: So the Federal Department of Industry, Science and Resources showed 209 00:12:17,760 --> 00:12:20,480 Speaker 1: that AI could help address some of the world's most 210 00:12:20,520 --> 00:12:24,320 Speaker 1: pressing climate change issues. The Australian Human Rights Commission said 211 00:12:24,360 --> 00:12:27,800 Speaker 1: that AI has the potential to positively impact the environment 212 00:12:27,960 --> 00:12:33,080 Speaker 1: in several ways, including by improving energy efficiency and enhancing 213 00:12:33,160 --> 00:12:34,560 Speaker 1: sustainable practices. 214 00:12:34,720 --> 00:12:37,640 Speaker 2: One of the interesting ones that the Federal government flagged 215 00:12:37,640 --> 00:12:41,400 Speaker 2: I remember is how AI could be used for firefighting technologies. 216 00:12:41,520 --> 00:12:44,720 Speaker 2: That they developed an AI technology that could detect small 217 00:12:44,800 --> 00:12:48,280 Speaker 2: fires and predict fire behavior patterns, which is of course, 218 00:12:48,320 --> 00:12:51,520 Speaker 2: you know, beneficial to the environment on the other side 219 00:12:51,559 --> 00:12:53,079 Speaker 2: of this coin exactly. 220 00:12:53,679 --> 00:12:56,840 Speaker 1: But despite that, the Australian Human Rights Commission did warn 221 00:12:56,960 --> 00:13:01,600 Speaker 1: AI pose a significant risk. Now, ultimate knowledge is power, 222 00:13:01,960 --> 00:13:06,400 Speaker 1: so by increasing transparency around the potential environmental impacts of AI, 223 00:13:06,800 --> 00:13:09,680 Speaker 1: the risk may be mitigated atoll. 224 00:13:09,840 --> 00:13:12,160 Speaker 2: Thank you so much for breaking that down for us. 225 00:13:12,240 --> 00:13:16,240 Speaker 2: A very very big, complicated story, but you've made sense 226 00:13:16,280 --> 00:13:19,040 Speaker 2: of it for us, So we thank you for joining 227 00:13:19,080 --> 00:13:22,160 Speaker 2: us on the podcast today. And thank you for listening. 228 00:13:22,400 --> 00:13:24,760 Speaker 2: If you like today's episode, if you learn something, feel 229 00:13:24,760 --> 00:13:27,280 Speaker 2: free to pass it on to a friend. Don't forget 230 00:13:27,360 --> 00:13:29,920 Speaker 2: to follow or subscribe wherever you listen to The Daily 231 00:13:29,960 --> 00:13:32,480 Speaker 2: Os or if you're watching us over on our YouTube. 232 00:13:32,679 --> 00:13:34,560 Speaker 2: We will be back a little bit later today with 233 00:13:34,600 --> 00:13:37,320 Speaker 2: the evening headlines. Until then, have a great day. 234 00:13:41,720 --> 00:13:44,000 Speaker 1: My name is Lily Madden and I'm a proud Arunda 235 00:13:44,240 --> 00:13:48,320 Speaker 1: Bunje lung Kalkotin woman from Gadigl Country. The Daily Os 236 00:13:48,440 --> 00:13:51,240 Speaker 1: acknowledges that this podcast is recorded on the lands of 237 00:13:51,240 --> 00:13:54,560 Speaker 1: the Gadighl people and pays respect to all Aboriginal and 238 00:13:54,600 --> 00:13:57,640 Speaker 1: Torres Straight Island and nations. We pay our respects to 239 00:13:57,720 --> 00:14:00,600 Speaker 1: the first peoples of these countries, both past and present