1 00:00:05,720 --> 00:00:08,520 Speaker 1: From The Australian. Here's what's on the front. I'm Claire Harvey. 2 00:00:08,560 --> 00:00:15,880 Speaker 1: It's Thursday, April to twenty twenty six. We're all in 3 00:00:15,920 --> 00:00:19,759 Speaker 1: this together. Sound familiar. Anthony Albanezi made and addressed to 4 00:00:19,800 --> 00:00:23,439 Speaker 1: the nation asking everyone to play their part in responding 5 00:00:23,440 --> 00:00:26,880 Speaker 1: to the economic crisis created by war in the Middle East. 6 00:00:27,920 --> 00:00:32,040 Speaker 2: These are uncertain times, but I'm absolutely certain of this. 7 00:00:32,760 --> 00:00:36,600 Speaker 2: We will deal with these global challenges the Australian way, 8 00:00:37,200 --> 00:00:41,400 Speaker 2: working together and looking after each other as we always have. 9 00:00:42,560 --> 00:00:45,479 Speaker 1: All our analysis of that is live right now at 10 00:00:45,479 --> 00:00:50,720 Speaker 1: the Australian dot com dot au. Big AI wants Australia's 11 00:00:50,800 --> 00:00:55,960 Speaker 1: creativity for free. The artificial intelligence giant Anthropic, which makes 12 00:00:56,080 --> 00:00:59,680 Speaker 1: the chatbot Claude, has signed an agreement with the Albanezy 13 00:00:59,720 --> 00:01:04,440 Speaker 1: government about AI safety and about potentially building giant data 14 00:01:04,480 --> 00:01:08,680 Speaker 1: centers here. But there's a catch. Anthropic and other AI 15 00:01:08,800 --> 00:01:12,440 Speaker 1: companies want to be allowed to hoover up all Australia's 16 00:01:12,440 --> 00:01:17,800 Speaker 1: creative works for free. That's books, music, journalism, architecture, dance, 17 00:01:18,160 --> 00:01:22,520 Speaker 1: indigenous cultural knowledge art. It's even triggered a fight inside 18 00:01:22,520 --> 00:01:26,319 Speaker 1: the government today. What's fair when it comes to AI? 19 00:01:33,200 --> 00:01:36,000 Speaker 1: James Madden is the Australian's media editor and James Ware 20 00:01:36,040 --> 00:01:39,039 Speaker 1: here today to talk about an issue that seems quite 21 00:01:39,040 --> 00:01:42,600 Speaker 1: complicated but I think is actually very simple. Dario Amiday, 22 00:01:42,640 --> 00:01:46,240 Speaker 1: who's one of the world's most powerful AI billionaires, is 23 00:01:46,319 --> 00:01:49,400 Speaker 1: here in Australia lobbying federal politicians. 24 00:01:49,640 --> 00:01:50,720 Speaker 3: What does he want? 25 00:01:51,120 --> 00:01:55,520 Speaker 4: What he wants, Claire, is basically access to the creative 26 00:01:55,560 --> 00:02:03,120 Speaker 4: content owned by Australian artists, musicians, journalists, authors, screenwriters and 27 00:02:03,160 --> 00:02:06,360 Speaker 4: news media outlets. And the reason he wants that content 28 00:02:07,000 --> 00:02:11,120 Speaker 4: is so he can fuel his AI engines, which is 29 00:02:11,160 --> 00:02:14,120 Speaker 4: obviously at the core of anthropics business model. 30 00:02:14,440 --> 00:02:17,560 Speaker 1: Yeah. So around the world, AI companies are training their 31 00:02:17,639 --> 00:02:20,120 Speaker 1: models to understand society so that they can give you 32 00:02:20,160 --> 00:02:23,400 Speaker 1: and me good answers when we ask them questions. I 33 00:02:23,520 --> 00:02:26,799 Speaker 1: hear one of the senior executive of Universal Music describing 34 00:02:26,840 --> 00:02:29,920 Speaker 1: this the other day. A lot of creative companies, like 35 00:02:30,040 --> 00:02:33,280 Speaker 1: music labels, are looking at making deals with AI companies 36 00:02:33,360 --> 00:02:36,040 Speaker 1: so that the R companies can use the copyright of 37 00:02:36,160 --> 00:02:38,800 Speaker 1: their artists. He used the example of Billie Eilish, he 38 00:02:38,880 --> 00:02:42,000 Speaker 1: was signed to one of Universal's labels, and he said 39 00:02:42,000 --> 00:02:45,639 Speaker 1: the potential is amazing. Let's say it's a cloudy, rainy afternoon, 40 00:02:45,680 --> 00:02:47,400 Speaker 1: you're going to go for a half an hour run. 41 00:02:47,800 --> 00:02:50,160 Speaker 1: You're a big Billie Eilish fan, but there's nothing really 42 00:02:50,360 --> 00:02:52,840 Speaker 1: in her catalog that quite suits the mood or the 43 00:02:52,919 --> 00:02:55,280 Speaker 1: time that you want. And you want a pretty neat 44 00:02:55,280 --> 00:02:57,680 Speaker 1: playlist that you don't have to do much too. You 45 00:02:57,840 --> 00:03:00,280 Speaker 1: could interact with an AI that has some and to 46 00:03:00,320 --> 00:03:04,000 Speaker 1: deal with Billie Eilish and is paying her to create 47 00:03:04,120 --> 00:03:07,359 Speaker 1: a playlist for you that's curated just for you. It's 48 00:03:07,400 --> 00:03:10,360 Speaker 1: Billy Eilish music. She hasn't actually created it, but she 49 00:03:10,440 --> 00:03:12,840 Speaker 1: has licensed the AI to do it for her. You 50 00:03:12,880 --> 00:03:15,760 Speaker 1: get this perfect half an hour moody soundtrack. So that's 51 00:03:15,800 --> 00:03:18,040 Speaker 1: a really positive use of AI, right, And that's the 52 00:03:18,080 --> 00:03:21,519 Speaker 1: way that deals could be made and are being made already. 53 00:03:21,919 --> 00:03:24,440 Speaker 1: I don't think Billy Eilish has signed one herself, and 54 00:03:24,480 --> 00:03:26,079 Speaker 1: so I thought I'd test that a little bit. What's 55 00:03:26,120 --> 00:03:28,320 Speaker 1: the flip side of that? And are they already doing 56 00:03:28,320 --> 00:03:32,680 Speaker 1: it anyway without Billie Eilish's permission. So I asked chat, 57 00:03:32,680 --> 00:03:34,920 Speaker 1: GBT and Google's Gemini to tell me what the big 58 00:03:34,960 --> 00:03:37,960 Speaker 1: themes are in Billie Eilish's music. I've asked our producer 59 00:03:38,040 --> 00:03:40,480 Speaker 1: Christian Amio to read me what they said. What was 60 00:03:40,520 --> 00:03:42,400 Speaker 1: the answer from chat GBT Chrissy. 61 00:03:42,360 --> 00:03:45,680 Speaker 5: So chat GBT came back and said that Billie Eilish's 62 00:03:45,680 --> 00:03:48,920 Speaker 5: known for weaving a handful of distinct emotional and conceptual 63 00:03:48,960 --> 00:03:53,600 Speaker 5: themes through her music. They include mental health and inner struggles, darkness, 64 00:03:53,640 --> 00:03:58,559 Speaker 5: fear and the macabre love, heartbreak and toxic relationships, identity 65 00:03:58,600 --> 00:04:02,000 Speaker 5: and self image, fame in its downsides, and there is 66 00:04:02,120 --> 00:04:06,360 Speaker 5: actually seven quite distinct themes given here by chat GPT. 67 00:04:06,560 --> 00:04:08,800 Speaker 1: It gave me a little emoji for each of those two, which. 68 00:04:08,640 --> 00:04:11,160 Speaker 5: Did just in case you are unsure as to what 69 00:04:11,480 --> 00:04:14,880 Speaker 5: the flame emoji should be alongside. 70 00:04:14,320 --> 00:04:16,800 Speaker 1: And that took about three or four seconds. So then 71 00:04:16,839 --> 00:04:19,080 Speaker 1: I thought, okay, what if I'm a lipstick designer, but 72 00:04:19,120 --> 00:04:21,560 Speaker 1: I want to use Billie eilish kind of style themes 73 00:04:21,560 --> 00:04:24,440 Speaker 1: for my new lipstick collection. So I asked chat gpt 74 00:04:24,600 --> 00:04:27,800 Speaker 1: to give me some Billie Eilish inspired lipstick names for 75 00:04:27,920 --> 00:04:30,359 Speaker 1: my six lipsticks in my collection, and here's what it 76 00:04:30,400 --> 00:04:32,800 Speaker 1: gave me. So the one that jumped out to me 77 00:04:33,120 --> 00:04:36,560 Speaker 1: was called Ocean Eyes Glow because Ocean Eyes is one 78 00:04:36,600 --> 00:04:39,599 Speaker 1: of Billie Eilish's best known songs. It is arguably the 79 00:04:39,640 --> 00:04:42,240 Speaker 1: song that launched her onto the global stage. 80 00:04:42,640 --> 00:04:47,400 Speaker 5: There's also Quite Chaos, which chat GBT suggests captures her 81 00:04:47,520 --> 00:04:50,680 Speaker 5: inner conflict, themes ideal for a cool move or a 82 00:04:50,720 --> 00:04:54,080 Speaker 5: gray toned pink. There's Faded Dream, which is a soft, 83 00:04:54,120 --> 00:04:57,839 Speaker 5: melancholic tone, great for a muted nude or dusty row. 84 00:04:58,080 --> 00:05:00,640 Speaker 1: So there we go. I know that chat GPT is 85 00:05:00,720 --> 00:05:03,200 Speaker 1: not paying Billie Eilish anything, it hasn't come to a 86 00:05:03,240 --> 00:05:07,000 Speaker 1: licensing agreement with her, and yet it is able to 87 00:05:07,040 --> 00:05:11,200 Speaker 1: give me a business plan for my fictional lipstick empire. 88 00:05:12,040 --> 00:05:14,640 Speaker 1: That for me is an example of how badly this 89 00:05:14,760 --> 00:05:17,760 Speaker 1: can go wrong and how they're already using creative works 90 00:05:17,760 --> 00:05:22,440 Speaker 1: to power their engines. What's the risky for Australian artists. 91 00:05:22,520 --> 00:05:25,760 Speaker 4: Well, the risk for Australian artists is basically that they 92 00:05:25,760 --> 00:05:29,599 Speaker 4: will lose ownership of what is theirs. That is their content, 93 00:05:29,760 --> 00:05:32,800 Speaker 4: They own it, that's their lifeblud. So for them to 94 00:05:32,839 --> 00:05:36,200 Speaker 4: have that stolen from under their nose should the AAI 95 00:05:36,360 --> 00:05:41,279 Speaker 4: companies wish come to pass, would have potentially catastrophic consequences 96 00:05:41,320 --> 00:05:46,680 Speaker 4: for the news industry, the music industry, the screenwriting industry, 97 00:05:47,160 --> 00:05:50,200 Speaker 4: individuals and news companies as a whole. 98 00:05:50,560 --> 00:05:54,400 Speaker 1: So they're clearly already doing it. And Dario Amiday, the 99 00:05:54,600 --> 00:05:58,120 Speaker 1: CEO of Anthropic, is here ostensibly to talk about AI safety, 100 00:05:58,120 --> 00:06:01,880 Speaker 1: about helping Australia develop a safe protocol, something that Anthropic 101 00:06:01,920 --> 00:06:05,719 Speaker 1: talks a lot about. Also to investigate the potential for 102 00:06:05,880 --> 00:06:09,320 Speaker 1: giant data centers to be built here. Australia already has 103 00:06:09,440 --> 00:06:12,520 Speaker 1: more than one hundred data centers. They consume about as 104 00:06:12,600 --> 00:06:15,840 Speaker 1: much electricity as one hundred thousand homes according to the 105 00:06:15,839 --> 00:06:20,440 Speaker 1: International Energy Agency, So they're big investments that the AI 106 00:06:20,480 --> 00:06:23,880 Speaker 1: companies would say are helping the economy grow. But there's 107 00:06:23,880 --> 00:06:26,200 Speaker 1: a caveat right. What do the AI companies want in 108 00:06:26,240 --> 00:06:28,200 Speaker 1: return for bringing their investment here? 109 00:06:28,400 --> 00:06:31,720 Speaker 4: Well, what they want is an easing of Australia's copyright laws. 110 00:06:32,279 --> 00:06:36,000 Speaker 4: And as it stands, the Australian government has held firm 111 00:06:36,440 --> 00:06:40,880 Speaker 4: on protecting our copyright laws. And the reason why the 112 00:06:40,920 --> 00:06:44,800 Speaker 4: AO companies want these laws to be eased is so 113 00:06:44,880 --> 00:06:49,440 Speaker 4: they can effectively mine creative content to train their machines. 114 00:06:51,880 --> 00:06:55,440 Speaker 1: Here's Dario Amiday on stage in Canberra on Wednesday. 115 00:06:56,320 --> 00:06:59,800 Speaker 3: I do think rights orders how legitimate cleans here and 116 00:07:00,160 --> 00:07:03,039 Speaker 3: you know, we're you know, we're kind of more here 117 00:07:03,080 --> 00:07:05,200 Speaker 3: to talk about. You know, how can we how can 118 00:07:05,240 --> 00:07:08,960 Speaker 3: we arrive at an arrangement that works for everyone and 119 00:07:09,000 --> 00:07:12,400 Speaker 3: you know, leave everyone better off. Like I would generalize 120 00:07:12,400 --> 00:07:16,320 Speaker 3: to the point of, if we're able to generate voutepenopment 121 00:07:16,400 --> 00:07:19,480 Speaker 3: growth with AI, you know, it's it's a standard you know, 122 00:07:19,560 --> 00:07:22,960 Speaker 3: economic growth thing, making the pie figure, and then there's 123 00:07:23,040 --> 00:07:25,720 Speaker 3: more that we can get to everyone. 124 00:07:26,640 --> 00:07:29,360 Speaker 4: It's a bit of a charm offensive that Daria has 125 00:07:29,400 --> 00:07:32,640 Speaker 4: been undertaken is in Cambra to meet with the Treasurer 126 00:07:32,720 --> 00:07:35,640 Speaker 4: and potentially the Prime Minister as well. It's not a 127 00:07:35,680 --> 00:07:39,320 Speaker 4: memorandum of understanding with the government. It doesn't mean much 128 00:07:39,520 --> 00:07:42,320 Speaker 4: in and of itself at this point, but what it 129 00:07:42,360 --> 00:07:46,160 Speaker 4: does to potentially is give Anthropic a foot in the 130 00:07:46,160 --> 00:07:49,040 Speaker 4: door of Australia. At the moment they're saying we'd like 131 00:07:49,080 --> 00:07:53,560 Speaker 4: to engage in health research, assist with the agricultural industry, 132 00:07:54,000 --> 00:07:57,880 Speaker 4: financial services, this is a mask. What they want is 133 00:07:58,000 --> 00:08:03,360 Speaker 4: the content, the creative content of Australian artists, because that 134 00:08:03,560 --> 00:08:06,680 Speaker 4: is their business model. That is what feeds and fattens 135 00:08:06,920 --> 00:08:09,440 Speaker 4: their AI training models, and that is how they make 136 00:08:09,480 --> 00:08:10,120 Speaker 4: their profits. 137 00:08:10,480 --> 00:08:13,680 Speaker 1: So companies, including News Corp Australia, which publishes The Australian. 138 00:08:13,840 --> 00:08:17,000 Speaker 1: I've strongly opposed to this, and so are the bodies 139 00:08:17,040 --> 00:08:21,080 Speaker 1: representing musicians and writers and screenwriters and so on and architects. 140 00:08:21,560 --> 00:08:24,240 Speaker 1: They say there's nothing wrong with the copyright laws. Now, 141 00:08:24,360 --> 00:08:27,920 Speaker 1: let's say, James, you had written a book about the 142 00:08:27,960 --> 00:08:30,320 Speaker 1: Man from Snow River. I loved your book so much 143 00:08:30,400 --> 00:08:33,319 Speaker 1: that I wanted to create a range of jackets inspired 144 00:08:33,360 --> 00:08:36,560 Speaker 1: by your book. Presently, under the copyright law, I would 145 00:08:36,559 --> 00:08:39,319 Speaker 1: have to come to a licensing agreement with you. Then 146 00:08:39,320 --> 00:08:41,559 Speaker 1: I could market it as the James Madden Man from 147 00:08:41,559 --> 00:08:44,439 Speaker 1: Snow River Jacket Collection. It's a good idea. Maybe we 148 00:08:44,480 --> 00:08:47,400 Speaker 1: should actually do this. Yeah, they don't want to have 149 00:08:47,440 --> 00:08:49,319 Speaker 1: to do that. They don't want to have to pay 150 00:08:49,720 --> 00:08:52,960 Speaker 1: or agree with creators about what their works would be 151 00:08:53,080 --> 00:08:55,800 Speaker 1: used for. So they could hoover it up and then 152 00:08:55,840 --> 00:08:58,719 Speaker 1: spit it out again, like in ellipstick collection or a 153 00:08:58,800 --> 00:09:02,160 Speaker 1: jacket collection, and you would have no say whatsoever. And 154 00:09:02,280 --> 00:09:04,600 Speaker 1: Toropic is worth five hundred and fifty billion dollars, these 155 00:09:04,600 --> 00:09:08,800 Speaker 1: other companies are worth billions. Why would they try to 156 00:09:08,840 --> 00:09:11,599 Speaker 1: object to paying what is going to be a relatively 157 00:09:11,640 --> 00:09:15,480 Speaker 1: small amount in copyright agreements, we'd say news publishers or artists. 158 00:09:16,120 --> 00:09:19,920 Speaker 4: Countries like Australia, for example, they're not saying AI companies 159 00:09:20,160 --> 00:09:22,880 Speaker 4: be gone, you have no business here. In fact, quite 160 00:09:22,880 --> 00:09:25,880 Speaker 4: the opposite country, like Australia, we're open for business. You'll 161 00:09:25,880 --> 00:09:28,400 Speaker 4: find the music industry and the news industry and the 162 00:09:28,440 --> 00:09:31,920 Speaker 4: screenwriting industry all saying the same thing. Please, let's engage. 163 00:09:31,960 --> 00:09:35,400 Speaker 4: Let's sit down in good faith and have good faith negotiations. 164 00:09:35,840 --> 00:09:38,720 Speaker 4: Because AI is here to stay and it's a reality. 165 00:09:39,320 --> 00:09:42,760 Speaker 4: I don't think anyone's overlooking that fact. The issue in 166 00:09:42,880 --> 00:09:49,880 Speaker 4: terms of recompense is that unless licensing deals are honored 167 00:09:50,360 --> 00:09:56,880 Speaker 4: and struck, then it's almost impossible to clawback content and 168 00:09:56,960 --> 00:10:00,760 Speaker 4: copyright that has been stolen. Once it's stopped, it's gone. 169 00:10:00,800 --> 00:10:03,960 Speaker 4: It's out there, it's digested by the wider world, and 170 00:10:04,000 --> 00:10:06,320 Speaker 4: there's no way of making a fair claim back. 171 00:10:10,280 --> 00:10:31,480 Speaker 1: Coming up. Are the AI giants already getting away with it? Chrissy. 172 00:10:31,600 --> 00:10:34,000 Speaker 1: Every day you and I use a couple of AI 173 00:10:34,080 --> 00:10:37,280 Speaker 1: tools as a kind of enthusiastic assistant. You know, that's 174 00:10:37,320 --> 00:10:40,120 Speaker 1: the way we've been trained to use them here at 175 00:10:40,120 --> 00:10:42,400 Speaker 1: News Corp. It's like use them as your annoying but 176 00:10:42,559 --> 00:10:45,360 Speaker 1: enthusiastic assistant. They'll give you too much, and they'll give 177 00:10:45,360 --> 00:10:47,280 Speaker 1: you the wrong things and keep prompting them so that 178 00:10:47,320 --> 00:10:49,120 Speaker 1: they can actually help you do your work. We don't 179 00:10:49,160 --> 00:10:51,760 Speaker 1: ever use them to write something that we actually publish. 180 00:10:51,880 --> 00:10:54,520 Speaker 1: So every day we ask Google's Gemini to think of 181 00:10:54,559 --> 00:10:57,600 Speaker 1: a couple of episode titles for us. What's your take 182 00:10:57,640 --> 00:10:58,880 Speaker 1: on what it gives back to us? 183 00:11:00,040 --> 00:11:03,280 Speaker 5: I find, by and large, it tends to veer into 184 00:11:03,640 --> 00:11:07,120 Speaker 5: cliche and flattery. More often than not. It will start 185 00:11:07,120 --> 00:11:09,640 Speaker 5: by giving you a spiel about how great your script is, 186 00:11:09,640 --> 00:11:11,839 Speaker 5: and it's like, well, thank you, bet I already knew that. 187 00:11:12,320 --> 00:11:15,439 Speaker 5: It will generate a lot of things that could be 188 00:11:15,600 --> 00:11:17,960 Speaker 5: the title of an episode of any podcast, and any 189 00:11:17,960 --> 00:11:19,680 Speaker 5: one of our competitors for that matter. 190 00:11:20,160 --> 00:11:21,800 Speaker 1: It looks to me like it's trained on the New 191 00:11:21,880 --> 00:11:24,280 Speaker 1: York Times and the Washington Post, and it's got that 192 00:11:24,360 --> 00:11:26,960 Speaker 1: style of headline writing, which is very it's a bit pompous. 193 00:11:27,000 --> 00:11:29,600 Speaker 1: You know, there's usually ten words, there's a semi colon 194 00:11:29,679 --> 00:11:32,600 Speaker 1: in the middle somewhere. We ask Google's Gemini to come 195 00:11:32,679 --> 00:11:34,800 Speaker 1: up with a title for this episode. Do you want 196 00:11:34,840 --> 00:11:36,560 Speaker 1: to read us some of the options that it came back. 197 00:11:36,400 --> 00:11:41,040 Speaker 5: With the content collapse. Is AI digestion killing original creation? 198 00:11:41,440 --> 00:11:45,280 Speaker 5: Question mark code versus canvas? The existential threat to the 199 00:11:45,320 --> 00:11:46,240 Speaker 5: creative class. 200 00:11:46,280 --> 00:11:47,840 Speaker 1: There's a colon in the middle of all these heads. 201 00:11:47,960 --> 00:11:52,000 Speaker 5: There is Yes, there is no singular sentence anywhere here? 202 00:11:52,440 --> 00:11:55,760 Speaker 5: Whose art is it? Anyway? AI copyright and the death 203 00:11:55,840 --> 00:11:59,800 Speaker 5: of the author and stolen sentences? How AI is stripping 204 00:11:59,840 --> 00:12:02,280 Speaker 5: news rooms bear So I don't know if there's something 205 00:12:02,280 --> 00:12:05,760 Speaker 5: you haven't told us there, but that's somewhat concerning for me. 206 00:12:05,880 --> 00:12:09,600 Speaker 1: That's clearly been trained on American newspapers. I don't think 207 00:12:09,720 --> 00:12:12,600 Speaker 1: that's as a result of a licensing deal that I 208 00:12:12,640 --> 00:12:15,080 Speaker 1: would have to pay if I decided to start using 209 00:12:15,080 --> 00:12:18,760 Speaker 1: New York Times headlines in my work. So they're already 210 00:12:18,800 --> 00:12:20,760 Speaker 1: getting away with it. Is it too late? 211 00:12:21,880 --> 00:12:25,400 Speaker 4: Well, I hope not. But the scales are tipping in 212 00:12:25,559 --> 00:12:29,320 Speaker 4: favor of AI companies because it is hard to regulate. 213 00:12:29,400 --> 00:12:32,040 Speaker 4: This is hard. There's a lot out there on the 214 00:12:32,040 --> 00:12:35,280 Speaker 4: Internet already on the so called open web, that can 215 00:12:35,320 --> 00:12:39,240 Speaker 4: be mined by these AI companies, but they want it all. 216 00:12:39,559 --> 00:12:41,320 Speaker 4: The reason they wanted all is because it's best for 217 00:12:41,360 --> 00:12:43,680 Speaker 4: their models. But is it too late. 218 00:12:43,840 --> 00:12:47,560 Speaker 1: No, No, there's clearly a split within the government. Michelle Roland, 219 00:12:47,640 --> 00:12:50,880 Speaker 1: the Attorney General, gave a very rousing speech last week 220 00:12:50,880 --> 00:12:52,880 Speaker 1: in which she said that the Government is never going 221 00:12:52,920 --> 00:12:56,800 Speaker 1: to capitulate to a request to exempt these works from 222 00:12:56,840 --> 00:12:59,160 Speaker 1: copyright for text and data mining, which is what the 223 00:12:59,200 --> 00:13:02,520 Speaker 1: AI companies want. But there are other ministers like Andrew Charlton, 224 00:13:02,520 --> 00:13:05,560 Speaker 1: who's the Assistant Minister responsible for the Digital Economy. He's 225 00:13:05,600 --> 00:13:08,120 Speaker 1: clearly much more interested in going AI's way right. 226 00:13:08,600 --> 00:13:12,400 Speaker 4: Look, he's been quite vocal in championing the positives of 227 00:13:12,600 --> 00:13:16,000 Speaker 4: data centers AI data centers in Australia, for example. But 228 00:13:16,320 --> 00:13:19,040 Speaker 4: I suppose where it gets murky for mister Charlton and 229 00:13:19,080 --> 00:13:23,000 Speaker 4: others is that it's one thing to promote data centers, 230 00:13:23,400 --> 00:13:27,280 Speaker 4: but if you're compromising the copyright of your artists and 231 00:13:27,320 --> 00:13:31,040 Speaker 4: your journalists and your musicians, then it's almost like once 232 00:13:31,040 --> 00:13:32,800 Speaker 4: the genie is out of the bottle, it's gone. 233 00:13:33,160 --> 00:13:34,439 Speaker 1: James Madden, thank you very much. 234 00:13:34,520 --> 00:13:34,720 Speaker 4: Thanks. 235 00:13:40,320 --> 00:13:43,319 Speaker 1: James Madden is The Australian's Media editor. You can read 236 00:13:43,400 --> 00:13:46,680 Speaker 1: all our journalism, produced by real people right now at 237 00:13:46,720 --> 00:13:48,720 Speaker 1: the Australian dot com dot au