1 00:00:04,320 --> 00:00:07,280 Speaker 1: From The Australian. Here's what's on the front. I'm Claire Harvey. 2 00:00:07,400 --> 00:00:15,320 Speaker 1: It's Thursday, June sixth Anthony Albanezi and Peter Dutton have 3 00:00:15,360 --> 00:00:18,560 Speaker 1: put on a rare show of unity. They've slammed the 4 00:00:18,600 --> 00:00:22,120 Speaker 1: Greens in Parliament, saying they're fueling tension and division over 5 00:00:22,200 --> 00:00:25,800 Speaker 1: conflict in the Middle East. The PM says pro Palestinian 6 00:00:25,840 --> 00:00:30,880 Speaker 1: protesters targeting electorate offices like his in Sydney's Marrickville are 7 00:00:30,960 --> 00:00:35,520 Speaker 1: undermining their cause and preventing others from participating in democracy. 8 00:00:39,040 --> 00:00:41,919 Speaker 1: Economic growth is all but ground to a halt, slumping 9 00:00:42,000 --> 00:00:46,720 Speaker 1: to its slowest pace in three decades outside the COVID pandemic. 10 00:00:47,240 --> 00:00:50,199 Speaker 1: Treasurer Jim Chalmers said on Wednesday high interest rates are 11 00:00:50,200 --> 00:00:53,479 Speaker 1: to blame for the slowdown and he doesn't expect things 12 00:00:53,600 --> 00:00:57,400 Speaker 1: to improve before the end of the year. Those stories 13 00:00:57,440 --> 00:01:03,440 Speaker 1: alive right now at The Australian dot Com. The boss 14 00:01:03,440 --> 00:01:06,520 Speaker 1: of News Corp Australia, Michael Miller, wants the federal government 15 00:01:06,560 --> 00:01:10,839 Speaker 1: to pass so called social license laws that would expose 16 00:01:10,959 --> 00:01:16,319 Speaker 1: social media executives to criminal penalties if they breach Australian law. Miller, 17 00:01:16,480 --> 00:01:20,640 Speaker 1: whose company publishes Mastheads, from the Herald Sun to Vogue Australia, 18 00:01:20,800 --> 00:01:24,240 Speaker 1: The Australian and the front says if the tech giants 19 00:01:24,280 --> 00:01:28,039 Speaker 1: don't comply, they should be expelled from the Australian market. 20 00:01:28,920 --> 00:01:44,560 Speaker 2: That's today's story. 21 00:01:50,040 --> 00:01:51,400 Speaker 3: My name is Francis Hogan. 22 00:01:52,120 --> 00:01:53,440 Speaker 4: I used to work at Facebook. 23 00:01:54,200 --> 00:01:57,880 Speaker 1: In October twenty twenty one, the worst nightmare of executives 24 00:01:57,880 --> 00:02:02,440 Speaker 1: at the world's most powerful social media platform, Meta came true. 25 00:02:02,560 --> 00:02:06,680 Speaker 1: A whistleblower, Francis Hogan went public in the most dramatic 26 00:02:06,840 --> 00:02:10,880 Speaker 1: possible way. This is Hagen addressing the United States Senate. 27 00:02:12,080 --> 00:02:15,359 Speaker 3: I joined Facebook because I think Facebook has the potential 28 00:02:15,480 --> 00:02:19,200 Speaker 3: to bring out the best inness. But I'm here today 29 00:02:19,240 --> 00:02:23,440 Speaker 3: because I believe Facebook's products home children, stoke division and 30 00:02:23,520 --> 00:02:27,560 Speaker 3: weaken our democracy. The company's leadership knows how to make 31 00:02:27,639 --> 00:02:31,440 Speaker 3: Facebook and Instagram saver, but won't make the necessary changes 32 00:02:31,480 --> 00:02:34,960 Speaker 3: because they have put their astronomical profits before people. 33 00:02:36,680 --> 00:02:40,600 Speaker 1: METS founder Mark Zuckerberg said Hagen's claim it puts profit 34 00:02:40,680 --> 00:02:44,600 Speaker 1: before people was untrue. On Wednesday, at the National Press 35 00:02:44,600 --> 00:02:48,320 Speaker 1: Club in Canberra, News Corp Australasia boss Michael Miller took 36 00:02:48,400 --> 00:02:49,200 Speaker 1: up the fight. 37 00:02:51,800 --> 00:02:55,760 Speaker 5: These tech giants, especially the social media networks such as Meta, 38 00:02:56,080 --> 00:02:59,880 Speaker 5: TikTok and x operate outside our legal system. 39 00:03:01,120 --> 00:03:04,040 Speaker 6: It's time for them to play by our rules. 40 00:03:04,760 --> 00:03:07,960 Speaker 5: They have deep pockets, they are highly litigious, but that 41 00:03:08,120 --> 00:03:10,320 Speaker 5: is no reason why they should not be subject to 42 00:03:10,360 --> 00:03:14,520 Speaker 5: Australian law, and we should seek to enforce Australian law. 43 00:03:14,880 --> 00:03:19,000 Speaker 5: The words social license describe the permission companies have to 44 00:03:19,080 --> 00:03:22,920 Speaker 5: operate within a society. The tech monopolies should also be 45 00:03:23,040 --> 00:03:27,600 Speaker 5: made to pay a license. This social license would be 46 00:03:27,639 --> 00:03:31,320 Speaker 5: a package of laws and requirements the tech monopolies would 47 00:03:31,320 --> 00:03:34,920 Speaker 5: need to meet if they want access to Australian consumers. 48 00:03:40,960 --> 00:03:44,839 Speaker 1: Miller recommended a suite of new legislation designed to keep 49 00:03:44,840 --> 00:03:49,200 Speaker 1: the tech platforms accountable for the content and ideas they amplify. 50 00:03:49,760 --> 00:03:52,560 Speaker 1: They include making it easier for users to contact the 51 00:03:52,560 --> 00:03:57,200 Speaker 1: platforms directly, breaking up the digital advertising monopoly, honoring the 52 00:03:57,240 --> 00:04:00,520 Speaker 1: news media Bargaining Code, and importantly. 53 00:04:00,440 --> 00:04:04,680 Speaker 5: Penalties that incorporate criminal sanctions for companies and their executives 54 00:04:04,680 --> 00:04:07,320 Speaker 5: that agree to the license that then break the rules, 55 00:04:07,680 --> 00:04:10,560 Speaker 5: and the power to ultimately block access to our country 56 00:04:10,760 --> 00:04:14,200 Speaker 5: and our people if they refuse to play by our rules. 57 00:04:14,640 --> 00:04:18,600 Speaker 1: Miller said part of the problem is that fines don't. 58 00:04:18,240 --> 00:04:23,799 Speaker 5: Work because on social media, bad behavior is good for business. 59 00:04:24,640 --> 00:04:25,880 Speaker 6: The social media giants. 60 00:04:25,960 --> 00:04:30,479 Speaker 5: They profit from evil videos, They profit from bullying, They 61 00:04:30,480 --> 00:04:32,120 Speaker 5: profit from online con artists. 62 00:04:32,800 --> 00:04:35,520 Speaker 6: They profit from glamorizing eating disorders. 63 00:04:36,279 --> 00:04:38,720 Speaker 5: In the words of a British father who lost their 64 00:04:38,800 --> 00:04:45,000 Speaker 5: child to suicide, they monetize misery in metas case, according 65 00:04:45,000 --> 00:04:47,960 Speaker 5: to a recent report in the UK's Sunday Times magazine, 66 00:04:48,440 --> 00:04:51,440 Speaker 5: that profit is a staggering one hundred and thirty six 67 00:04:51,520 --> 00:04:54,400 Speaker 5: million US dollars every single day. 68 00:04:55,440 --> 00:04:57,480 Speaker 6: The a Triple C has ruled that for. 69 00:04:57,520 --> 00:05:01,120 Speaker 5: Media companies, Meta is an unav avoidable trading partner. 70 00:05:04,320 --> 00:05:07,039 Speaker 1: That means the media companies have no choice but to 71 00:05:07,200 --> 00:05:11,080 Speaker 1: use the social media platforms to get their journalism to Australians. 72 00:05:11,640 --> 00:05:15,280 Speaker 1: That's because all our habits have changed so much. More 73 00:05:15,320 --> 00:05:18,240 Speaker 1: and more we live in our social feeds now, instead 74 00:05:18,240 --> 00:05:20,920 Speaker 1: of tuning into the six pm news or swinging past 75 00:05:20,960 --> 00:05:23,920 Speaker 1: the news agency to pick up the paper. In order 76 00:05:23,920 --> 00:05:27,400 Speaker 1: to find out what's going on, news organizations have to 77 00:05:27,440 --> 00:05:30,719 Speaker 1: post their stories on TikTok or Facebook if they've got 78 00:05:30,760 --> 00:05:33,440 Speaker 1: any chance of having them noticed by the audience. 79 00:05:34,880 --> 00:05:38,800 Speaker 5: And in twenty twenty one, Parliament determined that Meta should 80 00:05:38,800 --> 00:05:40,920 Speaker 5: pay for the news it users. 81 00:05:42,160 --> 00:05:45,719 Speaker 1: That's the world first news media Bargaining code that was 82 00:05:45,760 --> 00:05:48,520 Speaker 1: passed by the Australian Parliament after the a Triable C 83 00:05:48,760 --> 00:05:52,080 Speaker 1: found the big tech platforms were the news media industry's 84 00:05:52,200 --> 00:05:56,480 Speaker 1: unavoidable trading partners. Essentially be a trible C, which is 85 00:05:56,520 --> 00:06:00,240 Speaker 1: the Competition watchdog said platforms are profiting from news news 86 00:06:00,279 --> 00:06:03,120 Speaker 1: content and should pay for it. Facing the thread of 87 00:06:03,160 --> 00:06:07,280 Speaker 1: the code, both Google and Facebook did deals with Australian publishers. 88 00:06:08,000 --> 00:06:12,520 Speaker 1: Now as its three year deals expire, Meta is refusing 89 00:06:12,560 --> 00:06:16,320 Speaker 1: to renew them, saying neither it nor its users are 90 00:06:16,360 --> 00:06:20,560 Speaker 1: actually interested in news now. Their decision now triggers the 91 00:06:20,600 --> 00:06:24,760 Speaker 1: possibility of them being designated under the bargaining code and 92 00:06:24,839 --> 00:06:27,719 Speaker 1: told that despite what it says, it is still using 93 00:06:27,839 --> 00:06:32,720 Speaker 1: news content and should return to the negotiating table. In Canada, 94 00:06:32,839 --> 00:06:35,159 Speaker 1: faced with the same demand that it pays for news, 95 00:06:35,600 --> 00:06:40,200 Speaker 1: METAUP turned news off on Facebook, saying you can't force 96 00:06:40,320 --> 00:06:45,240 Speaker 1: us to pay for something we no longer carry. 97 00:06:45,400 --> 00:06:47,559 Speaker 6: We had a deal, they walked away. 98 00:06:48,960 --> 00:06:52,240 Speaker 5: I believe they had an obligation to renew the agreements 99 00:06:52,720 --> 00:06:56,839 Speaker 5: and honor our laws. It is intriguing that Meta has 100 00:06:57,120 --> 00:07:00,600 Speaker 5: no problem turning off the news, but has a big 101 00:07:00,640 --> 00:07:04,559 Speaker 5: problem turning off teenage fight clubs, the bullying of young women, 102 00:07:05,120 --> 00:07:06,120 Speaker 5: or scam advertising. 103 00:07:06,640 --> 00:07:08,440 Speaker 6: We cannot let ourselves be bullied. 104 00:07:09,000 --> 00:07:12,360 Speaker 5: Meta must be designated under the Media Bargaining Code and 105 00:07:12,480 --> 00:07:14,520 Speaker 5: challenged to negotiate in good faith. 106 00:07:15,080 --> 00:07:17,480 Speaker 1: But the problem is who can make them pay. 107 00:07:22,720 --> 00:07:25,920 Speaker 5: Met has been a frequent target of regulators in the 108 00:07:26,000 --> 00:07:28,720 Speaker 5: last two years. As of last week, there are at 109 00:07:28,840 --> 00:07:33,440 Speaker 5: least thirty enforcement actions against Meta around the globe. 110 00:07:34,120 --> 00:07:35,520 Speaker 6: But when does this money get paid? 111 00:07:36,920 --> 00:07:39,880 Speaker 5: Their usual pattern is to fight any ruling against them 112 00:07:40,160 --> 00:07:41,640 Speaker 5: through appeals. 113 00:07:41,120 --> 00:07:41,760 Speaker 6: In the courts. 114 00:07:41,840 --> 00:07:45,840 Speaker 5: For years, the tech monopolies love to use the law 115 00:07:46,360 --> 00:07:48,480 Speaker 5: to protect their right to be above the law. 116 00:07:49,840 --> 00:07:50,640 Speaker 6: When the threat of. 117 00:07:50,600 --> 00:07:54,520 Speaker 5: Regulation comes in their direction, the tech monopoly has a playbook. 118 00:07:55,240 --> 00:07:58,320 Speaker 5: They declare that they want to be regulated, that no 119 00:07:58,480 --> 00:08:02,920 Speaker 5: solution is ever workable for them. Then they insist that 120 00:08:03,040 --> 00:08:08,040 Speaker 5: regulation should be harmonized across jurisdictions, and Meta knows that 121 00:08:08,080 --> 00:08:11,080 Speaker 5: such a global agreement is never going to happen. 122 00:08:12,680 --> 00:08:15,600 Speaker 1: Miller said the social license he's proposing would put social 123 00:08:15,640 --> 00:08:19,440 Speaker 1: media executives under exactly the same kinds of rules that 124 00:08:19,560 --> 00:08:22,640 Speaker 1: apply to every other company. 125 00:08:23,160 --> 00:08:26,040 Speaker 5: A company may want to come to Australia to mine 126 00:08:26,280 --> 00:08:31,040 Speaker 5: the mineral of the moment copper. We rightly regard copper 127 00:08:31,080 --> 00:08:34,120 Speaker 5: as Australian property and if you want to extract it, 128 00:08:34,559 --> 00:08:37,160 Speaker 5: here are just a few of the things that Australia requires. 129 00:08:38,440 --> 00:08:41,560 Speaker 5: You will pay a royalty on the copper to access 130 00:08:41,600 --> 00:08:46,440 Speaker 5: the copper. You will fairly negotiate with the landowners. You 131 00:08:46,559 --> 00:08:51,360 Speaker 5: will hire Australians and meet all our industrial relations standards, safety. 132 00:08:51,080 --> 00:08:52,600 Speaker 6: Requirements and awards. 133 00:08:53,679 --> 00:08:57,000 Speaker 5: You will meet all environmental requirements, and you will pay 134 00:08:57,040 --> 00:09:03,040 Speaker 5: to fix any damage you cause or face preak. In 135 00:09:03,080 --> 00:09:08,240 Speaker 5: my view, the tech monopolies are also mining companies, but 136 00:09:08,280 --> 00:09:11,280 Speaker 5: they don't mind our minerals. They mind our lives. 137 00:09:15,120 --> 00:09:20,800 Speaker 1: Coming up, How AI fits into this complicated picture. Like 138 00:09:20,840 --> 00:09:24,880 Speaker 1: the media landscape, the Australian is always evolving. Our subscribers 139 00:09:24,880 --> 00:09:27,960 Speaker 1: get breaking news alerts and commentary direct to their phones, 140 00:09:28,200 --> 00:09:31,640 Speaker 1: as well as access to newsletters and special events. Check 141 00:09:31,720 --> 00:09:34,960 Speaker 1: us out at Vaustralian dot com dot au and we'll 142 00:09:35,000 --> 00:09:47,679 Speaker 1: be back after this break. In May, News Corps Global 143 00:09:47,720 --> 00:09:52,920 Speaker 1: boss Robert Thompson announced a landmark agreement with OpenAI, a 144 00:09:52,960 --> 00:09:59,880 Speaker 1: tech company that owns the generative artificial intelligence platform chat GPT. 145 00:10:00,120 --> 00:10:04,320 Speaker 1: If AI falls under the broad umbrella of machine learning. 146 00:10:04,960 --> 00:10:10,079 Speaker 1: That means sophisticated algorithms crawl the Internet and synthesize all 147 00:10:10,120 --> 00:10:13,640 Speaker 1: the information they can find on a given topic. Then 148 00:10:13,760 --> 00:10:16,680 Speaker 1: they spit it back out in whatever format. We ask 149 00:10:16,800 --> 00:10:22,280 Speaker 1: for images, videos, essays and more. In licensing the expert 150 00:10:22,400 --> 00:10:25,839 Speaker 1: FactCheck journalism produced by news Corps many mast heads around 151 00:10:25,840 --> 00:10:28,679 Speaker 1: the globe. The hope is that the information served up 152 00:10:28,679 --> 00:10:31,640 Speaker 1: to users of chat, GPT and others is of the 153 00:10:31,720 --> 00:10:37,960 Speaker 1: highest quality and accuracy. News Corp Australia CEO Michael Miller 154 00:10:38,120 --> 00:10:40,959 Speaker 1: was asked about the agreement at the Press Club on Wednesday. 155 00:10:41,640 --> 00:10:43,640 Speaker 4: Thank you for your spirch Olivia Island from the Sidney 156 00:10:43,679 --> 00:10:47,120 Speaker 4: Morning Herald the age, miss Miller. A year ago, News 157 00:10:47,160 --> 00:10:50,679 Speaker 4: Corp was giving warnings about the use of artificial intelligence 158 00:10:50,720 --> 00:10:53,680 Speaker 4: and its effects on the media, and now you're doing deals. 159 00:10:54,080 --> 00:10:55,920 Speaker 4: What exactly has changed. 160 00:10:56,360 --> 00:10:59,320 Speaker 5: Jim DVAI has not been in the vernacular for long. 161 00:10:59,360 --> 00:11:02,480 Speaker 5: I think changed over the past year is that a 162 00:11:02,480 --> 00:11:04,760 Speaker 5: lot of those AI companies have learned from the tech 163 00:11:04,800 --> 00:11:08,079 Speaker 5: platforms to engage with media earlier. And that's what I 164 00:11:08,120 --> 00:11:10,480 Speaker 5: think all media companies are seeing that they are wanting 165 00:11:10,520 --> 00:11:14,000 Speaker 5: to have conversations. They see media as being important partners 166 00:11:14,679 --> 00:11:17,760 Speaker 5: and that media companies should be open to having those conversations. 167 00:11:18,080 --> 00:11:21,640 Speaker 5: There's no doubt that those AI companies need media companies 168 00:11:22,160 --> 00:11:24,319 Speaker 5: and that if the partnership is right, then I don't 169 00:11:24,320 --> 00:11:26,880 Speaker 5: think media companies should be closed minded to working with 170 00:11:26,960 --> 00:11:27,840 Speaker 5: AI companies either. 171 00:11:31,600 --> 00:11:34,000 Speaker 1: So you're not going to hear episodes of the Front 172 00:11:34,080 --> 00:11:37,960 Speaker 1: scripted by chat GPT ever, and this is where the 173 00:11:38,000 --> 00:11:42,440 Speaker 1: confusion comes from. News Corp, like other publishers, is exploring 174 00:11:42,480 --> 00:11:46,160 Speaker 1: the use of AI to make journalism more efficient, faster 175 00:11:46,360 --> 00:11:49,560 Speaker 1: and more accurate. But it's not anywhere near our news 176 00:11:49,679 --> 00:11:55,000 Speaker 1: or investigative content. It's information like weather, traffic conditions, fuel prices, 177 00:11:55,440 --> 00:11:59,439 Speaker 1: or based on information that's publicly available and independently verified. 178 00:12:00,080 --> 00:12:02,520 Speaker 1: I asked the boss of our Data journalism division about 179 00:12:02,520 --> 00:12:05,920 Speaker 1: this and he said generative AI is not writing stories. 180 00:12:06,120 --> 00:12:09,280 Speaker 1: It can't be trusted for that. Nothing is published without 181 00:12:09,280 --> 00:12:12,400 Speaker 1: being carefully checked by an editor, a human. 182 00:12:13,520 --> 00:12:15,679 Speaker 7: Thank you Dana Daniel from The Camera Times. Again, you've 183 00:12:15,679 --> 00:12:17,520 Speaker 7: said that it won't be a big deal if MATA 184 00:12:17,679 --> 00:12:20,160 Speaker 7: leaves Australia and that we can just come up with 185 00:12:20,240 --> 00:12:24,080 Speaker 7: other social media platforms fill in that space. But Australia 186 00:12:24,160 --> 00:12:26,840 Speaker 7: is not really a tech superpower. Do you see this 187 00:12:26,880 --> 00:12:30,800 Speaker 7: as something that government should fund and subsidize an alternative 188 00:12:30,840 --> 00:12:33,200 Speaker 7: social media industry in Australia. 189 00:12:33,320 --> 00:12:36,440 Speaker 5: First of all meta leaving the country. The E Safety 190 00:12:36,480 --> 00:12:39,880 Speaker 5: Commissioner said that there were thirty two million reports of 191 00:12:40,040 --> 00:12:44,480 Speaker 5: child exploitation online last year. Eighty five percent of those 192 00:12:44,520 --> 00:12:49,720 Speaker 5: came from the four meta platforms Instagram, WhatsApp, Facebook and threads. 193 00:12:50,160 --> 00:12:52,880 Speaker 5: They are by far the worst actor thirty two million. 194 00:12:54,400 --> 00:12:55,040 Speaker 6: Good riddance. 195 00:12:55,200 --> 00:12:57,240 Speaker 5: If you don't want to not just sign up to 196 00:12:57,440 --> 00:13:00,400 Speaker 5: our values, but sign up to our laws, why do we. 197 00:13:00,400 --> 00:13:04,000 Speaker 7: Want them here so we don't need subsidies, then I'm. 198 00:13:03,880 --> 00:13:05,000 Speaker 6: A head ask for subsidies. 199 00:13:05,120 --> 00:13:09,280 Speaker 5: I'm asking for the government to move quickly with a 200 00:13:09,280 --> 00:13:12,400 Speaker 5: social license that enshrines our way of life, that protects 201 00:13:12,400 --> 00:13:14,520 Speaker 5: austrained businesses, not just protect a strained business, put just 202 00:13:14,559 --> 00:13:16,960 Speaker 5: a level playing field. We're not even asking for protection, 203 00:13:17,200 --> 00:13:19,720 Speaker 5: We're looking for a level playing field. Have them play 204 00:13:19,800 --> 00:13:23,240 Speaker 5: by our laws and if not, put penalties in place. 205 00:13:23,480 --> 00:13:25,680 Speaker 5: That's what other businesses have to sign up to. That's 206 00:13:25,720 --> 00:13:26,760 Speaker 5: what they should sign up to. 207 00:13:35,559 --> 00:13:37,640 Speaker 1: Thanks for joining us on the front we'll be back 208 00:13:37,720 --> 00:13:41,520 Speaker 1: right here tomorrow. In the meantime, join our subscribers at 209 00:13:41,600 --> 00:14:00,839 Speaker 1: the Australian dot com