1 00:00:00,080 --> 00:00:02,280 Speaker 1: Legislation before Parliament at the moment. You may have heard 2 00:00:02,320 --> 00:00:05,720 Speaker 1: about this, a Misinformation Bill that is currently making its 3 00:00:05,760 --> 00:00:08,719 Speaker 1: way through the parliament. The government keen to get this in. 4 00:00:08,800 --> 00:00:10,800 Speaker 1: Let's find out what it does, and I know my 5 00:00:10,920 --> 00:00:15,319 Speaker 1: next guest has misgivings about it. Professor Antwumit, constitutional law 6 00:00:15,440 --> 00:00:18,520 Speaker 1: expert on the line firstly, and good morning, thank you 7 00:00:18,560 --> 00:00:21,000 Speaker 1: for your time. What does the bill aing to do? 8 00:00:22,239 --> 00:00:24,680 Speaker 2: Well, what it does is that pushes the burden onto 9 00:00:24,800 --> 00:00:29,280 Speaker 2: the digital platforms. So you know, you're Googles and metas 10 00:00:29,320 --> 00:00:32,960 Speaker 2: and twitters and all those sorts of bodies to deal 11 00:00:33,000 --> 00:00:37,080 Speaker 2: with misleading and deceptives and false information. So what they 12 00:00:37,120 --> 00:00:41,080 Speaker 2: call misinformation and they have to create codes to do that, 13 00:00:41,640 --> 00:00:45,360 Speaker 2: and if they breach the code then there is an offense. 14 00:00:45,440 --> 00:00:49,960 Speaker 2: So in the first instance, they decide what's misinformation or disinformation. 15 00:00:50,440 --> 00:00:53,680 Speaker 2: But if the government back here in Australia through AKMA, 16 00:00:53,920 --> 00:00:57,600 Speaker 2: decide that the code is deficient and they're not dealing 17 00:00:57,640 --> 00:01:02,600 Speaker 2: adequately with misinformation or disinformation, then the AKMA can make 18 00:01:03,000 --> 00:01:08,160 Speaker 2: what it's called standards which basically require these platforms to 19 00:01:08,200 --> 00:01:11,200 Speaker 2: deal with misinformation and disinformation in particular ways. 20 00:01:11,280 --> 00:01:14,760 Speaker 1: Okay, AKMA being the communication media authority, they'll have the 21 00:01:15,360 --> 00:01:18,760 Speaker 1: final SAT. Now the opposition's concerned about this because it 22 00:01:18,800 --> 00:01:21,959 Speaker 1: takes away freedom of expression. Do you share those views? 23 00:01:23,160 --> 00:01:26,880 Speaker 2: What my concern is is how they actually define what's misinformation. 24 00:01:27,080 --> 00:01:28,959 Speaker 2: So the words that they put on the bill, they 25 00:01:29,000 --> 00:01:33,680 Speaker 2: say that it's information that reasonably they verifiable as false, misleading, 26 00:01:33,760 --> 00:01:36,320 Speaker 2: or accept it. And I don't know concerns about them 27 00:01:36,360 --> 00:01:38,640 Speaker 2: getting rid of the you know, the deceptive stuff to 28 00:01:38,720 --> 00:01:42,120 Speaker 2: all the bots, for example, that manipulate things that's absolutely 29 00:01:42,200 --> 00:01:45,679 Speaker 2: fine or things that are just flat out factually wrong. 30 00:01:45,920 --> 00:01:48,200 Speaker 2: So you know, you see all those scam ads that 31 00:01:48,360 --> 00:01:52,040 Speaker 2: say to you Forest or David kash is supporting something right. 32 00:01:52,360 --> 00:01:55,160 Speaker 2: If they don't, then that's proved, then that should be 33 00:01:55,200 --> 00:01:57,160 Speaker 2: got rid of. So getting rid of things that are 34 00:01:57,200 --> 00:02:00,160 Speaker 2: factually wrong, no problem with that at all. Where the 35 00:02:00,240 --> 00:02:04,040 Speaker 2: problems arise is where it extends into things that are contestable. 36 00:02:04,480 --> 00:02:07,840 Speaker 2: So say there's an issue and people have different views 37 00:02:07,880 --> 00:02:11,119 Speaker 2: on it. The problem here is that this extends beyond 38 00:02:11,360 --> 00:02:15,480 Speaker 2: fact to what they describe as comment and opinion, and 39 00:02:15,520 --> 00:02:18,280 Speaker 2: so somebody has to decide whether or not that is 40 00:02:19,200 --> 00:02:22,160 Speaker 2: false or misleading or accept it. So how does the 41 00:02:22,200 --> 00:02:25,799 Speaker 2: platform overseas in America make those decisions about what's going 42 00:02:25,840 --> 00:02:29,120 Speaker 2: on here in Australia, they really can't, and their algorithms 43 00:02:29,200 --> 00:02:32,079 Speaker 2: can't make those kinds of decisions either. So what they 44 00:02:32,160 --> 00:02:35,520 Speaker 2: do is that they employ fact checkers in Australia to 45 00:02:35,560 --> 00:02:37,680 Speaker 2: do it. When the fact checkers go out and they 46 00:02:37,720 --> 00:02:40,760 Speaker 2: get some advice from experts and then they report one 47 00:02:40,800 --> 00:02:42,920 Speaker 2: way or another and they say whether it's falseful or not. 48 00:02:43,400 --> 00:02:45,919 Speaker 2: The problem there is, however, it depends on which experts 49 00:02:45,919 --> 00:02:48,600 Speaker 2: you choose. Might be that you choose, you know two 50 00:02:48,639 --> 00:02:50,600 Speaker 2: when they say a particular thing, that there might be 51 00:02:50,680 --> 00:02:53,960 Speaker 2: five over there that says something else. Just because there's 52 00:02:54,160 --> 00:02:57,800 Speaker 2: even a consensus amongst experts that something is true doesn't 53 00:02:57,840 --> 00:03:01,160 Speaker 2: necessarily mean that it is. Sometimes, you know, things come 54 00:03:01,200 --> 00:03:04,040 Speaker 2: along and they change and we learn by the people 55 00:03:04,040 --> 00:03:08,200 Speaker 2: who challenge things. So where there is concern is in 56 00:03:08,240 --> 00:03:11,440 Speaker 2: this more contestable sort of policy zone that we will 57 00:03:11,480 --> 00:03:15,079 Speaker 2: have people just making decisions about what's fulse and what's not, 58 00:03:15,520 --> 00:03:18,280 Speaker 2: and that goes across to a social media platform and 59 00:03:18,320 --> 00:03:21,720 Speaker 2: that results in potentially a whole lot of stuff being 60 00:03:21,720 --> 00:03:22,720 Speaker 2: wiped off the Internet. 61 00:03:23,040 --> 00:03:25,600 Speaker 1: That's the problem, and the issue is of course, for 62 00:03:25,880 --> 00:03:29,520 Speaker 1: Joe blow online, is it actually censorship. I suppose to 63 00:03:29,600 --> 00:03:30,320 Speaker 1: some degree it is. 64 00:03:31,160 --> 00:03:33,640 Speaker 2: All of it is. So there's different things that the 65 00:03:34,040 --> 00:03:36,960 Speaker 2: platforms can fuse. So the most extreme end is just 66 00:03:37,080 --> 00:03:39,800 Speaker 2: wiping out your posts, just making it go away, and look, 67 00:03:39,960 --> 00:03:42,960 Speaker 2: that already happened. By the way, I should say, these 68 00:03:43,200 --> 00:03:46,280 Speaker 2: platforms already have policies to do that I'm watching this 69 00:03:46,360 --> 00:03:48,440 Speaker 2: bill is to try and make it more transparent about 70 00:03:48,520 --> 00:03:50,040 Speaker 2: how they're doing it and that that part of the 71 00:03:50,080 --> 00:03:52,560 Speaker 2: still is good. Okay, So they can remove it. That 72 00:03:52,680 --> 00:03:55,120 Speaker 2: is censorship, but they can do other things. Sometimes they 73 00:03:55,160 --> 00:03:57,320 Speaker 2: demote it so that it doesn't come up in your 74 00:03:57,360 --> 00:04:00,400 Speaker 2: feed and find. That's one way they do with that. 75 00:04:00,600 --> 00:04:02,200 Speaker 2: But the other way they deal with it, which I 76 00:04:02,200 --> 00:04:04,440 Speaker 2: think is a much better way, is that they'll put 77 00:04:04,440 --> 00:04:06,400 Speaker 2: a note on it, and so they'll say, you know, 78 00:04:06,520 --> 00:04:08,960 Speaker 2: there's a post it says something rather and it'll add 79 00:04:09,080 --> 00:04:11,600 Speaker 2: what's sometimes called a community note. They'll say, well, actually 80 00:04:11,640 --> 00:04:14,040 Speaker 2: this is out of context, or that video is really 81 00:04:14,080 --> 00:04:16,760 Speaker 2: of something else, or experts that said this is wrong. 82 00:04:16,880 --> 00:04:19,560 Speaker 2: Here's a link to their advice as to why it's wrong. 83 00:04:19,920 --> 00:04:23,520 Speaker 2: And that gives power to the reader to then say, okay, look, 84 00:04:23,640 --> 00:04:26,000 Speaker 2: maybe there is something dodging here. I'll have a look 85 00:04:26,000 --> 00:04:28,200 Speaker 2: at about it, I'll think about it, I'll make a decision 86 00:04:28,200 --> 00:04:30,839 Speaker 2: for myself. But it doesn't sort of wipe out the 87 00:04:30,880 --> 00:04:34,480 Speaker 2: ability to know things. So if I had my choice, 88 00:04:34,520 --> 00:04:37,160 Speaker 2: I'd say that that kind of response, where you just 89 00:04:37,240 --> 00:04:40,680 Speaker 2: give people more information and context about the relevant thing, 90 00:04:40,760 --> 00:04:41,920 Speaker 2: is sort of a better way to go. 91 00:04:42,120 --> 00:04:45,360 Speaker 1: Yeah. The only issue with that, With millions upon millions 92 00:04:45,360 --> 00:04:48,760 Speaker 1: of uploads every single day, if not every hour, how 93 00:04:48,800 --> 00:04:50,000 Speaker 1: can they keep on top of it? 94 00:04:50,520 --> 00:04:52,880 Speaker 2: Well, exactly, And that's where you get a big problem 95 00:04:52,920 --> 00:04:54,840 Speaker 2: because in the end they can only do this by 96 00:04:54,880 --> 00:04:58,800 Speaker 2: applying algorithms. Okay, so our computer has to decide, and 97 00:04:58,839 --> 00:05:02,039 Speaker 2: the thing is you can't make sort of nuanced type 98 00:05:02,080 --> 00:05:04,440 Speaker 2: decisions in relations to things. So it might well be 99 00:05:04,600 --> 00:05:07,920 Speaker 2: that in the end the platforms say, okay, there's controversy 100 00:05:08,000 --> 00:05:10,960 Speaker 2: about this particular issue, so instead of having to make 101 00:05:10,960 --> 00:05:13,080 Speaker 2: decisions in every post, we're just going to wipe out 102 00:05:13,160 --> 00:05:16,160 Speaker 2: everything that deals with whatever it is, you know, the 103 00:05:16,240 --> 00:05:19,840 Speaker 2: Voice referendum or Middle East or whatever. So you could 104 00:05:19,920 --> 00:05:22,920 Speaker 2: end up with sort of massive censorship simply because they're 105 00:05:22,960 --> 00:05:25,960 Speaker 2: just trying to avoid getting into trouble In relation to 106 00:05:26,040 --> 00:05:28,320 Speaker 2: small things, so they just wipe out the whole lot, 107 00:05:28,760 --> 00:05:29,680 Speaker 2: and that's a real risk. 108 00:05:29,760 --> 00:05:32,400 Speaker 1: It's a mess, no doubt about it. Well, potentially will be. 109 00:05:32,760 --> 00:05:34,880 Speaker 2: Well yeah, Look, the other side of it, by the way, 110 00:05:34,920 --> 00:05:37,839 Speaker 2: is that Donald Trump has said that when he becomes president, 111 00:05:37,960 --> 00:05:40,359 Speaker 2: assuming he has control of both houses, they're going to 112 00:05:40,360 --> 00:05:44,880 Speaker 2: pass legislation in the US that prevents the platforms from 113 00:05:45,040 --> 00:05:48,680 Speaker 2: removing any misinformation or disinformation. So it may all turn 114 00:05:48,720 --> 00:05:52,240 Speaker 2: out that our law's completely useful, even if they're part. 115 00:05:52,520 --> 00:05:55,280 Speaker 1: Yeah, indeed, we'll wait and see. Appreciate your insights, So 116 00:05:55,520 --> 00:05:57,680 Speaker 1: very very interesting in food for thought. Do you think 117 00:05:58,200 --> 00:06:01,200 Speaker 1: the government will pay attention to your report, your suggestions 118 00:06:01,240 --> 00:06:03,880 Speaker 1: that this potentially is going to be censorship? 119 00:06:04,640 --> 00:06:07,200 Speaker 2: Ah? Look, possibly. I mean the good thing is that 120 00:06:07,240 --> 00:06:09,560 Speaker 2: they did put out an exposure draft in the past, 121 00:06:10,200 --> 00:06:13,760 Speaker 2: I along with others, did make suggestions for changes, and 122 00:06:13,800 --> 00:06:16,200 Speaker 2: they did take up changes, so they did a meand 123 00:06:16,200 --> 00:06:18,760 Speaker 2: it before this bill went in. Problem is with some 124 00:06:18,800 --> 00:06:22,359 Speaker 2: of their amendments created more problems, so more people are complaining, 125 00:06:22,440 --> 00:06:25,680 Speaker 2: so maybe they'll respond to it, and particularly if there's 126 00:06:25,680 --> 00:06:27,839 Speaker 2: going to be difficulty getting it through the Senate. So 127 00:06:27,960 --> 00:06:30,880 Speaker 2: it might well be that a version gets through later on, 128 00:06:30,960 --> 00:06:33,600 Speaker 2: that's changed. I mean there is good stuff in there. 129 00:06:33,640 --> 00:06:35,800 Speaker 2: It's not just you know, the dodgy stuff. There is 130 00:06:35,800 --> 00:06:38,800 Speaker 2: good stuff about transparency, there's good stuff about getting rid 131 00:06:38,839 --> 00:06:42,039 Speaker 2: of you know, the box, all that stuff. It's just 132 00:06:42,320 --> 00:06:46,280 Speaker 2: some areas where we get into contestable stuff, where there's dangers. 133 00:06:46,360 --> 00:06:48,000 Speaker 1: Appreciate your time, Thank you so much. 134 00:06:48,480 --> 00:06:48,920 Speaker 2: Thank you. 135 00:06:49,000 --> 00:06:53,000 Speaker 1: Professor Antumi or Constitutional Law Expert to the UNI of Sydney,