1 00:00:02,880 --> 00:00:05,200 Speaker 1: My name is Lily Maddon and I'm a proud Arunda 2 00:00:05,440 --> 00:00:10,200 Speaker 1: Bungelung Calcuttin woman from Gadigol Country. The Daily oz acknowledges 3 00:00:10,280 --> 00:00:12,440 Speaker 1: that this podcast is recorded on the lands of the 4 00:00:12,480 --> 00:00:16,040 Speaker 1: Gadighl people and pays respect to all Aboriginal and Torres 5 00:00:16,040 --> 00:00:18,959 Speaker 1: Straight Island and nations. We pay our respects to the 6 00:00:19,000 --> 00:00:21,759 Speaker 1: first peoples of these countries, both past and present. 7 00:00:27,320 --> 00:00:30,480 Speaker 2: Welcome to the Daily Ods, your daily dose of news 8 00:00:30,480 --> 00:00:33,160 Speaker 2: and current affairs from Australia and around the world. 9 00:00:33,320 --> 00:00:36,440 Speaker 3: I'm Sam and I'm Zara, and today we're discussing a 10 00:00:36,440 --> 00:00:40,440 Speaker 3: bit of a controversial topic, how artificial intelligence could enable 11 00:00:40,520 --> 00:00:43,360 Speaker 3: students to cheat on exams. Now to a growing problem 12 00:00:43,400 --> 00:00:46,600 Speaker 3: facing high schools and universities around the world and artificial 13 00:00:46,640 --> 00:00:49,040 Speaker 3: intelligence software called chat GPT. 14 00:00:49,440 --> 00:00:51,400 Speaker 1: This was unlike anything I had dealt with before. 15 00:00:51,680 --> 00:00:55,360 Speaker 2: As AI technology becomes more advanced, it's becoming easier for 16 00:00:55,400 --> 00:00:58,760 Speaker 2: students to use it to their advantage in cheating on exams. 17 00:00:58,800 --> 00:01:02,040 Speaker 2: We've got some students use aipowered bots to complete online 18 00:01:02,080 --> 00:01:05,880 Speaker 2: assignments and others taking online exams on their behalf. 19 00:01:06,200 --> 00:01:09,880 Speaker 3: Some students who are apparently using AI enabled smart watchers 20 00:01:10,040 --> 00:01:12,880 Speaker 3: or other devices to access answers during an exam. 21 00:01:12,959 --> 00:01:16,000 Speaker 2: Not only can AI write exams and assessments, it can 22 00:01:16,080 --> 00:01:20,200 Speaker 2: also be used to write podcast introductions. It's actually written 23 00:01:20,280 --> 00:01:23,040 Speaker 2: this one. We're going to explain everything in the Deep Dive, 24 00:01:23,120 --> 00:01:25,440 Speaker 2: But first, Sarah, what is making headlines today? 25 00:01:31,400 --> 00:01:34,800 Speaker 3: PM Anthony Albernizi has said the current Medicare system is 26 00:01:34,920 --> 00:01:37,560 Speaker 3: struggling to keep up with the needs of Australians. 27 00:01:37,760 --> 00:01:41,160 Speaker 2: Too many people are turning up at emergency departments because 28 00:01:41,200 --> 00:01:44,200 Speaker 2: they can't get access to a GP and to primary 29 00:01:44,240 --> 00:01:44,800 Speaker 2: health care. 30 00:01:45,160 --> 00:01:47,880 Speaker 3: It comes ahead of an impending federal government overhaul of 31 00:01:47,960 --> 00:01:52,360 Speaker 3: Australia's universal healthcare system amid rising fees and increased pressure 32 00:01:52,360 --> 00:01:53,040 Speaker 3: on hospitals. 33 00:01:55,000 --> 00:01:57,480 Speaker 2: The New South Wales government has announced plans to roll 34 00:01:57,480 --> 00:02:00,600 Speaker 2: out a new domestic violence protection service that would allow 35 00:02:00,680 --> 00:02:03,120 Speaker 2: people to find out if their partner has a history 36 00:02:03,160 --> 00:02:06,720 Speaker 2: of abuse. The Right to Ask scheme will be based 37 00:02:06,720 --> 00:02:09,120 Speaker 2: on a similar program used in the UK, with New 38 00:02:09,160 --> 00:02:11,960 Speaker 2: South Wales Police Minister Paul Tool saying it would allow 39 00:02:12,040 --> 00:02:15,400 Speaker 2: people to enter relationships quote with their eyes wide open. 40 00:02:17,840 --> 00:02:20,360 Speaker 3: The suspect in a mass shooting near la that killed 41 00:02:20,360 --> 00:02:23,359 Speaker 3: at least ten people has been found dead inside of them, 42 00:02:24,040 --> 00:02:26,560 Speaker 3: the suspect open fire at a dance studio that was 43 00:02:26,600 --> 00:02:30,200 Speaker 3: hosting Lunar New Year celebrations on Saturday. A further ten 44 00:02:30,240 --> 00:02:32,240 Speaker 3: people were hospitalized. 45 00:02:33,440 --> 00:02:36,200 Speaker 2: And today's good news has some serious happy feet vibes. 46 00:02:36,240 --> 00:02:40,000 Speaker 2: A new colony of emperor penguins have been discovered in Antarctica. 47 00:02:40,360 --> 00:02:42,600 Speaker 2: The colony is home to about five hundred birds and 48 00:02:42,720 --> 00:02:45,720 Speaker 2: was found using satellite images in the most remote areas 49 00:02:45,760 --> 00:02:49,280 Speaker 2: of Antarctica. Scientists have been using satellite imagery to search 50 00:02:49,320 --> 00:02:57,160 Speaker 2: for emperor penguins for the last fifteen years. ZARA educators 51 00:02:57,160 --> 00:03:00,560 Speaker 2: around Australia are creating new rules for students on how 52 00:03:00,600 --> 00:03:05,359 Speaker 2: they use artificial intelligence or AI programs in assessment contexts. 53 00:03:05,919 --> 00:03:09,200 Speaker 2: For some, this means banning AI altogether and returning to 54 00:03:09,320 --> 00:03:12,920 Speaker 2: odd fashioned pen and paper exams, or countering AI tech 55 00:03:13,040 --> 00:03:14,119 Speaker 2: with tech of their own. 56 00:03:14,360 --> 00:03:17,519 Speaker 3: It feels like, for all the technological advancements that we've made, 57 00:03:17,560 --> 00:03:19,600 Speaker 3: to go back to pen and paper would seem like 58 00:03:19,680 --> 00:03:22,120 Speaker 3: something of an aggression. Can you take me back to 59 00:03:22,160 --> 00:03:25,040 Speaker 3: the beginning of how we actually got here? What ignited 60 00:03:25,040 --> 00:03:25,520 Speaker 3: this debate? 61 00:03:25,600 --> 00:03:27,600 Speaker 2: Just quickly aside note I was actually part of one 62 00:03:27,600 --> 00:03:31,640 Speaker 2: of the first universities in Australia to go to digital exams, 63 00:03:31,639 --> 00:03:33,839 Speaker 2: and they threw one pen and paper exam at us 64 00:03:33,919 --> 00:03:36,160 Speaker 2: right at the end of our degree, and everyone panicked. 65 00:03:36,400 --> 00:03:38,760 Speaker 2: And there was no such thing as chat GPT back then. 66 00:03:38,960 --> 00:03:41,040 Speaker 2: It was simply just a way to make sure we 67 00:03:41,080 --> 00:03:43,680 Speaker 2: still knew how to use a pen. So I can't 68 00:03:43,680 --> 00:03:46,360 Speaker 2: imagine that students today, all these years later, would adapt 69 00:03:46,480 --> 00:03:49,560 Speaker 2: to it that quickly. I guess the thing that's really 70 00:03:49,680 --> 00:03:53,920 Speaker 2: kicked off this new wave of discussions about AI has 71 00:03:53,960 --> 00:03:56,720 Speaker 2: been the launch of chat GPT, and that was the 72 00:03:56,880 --> 00:03:59,160 Speaker 2: end of last year. And if you missed all the 73 00:03:59,200 --> 00:04:03,280 Speaker 2: hype about this place, it's what's called a large language model. Basically, 74 00:04:03,360 --> 00:04:06,360 Speaker 2: this just means it's been programmed by humans with vast 75 00:04:06,400 --> 00:04:09,080 Speaker 2: amounts of text on the Internet. I think it's ingested 76 00:04:09,120 --> 00:04:11,800 Speaker 2: text up to about the end of twenty twenty one, 77 00:04:12,320 --> 00:04:15,040 Speaker 2: and it uses all the information it's accumulated to answer 78 00:04:15,120 --> 00:04:18,760 Speaker 2: questions in a conversational way. So let's go through a 79 00:04:18,839 --> 00:04:22,359 Speaker 2: quick example. I use CHATGBT to write the intro for 80 00:04:22,400 --> 00:04:25,039 Speaker 2: this podcast. So if you thought it sounded a little 81 00:04:25,080 --> 00:04:28,880 Speaker 2: bit more robotic than usual, that's why I started off 82 00:04:28,880 --> 00:04:31,240 Speaker 2: by asking it to write a podcast script. About how 83 00:04:31,360 --> 00:04:35,000 Speaker 2: artificial intelligence could allow students to cheat in exams. And 84 00:04:35,040 --> 00:04:36,120 Speaker 2: this is what it wrote. 85 00:04:35,880 --> 00:04:38,760 Speaker 3: So I'll read it quickly, Welcome to the AI in 86 00:04:38,960 --> 00:04:43,440 Speaker 3: Education podcast, where we explore the intersection of artificial intelligence 87 00:04:43,520 --> 00:04:44,520 Speaker 3: and education. 88 00:04:44,760 --> 00:04:46,279 Speaker 2: But we knew that would kind of give it away 89 00:04:46,360 --> 00:04:48,920 Speaker 2: a little bit that really doesn't sound like us, So 90 00:04:49,160 --> 00:04:52,120 Speaker 2: then we said, make the podcast called the Daily OS 91 00:04:52,440 --> 00:04:53,760 Speaker 2: and this is what it came up with. 92 00:04:54,080 --> 00:04:56,640 Speaker 3: Welcome to the Daily OS, your daily dose of news 93 00:04:56,680 --> 00:04:59,279 Speaker 3: and current events from Australia and around the world. 94 00:04:59,640 --> 00:05:02,480 Speaker 2: Better but not quite. We needed two voices, so then 95 00:05:02,520 --> 00:05:05,120 Speaker 2: we asked it to make the podcast script in two voices, 96 00:05:05,160 --> 00:05:07,920 Speaker 2: one called Sam and one called Zara, and it gave 97 00:05:08,000 --> 00:05:09,960 Speaker 2: us the scripts that we read at the open. 98 00:05:10,160 --> 00:05:12,200 Speaker 3: I mean, I think it did a pretty good job. 99 00:05:12,360 --> 00:05:14,800 Speaker 3: It nailed the brief and it's then up to us 100 00:05:14,800 --> 00:05:17,320 Speaker 3: as humans to interpret what it's given to us right. 101 00:05:17,200 --> 00:05:19,680 Speaker 2: Exactly, and that's really just scraping the surface. I was 102 00:05:19,720 --> 00:05:21,680 Speaker 2: playing around with it even more and I got it 103 00:05:21,720 --> 00:05:24,200 Speaker 2: to write a full movie script for me. I've even 104 00:05:24,240 --> 00:05:26,840 Speaker 2: read that one university professor got the bot to write 105 00:05:26,839 --> 00:05:29,760 Speaker 2: an essay that was coherent and original enough to pass 106 00:05:29,800 --> 00:05:33,440 Speaker 2: through anti plagiarism software now in saying that by its 107 00:05:33,440 --> 00:05:38,200 Speaker 2: company's CEO's own admission, chat GBT is incredibly limited and 108 00:05:38,240 --> 00:05:41,560 Speaker 2: shouldn't be used for anything too important because it lacks 109 00:05:41,560 --> 00:05:45,000 Speaker 2: the ability to think critically. Because it's AI, it can 110 00:05:45,080 --> 00:05:48,839 Speaker 2: also be responsible for some pretty problematic content. So to 111 00:05:48,960 --> 00:05:51,680 Speaker 2: prove this, a journalist asked chat gpt to write a 112 00:05:51,760 --> 00:05:55,040 Speaker 2: racist article and it did. It's able to do this 113 00:05:55,080 --> 00:05:58,440 Speaker 2: because there's so much problematic material already online that the 114 00:05:58,480 --> 00:06:00,240 Speaker 2: AI finds it pretty easy to draw on. 115 00:06:00,320 --> 00:06:03,599 Speaker 3: That. That's obviously concerning, and I think that talking about 116 00:06:03,920 --> 00:06:06,080 Speaker 3: the lack of critical thought and the lack of nuance 117 00:06:06,200 --> 00:06:09,440 Speaker 3: that a chatbot can bring to big discussions is one 118 00:06:09,480 --> 00:06:11,480 Speaker 3: that we can have at a different time. But I 119 00:06:11,520 --> 00:06:14,120 Speaker 3: really want to return to the education context because I 120 00:06:14,120 --> 00:06:16,039 Speaker 3: think that this is in the news a lot, and 121 00:06:16,080 --> 00:06:18,440 Speaker 3: I guess the worry is, how do you know if 122 00:06:18,520 --> 00:06:22,000 Speaker 3: work is authentic if it is passing a plagiarism test. 123 00:06:22,279 --> 00:06:25,040 Speaker 3: All of us had to submit our essays through turn 124 00:06:25,080 --> 00:06:27,520 Speaker 3: it in, and we're now hearing that some of the 125 00:06:27,640 --> 00:06:30,400 Speaker 3: essays written by chat gpt can actually pass through turn 126 00:06:30,480 --> 00:06:33,240 Speaker 3: it in. So this is new territory for educators to 127 00:06:33,279 --> 00:06:36,000 Speaker 3: be navigating. What have we heard so far about how 128 00:06:36,040 --> 00:06:37,640 Speaker 3: they're responding to this issue. 129 00:06:37,720 --> 00:06:39,640 Speaker 2: Well, I think we have to separate the schools and 130 00:06:39,680 --> 00:06:42,920 Speaker 2: the universities. First off, TDA reached out to the group 131 00:06:42,960 --> 00:06:46,240 Speaker 2: of eight university associations to ask them how they're going 132 00:06:46,279 --> 00:06:48,479 Speaker 2: to be dealing with this rise of AI, and the 133 00:06:48,520 --> 00:06:52,720 Speaker 2: association includes the Australian National University, the University of Sydney, 134 00:06:52,839 --> 00:06:55,640 Speaker 2: University of Melbourne, and they said they're changing how they 135 00:06:55,800 --> 00:06:59,600 Speaker 2: run assessments in response. They're moving away from online exams 136 00:06:59,640 --> 00:07:02,400 Speaker 2: and towards in person exams where they literally can have 137 00:07:02,520 --> 00:07:05,200 Speaker 2: eyes on people who are taking the tests, and they're 138 00:07:05,200 --> 00:07:08,040 Speaker 2: going to be using digital monitoring for remote students. 139 00:07:08,400 --> 00:07:10,280 Speaker 3: Okay, so then what about schools. 140 00:07:10,480 --> 00:07:12,680 Speaker 2: Well, you've got to remember the schools is kind of 141 00:07:12,800 --> 00:07:15,520 Speaker 2: state based, so there are going to be different responses 142 00:07:15,560 --> 00:07:18,320 Speaker 2: across the country. This week though, in the lead up 143 00:07:18,360 --> 00:07:21,320 Speaker 2: to some state schools going back, we've had a few developments. 144 00:07:21,520 --> 00:07:24,320 Speaker 2: In New South Wales, the Education Department have confirmed they're 145 00:07:24,320 --> 00:07:27,880 Speaker 2: going to restrict student access to AI apps including chat, GPT, 146 00:07:28,040 --> 00:07:31,640 Speaker 2: on school computers and networks. They say this restriction is 147 00:07:31,680 --> 00:07:34,240 Speaker 2: going to continue until they review how to safely and 148 00:07:34,280 --> 00:07:38,400 Speaker 2: appropriately use this new tech in the classroom. Up in Queensland, 149 00:07:38,480 --> 00:07:41,360 Speaker 2: the Department of Education has also announced it blocked chat 150 00:07:41,400 --> 00:07:44,560 Speaker 2: GPT until it can be assessed for quote appropriateness in 151 00:07:44,600 --> 00:07:45,600 Speaker 2: school settings. 152 00:07:46,400 --> 00:07:57,320 Speaker 3: We'll be back right after this, Okay, So in Queensland 153 00:07:57,400 --> 00:07:59,600 Speaker 3: and in New South Wales, students won't be able to 154 00:07:59,600 --> 00:08:02,320 Speaker 3: access the chatbot on their school computers or on the 155 00:08:02,360 --> 00:08:05,800 Speaker 3: Internet at school. But I guess that still doesn't stop 156 00:08:05,800 --> 00:08:09,360 Speaker 3: students from accessing AI at home or the many new 157 00:08:09,360 --> 00:08:12,920 Speaker 3: platforms that will arise and get around that technology. What 158 00:08:13,120 --> 00:08:14,600 Speaker 3: is the long term solution here? 159 00:08:14,880 --> 00:08:17,600 Speaker 2: Well, there's a couple of ways to tackle this question, 160 00:08:17,680 --> 00:08:19,400 Speaker 2: and I think it's one that's keeping a lot of 161 00:08:19,400 --> 00:08:22,320 Speaker 2: people up at night the moment. There are software developers 162 00:08:22,320 --> 00:08:25,600 Speaker 2: who are working on programs to detect work created by AI, 163 00:08:25,680 --> 00:08:28,320 Speaker 2: and that's probably going to be the most short term 164 00:08:28,400 --> 00:08:32,280 Speaker 2: solution to this problem. A student at Princeton, Edward Tian, 165 00:08:32,480 --> 00:08:34,839 Speaker 2: tweeted that he spent New Years making an app called 166 00:08:34,920 --> 00:08:38,199 Speaker 2: chat gbt Zero, which he says can quickly and efficiently 167 00:08:38,240 --> 00:08:41,760 Speaker 2: detect whether an essay is written by chat GBT or 168 00:08:41,840 --> 00:08:44,440 Speaker 2: a real life human. He said he was motivated to 169 00:08:44,440 --> 00:08:47,599 Speaker 2: develop this app in the face of increasing AI plagiarism, 170 00:08:47,760 --> 00:08:50,720 Speaker 2: and there's so much chat gpt hype about what is 171 00:08:50,760 --> 00:08:54,200 Speaker 2: written by AI. He says that we as humans deserve 172 00:08:54,280 --> 00:08:57,280 Speaker 2: to know. Now, you've got to remember it's still early 173 00:08:57,360 --> 00:08:59,920 Speaker 2: days in this, but it is likely as AI continues 174 00:09:00,080 --> 00:09:03,559 Speaker 2: to rise in popularity that the demand for AI plagiarism 175 00:09:03,600 --> 00:09:04,920 Speaker 2: apps will also rise. 176 00:09:05,360 --> 00:09:08,960 Speaker 3: Sounds like Edward had a similar New Year's Eve to us. 177 00:09:09,360 --> 00:09:12,320 Speaker 3: I too built an entirely new software, so I'm glad 178 00:09:13,040 --> 00:09:15,720 Speaker 3: on similar footing there, But I do think it's a 179 00:09:15,760 --> 00:09:20,680 Speaker 3: really interesting topic. When chat gpt first hit the scene, Sam, 180 00:09:20,760 --> 00:09:22,480 Speaker 3: you and I were talking about the fact that could 181 00:09:22,520 --> 00:09:24,840 Speaker 3: this mean the end of the Daily ohs, And we 182 00:09:24,840 --> 00:09:26,960 Speaker 3: were very glad to know that when you type in 183 00:09:27,640 --> 00:09:29,840 Speaker 3: tell me the news of the day, it cannot do that. 184 00:09:30,120 --> 00:09:33,720 Speaker 3: So at this present moment in time, our jobs are safe. 185 00:09:33,720 --> 00:09:38,960 Speaker 3: But it's certainly creating an unforeseen challenge for our educators 186 00:09:38,960 --> 00:09:42,200 Speaker 3: and for the education system. So it'd be fascinating to 187 00:09:42,200 --> 00:09:43,160 Speaker 3: see where it goes from here. 188 00:09:47,800 --> 00:09:49,960 Speaker 2: Thank you for joining us on the Daily OS today. 189 00:09:50,000 --> 00:09:52,280 Speaker 2: If you learn something from today's episode, don't forget to 190 00:09:52,360 --> 00:09:55,080 Speaker 2: hit subscribe. So there's a TDA episode waiting for you 191 00:09:55,120 --> 00:09:58,240 Speaker 2: every morning. We'll be back again tomorrow, assuming it's not 192 00:09:58,280 --> 00:10:02,160 Speaker 2: the end of the world. I Grobodic apocalypse vibes. Okay, 193 00:10:02,240 --> 00:10:10,559 Speaker 2: we'll be back again tomorrow. Until then, have a great day.