1 00:00:00,120 --> 00:00:04,120 Speaker 1: Turns out and uncompletely unrelated news, turns out our public 2 00:00:04,160 --> 00:00:06,840 Speaker 1: servants are loving the old AI. Almost half of them 3 00:00:06,880 --> 00:00:09,760 Speaker 1: forty five percent are using AI and their work. But 4 00:00:09,800 --> 00:00:12,480 Speaker 1: the problem is very few have any rules at work 5 00:00:12,520 --> 00:00:14,840 Speaker 1: around how to use it now. Kerrie Davis is the 6 00:00:14,840 --> 00:00:20,080 Speaker 1: Public Service Association National Secretary. Kerry, Hello, Hi, Hi? Does 7 00:00:20,120 --> 00:00:21,639 Speaker 1: this worry you the use of the stuff? 8 00:00:23,400 --> 00:00:26,239 Speaker 2: What worries me is not so much the use of 9 00:00:26,280 --> 00:00:33,840 Speaker 2: it except without proper guardrails, without safeguards, without training. AI 10 00:00:33,960 --> 00:00:37,680 Speaker 2: in itself is not so much the issue. It's the 11 00:00:37,800 --> 00:00:43,040 Speaker 2: use of AI without proper proper supports and proper training. Yeah. 12 00:00:43,080 --> 00:00:46,280 Speaker 1: I mean, because what do these guys have closed AI 13 00:00:46,520 --> 00:00:48,800 Speaker 1: or are they just using chat GPT? Do we know. 14 00:00:50,479 --> 00:00:53,400 Speaker 2: There's a whole mixture, but chet GPT is the most 15 00:00:53,400 --> 00:00:54,400 Speaker 2: common venoes. 16 00:00:55,320 --> 00:00:57,520 Speaker 1: So is it possible that they are uploading sort of 17 00:00:57,560 --> 00:01:01,160 Speaker 1: like confidential government document and to chat GPT. 18 00:01:02,400 --> 00:01:03,240 Speaker 2: I wouldn't think so. 19 00:01:04,520 --> 00:01:07,920 Speaker 1: No, So what are they doing more? What's the concern? Then? 20 00:01:07,959 --> 00:01:10,840 Speaker 1: If they're not uploading private and confidential stuff, then what's 21 00:01:10,840 --> 00:01:11,319 Speaker 1: the problem. 22 00:01:12,720 --> 00:01:17,040 Speaker 2: Well, there are still problems about the accuracy of information 23 00:01:17,520 --> 00:01:22,200 Speaker 2: that's been utilized through the use of AI and making 24 00:01:22,240 --> 00:01:25,959 Speaker 2: sure that you're using the right platform for the right 25 00:01:26,160 --> 00:01:30,560 Speaker 2: platform tool for the right sort of support if you. 26 00:01:30,640 --> 00:01:33,000 Speaker 1: Like, Okay, give me an example you might be worried about. 27 00:01:34,520 --> 00:01:37,959 Speaker 2: So one is in terms of when we're talking about 28 00:01:38,560 --> 00:01:46,000 Speaker 2: recruitment exercises or any sort of collation of information, there 29 00:01:46,080 --> 00:01:48,800 Speaker 2: needs to be guardrails put in place. In terms of 30 00:01:48,880 --> 00:01:54,000 Speaker 2: gender and racial bias, there's issues of privacy, there's also 31 00:01:54,080 --> 00:01:59,280 Speaker 2: issues of accuracy, and there's also the impact on the 32 00:01:59,360 --> 00:02:03,680 Speaker 2: quality of services, particularly when we're aware that New Zealand's 33 00:02:04,200 --> 00:02:09,920 Speaker 2: vs community and needs to be tailored to match the 34 00:02:09,960 --> 00:02:13,440 Speaker 2: context in which it's been used and the different population 35 00:02:13,560 --> 00:02:18,119 Speaker 2: groups in which we're working with. So we do have 36 00:02:18,320 --> 00:02:22,000 Speaker 2: an AI framework for the public service, and that's not 37 00:02:22,040 --> 00:02:25,480 Speaker 2: a bad framework, but what it appears has happened is 38 00:02:25,520 --> 00:02:31,160 Speaker 2: that there hasn't been implemented into workplaces. So it's about 39 00:02:31,200 --> 00:02:35,959 Speaker 2: translating that framework into workplaces so that people get appropriate 40 00:02:36,600 --> 00:02:37,760 Speaker 2: support and training. 41 00:02:37,919 --> 00:02:40,440 Speaker 1: Harry, you've lost me. All I'm marrying is a lot 42 00:02:40,480 --> 00:02:42,160 Speaker 1: of words. I don't even know what this means in 43 00:02:42,160 --> 00:02:48,519 Speaker 1: real life. What's the problem what we have? So let's 44 00:02:48,520 --> 00:02:52,320 Speaker 1: say I don't know. Let me say let's say I'm 45 00:02:52,800 --> 00:02:55,239 Speaker 1: pick a department, Kerry pick a department. Let's say I'm 46 00:02:55,480 --> 00:02:57,880 Speaker 1: working at the Treasury and I'm going to put into 47 00:02:57,919 --> 00:03:02,760 Speaker 1: Treasury what in to my little chat GPT, what is 48 00:03:02,800 --> 00:03:06,600 Speaker 1: the impact of I don't know, two point five percent 49 00:03:06,639 --> 00:03:10,959 Speaker 1: inflation on the population. What's the problem? And it pumps 50 00:03:11,000 --> 00:03:11,600 Speaker 1: out an answer. 51 00:03:12,000 --> 00:03:16,040 Speaker 2: I mean that happens all the time already. You don't 52 00:03:16,040 --> 00:03:18,880 Speaker 2: need AI to do that sort of work. The sort 53 00:03:18,880 --> 00:03:22,680 Speaker 2: of work that AI has been used for is helping 54 00:03:23,600 --> 00:03:30,000 Speaker 2: transcribe and summarize notes. It's also about ensuring that the 55 00:03:30,080 --> 00:03:33,880 Speaker 2: right tone might be used in emails or staff communications. 56 00:03:34,520 --> 00:03:39,280 Speaker 2: It's also about if you're working in checking code and 57 00:03:39,320 --> 00:03:43,160 Speaker 2: problem solving, coding and designer shows, using ALC. 58 00:03:42,800 --> 00:03:45,280 Speaker 1: All of the stuff that you feel particularly dangerous to me? 59 00:03:45,600 --> 00:03:48,160 Speaker 1: Are we potentially making rules if what you're arguing is 60 00:03:48,200 --> 00:03:49,880 Speaker 1: we need to make some rules around AI. Are we're 61 00:03:49,880 --> 00:03:51,880 Speaker 1: potentially just making rules for the sake of making rules? 62 00:03:51,920 --> 00:03:54,360 Speaker 1: Should we not just let these very competent and often 63 00:03:54,480 --> 00:03:56,640 Speaker 1: university educated people just use their judgment. 64 00:03:58,440 --> 00:04:01,560 Speaker 2: Well, they are using their judgment, but it's a matter. 65 00:04:01,760 --> 00:04:06,240 Speaker 2: You can't like any technology. We've had all sorts of 66 00:04:06,280 --> 00:04:10,160 Speaker 2: technology changes over the years, and we've always had appropriate 67 00:04:10,200 --> 00:04:12,360 Speaker 2: support and training to make sure that we get the 68 00:04:12,400 --> 00:04:13,960 Speaker 2: best out of that technology. 69 00:04:14,480 --> 00:04:16,360 Speaker 1: Carrie, thank you for you time. Appreciate it. Mate. That's 70 00:04:16,720 --> 00:04:20,400 Speaker 1: Carry Davis, who's the Public Service Association National Secretary. For 71 00:04:20,520 --> 00:04:23,800 Speaker 1: more from Heather Duplessy Allen Drive, Listen live to news 72 00:04:23,839 --> 00:04:26,720 Speaker 1: talks that'd be from four pm weekdays, or follow the 73 00:04:26,800 --> 00:04:28,440 Speaker 1: podcast on iHeartRadio.