1 00:00:00,080 --> 00:00:03,080 Speaker 1: To AI. Most of us use AI, and we know 2 00:00:03,200 --> 00:00:05,520 Speaker 1: that we're using it. Four and five of us almost 3 00:00:05,760 --> 00:00:07,600 Speaker 1: have used it in the past year knowingly, but that 4 00:00:07,600 --> 00:00:10,000 Speaker 1: doesn't mean we trust it and you report by One 5 00:00:10,039 --> 00:00:12,440 Speaker 1: New Zealand out today found we do not trust large 6 00:00:12,440 --> 00:00:15,400 Speaker 1: companies to use it. We're most concerned with our personal 7 00:00:15,480 --> 00:00:18,560 Speaker 1: data being misused, with job losses associated with AI, and 8 00:00:18,560 --> 00:00:21,320 Speaker 1: with AI making unfair decisions. Jason Parris is the chief 9 00:00:21,360 --> 00:00:25,000 Speaker 1: executive of one end z hih Jason, Did I hear that, Jason? 10 00:00:25,079 --> 00:00:27,080 Speaker 1: Given that most of us use it, you'd think that 11 00:00:27,160 --> 00:00:28,680 Speaker 1: would be okay with it, But why are we so 12 00:00:28,800 --> 00:00:29,560 Speaker 1: worried about it? 13 00:00:30,960 --> 00:00:31,160 Speaker 2: Yeah? 14 00:00:31,200 --> 00:00:33,960 Speaker 3: I think there's a bit of nervousness about it because 15 00:00:34,080 --> 00:00:36,320 Speaker 3: people don't know they're probably using it most days anyway. 16 00:00:36,520 --> 00:00:38,960 Speaker 3: So I think the survey sed seventy odd percent of 17 00:00:39,040 --> 00:00:41,159 Speaker 3: us know we are, but I think it's more like 18 00:00:41,200 --> 00:00:44,360 Speaker 3: one hundred percent of people are using Spotify or maps, 19 00:00:45,520 --> 00:00:50,600 Speaker 3: social or social media. But I think it's the lack 20 00:00:50,640 --> 00:00:54,920 Speaker 3: of transparency on what is going on behind the scenes 21 00:00:55,400 --> 00:00:58,360 Speaker 3: of those AI tools that they don't know about makes 22 00:00:58,400 --> 00:01:00,160 Speaker 3: you naturally suspicious. 23 00:01:00,360 --> 00:01:02,880 Speaker 2: How are you using my data? How you are you? 24 00:01:02,960 --> 00:01:05,880 Speaker 2: Are you tracking me, what is this going to mean 25 00:01:06,000 --> 00:01:10,679 Speaker 2: for jobs and for employment when you make a recommendation 26 00:01:10,760 --> 00:01:12,560 Speaker 2: to me? Is it in my best interests? Or is 27 00:01:12,560 --> 00:01:15,320 Speaker 2: it in the best interests of the company. So all 28 00:01:15,319 --> 00:01:18,280 Speaker 2: pretty interesting themes that I kind of you know, it 29 00:01:18,319 --> 00:01:20,800 Speaker 2: makes sense that customers should be concerned about it. 30 00:01:20,920 --> 00:01:24,160 Speaker 1: Is it maybe an underlying lack of trust in big corporates, 31 00:01:24,240 --> 00:01:26,000 Speaker 1: Like it's okay for me to use AI because I 32 00:01:26,000 --> 00:01:27,640 Speaker 1: know what I'm doing and I can trust myself, But 33 00:01:27,680 --> 00:01:28,759 Speaker 1: I don't trust those guys. 34 00:01:29,840 --> 00:01:31,160 Speaker 2: There's probably an undertone there. 35 00:01:31,200 --> 00:01:35,520 Speaker 3: It's just been around, you know, for years, globally and 36 00:01:36,880 --> 00:01:39,320 Speaker 3: a New Zealand. But that's I think with the opportunity 37 00:01:39,440 --> 00:01:42,480 Speaker 3: is you've got this massive opportunity, a productivity gain for 38 00:01:42,720 --> 00:01:47,160 Speaker 3: New Zealand by using AI across different businesses and all 39 00:01:47,280 --> 00:01:50,240 Speaker 3: facets of life. But it needs to be done transparently, 40 00:01:50,320 --> 00:01:53,440 Speaker 3: and sometimes corporates aren't as transparent as they need to be, 41 00:01:54,040 --> 00:01:55,000 Speaker 3: and therefore. 42 00:01:54,640 --> 00:01:56,000 Speaker 2: There's a question on trust. 43 00:01:56,600 --> 00:01:58,720 Speaker 3: I don't think any corporate gets up every morning and goes, 44 00:01:58,720 --> 00:02:00,360 Speaker 3: how can I rip a customer off? And I don't 45 00:02:00,400 --> 00:02:02,160 Speaker 3: think they're going to go how can I use AI 46 00:02:02,480 --> 00:02:05,040 Speaker 3: to do anything bad? We just need to be about 47 00:02:05,040 --> 00:02:08,080 Speaker 3: transparent about how we use this amazing technology. 48 00:02:08,160 --> 00:02:11,120 Speaker 1: Yeah, do you think that the job fears are I mean, 49 00:02:11,480 --> 00:02:13,320 Speaker 1: you mentioned it yourself. People are really worried that they 50 00:02:13,360 --> 00:02:15,480 Speaker 1: will lose their jobs or all these humans will be 51 00:02:15,520 --> 00:02:18,760 Speaker 1: displaced by robots. But it seems increasingly that actually it's 52 00:02:18,800 --> 00:02:20,760 Speaker 1: more a tool for the humans to use. You still 53 00:02:20,760 --> 00:02:23,359 Speaker 1: need the humans at the moment. 54 00:02:23,520 --> 00:02:26,440 Speaker 2: AI is coming to your job, not for your job. 55 00:02:26,680 --> 00:02:27,960 Speaker 2: We've deployed AI. 56 00:02:27,800 --> 00:02:30,440 Speaker 3: Across most parts of our business, and I would say 57 00:02:30,880 --> 00:02:34,000 Speaker 3: the max I've seen of anyone's job of tasks that 58 00:02:34,040 --> 00:02:37,000 Speaker 3: someone would be performing as about thirty percent. So what 59 00:02:37,040 --> 00:02:39,359 Speaker 3: it's doing at the moment is just letting us get 60 00:02:39,440 --> 00:02:41,959 Speaker 3: to that to do list that's longer than my arm 61 00:02:42,040 --> 00:02:45,800 Speaker 3: that you never get to by removing all the noise 62 00:02:45,800 --> 00:02:47,560 Speaker 3: and the stuff that you don't like doing because it's 63 00:02:47,600 --> 00:02:51,600 Speaker 3: repetitive and boring and brain dumbing. 64 00:02:51,480 --> 00:02:54,000 Speaker 2: So you can work on the call sexy stuff. And 65 00:02:54,160 --> 00:02:55,200 Speaker 2: five years though I. 66 00:02:55,120 --> 00:02:58,799 Speaker 3: Think the technology is evolving fast enough to go, actually 67 00:02:59,360 --> 00:03:02,600 Speaker 3: some role won't be there because the roles we've got 68 00:03:02,800 --> 00:03:06,600 Speaker 3: organizations are to kind of hide process gaps or an 69 00:03:06,600 --> 00:03:10,600 Speaker 3: efficiency or old technology that shouldn't be in the business. 70 00:03:10,720 --> 00:03:12,359 Speaker 3: And so you should be able to speed it up 71 00:03:12,520 --> 00:03:15,400 Speaker 3: using AI. The key is then to make sure that 72 00:03:15,480 --> 00:03:18,280 Speaker 3: you train and upscill those people so they can be 73 00:03:18,320 --> 00:03:22,040 Speaker 3: deployed in more value creating roles within your organization and 74 00:03:22,080 --> 00:03:24,760 Speaker 3: it doesn't come as a surprise. So again, corporates have 75 00:03:24,800 --> 00:03:28,840 Speaker 3: a responsibility to talk early and transparently, not just with customers, 76 00:03:29,320 --> 00:03:30,920 Speaker 3: but their teams as well. 77 00:03:31,560 --> 00:03:33,640 Speaker 1: Out of ten, how much of a cheerleader do you 78 00:03:33,639 --> 00:03:34,359 Speaker 1: think you are. 79 00:03:36,400 --> 00:03:37,560 Speaker 2: Well compared to the warriors? 80 00:03:37,760 --> 00:03:44,400 Speaker 1: Or like that sounded so much more disparaging that I 81 00:03:44,440 --> 00:03:45,880 Speaker 1: meant it to sound. I don't mean it in a 82 00:03:45,880 --> 00:03:47,760 Speaker 1: disparaging way at all, but I mean you are like 83 00:03:47,800 --> 00:03:49,800 Speaker 1: a super enthusiast for this, aren't you. 84 00:03:50,600 --> 00:03:51,920 Speaker 2: Well, I just love New Zealand. 85 00:03:52,000 --> 00:03:53,760 Speaker 3: I think in you as you know, I think it's 86 00:03:53,800 --> 00:03:55,920 Speaker 3: the greatest country on the planet to live and work. 87 00:03:56,040 --> 00:03:57,360 Speaker 2: Yes, we have our challenges, but. 88 00:03:57,840 --> 00:03:59,680 Speaker 3: I just think it's a privilege to live here and 89 00:04:00,200 --> 00:04:05,400 Speaker 3: see that AI could be the new productivity gain or 90 00:04:05,440 --> 00:04:10,400 Speaker 3: create new areas of growth and innovation for. 91 00:04:10,000 --> 00:04:12,240 Speaker 1: For us, especially for New Zealand. 92 00:04:12,400 --> 00:04:15,440 Speaker 3: We go fast because we move fast, and our lack 93 00:04:15,480 --> 00:04:17,400 Speaker 3: of scale is a massive advantage. 94 00:04:17,520 --> 00:04:19,640 Speaker 2: Because we can move, we can move fast. 95 00:04:20,560 --> 00:04:23,279 Speaker 3: The digitization of the world means that you know, actually 96 00:04:23,800 --> 00:04:25,080 Speaker 3: innovation is decentralized. 97 00:04:25,120 --> 00:04:26,840 Speaker 2: You don't need to go to Silicon Valley. You can 98 00:04:26,839 --> 00:04:29,920 Speaker 2: do it from Auckland or Nelson or you know, Timaru Jason, 99 00:04:29,920 --> 00:04:30,440 Speaker 2: do you reckon? 100 00:04:30,440 --> 00:04:32,840 Speaker 1: People have realized though? Have the big corporate leaders, Have 101 00:04:32,920 --> 00:04:34,840 Speaker 1: the workers of New Zealand realized how much we could 102 00:04:34,880 --> 00:04:36,200 Speaker 1: actually use this and harnessed it. 103 00:04:37,279 --> 00:04:40,239 Speaker 3: I think they have, And that is the danger because 104 00:04:40,240 --> 00:04:42,520 Speaker 3: we're going to go after it fast, which we should, 105 00:04:42,920 --> 00:04:46,080 Speaker 3: but we need to go after it transparently and make 106 00:04:46,120 --> 00:04:49,320 Speaker 3: sure that we do it in a way that brings 107 00:04:49,400 --> 00:04:51,840 Speaker 3: our customers with us and they trust us for it. 108 00:04:51,960 --> 00:04:56,320 Speaker 1: So I've downloaded the chat GPT. Well, yep, I've already 109 00:04:56,320 --> 00:04:57,960 Speaker 1: had it's already It's already solved a lot of my 110 00:04:58,000 --> 00:05:01,680 Speaker 1: problems today. Jason, next minute, I'm too. 111 00:05:02,640 --> 00:05:06,240 Speaker 3: You should be not just at work, but also at 112 00:05:06,240 --> 00:05:07,800 Speaker 3: home as well. I found it can get you out 113 00:05:07,800 --> 00:05:08,960 Speaker 3: of a lot of trouble at home too. 114 00:05:09,000 --> 00:05:11,480 Speaker 1: Oh yeah, do you want to expand on that for us? 115 00:05:11,600 --> 00:05:15,120 Speaker 3: Just you know, what are the excuses for me being 116 00:05:15,680 --> 00:05:18,440 Speaker 3: you know, forty five minutes late yet again for my 117 00:05:18,520 --> 00:05:21,160 Speaker 3: wife that I haven't already used a number of times. 118 00:05:20,920 --> 00:05:24,840 Speaker 1: Before Jason, thank you very much, look after yourself. Jason Parris, 119 00:05:24,880 --> 00:05:26,480 Speaker 1: chief executive of One New Zealand. 120 00:05:27,160 --> 00:05:30,360 Speaker 2: For more from Heather Duplessy Allen Drive, listen live to 121 00:05:30,440 --> 00:05:33,479 Speaker 2: news talks it'd be from four pm weekdays, or follow 122 00:05:33,520 --> 00:05:35,280 Speaker 2: the podcast on iHeartRadio