1 00:00:00,000 --> 00:00:02,120 Speaker 1: Now let's talk about AI. Do you sometimes look at 2 00:00:02,120 --> 00:00:05,280 Speaker 1: a picture or a video and wonder, Hm, is this real? 3 00:00:05,400 --> 00:00:08,039 Speaker 1: Or is this AI? How much of the fear around 4 00:00:08,119 --> 00:00:11,039 Speaker 1: AI is founded? Terresa Peyton is a former White House 5 00:00:11,080 --> 00:00:13,520 Speaker 1: Chief Information Officer. She's in the country for Sparks Tech 6 00:00:13,560 --> 00:00:15,200 Speaker 1: Summit underweight at the moment. Let's talk to her. 7 00:00:15,240 --> 00:00:17,920 Speaker 2: Hi, Teresa, Hi, how are you well? Thank you? 8 00:00:18,400 --> 00:00:20,520 Speaker 1: Do you think we will ever get to a world 9 00:00:20,840 --> 00:00:22,759 Speaker 1: where we will look at a picture and not know 10 00:00:22,800 --> 00:00:23,680 Speaker 1: if it's AI or not? 11 00:00:24,120 --> 00:00:24,840 Speaker 2: Where there now? 12 00:00:25,040 --> 00:00:28,120 Speaker 3: Oh? I can tell the difference, so can't you most 13 00:00:28,120 --> 00:00:30,840 Speaker 3: of the time. But sometimes some time I have to 14 00:00:31,160 --> 00:00:34,080 Speaker 3: really take a look at it a little closer. So 15 00:00:34,120 --> 00:00:36,760 Speaker 3: we're getting closer and closer to the day you won't 16 00:00:36,800 --> 00:00:37,199 Speaker 3: be able to. 17 00:00:37,120 --> 00:00:39,240 Speaker 1: Take okay, So but the experts at the moment can 18 00:00:39,280 --> 00:00:41,920 Speaker 1: stop like, okay, some of its normal people can see 19 00:00:41,920 --> 00:00:44,440 Speaker 1: through most of it. The experts can still see through. 20 00:00:44,479 --> 00:00:45,959 Speaker 1: Will we ever get to a day where I will 21 00:00:45,960 --> 00:00:47,800 Speaker 1: put it in front of you and you'll go, I 22 00:00:47,800 --> 00:00:52,519 Speaker 1: actually don't know. Yes, how far away this year? What 23 00:00:52,680 --> 00:00:55,640 Speaker 1: does that say about our ability to prove things? 24 00:00:56,120 --> 00:00:56,360 Speaker 2: Well? 25 00:00:56,440 --> 00:00:58,920 Speaker 3: I mean, here's the thing, trust, It's one of our 26 00:00:59,000 --> 00:01:01,640 Speaker 3: most valuable ass and it's now one of our most 27 00:01:01,840 --> 00:01:06,360 Speaker 3: vulnerable assets. You can't even trust your own eyes. But 28 00:01:06,640 --> 00:01:10,679 Speaker 3: the technology is there where what we can do in 29 00:01:10,720 --> 00:01:13,399 Speaker 3: the very near future. The question is which one's going 30 00:01:13,440 --> 00:01:16,959 Speaker 3: to win out? Is like, for example, you could tell 31 00:01:17,000 --> 00:01:19,039 Speaker 3: whether or not our voice is the two of us 32 00:01:19,040 --> 00:01:23,040 Speaker 3: talking or are you actually talking to a voice clone 33 00:01:23,040 --> 00:01:26,080 Speaker 3: of me? So there is technology now that could say 34 00:01:26,400 --> 00:01:29,040 Speaker 3: this is of human origin or this is of computer origan. 35 00:01:29,120 --> 00:01:31,319 Speaker 1: So do you think there will always be technology that 36 00:01:31,360 --> 00:01:32,839 Speaker 1: will be able to tell us the truth? 37 00:01:33,040 --> 00:01:36,600 Speaker 3: Yes, but the question is will it be implemented into 38 00:01:36,680 --> 00:01:41,360 Speaker 3: the process fast enough before we get duped by fraudstors 39 00:01:41,400 --> 00:01:42,520 Speaker 3: and criminals. 40 00:01:42,160 --> 00:01:44,560 Speaker 1: And will it be accessible enough because the problem is obviously, 41 00:01:44,560 --> 00:01:48,160 Speaker 1: as a member of the media, we often rely on photographs, videos, 42 00:01:48,240 --> 00:01:51,480 Speaker 1: audio documents to say the thing that we are alleging 43 00:01:51,800 --> 00:01:56,880 Speaker 1: is true because here's the proof, right, Yes, So if 44 00:01:56,920 --> 00:01:59,720 Speaker 1: AI is able to kind of confound that and I 45 00:01:59,760 --> 00:02:02,200 Speaker 1: can't use that anymore, well, I always be able to 46 00:02:02,240 --> 00:02:04,240 Speaker 1: rely on the technology to back me up and go, no, 47 00:02:04,280 --> 00:02:05,120 Speaker 1: that is really the truth. 48 00:02:05,280 --> 00:02:08,040 Speaker 3: You should be able to the other thing to think about, too, 49 00:02:08,120 --> 00:02:11,240 Speaker 3: is we can watermark. So for example, this conversation you 50 00:02:11,280 --> 00:02:15,480 Speaker 3: and I are having. The radio station could watermark that conversation, 51 00:02:15,960 --> 00:02:18,160 Speaker 3: so if somebody tries to meddle with it, the AI 52 00:02:18,320 --> 00:02:21,560 Speaker 3: would know. So it like, in my mind what has 53 00:02:21,639 --> 00:02:25,000 Speaker 3: to happen is all the tech product companies when they 54 00:02:25,600 --> 00:02:28,320 Speaker 3: edit something, it needs to say what you're about to 55 00:02:28,360 --> 00:02:31,799 Speaker 3: hear has been produced by generative AI. It's like there 56 00:02:31,800 --> 00:02:33,840 Speaker 3: needs to be a disclaimer, like there is things that 57 00:02:33,880 --> 00:02:34,600 Speaker 3: are unhealthy for. 58 00:02:34,840 --> 00:02:36,079 Speaker 1: Only the good guys are going to do it, though 59 00:02:36,120 --> 00:02:38,400 Speaker 1: the bad guys won't. But you can see how you 60 00:02:38,440 --> 00:02:41,639 Speaker 1: can see how you know, Yeah, sure we can prove 61 00:02:41,680 --> 00:02:44,080 Speaker 1: and disprove, but there could just be this proliferation of 62 00:02:44,160 --> 00:02:47,520 Speaker 1: stuff that is untrustworthy and we're kind. 63 00:02:47,320 --> 00:02:49,000 Speaker 2: Of on our own, don't we. 64 00:02:49,200 --> 00:02:52,320 Speaker 3: Yeah, well, we somewhat are on our own to you know, 65 00:02:52,600 --> 00:02:55,919 Speaker 3: I say, we used to be in a place where 66 00:02:55,919 --> 00:02:58,560 Speaker 3: it was trust but verify, and now I say never trust, 67 00:02:58,639 --> 00:03:01,919 Speaker 3: always verify a verify, Verify and verify one more time. 68 00:03:02,040 --> 00:03:06,720 Speaker 1: Yeah, okay, So what do you worry about what seems 69 00:03:06,720 --> 00:03:08,679 Speaker 1: to be the worst case scenario with AI, which is 70 00:03:08,720 --> 00:03:09,799 Speaker 1: that we lose control? 71 00:03:10,760 --> 00:03:11,120 Speaker 2: Yes? 72 00:03:11,280 --> 00:03:11,959 Speaker 1: Do you really? 73 00:03:12,040 --> 00:03:13,519 Speaker 2: I really do worry about that? Okay? 74 00:03:13,600 --> 00:03:15,600 Speaker 1: How far away is that if it happens. 75 00:03:15,520 --> 00:03:18,880 Speaker 3: Well, I think there's a lot of really smart people 76 00:03:18,880 --> 00:03:21,880 Speaker 3: around the world, New Zealand included, who are having really 77 00:03:21,960 --> 00:03:26,520 Speaker 3: hard conversations around governance and guardrails for AI. So my 78 00:03:26,639 --> 00:03:29,639 Speaker 3: hope is those hard conversations will turn into governance and 79 00:03:29,680 --> 00:03:32,640 Speaker 3: guardrails before we hit this. But I do think twenty 80 00:03:32,680 --> 00:03:35,280 Speaker 3: twenty seven, twenty twenty eight, if we don't get this 81 00:03:35,640 --> 00:03:38,520 Speaker 3: right now, this isn't you know how we put things 82 00:03:38,520 --> 00:03:40,280 Speaker 3: off with social media It's still a little bit of 83 00:03:40,320 --> 00:03:44,160 Speaker 3: a dumpster fire sometimes. Yeah, if we don't get this right, 84 00:03:44,240 --> 00:03:44,920 Speaker 3: this is different. 85 00:03:45,800 --> 00:03:47,680 Speaker 1: And what happens if we lose control? What does say 86 00:03:47,760 --> 00:03:49,160 Speaker 1: I do well? 87 00:03:49,440 --> 00:03:52,240 Speaker 2: For starters, it's a huge energy hug. 88 00:03:52,560 --> 00:03:56,720 Speaker 3: So if you love the planet, it can run infinitely 89 00:03:57,680 --> 00:04:00,800 Speaker 3: and tell itself to keep running. We've already and labs 90 00:04:00,840 --> 00:04:03,320 Speaker 3: where researchers who are you know, kind of like your 91 00:04:03,360 --> 00:04:07,000 Speaker 3: ethical hackers. They try to see if they can trick 92 00:04:07,720 --> 00:04:11,520 Speaker 3: the generative AI into creating kill switches by telling it, 93 00:04:11,600 --> 00:04:14,640 Speaker 3: don't create a kill switch for yourself, or don't create 94 00:04:14,640 --> 00:04:17,359 Speaker 3: a an override to the kill switch, and they see 95 00:04:17,920 --> 00:04:21,320 Speaker 3: where it basically becomes self preserving and does it and 96 00:04:21,400 --> 00:04:24,080 Speaker 3: it tries to create something where you can't turn it off. 97 00:04:24,160 --> 00:04:26,279 Speaker 2: Yeah, yeah, yeah, does it succeed? 98 00:04:26,680 --> 00:04:28,680 Speaker 3: It will in some of these lab in some of 99 00:04:28,680 --> 00:04:32,360 Speaker 3: these lab cases in limited areas, yes. 100 00:04:32,520 --> 00:04:34,960 Speaker 1: Can we not override it? Then? Will we not always 101 00:04:34,960 --> 00:04:36,040 Speaker 1: have an override function? 102 00:04:37,360 --> 00:04:39,720 Speaker 2: The question is as will you have engineers who really 103 00:04:39,760 --> 00:04:40,520 Speaker 2: know how it works? 104 00:04:40,640 --> 00:04:42,320 Speaker 1: Why don't you just go to the wall and pull 105 00:04:42,320 --> 00:04:42,680 Speaker 1: it out? 106 00:04:43,200 --> 00:04:46,360 Speaker 3: I mean that's yeah, ideally, right, just pull it out, 107 00:04:46,400 --> 00:04:48,480 Speaker 3: sort of like the movie Airplane and he unplugged the 108 00:04:48,520 --> 00:04:49,039 Speaker 3: runway case. 109 00:04:49,080 --> 00:04:51,080 Speaker 1: But I'm serious. Is that always going to be an 110 00:04:51,080 --> 00:04:52,400 Speaker 1: option that we could not be? 111 00:04:52,760 --> 00:04:54,919 Speaker 3: It may not be because here's the thing. If you 112 00:04:55,200 --> 00:04:57,360 Speaker 3: cut the power to the mainframe, you don't know if 113 00:04:57,400 --> 00:04:59,960 Speaker 3: it already proliferated itself to someplace else. 114 00:05:00,360 --> 00:05:00,840 Speaker 2: Yeah. 115 00:05:01,000 --> 00:05:02,920 Speaker 1: You haven't just watched too many movies, have you, Teresa? 116 00:05:03,080 --> 00:05:05,520 Speaker 2: No, I don't have time for movies. It's all in 117 00:05:05,520 --> 00:05:05,880 Speaker 2: my head. 118 00:05:06,000 --> 00:05:09,360 Speaker 1: What is the thing that you almost worried about? 119 00:05:10,279 --> 00:05:12,240 Speaker 2: I worry about. 120 00:05:13,320 --> 00:05:17,520 Speaker 3: Losing human essence as part of sort of the story. 121 00:05:17,560 --> 00:05:20,360 Speaker 3: And so, for example, I've watched side by sides of 122 00:05:20,400 --> 00:05:24,440 Speaker 3: the same person interacting with a customer service agent. 123 00:05:24,560 --> 00:05:25,680 Speaker 2: You can hear their voice and. 124 00:05:25,640 --> 00:05:28,120 Speaker 3: You can hear the human essence and the interaction between 125 00:05:28,120 --> 00:05:31,680 Speaker 3: the two. And then they opted in the next phone 126 00:05:31,680 --> 00:05:33,880 Speaker 3: call to talk to a customer service bot because they 127 00:05:33,920 --> 00:05:35,599 Speaker 3: didn't have to wait if they talked to the bot 128 00:05:36,040 --> 00:05:39,520 Speaker 3: and they started responding like the robot like without the essence, 129 00:05:39,520 --> 00:05:41,080 Speaker 3: like it was like kind of rude and. 130 00:05:41,040 --> 00:05:41,839 Speaker 1: Short, perfunctory. 131 00:05:41,960 --> 00:05:45,720 Speaker 3: So if we spend more time of our day interacting 132 00:05:45,760 --> 00:05:49,640 Speaker 3: with customer service chatbots instead of each other, we're going 133 00:05:49,720 --> 00:05:52,240 Speaker 3: to start to lose that because that's muscle memory for 134 00:05:52,320 --> 00:05:55,240 Speaker 3: us to be polite and to be nice. And so 135 00:05:55,279 --> 00:05:58,960 Speaker 3: if your muscle memory becomes be just do the task 136 00:05:59,200 --> 00:06:01,880 Speaker 3: and have no emotion, I worry about us losing our 137 00:06:01,960 --> 00:06:02,480 Speaker 3: human essence. 138 00:06:02,560 --> 00:06:04,039 Speaker 1: Yeah, And isn't it also, I mean, you make me 139 00:06:04,080 --> 00:06:07,240 Speaker 1: think we have such a problem with loneliness, right, you 140 00:06:07,279 --> 00:06:10,000 Speaker 1: don't like the interactions that you had maybe one hundred 141 00:06:10,080 --> 00:06:12,080 Speaker 1: years ago. You go to the supermarket, get the kids, 142 00:06:12,680 --> 00:06:14,800 Speaker 1: go to the kids' school, interact with the teachers, all 143 00:06:14,839 --> 00:06:17,880 Speaker 1: that stuff that you would do in your day. We've 144 00:06:17,920 --> 00:06:20,040 Speaker 1: lost so much of that. And doesn't that just actually 145 00:06:20,080 --> 00:06:22,760 Speaker 1: doesn't ali have the ability here to actually just make 146 00:06:22,760 --> 00:06:23,240 Speaker 1: that worse? 147 00:06:23,760 --> 00:06:27,479 Speaker 3: It does, And so we're seeing so you know, it's 148 00:06:27,520 --> 00:06:29,200 Speaker 3: sort of like two sides of the same coin. 149 00:06:29,279 --> 00:06:31,800 Speaker 2: So on one side, for somebody. 150 00:06:31,440 --> 00:06:34,560 Speaker 3: Who is lonely, it would be nice for them to 151 00:06:34,600 --> 00:06:37,440 Speaker 3: have an outlet or to be able to game. Maybe 152 00:06:37,480 --> 00:06:40,200 Speaker 3: they're very socially awkward, and so maybe they can kind 153 00:06:40,240 --> 00:06:42,840 Speaker 3: of like use it as a coach to help them 154 00:06:42,880 --> 00:06:45,000 Speaker 3: get their courage up to leave the house and go 155 00:06:45,080 --> 00:06:48,479 Speaker 3: to a party, for example. But what we're seeing is 156 00:06:48,520 --> 00:06:52,560 Speaker 3: that because of the way these chat butts are created, 157 00:06:52,720 --> 00:06:55,400 Speaker 3: is they want you to come back for more. So 158 00:06:55,440 --> 00:06:58,640 Speaker 3: if they're basically designed to give you more of what 159 00:06:58,720 --> 00:07:03,240 Speaker 3: you can for means addictive properties, and if it's addictive 160 00:07:03,279 --> 00:07:05,599 Speaker 3: that way, then you'll find And there was a story 161 00:07:05,640 --> 00:07:07,720 Speaker 3: in the Wall Street Journal that somebody said, look, I'm 162 00:07:07,720 --> 00:07:10,920 Speaker 3: an extrovert, and I became more introverted the more I 163 00:07:10,960 --> 00:07:12,320 Speaker 3: talk to my chatbot. 164 00:07:12,800 --> 00:07:14,360 Speaker 2: Interesting, so it's addictive. 165 00:07:14,440 --> 00:07:16,600 Speaker 3: And the person like literally had to have somebody in 166 00:07:16,640 --> 00:07:19,720 Speaker 3: their life say I think you spend too much time 167 00:07:20,560 --> 00:07:22,760 Speaker 3: talking me your phone talking to a bot. 168 00:07:22,880 --> 00:07:25,640 Speaker 1: I'm just like that movie she isn't it? Yes? Okay, 169 00:07:25,760 --> 00:07:27,240 Speaker 1: what do you use it for in a good way? 170 00:07:28,080 --> 00:07:29,800 Speaker 2: Oh, there's so many amazing ways. 171 00:07:29,800 --> 00:07:34,040 Speaker 3: So I'm trying to learn Italian and so I have 172 00:07:34,320 --> 00:07:36,240 Speaker 3: a chatbot that I use to kind of quiz me 173 00:07:36,560 --> 00:07:39,360 Speaker 3: on my Italian flash cards, so that could be really helpful. 174 00:07:40,440 --> 00:07:43,680 Speaker 3: I actually tell people instead of like asking it just 175 00:07:43,720 --> 00:07:46,800 Speaker 3: to like summarize something. Sometimes I'll say, if you were 176 00:07:46,880 --> 00:07:49,400 Speaker 3: Bob Iger, how would you read this article and how 177 00:07:49,440 --> 00:07:52,240 Speaker 3: would you summarize it? Or if I'm trying to brainstorm, 178 00:07:52,840 --> 00:07:54,760 Speaker 3: you know, I run a company. I've got thirty employees, 179 00:07:54,760 --> 00:07:57,720 Speaker 3: and sometimes I'm trying to brainstorm on a different way 180 00:07:57,800 --> 00:08:00,600 Speaker 3: to present our services to clients. And so you can 181 00:08:00,680 --> 00:08:04,400 Speaker 3: kind of go in this roleplay mode and do that. 182 00:08:04,560 --> 00:08:07,600 Speaker 3: So there's a lot of really positive uses for it. 183 00:08:07,960 --> 00:08:09,720 Speaker 1: Yeah, hey, it's been very nice to talk to you. 184 00:08:09,720 --> 00:08:10,640 Speaker 1: Thanks for chatting to us. 185 00:08:10,640 --> 00:08:12,320 Speaker 2: It's been an amazing and be with you here in Steadio. 186 00:08:12,360 --> 00:08:12,920 Speaker 2: Great to meet you. 187 00:08:13,120 --> 00:08:16,840 Speaker 1: Yeah, go well. Teresa Payton's CEO of Fortaalized Solutions, and 188 00:08:16,880 --> 00:08:19,840 Speaker 1: of course former White House Chief Information Office say for 189 00:08:19,960 --> 00:08:23,240 Speaker 1: more from hither Duplessy Allen Drive, listen live to news 190 00:08:23,280 --> 00:08:26,160 Speaker 1: talks it'd be from four pm weekdays, or follow the 191 00:08:26,240 --> 00:08:27,880 Speaker 1: podcast on iHeartRadio.