1 00:00:00,280 --> 00:00:04,280 Speaker 1: Right now though. Australia has banned the AI chatbot Deep 2 00:00:04,320 --> 00:00:07,720 Speaker 1: Seek from all federal government devices, and there are security 3 00:00:07,720 --> 00:00:11,200 Speaker 1: concerns over China's foray into the AI market, so the 4 00:00:11,240 --> 00:00:15,000 Speaker 1: Aussie government's done it. Taiwan has banned the software as well. 5 00:00:15,600 --> 00:00:20,079 Speaker 1: We haven't yet. In twenty twenty three we banned TikTok, 6 00:00:20,120 --> 00:00:23,320 Speaker 1: which is Chinese owned. So will we follow suit here? 7 00:00:23,360 --> 00:00:26,400 Speaker 1: Should we follow suit here? Well, let's go to strategy 8 00:00:26,440 --> 00:00:30,080 Speaker 1: psychologist and AI commentator Paul Diagnan, who's with me. Good evening, 9 00:00:32,520 --> 00:00:34,960 Speaker 1: Great to have you on the show. Do you think 10 00:00:35,000 --> 00:00:36,840 Speaker 1: what Australia has done is the right move? 11 00:00:37,800 --> 00:00:43,000 Speaker 2: Yes, very definitely, But you need to distinguish between there's 12 00:00:43,000 --> 00:00:47,559 Speaker 2: two things happening here. Is slightly complicated. There's basically the 13 00:00:47,640 --> 00:00:52,000 Speaker 2: deep Seek software itself and that can be run on 14 00:00:52,120 --> 00:00:54,840 Speaker 2: Chinese servers or it can be run on other servers. 15 00:00:55,280 --> 00:00:57,120 Speaker 2: So the thing to ban at the moment is it 16 00:00:57,200 --> 00:00:59,880 Speaker 2: running on Chinese service. So if you just download the 17 00:01:00,080 --> 00:01:02,320 Speaker 2: up on your phone and start using it, all that 18 00:01:02,360 --> 00:01:05,640 Speaker 2: information goes to China and they're actually the privacy policy 19 00:01:05,680 --> 00:01:08,200 Speaker 2: says that if the government asks them, they'll give them, 20 00:01:08,520 --> 00:01:11,640 Speaker 2: they'll give them information. So you certainly wouldn't want public 21 00:01:11,680 --> 00:01:15,280 Speaker 2: servants using that and feeding any kind of sensitive information 22 00:01:15,400 --> 00:01:19,080 Speaker 2: through to China, So that there's that issue. But separate 23 00:01:19,080 --> 00:01:20,720 Speaker 2: issue is if you run it, if we run a 24 00:01:20,760 --> 00:01:24,000 Speaker 2: prinstance in New Zealand, and that's an issue of whether 25 00:01:24,120 --> 00:01:27,520 Speaker 2: whether the actual software itself is safe. And already Microsoft 26 00:01:27,560 --> 00:01:30,920 Speaker 2: is actually running deep Sea on its servers. Because it's 27 00:01:30,920 --> 00:01:33,760 Speaker 2: open source software, anyone can do it, so that points 28 00:01:33,760 --> 00:01:35,560 Speaker 2: to the fact that probably when it's run on your 29 00:01:35,560 --> 00:01:37,720 Speaker 2: own service and New Zeman for instance, then it would 30 00:01:37,720 --> 00:01:39,280 Speaker 2: be safe. So you want to make that distinction. 31 00:01:40,200 --> 00:01:42,120 Speaker 1: How do you know the difference? 32 00:01:43,560 --> 00:01:48,120 Speaker 2: Well, well, first of all, you would know, you would 33 00:01:48,160 --> 00:01:50,320 Speaker 2: if you well, you'd have to be assured that it 34 00:01:50,360 --> 00:01:53,200 Speaker 2: was running on local service. So if the Microsoft one 35 00:01:53,240 --> 00:01:56,720 Speaker 2: for instance, In fact, actually the Microsoft ones aren't necessarily 36 00:01:56,720 --> 00:01:58,120 Speaker 2: based in New Zone, but if you went through the 37 00:01:58,160 --> 00:02:02,400 Speaker 2: Microsoft and they use deep Sea via the Microsoft interface, 38 00:02:02,560 --> 00:02:05,120 Speaker 2: and then you would know that it wasn't hosted in 39 00:02:05,160 --> 00:02:06,960 Speaker 2: China because they would tell you that if you know 40 00:02:07,000 --> 00:02:09,600 Speaker 2: what I mean. So, yes, you fear enough that from 41 00:02:09,600 --> 00:02:11,000 Speaker 2: the user's point of view, you may not know. But 42 00:02:11,040 --> 00:02:13,880 Speaker 2: at the moment, if the user just downloads onto your phone, 43 00:02:14,120 --> 00:02:17,200 Speaker 2: deep Sea and starts using it. It is actually hosted 44 00:02:17,240 --> 00:02:20,280 Speaker 2: in China, so that version of it so a special 45 00:02:20,360 --> 00:02:23,440 Speaker 2: version that was sort of kerefully secured and looked after. 46 00:02:24,000 --> 00:02:25,720 Speaker 2: You'd know that you'd be using it because you'd have 47 00:02:25,760 --> 00:02:28,480 Speaker 2: to go to some effort to get into the special version. 48 00:02:28,639 --> 00:02:29,960 Speaker 1: Paul, have you downloaded it? 49 00:02:31,040 --> 00:02:33,560 Speaker 2: Look? I downloaded it, and then I didn't sign on 50 00:02:33,760 --> 00:02:36,680 Speaker 2: because I know, no, this is madness. So I've got it. 51 00:02:36,720 --> 00:02:39,440 Speaker 2: I actually put it on my phone just I thought, oh, 52 00:02:39,480 --> 00:02:40,919 Speaker 2: this is cool. I'm going to use that, and then 53 00:02:41,040 --> 00:02:43,800 Speaker 2: I thought no, about to register my name, I thought, 54 00:02:44,160 --> 00:02:46,760 Speaker 2: I really don't want to tell because the thing is 55 00:02:46,800 --> 00:02:49,600 Speaker 2: I'm the psychologist, right, So there's sort of different types 56 00:02:49,600 --> 00:02:52,520 Speaker 2: of information. And the fascinating thing about a chatbot is 57 00:02:52,560 --> 00:02:55,080 Speaker 2: you start talking to it almost like as a person, 58 00:02:55,680 --> 00:02:57,959 Speaker 2: and in a sense, it can kind of profile. You know, 59 00:02:58,200 --> 00:03:01,960 Speaker 2: the Chinese had no interest in profiling, but it does. 60 00:03:02,120 --> 00:03:04,360 Speaker 2: You know. It's really as soon as you know all 61 00:03:04,360 --> 00:03:06,760 Speaker 2: the different things that someone may ask a chat bop 62 00:03:07,160 --> 00:03:09,680 Speaker 2: it starts to it can if someone wanted to get 63 00:03:09,720 --> 00:03:12,760 Speaker 2: quite a sort of a personality psychological profile of you. 64 00:03:13,320 --> 00:03:15,080 Speaker 2: So I think you need to be very careful about 65 00:03:15,120 --> 00:03:17,399 Speaker 2: the ones that you use and you don't use. As 66 00:03:17,440 --> 00:03:19,200 Speaker 2: for the average punter, if they want to download it 67 00:03:19,240 --> 00:03:20,679 Speaker 2: and they don't care, you know, they don't think the 68 00:03:20,800 --> 00:03:24,720 Speaker 2: Chinese are looking very closely at them, they may decide 69 00:03:24,720 --> 00:03:26,400 Speaker 2: to do that, but i'd be careful about what you 70 00:03:26,440 --> 00:03:27,000 Speaker 2: say to it. 71 00:03:27,440 --> 00:03:31,040 Speaker 1: In terms of what can they look at. I mean, 72 00:03:31,520 --> 00:03:34,240 Speaker 1: what you've just said is actually quite interesting you. Some 73 00:03:34,280 --> 00:03:36,120 Speaker 1: people might start talking to it like they would a 74 00:03:36,120 --> 00:03:39,360 Speaker 1: friend or a partner and tell them maybe some intimate 75 00:03:39,400 --> 00:03:42,480 Speaker 1: things about themselves, things that could be used potentially by 76 00:03:42,480 --> 00:03:44,840 Speaker 1: a Chinese government. And if you're a federal employee, then 77 00:03:44,840 --> 00:03:46,280 Speaker 1: perhaps that's not such a great thing. 78 00:03:47,040 --> 00:03:49,520 Speaker 2: Well, absolutely, they're the disaster of anyone who is foolish 79 00:03:49,600 --> 00:03:52,200 Speaker 2: enough to do that. But even even at a more 80 00:03:52,280 --> 00:03:55,120 Speaker 2: subtle level, it kind of gets, you know, like at 81 00:03:55,160 --> 00:03:57,960 Speaker 2: the moment that companies track what we do, what advertisements 82 00:03:58,000 --> 00:03:59,640 Speaker 2: we look at, and what sites who go to, all 83 00:03:59,680 --> 00:04:02,320 Speaker 2: that kind of stuff. But really the interaction of a 84 00:04:02,400 --> 00:04:05,880 Speaker 2: chatbot is sort of another layer of where if someone 85 00:04:06,080 --> 00:04:08,880 Speaker 2: sought to, they could really quite you know, kind of 86 00:04:08,880 --> 00:04:12,320 Speaker 2: profile you at a deep, almost psychological level, because they 87 00:04:12,320 --> 00:04:14,840 Speaker 2: would know everything that you're interested in, and they would 88 00:04:14,880 --> 00:04:17,760 Speaker 2: know you almost your interaction styles, so they could even leave. 89 00:04:17,839 --> 00:04:20,600 Speaker 2: So obviously the Chinese government's not going to be doing 90 00:04:20,600 --> 00:04:24,880 Speaker 2: that with the average person in New Zealand. And this 91 00:04:24,920 --> 00:04:29,280 Speaker 2: is a concern for any any selfware service that's based overseas, 92 00:04:29,279 --> 00:04:33,200 Speaker 2: any AI system based overseas. And what this raises us 93 00:04:33,200 --> 00:04:36,839 Speaker 2: a question what's called data sovereignty, which is really important. 94 00:04:38,240 --> 00:04:41,400 Speaker 1: Yeah, just before we go, I wanted to just ask 95 00:04:41,440 --> 00:04:44,360 Speaker 1: how you got into this because you're a psychologist, but 96 00:04:44,400 --> 00:04:47,880 Speaker 1: then you're AI. You've got an interest in AI. How 97 00:04:47,920 --> 00:04:49,680 Speaker 1: did you get the two become one? 98 00:04:50,720 --> 00:04:53,719 Speaker 2: Well that most up until now, I've had quite an 99 00:04:53,720 --> 00:04:55,839 Speaker 2: interest in it, But up until now it's kind of 100 00:04:55,839 --> 00:04:57,960 Speaker 2: been pouring from a psychological point of view. You can't 101 00:04:58,000 --> 00:05:01,760 Speaker 2: psychoanalyze a spreadsheet, for instance, can you. The fascinating thing 102 00:05:01,880 --> 00:05:05,679 Speaker 2: about AI is it's kind of like this intelligent entity 103 00:05:05,680 --> 00:05:08,520 Speaker 2: has come into the world. So as a psychologist, I 104 00:05:08,560 --> 00:05:11,440 Speaker 2: find it fascinating because suddenly we've got this whole new 105 00:05:11,600 --> 00:05:15,120 Speaker 2: entity and you can actually give it psychological tests, you 106 00:05:15,160 --> 00:05:18,200 Speaker 2: know you can. You can you can actually treat it 107 00:05:18,240 --> 00:05:20,880 Speaker 2: in a lot of the ways in which we actually 108 00:05:20,880 --> 00:05:23,000 Speaker 2: think about people as a psychologist. So that's really where 109 00:05:23,040 --> 00:05:26,840 Speaker 2: my interests came from. Plus I've also been involvement software startup, 110 00:05:26,880 --> 00:05:27,719 Speaker 2: et cetera, et cetera. 111 00:05:28,200 --> 00:05:30,440 Speaker 1: So have you actually treated an AI bot for like 112 00:05:30,560 --> 00:05:31,599 Speaker 1: mental illness or something. 113 00:05:32,240 --> 00:05:34,880 Speaker 2: It's not got to that level yet, but I mean 114 00:05:34,960 --> 00:05:38,480 Speaker 2: to it. But but I think, see what I think 115 00:05:38,520 --> 00:05:41,120 Speaker 2: will happen. These things are getting more and more powerful, 116 00:05:41,120 --> 00:05:42,840 Speaker 2: and you know that you know that what's going to 117 00:05:42,880 --> 00:05:46,560 Speaker 2: happen at some stage the the kind of culture wars, 118 00:05:47,000 --> 00:05:49,520 Speaker 2: and it's already happening. The culture wars will move into 119 00:05:49,600 --> 00:05:52,920 Speaker 2: the type of AI system that people are using. And 120 00:05:53,240 --> 00:05:55,640 Speaker 2: we know that Elon Musk is set one up which 121 00:05:55,640 --> 00:05:57,760 Speaker 2: has a certain flavor to it, and he says, chat, yep, 122 00:05:57,800 --> 00:06:00,200 Speaker 2: it has got a different favor. So we will we 123 00:06:00,320 --> 00:06:02,800 Speaker 2: will be sort of analyzing the personality of these ones 124 00:06:02,839 --> 00:06:05,000 Speaker 2: and you could kind of pick and choose which one's 125 00:06:05,080 --> 00:06:07,520 Speaker 2: more attuned to your kind of worldview. So we will 126 00:06:07,560 --> 00:06:08,360 Speaker 2: get to that stage. 127 00:06:08,760 --> 00:06:12,120 Speaker 1: Will people fall and can people no longer do they 128 00:06:12,200 --> 00:06:15,040 Speaker 1: no longer need real life relationships? Can they fall in 129 00:06:15,120 --> 00:06:17,479 Speaker 1: love with an AI bot? 130 00:06:17,720 --> 00:06:22,039 Speaker 2: This is I think absolutely definitely. This is a fascinating area. 131 00:06:22,560 --> 00:06:26,040 Speaker 2: So some people say this is bad and dangerous, and 132 00:06:26,120 --> 00:06:27,880 Speaker 2: obviously if you do fall in love with the bot. 133 00:06:27,960 --> 00:06:31,159 Speaker 2: What happens with replicas? They in turned all the blots 134 00:06:31,200 --> 00:06:32,919 Speaker 2: off and a whole other people have their heart's breaken. 135 00:06:33,480 --> 00:06:35,800 Speaker 2: But some people say it's bad, and you can understand 136 00:06:35,839 --> 00:06:37,479 Speaker 2: why they say it's bad. But on the other hand, 137 00:06:38,120 --> 00:06:40,640 Speaker 2: in modern society, there's a lot of lonely people out there, 138 00:06:41,000 --> 00:06:43,920 Speaker 2: and some of them would argue, well, you know, I 139 00:06:43,960 --> 00:06:46,080 Speaker 2: don't have people coming around knocking on the door wanting 140 00:06:46,120 --> 00:06:47,560 Speaker 2: to talk to me, and I've got this thing is 141 00:06:48,120 --> 00:06:51,080 Speaker 2: to me. So if you don't like it, that's your problem. 142 00:06:51,200 --> 00:06:54,080 Speaker 2: But basically I'm finding I'm getting on ready quite well 143 00:06:54,120 --> 00:06:57,160 Speaker 2: with it. So so as if everything with AI is 144 00:06:57,279 --> 00:06:59,719 Speaker 2: kind of a game changer, we're into a new world. 145 00:07:00,120 --> 00:07:02,560 Speaker 2: We're we're going to have to think about a lot 146 00:07:02,600 --> 00:07:04,960 Speaker 2: of the assumptions that we had in the old world. 147 00:07:05,120 --> 00:07:07,400 Speaker 1: Yeah, goodness me, Paul, you've given us a lot to 148 00:07:07,440 --> 00:07:10,680 Speaker 1: think about. Strategy psychologists and AI commentator Paul Dagnan with 149 00:07:10,800 --> 00:07:14,480 Speaker 1: us this evening. For more from Hither Duplessy Allen Drive, 150 00:07:14,640 --> 00:07:17,920 Speaker 1: listen live to news talks it'd be from four pm weekdays, 151 00:07:18,160 --> 00:07:20,320 Speaker 1: or follow the podcast on iHeartRadio.