1 00:00:00,280 --> 00:00:01,440 Speaker 1: Ever do for clan? 2 00:00:01,520 --> 00:00:04,080 Speaker 2: Would AI be the key to solving our mental health backlog? 3 00:00:04,640 --> 00:00:06,920 Speaker 2: Mental Health Minister Matt Doocy has claimed that up to 4 00:00:06,960 --> 00:00:10,160 Speaker 2: twenty percent of New Zealand's unmet needs could be solved. 5 00:00:10,200 --> 00:00:12,639 Speaker 2: This is mental health needs could be solved using chatbots 6 00:00:12,640 --> 00:00:15,840 Speaker 2: like chat GBT. Jackie maguire is a clinical psychologist. 7 00:00:15,880 --> 00:00:17,759 Speaker 1: Hey Jackie, Hi, Heather? 8 00:00:18,000 --> 00:00:18,960 Speaker 2: Is he on to something here? 9 00:00:20,720 --> 00:00:25,720 Speaker 1: I think therapy AI is really helpful for life stuff. 10 00:00:25,600 --> 00:00:27,639 Speaker 1: If you want to make a decision, if you're wanting 11 00:00:27,680 --> 00:00:31,360 Speaker 1: to plan a conversation, if you are switched on with 12 00:00:31,640 --> 00:00:36,000 Speaker 1: critical analysis skills that you can go naturally helpful. You 13 00:00:36,040 --> 00:00:39,000 Speaker 1: know you're grounding or you're prompting me. I'm going to 14 00:00:39,120 --> 00:00:40,839 Speaker 1: use that and I'm going to think about that. I mean, 15 00:00:40,880 --> 00:00:43,159 Speaker 1: that's really helpful in terms of an AI tool. Is 16 00:00:43,159 --> 00:00:46,520 Speaker 1: it going to solve our mental health crisis? In terms 17 00:00:46,520 --> 00:00:49,880 Speaker 1: of we've got huge weightlists and people aren'nable to get 18 00:00:51,120 --> 00:00:54,440 Speaker 1: you know, meaningful support. I don't think it's going to 19 00:00:54,440 --> 00:00:55,080 Speaker 1: fill that gap. 20 00:00:55,240 --> 00:00:57,560 Speaker 2: What about what about just triaging? So I was just 21 00:00:57,560 --> 00:01:02,240 Speaker 2: thinking about so, for example, had my second baby about 22 00:01:02,280 --> 00:01:05,360 Speaker 2: five months ago, so obviously would be experiencing what a 23 00:01:05,360 --> 00:01:07,640 Speaker 2: lot of mums will experience postpartum which is a little 24 00:01:07,640 --> 00:01:09,280 Speaker 2: bit of anxiety, maybe a bit of the blues or 25 00:01:09,280 --> 00:01:12,800 Speaker 2: whatever whatever. So I go along to it and I say, hey, 26 00:01:12,920 --> 00:01:15,360 Speaker 2: I'm having some anxiety babies five months old, and it 27 00:01:15,400 --> 00:01:18,720 Speaker 2: says yet completely normal. Try doing this. Would that not 28 00:01:18,800 --> 00:01:21,680 Speaker 2: be helpful for just triaging some people out who actually 29 00:01:21,959 --> 00:01:25,240 Speaker 2: don't need any more assistance than just knowing what's going on? 30 00:01:27,040 --> 00:01:30,360 Speaker 1: So yes, I think being able to have anyone validate you, 31 00:01:30,520 --> 00:01:33,679 Speaker 1: including a chatbot, actually has shown when your name it 32 00:01:33,680 --> 00:01:37,080 Speaker 1: your tainment, So some form of validation is helpful. Being 33 00:01:37,160 --> 00:01:41,399 Speaker 1: able to be provided with in terms of strategies is useful, 34 00:01:41,720 --> 00:01:44,399 Speaker 1: But that also requires that the person that is being 35 00:01:44,480 --> 00:01:48,080 Speaker 1: validated and receiving their information is in a headspace to 36 00:01:48,280 --> 00:01:51,240 Speaker 1: use that wisely. Yes, Like if you logged on and 37 00:01:51,280 --> 00:01:55,200 Speaker 1: you were so distressed or experiencing such great post aatal 38 00:01:55,280 --> 00:01:59,280 Speaker 1: depression that you were psychotic, then you need a person 39 00:01:59,280 --> 00:02:01,600 Speaker 1: in front of you as here you not chatbot, right, 40 00:02:02,200 --> 00:02:04,280 Speaker 1: And I guess there's no way hard because you don't 41 00:02:04,280 --> 00:02:06,120 Speaker 1: know who's on the other part of the computer. 42 00:02:05,800 --> 00:02:08,959 Speaker 2: Screen, so there's no way of knowing whether like can 43 00:02:08,960 --> 00:02:12,320 Speaker 2: we not triage it and go chat GPT has screened 44 00:02:12,320 --> 00:02:14,320 Speaker 2: you at the start, WHOA, you need some help or no, 45 00:02:14,440 --> 00:02:16,040 Speaker 2: you're okay, off you go. We can't do that. 46 00:02:17,600 --> 00:02:19,040 Speaker 1: I think it would be useful if there was a 47 00:02:19,080 --> 00:02:21,200 Speaker 1: clinician to see you through that process. 48 00:02:21,360 --> 00:02:26,360 Speaker 2: Yeah, but there is not right, not curently. 49 00:02:26,160 --> 00:02:29,160 Speaker 1: In terms of you know, we would hope that people 50 00:02:29,200 --> 00:02:31,880 Speaker 1: in crisis are being seen, but we know from those 51 00:02:31,919 --> 00:02:35,239 Speaker 1: that are experiencing moderate to severe mental illness or who 52 00:02:35,240 --> 00:02:38,680 Speaker 1: are in great distress, many people aren't getting the support 53 00:02:38,680 --> 00:02:42,000 Speaker 1: that they need. And so, you know, it's a useful tool. 54 00:02:42,080 --> 00:02:44,640 Speaker 1: I just don't want it to be spoken about in 55 00:02:44,680 --> 00:02:46,120 Speaker 1: terms of it's our hell Mary. 56 00:02:46,240 --> 00:02:50,440 Speaker 2: Yeah, is there also the risk that it affirms your neuroses? 57 00:02:52,120 --> 00:02:55,399 Speaker 1: You know, it's interesting because I've used a free therapy 58 00:02:55,680 --> 00:02:58,760 Speaker 1: tool when I was having to plan a difficult conversation 59 00:02:58,840 --> 00:03:02,320 Speaker 1: with somebody, and yes, like I'm a clinical psych sitting 60 00:03:02,360 --> 00:03:03,919 Speaker 1: on the end of the computer and I can go, 61 00:03:04,040 --> 00:03:05,960 Speaker 1: and that's really useful. I wouldn't have thought of that 62 00:03:06,040 --> 00:03:08,080 Speaker 1: question and I can go. No, I wouldn't take that 63 00:03:08,160 --> 00:03:10,960 Speaker 1: on board, but like from that perspective here that it 64 00:03:11,000 --> 00:03:14,679 Speaker 1: was really helpful, saved me affession from going to see 65 00:03:14,720 --> 00:03:17,400 Speaker 1: someone in supervise and you know in soundboard that like, 66 00:03:17,440 --> 00:03:20,240 Speaker 1: I think it can be an excellent soundboarding. You know, 67 00:03:20,440 --> 00:03:23,320 Speaker 1: life challenge a tool. I think it can be really 68 00:03:23,360 --> 00:03:26,240 Speaker 1: helpful used in that forum. I just think we have 69 00:03:26,360 --> 00:03:30,800 Speaker 1: to go human beings a complex if you are experiencing 70 00:03:30,840 --> 00:03:33,880 Speaker 1: mental illness, if your life has got multiple things going on, 71 00:03:34,280 --> 00:03:36,880 Speaker 1: if you're in a state of distress. We know that 72 00:03:36,920 --> 00:03:39,360 Speaker 1: one of the primary factors of success in therapy is 73 00:03:39,360 --> 00:03:44,000 Speaker 1: a therapeutic alliance, which means an individual other person can 74 00:03:44,080 --> 00:03:47,680 Speaker 1: see you as a human being, can understand the complexities 75 00:03:47,720 --> 00:03:50,520 Speaker 1: of your life, can build a relationship with you and 76 00:03:51,000 --> 00:03:54,680 Speaker 1: AI Whilst it might feel like they're building a relationship 77 00:03:54,720 --> 00:03:58,600 Speaker 1: with you, their relationship is based on data, you know, 78 00:03:58,760 --> 00:04:00,640 Speaker 1: patterns and data, not based on you. 79 00:04:01,200 --> 00:04:03,600 Speaker 2: Yeah, Jackie, thank you. I appreciate you talking us through. 80 00:04:03,600 --> 00:04:06,720 Speaker 2: It's fascinating stuff. A Jackie Maguire, clinical psychologist. 81 00:04:07,200 --> 00:04:10,400 Speaker 1: For more from Hither Duplessy Allen Drive, listen live to 82 00:04:10,480 --> 00:04:13,520 Speaker 1: news talks it'd be from four pm weekdays, or follow 83 00:04:13,560 --> 00:04:15,320 Speaker 1: the podcast on iHeartRadio.