1 00:00:00,160 --> 00:00:04,160 Speaker 1: Now GPS are taking to AI for their consultations like 2 00:00:04,400 --> 00:00:07,560 Speaker 1: ducks to water. A survey of almost two hundred GPS 3 00:00:07,600 --> 00:00:10,240 Speaker 1: by Otago University has found that forty percent of them 4 00:00:10,520 --> 00:00:13,320 Speaker 1: were recording their consultations and getting AI to write everything 5 00:00:13,360 --> 00:00:17,160 Speaker 1: down for them. Professor Angela Balentine was the lead researcher 6 00:00:17,200 --> 00:00:20,960 Speaker 1: who did this survey and surveying with us. Now, Hey, Angela, hi, 7 00:00:21,040 --> 00:00:23,720 Speaker 1: thanks for having me. It's really smart that they're doing this. 8 00:00:23,800 --> 00:00:25,440 Speaker 1: You could see the time saving in it, but are 9 00:00:25,440 --> 00:00:27,160 Speaker 1: there any fish shocks? Like do they need to tell 10 00:00:27,200 --> 00:00:28,080 Speaker 1: you that they're doing this? 11 00:00:29,320 --> 00:00:31,960 Speaker 2: Yeah, so that does look like this time saving. There's 12 00:00:32,000 --> 00:00:34,800 Speaker 2: a couple of fishocks. One is that there are still 13 00:00:35,000 --> 00:00:37,519 Speaker 2: errors in the clinical notes. So we found that GPS 14 00:00:37,560 --> 00:00:40,960 Speaker 2: said that maybe there were hallucinations so content that hadn't 15 00:00:40,960 --> 00:00:44,960 Speaker 2: really been said, or the notes had missed really critical findings, 16 00:00:45,000 --> 00:00:47,080 Speaker 2: so they still need to be read really carefully by 17 00:00:47,080 --> 00:00:50,040 Speaker 2: the clinician. And in terms of consent, yes, we found 18 00:00:50,040 --> 00:00:54,120 Speaker 2: that fifty nine percent of GPs' we're asking patients for consent, 19 00:00:54,200 --> 00:00:57,480 Speaker 2: but that means a significant minority isn't, and we really 20 00:00:57,520 --> 00:00:59,320 Speaker 2: like to see that change. We think patients should know 21 00:00:59,400 --> 00:01:00,520 Speaker 2: and they should get to have a. 22 00:01:00,480 --> 00:01:03,360 Speaker 1: Say, well, because only if only to stop you saying 23 00:01:03,440 --> 00:01:05,640 Speaker 1: dumb things, right, because like you might go in like 24 00:01:06,200 --> 00:01:08,640 Speaker 1: I quite like having a laugh with the GPS sometimes. 25 00:01:08,319 --> 00:01:10,720 Speaker 2: In absolutely no, no, you should still do that, one 26 00:01:10,760 --> 00:01:11,959 Speaker 2: hundred percent, you should still. 27 00:01:11,840 --> 00:01:13,640 Speaker 1: Do that a bomb job, do you know what I mean? 28 00:01:13,680 --> 00:01:15,520 Speaker 1: And it's just good job, great, But I don't want 29 00:01:15,520 --> 00:01:16,640 Speaker 1: that much body AI. 30 00:01:17,720 --> 00:01:19,720 Speaker 2: I know, but I think that would be that would 31 00:01:19,720 --> 00:01:23,040 Speaker 2: be really so sad. One of the really nice things 32 00:01:23,040 --> 00:01:25,360 Speaker 2: actually in the study is that one of the GPS said, 33 00:01:25,800 --> 00:01:28,000 Speaker 2: it made my job fun again. And I work with 34 00:01:28,040 --> 00:01:30,399 Speaker 2: a lot of GPS and I know how you know, 35 00:01:30,640 --> 00:01:33,360 Speaker 2: overburdened they are, so please tell the jokes. That also 36 00:01:33,440 --> 00:01:36,320 Speaker 2: makes the GP's job fun again. So yeah. So I 37 00:01:36,360 --> 00:01:38,520 Speaker 2: mean in terms of it recording it, it will record it, 38 00:01:39,440 --> 00:01:43,160 Speaker 2: create the transcript. That transcript will typically be deleted, say 39 00:01:43,200 --> 00:01:45,560 Speaker 2: in seven days, So in terms of what ends up 40 00:01:45,560 --> 00:01:48,520 Speaker 2: in your clinical notes, hopefully the AI will just ignore 41 00:01:48,520 --> 00:01:51,280 Speaker 2: that and or the doctor will delete anything that's not necessary. 42 00:01:51,360 --> 00:01:53,480 Speaker 1: Angela, do you think people are still weird about AI 43 00:01:53,600 --> 00:01:55,840 Speaker 1: because they don't necessarily understand it. They might just freak 44 00:01:55,840 --> 00:01:57,080 Speaker 1: out and say, no, you can't. 45 00:01:58,680 --> 00:02:00,600 Speaker 2: I don't think they're weird about AI. I think it's 46 00:02:00,640 --> 00:02:03,720 Speaker 2: good to be skeptical about AI. I don't like the 47 00:02:03,840 --> 00:02:05,680 Speaker 2: narrative of like, oh, my gosh, AI is going to 48 00:02:05,680 --> 00:02:07,960 Speaker 2: solve all of our problems in the health system. I mean, 49 00:02:08,480 --> 00:02:10,840 Speaker 2: there's just there's been that narrative around like oh, we've 50 00:02:10,880 --> 00:02:13,079 Speaker 2: got dishwashers and now we've got GPS on our phone 51 00:02:13,120 --> 00:02:15,120 Speaker 2: and in our phone like predicts, so we don't even 52 00:02:15,160 --> 00:02:17,000 Speaker 2: have to think about how we're going to teck someone back. 53 00:02:17,320 --> 00:02:18,880 Speaker 2: And actually the result of all of that is that 54 00:02:18,880 --> 00:02:22,120 Speaker 2: we're all more exhausted and more tired and disconnected, you know, 55 00:02:22,200 --> 00:02:24,480 Speaker 2: than we were fifty years ago. So you know, we 56 00:02:24,520 --> 00:02:26,320 Speaker 2: don't want to be like simplistic about it's going to 57 00:02:26,320 --> 00:02:27,880 Speaker 2: solve all the problems, and we don't want to be 58 00:02:27,919 --> 00:02:31,000 Speaker 2: simplistic about the something evil and dodgy about it either. Yeah, 59 00:02:31,000 --> 00:02:33,080 Speaker 2: we just want to keep collecting the data and really 60 00:02:33,160 --> 00:02:35,760 Speaker 2: understanding how it's influencing clinical practice. 61 00:02:36,000 --> 00:02:38,720 Speaker 1: I like your approach. Thanks very much, Angela, Professor Professor 62 00:02:38,760 --> 00:02:42,120 Speaker 1: Angela Valentine, Department of Primary Healthcare at Otaga University in Wellington. 63 00:02:42,639 --> 00:02:45,799 Speaker 1: For more from Hither Duplessy Alan Drive, listen live to 64 00:02:45,919 --> 00:02:48,919 Speaker 1: news talks. It'd be from four pm weekdays, or follow 65 00:02:48,960 --> 00:02:50,760 Speaker 1: the podcast on iHeartRadio