1 00:00:00,280 --> 00:00:03,160 Speaker 1: This is the Fits In with Her with Kate Richie podcast. 2 00:00:03,440 --> 00:00:09,280 Speaker 1: Let's talk about artificial intelligence, computer. 3 00:00:08,920 --> 00:00:11,440 Speaker 2: Games man and you know what, Kate, we just had 4 00:00:11,440 --> 00:00:15,520 Speaker 2: a conversation during that song. It is hard these days 5 00:00:15,560 --> 00:00:18,640 Speaker 2: to work out. I think we're losing trust in people 6 00:00:18,680 --> 00:00:21,840 Speaker 2: and humans now. We don't know who to trust anymore. 7 00:00:21,480 --> 00:00:23,280 Speaker 3: Don't know who to trust all generally. 8 00:00:23,480 --> 00:00:24,000 Speaker 4: Yeah. 9 00:00:25,079 --> 00:00:27,320 Speaker 5: Well, my dad always taught me that he was a policeman, 10 00:00:27,400 --> 00:00:30,840 Speaker 5: so he saw his advice was don't trust anyone. I mean, 11 00:00:30,880 --> 00:00:32,240 Speaker 5: I don't think that's the attitude. 12 00:00:32,280 --> 00:00:33,240 Speaker 1: You can right. 13 00:00:34,960 --> 00:00:37,239 Speaker 4: Back humanity. There are good people out there. 14 00:00:37,320 --> 00:00:39,320 Speaker 1: Yeah, there are good people. You are right, mate. 15 00:00:39,360 --> 00:00:43,360 Speaker 6: But now that AI has involved, it's out now even 16 00:00:43,800 --> 00:00:44,760 Speaker 6: we're discovering it. 17 00:00:45,400 --> 00:00:48,560 Speaker 1: During love Oh. 18 00:00:47,760 --> 00:00:50,000 Speaker 6: Okay, So, a woman explained that her and her boyfriend 19 00:00:50,040 --> 00:00:53,320 Speaker 6: are in a long distance relationship. They communicate primarily through 20 00:00:53,320 --> 00:00:56,480 Speaker 6: texts and phone calls. Over the past two years, she 21 00:00:56,640 --> 00:01:00,639 Speaker 6: had developed a beautiful and deep emotional can to her 22 00:01:00,680 --> 00:01:03,480 Speaker 6: partner and felt so lucky to have met someone who 23 00:01:03,520 --> 00:01:04,160 Speaker 6: shared her. 24 00:01:04,080 --> 00:01:05,120 Speaker 1: Values and dreams. 25 00:01:05,600 --> 00:01:09,240 Speaker 2: However, approximately one month ago, the woman began to notice 26 00:01:09,240 --> 00:01:11,839 Speaker 2: something strange about her boyfriend's text messages. 27 00:01:12,800 --> 00:01:16,360 Speaker 1: While he used to be a man of few words. 28 00:01:16,760 --> 00:01:20,760 Speaker 2: He started sending her long paragraphs that were very well 29 00:01:20,760 --> 00:01:21,600 Speaker 2: written and. 30 00:01:21,600 --> 00:01:24,520 Speaker 1: Not at all in the tone that she was used to. 31 00:01:25,520 --> 00:01:32,080 Speaker 2: This fear was confirmed when she was told she told 32 00:01:32,200 --> 00:01:35,160 Speaker 2: him sorry that she was feeling tired and frustrated, and 33 00:01:35,200 --> 00:01:38,880 Speaker 2: his response was a bullet point list of things she 34 00:01:38,920 --> 00:01:43,920 Speaker 2: could do to improve her mental health, staying hydrated, removing 35 00:01:44,000 --> 00:01:46,200 Speaker 2: stress from her life, and staying positive. 36 00:01:47,160 --> 00:01:50,520 Speaker 5: People change. Maybe he'd gone off, taken a night class. 37 00:01:50,560 --> 00:01:51,560 Speaker 1: He really cares. 38 00:01:51,600 --> 00:01:55,120 Speaker 3: He wanted to be more focused on the relationship and detailed. 39 00:01:55,400 --> 00:01:58,600 Speaker 2: The woman claimed, I felt betrayed, like he had forgotten 40 00:01:58,600 --> 00:02:00,960 Speaker 2: about us. I like when he was a man of 41 00:02:01,040 --> 00:02:05,320 Speaker 2: few words. Every time I receive these texts, I feel revolted. 42 00:02:05,360 --> 00:02:07,840 Speaker 2: And this is happening every single day. I feel this 43 00:02:08,040 --> 00:02:11,480 Speaker 2: uncanny Valley vibe in all of them. These words are 44 00:02:11,520 --> 00:02:14,280 Speaker 2: all affected and so unnatural that it feels like that 45 00:02:14,280 --> 00:02:17,120 Speaker 2: that he's mocking the realness and the messiness of true 46 00:02:17,200 --> 00:02:20,720 Speaker 2: human connection. Wow, God, sounds like she's written bad from 47 00:02:20,800 --> 00:02:21,840 Speaker 2: AI Yeah. 48 00:02:21,680 --> 00:02:22,080 Speaker 4: It does. 49 00:02:22,200 --> 00:02:24,680 Speaker 3: I mean, there's so many dangers out there when it 50 00:02:24,680 --> 00:02:27,760 Speaker 3: comes to AIWN relationships and communication back and forth. What's 51 00:02:27,840 --> 00:02:30,880 Speaker 3: interesting is the amount of men out there that are 52 00:02:30,880 --> 00:02:33,440 Speaker 3: happy to be in relationships and they know that it's 53 00:02:33,480 --> 00:02:36,640 Speaker 3: a bot they're talking to, but they like the regularity 54 00:02:36,639 --> 00:02:38,760 Speaker 3: of a check in by somebody because they're lonely. 55 00:02:38,840 --> 00:02:40,760 Speaker 1: So, what do you mean someone checks in with them? 56 00:02:40,919 --> 00:02:43,200 Speaker 3: It's the bot checking in? What do you mean if 57 00:02:43,280 --> 00:02:46,280 Speaker 3: I can, I can? I can what what in the bot? 58 00:02:46,960 --> 00:02:49,600 Speaker 1: So I can have what what what? 59 00:02:50,040 --> 00:02:51,119 Speaker 4: Go and let me find them? 60 00:02:51,520 --> 00:02:55,160 Speaker 3: So I can have a girlfriend that I subscribe to. Okay, 61 00:02:55,520 --> 00:02:57,160 Speaker 3: so in the morning when I wake up, she might 62 00:02:57,240 --> 00:02:59,239 Speaker 3: message saying, hi, babe, how are you feeling today? 63 00:03:00,200 --> 00:03:02,640 Speaker 4: And I write back and she goes, that's great, have 64 00:03:02,680 --> 00:03:03,360 Speaker 4: an awesome morning. 65 00:03:03,360 --> 00:03:04,280 Speaker 1: I'll check in with you later. 66 00:03:04,440 --> 00:03:07,840 Speaker 4: Love you. And these are just generated And the. 67 00:03:07,760 --> 00:03:09,720 Speaker 3: More that you give it feeds back because it's a 68 00:03:09,919 --> 00:03:12,280 Speaker 3: it's continuing to feedback on the information you've given us. 69 00:03:12,320 --> 00:03:16,040 Speaker 5: This bothering anybody out it actually makes me feel quite 70 00:03:16,120 --> 00:03:17,520 Speaker 5: upsell yeah, and. 71 00:03:17,520 --> 00:03:20,440 Speaker 2: Frightened concerning can I or I want to throw this 72 00:03:20,520 --> 00:03:22,760 Speaker 2: out there? Thirteen twenty four to ten? Have you busted 73 00:03:22,800 --> 00:03:26,799 Speaker 2: someone using chat, GPT or AI because and it's not 74 00:03:26,960 --> 00:03:30,600 Speaker 2: just for school, you know reports like that? 75 00:03:30,639 --> 00:03:31,960 Speaker 1: What about a birthday card? 76 00:03:32,040 --> 00:03:35,080 Speaker 5: People won't even write from their art anymore. 77 00:03:34,760 --> 00:03:38,160 Speaker 1: Kate, listen to this, She's going to kill me what 78 00:03:38,320 --> 00:03:44,520 Speaker 1: she's listening to, BJJJ. My mum just had a birthday. 79 00:03:45,400 --> 00:03:47,200 Speaker 1: And my mum and BJ. 80 00:03:47,080 --> 00:03:49,680 Speaker 2: Have become really really close because they go to op 81 00:03:49,680 --> 00:03:53,160 Speaker 2: shops together and their joke is what they say to 82 00:03:53,320 --> 00:03:56,160 Speaker 2: us is, hey, we're going to get a loaf of bread, 83 00:03:56,840 --> 00:03:59,640 Speaker 2: and we know that that's code for they're going to 84 00:03:59,680 --> 00:04:01,760 Speaker 2: the option shop to find Yeah. 85 00:04:01,880 --> 00:04:04,119 Speaker 1: Right, mum just had her birthday. 86 00:04:04,480 --> 00:04:07,720 Speaker 2: And we in our family, we like to write on 87 00:04:07,880 --> 00:04:10,840 Speaker 2: cards and write beautiful words, and that's our gift to everyone. 88 00:04:10,920 --> 00:04:11,200 Speaker 1: Right. 89 00:04:12,080 --> 00:04:14,480 Speaker 2: So the boys did it, Hue and Lenny did it. 90 00:04:14,600 --> 00:04:17,080 Speaker 2: I did it, and then BJ comes in and trumps 91 00:04:17,160 --> 00:04:17,520 Speaker 2: us all. 92 00:04:18,200 --> 00:04:19,160 Speaker 1: And I'm not joking. 93 00:04:19,360 --> 00:04:23,039 Speaker 2: My mother started crying her eyes out, Kate, because she 94 00:04:23,160 --> 00:04:26,480 Speaker 2: starts reading this out to everyone. So remember the fresh 95 00:04:26,520 --> 00:04:30,560 Speaker 2: slaf of bread that they gets the coat. Mum opens 96 00:04:30,600 --> 00:04:34,720 Speaker 2: this card and goes to Claire a low, fresh baked. 97 00:04:34,520 --> 00:04:37,560 Speaker 1: Warm and sweet like the heart you bring with every greet. 98 00:04:37,800 --> 00:04:41,160 Speaker 6: You welcome me not just with bread, but with laughter, stories, 99 00:04:41,240 --> 00:04:44,799 Speaker 6: words unsaid in you. I found a friend so true, 100 00:04:44,960 --> 00:04:48,920 Speaker 6: A bond rare, one of chosen few from early Mornings 101 00:04:49,000 --> 00:04:52,880 Speaker 6: Late night talks to simple walks on quiet walks. Here's 102 00:04:53,000 --> 00:04:55,920 Speaker 6: to us in this cozy place, to friendship, love and grace. 103 00:04:56,279 --> 00:04:59,320 Speaker 6: With every loaf, our bond grows tight. A mother in law, 104 00:04:59,440 --> 00:05:05,120 Speaker 6: my friend, my light. Mom starts crying her eyes out, 105 00:05:04,560 --> 00:05:06,680 Speaker 6: and I'm looking at chatch. 106 00:05:07,120 --> 00:05:10,200 Speaker 3: You cheated, Yeah, you suck teas out of my mother. 107 00:05:10,560 --> 00:05:11,359 Speaker 3: We're using AI. 108 00:05:11,920 --> 00:05:17,560 Speaker 2: You and wash stole our cards look so insignificant? 109 00:05:18,200 --> 00:05:19,040 Speaker 4: Can we cheated? 110 00:05:19,920 --> 00:05:24,320 Speaker 2: She didn't cry when she read out our cards. It's 111 00:05:24,360 --> 00:05:28,080 Speaker 2: not fair, It's not It's not fair, Kate, No, it's 112 00:05:28,160 --> 00:05:28,640 Speaker 2: not fair. 113 00:05:29,800 --> 00:05:30,680 Speaker 4: Is better than you? 114 00:05:30,920 --> 00:05:31,680 Speaker 1: Well that I know. 115 00:05:32,080 --> 00:05:34,520 Speaker 5: Sometimes it's it's okay to get a bit of help. 116 00:05:34,680 --> 00:05:37,520 Speaker 5: But nothing's heartfelt anymore. 117 00:05:37,800 --> 00:05:38,200 Speaker 1: It's not. 118 00:05:38,520 --> 00:05:41,320 Speaker 3: Depends what you're plugging into chat PT you're abviously putting 119 00:05:41,320 --> 00:05:42,000 Speaker 3: the wrong things in. 120 00:05:42,600 --> 00:05:44,680 Speaker 4: Why Well, if you want it to be heartfelt, you've 121 00:05:44,720 --> 00:05:45,719 Speaker 4: got to put the right things in. 122 00:05:46,560 --> 00:05:48,839 Speaker 1: You say heartfelt, heartfel. 123 00:05:48,520 --> 00:05:50,680 Speaker 3: Make it emotion from you. You're right, Make it emotional. 124 00:05:50,760 --> 00:05:54,840 Speaker 3: Make it heartfelt. Include long walks and quiet whispers. Can 125 00:05:54,880 --> 00:05:58,760 Speaker 3: I have twenty lines with six words in each? It 126 00:05:58,760 --> 00:05:59,360 Speaker 3: needs to rhyme. 127 00:05:59,400 --> 00:06:01,080 Speaker 5: I don't like any of it. 128 00:06:01,080 --> 00:06:02,480 Speaker 1: It's oh no, it's cheating. 129 00:06:02,520 --> 00:06:05,320 Speaker 2: Isn't it thirteen twenty fourteen. If you've busted someone doing 130 00:06:05,360 --> 00:06:08,240 Speaker 2: this and does it let have you been let down? 131 00:06:08,880 --> 00:06:10,960 Speaker 5: Or can I also ask for someone to call if 132 00:06:10,960 --> 00:06:13,320 Speaker 5: they have one of these bots messaging them in the morning. 133 00:06:14,640 --> 00:06:15,560 Speaker 5: I want to know you're. 134 00:06:15,440 --> 00:06:17,400 Speaker 4: Married to a bot? Are you in a relationship with 135 00:06:17,400 --> 00:06:17,680 Speaker 4: a bot? 136 00:06:17,720 --> 00:06:20,720 Speaker 1: Bot? I I must admit, I've got to look after 137 00:06:20,800 --> 00:06:23,599 Speaker 1: my wife here. She did. She did set it up. 138 00:06:24,080 --> 00:06:26,160 Speaker 2: She did set it up with my mum saying that 139 00:06:26,200 --> 00:06:29,120 Speaker 2: there's this thing that the girls and I do and 140 00:06:29,160 --> 00:06:32,719 Speaker 2: we go to this website and it produces a poem 141 00:06:32,760 --> 00:06:33,080 Speaker 2: for us. 142 00:06:33,160 --> 00:06:37,120 Speaker 1: She set it up. You're feeling bad by Kate. You've 143 00:06:37,320 --> 00:06:38,960 Speaker 1: You've been really uncomfortable about it. 144 00:06:40,200 --> 00:06:43,080 Speaker 5: And I don't know whether I'm just having an emotional day, 145 00:06:43,920 --> 00:06:46,920 Speaker 5: but the idea when we talk about AI or even 146 00:06:46,960 --> 00:06:49,919 Speaker 5: then you just read out something about me that was 147 00:06:49,920 --> 00:06:53,960 Speaker 5: a computer deciding where my future is heading based on 148 00:06:54,000 --> 00:06:58,200 Speaker 5: the information it already has, right, it makes me. It 149 00:06:58,240 --> 00:07:03,080 Speaker 5: makes me feel really uncomfortable and kind of panicked or something. 150 00:07:03,160 --> 00:07:07,159 Speaker 5: It's like people know things about you that they don't 151 00:07:07,240 --> 00:07:11,440 Speaker 5: have to go and research making sense. It just makes 152 00:07:11,480 --> 00:07:12,920 Speaker 5: my body feel uneasy. 153 00:07:13,040 --> 00:07:15,880 Speaker 3: I put into chatchpt willill Kate Richie be in tenious time. 154 00:07:16,480 --> 00:07:20,000 Speaker 3: Given her experience in both entertainment and media industries, Richie 155 00:07:20,080 --> 00:07:22,120 Speaker 3: might explore them all behind the scenes work such as 156 00:07:22,160 --> 00:07:25,520 Speaker 3: producing or directing, especially if she wants to schedule that 157 00:07:25,920 --> 00:07:27,920 Speaker 3: and that it aligns with her family life. 158 00:07:28,440 --> 00:07:29,320 Speaker 1: Other thing. 159 00:07:30,360 --> 00:07:32,760 Speaker 3: Then you said, Kate Richie is famous on radio for 160 00:07:32,800 --> 00:07:34,280 Speaker 3: telling stories that go for far too long. 161 00:07:35,800 --> 00:07:36,840 Speaker 1: That's not what it says. 162 00:07:37,120 --> 00:07:41,200 Speaker 4: You made that uppt very smart. II knows you very well. 163 00:07:41,600 --> 00:07:44,640 Speaker 2: Leanna's given a school from mainly val who got caught 164 00:07:44,640 --> 00:07:45,679 Speaker 2: out using AI. 165 00:07:45,760 --> 00:07:47,600 Speaker 1: Leana, So I have. 166 00:07:47,640 --> 00:07:50,000 Speaker 7: A friend who applied for a job and did her 167 00:07:50,040 --> 00:07:55,360 Speaker 7: whole resume and cover letter using AI. Yeah, to really 168 00:07:55,360 --> 00:07:58,760 Speaker 7: talk about and now she's got the job and like 169 00:07:58,880 --> 00:08:01,160 Speaker 7: cannot perform up to the standards. 170 00:08:02,400 --> 00:08:07,240 Speaker 5: Yeah, because you can't, like without the help of the computers. 171 00:08:08,480 --> 00:08:11,480 Speaker 1: You don't have those skills. They're not your skills. 172 00:08:12,000 --> 00:08:16,040 Speaker 3: Yeah, but Lean, some of the wording might just be 173 00:08:16,120 --> 00:08:19,040 Speaker 3: what she's looking for. Like she's not lying about what 174 00:08:19,080 --> 00:08:22,240 Speaker 3: she's done, she's just put in her details and it 175 00:08:22,320 --> 00:08:25,520 Speaker 3: is AI has cleverly crafted. 176 00:08:25,880 --> 00:08:29,000 Speaker 7: So you need to write reports there what not. You 177 00:08:29,040 --> 00:08:32,400 Speaker 7: can't just be constantly using Ai, No. 178 00:08:32,400 --> 00:08:34,120 Speaker 1: You can't put it in a program. 179 00:08:34,200 --> 00:08:37,120 Speaker 5: Now you're indicating to somebody that you have to. 180 00:08:38,440 --> 00:08:39,600 Speaker 1: I don't like any of this. 181 00:08:40,320 --> 00:08:41,720 Speaker 3: Because you don't know how to turn on your TV 182 00:08:41,840 --> 00:08:42,160 Speaker 3: at home. 183 00:08:42,640 --> 00:08:44,480 Speaker 1: And I'm the happiest person for us. 184 00:08:44,520 --> 00:08:46,600 Speaker 3: This stuff is absolute panic town for you. 185 00:08:46,880 --> 00:08:49,320 Speaker 2: I like it. What if a boss does come to 186 00:08:49,400 --> 00:08:53,040 Speaker 2: Leana's friend and says, look, can we solve this right now, 187 00:08:53,200 --> 00:08:55,720 Speaker 2: face to face, Leanna has to get to a computer and. 188 00:08:57,240 --> 00:08:58,319 Speaker 1: Sort it all out that one. 189 00:08:58,360 --> 00:08:59,559 Speaker 4: Don't look over my shoulder. 190 00:08:59,679 --> 00:09:00,000 Speaker 2: You can't. 191 00:09:00,000 --> 00:09:02,840 Speaker 1: I don't have conversations with people, can out using computers. 192 00:09:02,920 --> 00:09:04,520 Speaker 1: Probably can't even set out a letter. 193 00:09:04,760 --> 00:09:06,240 Speaker 4: Can I tell you why Kate's really upset? 194 00:09:06,440 --> 00:09:09,280 Speaker 3: Why because she found a photo of a baby giraffe 195 00:09:09,320 --> 00:09:11,160 Speaker 3: standing on the back of another giraffe and thought it 196 00:09:11,200 --> 00:09:13,080 Speaker 3: was real and she shared it, but it was. 197 00:09:14,200 --> 00:09:18,679 Speaker 5: I didn't share it. Why do I feel confused all 198 00:09:18,720 --> 00:09:19,040 Speaker 5: the time? 199 00:09:19,080 --> 00:09:20,600 Speaker 1: I don't know what's real anymore? 200 00:09:20,800 --> 00:09:24,080 Speaker 4: How How is a snake eating a hippopotamus? 201 00:09:24,120 --> 00:09:26,240 Speaker 5: I did not say that one. I'm not an idiot. 202 00:09:27,000 --> 00:09:29,160 Speaker 4: Giraffe don't stand on their backs. I've never seen this 203 00:09:29,240 --> 00:09:30,079 Speaker 4: before with. 204 00:09:30,120 --> 00:09:33,920 Speaker 5: A baby giraffe cuddling a mummy giraffe on its back 205 00:09:34,880 --> 00:09:36,679 Speaker 5: like a slow sits in. 206 00:09:36,720 --> 00:09:39,880 Speaker 3: Whippa with Kate Ritchie is a Nova podcast or right 207 00:09:39,920 --> 00:09:43,520 Speaker 3: shows like this download the Nova player, Fight the app store, 208 00:09:43,640 --> 00:09:44,640 Speaker 3: or Google playing the