1 00:00:02,200 --> 00:00:06,960 Speaker 1: From the newsroom and news still come today. Good day there. 2 00:00:07,000 --> 00:00:09,360 Speaker 1: I'm Andrew Bucklow and I reckon there's a fair few 3 00:00:09,400 --> 00:00:12,640 Speaker 1: ossies who are feeling pricksty relieved today following the Reserve 4 00:00:12,680 --> 00:00:15,960 Speaker 1: Bank of Australia's decision to cut interest rates by twenty 5 00:00:15,960 --> 00:00:20,759 Speaker 1: five bases points. That must have been a choir of 6 00:00:20,800 --> 00:00:23,080 Speaker 1: homeowners there. If you've got a mortgage and you want 7 00:00:23,079 --> 00:00:24,799 Speaker 1: to know how much you're going to save each month, 8 00:00:25,000 --> 00:00:26,720 Speaker 1: head to news dot com dot Au. We've got a 9 00:00:26,760 --> 00:00:29,560 Speaker 1: calculated tool that will give you that answer. As for 10 00:00:29,640 --> 00:00:32,080 Speaker 1: what is making us today, well, have just come out 11 00:00:32,120 --> 00:00:35,159 Speaker 1: of our daily editorial meeting. That's where all the journals 12 00:00:35,200 --> 00:00:38,519 Speaker 1: get together and pitch ideas. One about team members, a 13 00:00:38,560 --> 00:00:41,680 Speaker 1: fellow named Jai was talking about some recent comments made 14 00:00:41,680 --> 00:00:44,919 Speaker 1: by US Vice President jd Vance about AI. 15 00:00:45,360 --> 00:00:49,320 Speaker 2: The Trump administration will maintain a pro worker growth path 16 00:00:49,360 --> 00:00:51,800 Speaker 2: for AI so it can be a potent tool for 17 00:00:51,920 --> 00:00:55,160 Speaker 2: job creation in the United States. AI, I really believe 18 00:00:55,320 --> 00:00:59,319 Speaker 2: will facility and make people more productive. It is not 19 00:00:59,360 --> 00:01:02,160 Speaker 2: going to replace place human beings. It will never replace 20 00:01:02,240 --> 00:01:04,759 Speaker 2: human beings. And I think too many of the leaders 21 00:01:04,920 --> 00:01:07,800 Speaker 2: in the AI industry when they talk about this fear 22 00:01:07,920 --> 00:01:09,160 Speaker 2: of replacing workers. 23 00:01:09,160 --> 00:01:09,679 Speaker 3: I think they. 24 00:01:09,640 --> 00:01:12,720 Speaker 2: Really missed the point AI we believe is going to 25 00:01:12,720 --> 00:01:16,120 Speaker 2: make us more productive, more prosperous, and more free. 26 00:01:16,360 --> 00:01:18,680 Speaker 1: This led to a big discussion about AI, and John 27 00:01:18,840 --> 00:01:20,760 Speaker 1: mentioned in that meeting he'd love for us to do 28 00:01:20,800 --> 00:01:24,399 Speaker 1: a podcast about how to utilize the technology, just like 29 00:01:25,240 --> 00:01:30,319 Speaker 1: these are the five really common useful board ways that 30 00:01:30,600 --> 00:01:34,000 Speaker 1: an everyday person could and should be using it about. 31 00:01:34,280 --> 00:01:36,399 Speaker 1: Actually thought that was a cracking idea from Joy. So 32 00:01:36,440 --> 00:01:38,440 Speaker 1: that's exactly what we're going to do in this episode. 33 00:01:38,440 --> 00:01:40,280 Speaker 1: If you want to know what you could be using 34 00:01:40,360 --> 00:01:45,480 Speaker 1: AI for stick around. Trevor Long is the editor of 35 00:01:45,560 --> 00:01:48,960 Speaker 1: EFTM dot com. He's a tech expert who regularly appears 36 00:01:48,960 --> 00:01:51,280 Speaker 1: on TV and radio around the country. He's a great 37 00:01:51,320 --> 00:01:53,240 Speaker 1: friend of this podcast. Trevor, it's great to have. 38 00:01:53,240 --> 00:01:55,840 Speaker 3: You on Fucky Greg to be with you, mate. 39 00:01:56,200 --> 00:01:59,600 Speaker 1: Okay, we've just heard some audio of JD vans sprooking 40 00:01:59,680 --> 00:02:02,080 Speaker 1: AI and saying that the US government believes that could 41 00:02:02,080 --> 00:02:04,160 Speaker 1: be a good thing for society and end up actually 42 00:02:04,200 --> 00:02:06,680 Speaker 1: creating jobs. What do you think? Do you think that 43 00:02:06,720 --> 00:02:09,320 Speaker 1: the benefits of AI will outweigh the risks. 44 00:02:10,120 --> 00:02:12,560 Speaker 4: I think, in the fullness of time, we will see 45 00:02:12,600 --> 00:02:15,200 Speaker 4: that to be the case. Yes, because there's always going 46 00:02:15,240 --> 00:02:18,720 Speaker 4: to be risks with any new technology. I mean, criminals 47 00:02:18,880 --> 00:02:21,440 Speaker 4: of the worst kind terrorists are using smartphones in a 48 00:02:21,440 --> 00:02:23,160 Speaker 4: bad way, but the rest of us are getting great 49 00:02:23,160 --> 00:02:26,200 Speaker 4: benefits from them. So, you know, we could be drawn 50 00:02:26,240 --> 00:02:28,400 Speaker 4: into the negativity of it, but we have to look 51 00:02:28,440 --> 00:02:31,960 Speaker 4: at the positives of making it easy to work, you know, 52 00:02:32,120 --> 00:02:34,400 Speaker 4: fun things you do with it, and go Actually, it's 53 00:02:34,440 --> 00:02:36,960 Speaker 4: pretty amazing. It's pretty amazing what we do with it. 54 00:02:37,000 --> 00:02:40,720 Speaker 4: And so what Jadie Vance is doing, he's breaking like Trump. 55 00:02:40,960 --> 00:02:42,919 Speaker 4: He wants the US to be the heart and soul 56 00:02:42,960 --> 00:02:45,560 Speaker 4: of this. He doesn't want certainly China, but he wants 57 00:02:45,560 --> 00:02:47,800 Speaker 4: the US to be known for this, which is you know, 58 00:02:47,840 --> 00:02:50,359 Speaker 4: it's an investment strategy as well. They want people investing 59 00:02:50,360 --> 00:02:52,600 Speaker 4: in American AI companies. 60 00:02:52,680 --> 00:02:53,440 Speaker 3: That's the strategy. 61 00:02:53,480 --> 00:02:56,000 Speaker 1: I think we just have been hearing about AI for 62 00:02:56,120 --> 00:02:59,680 Speaker 1: so long now, but some people listening they probably have 63 00:02:59,720 --> 00:03:02,520 Speaker 1: no idea how to access AI and how they can 64 00:03:02,560 --> 00:03:04,880 Speaker 1: actually use it. So where do people go to use AI? 65 00:03:04,960 --> 00:03:05,720 Speaker 1: And is it free? 66 00:03:06,720 --> 00:03:12,240 Speaker 4: Great question and sometimes just the answer to the free question. Basically, 67 00:03:12,320 --> 00:03:15,040 Speaker 4: there's three big ais. There's probably there's many more, but 68 00:03:15,160 --> 00:03:17,679 Speaker 4: we could get bogged down. But the AI I'm talking 69 00:03:17,720 --> 00:03:20,040 Speaker 4: about here is the chat style, so the one where 70 00:03:20,040 --> 00:03:22,359 Speaker 4: you ask a question, you get an answer, and it's 71 00:03:22,360 --> 00:03:24,960 Speaker 4: a bit more verbose than googling something, right, That's what 72 00:03:24,960 --> 00:03:27,520 Speaker 4: we're talking about here. So there's three big companies. Microsoft's 73 00:03:27,600 --> 00:03:30,320 Speaker 4: co Pilot that might be built into your laptop. If 74 00:03:30,320 --> 00:03:32,680 Speaker 4: you've got a laptop of the last year or so, 75 00:03:33,160 --> 00:03:35,320 Speaker 4: it's probably got Copilot on it. You can just press 76 00:03:35,320 --> 00:03:39,400 Speaker 4: that button and just ask a question. Then there's Google's Gemini. 77 00:03:39,640 --> 00:03:42,200 Speaker 4: If you live and breathe in the Gmail space and 78 00:03:42,240 --> 00:03:45,640 Speaker 4: a lot of workplaces do, it might have Gemini in it, 79 00:03:45,800 --> 00:03:47,360 Speaker 4: so just look up Gemini. 80 00:03:47,400 --> 00:03:48,280 Speaker 3: You never know it might be there. 81 00:03:48,320 --> 00:03:50,280 Speaker 4: And the other one, the big one, is chat GPT, 82 00:03:51,040 --> 00:03:54,480 Speaker 4: and all of them have a free level, but like 83 00:03:54,560 --> 00:03:57,480 Speaker 4: chat GPT, for example, it kind of times out. 84 00:03:57,640 --> 00:03:59,680 Speaker 3: You know, you've asked me too much, You've got to 85 00:03:59,680 --> 00:03:59,960 Speaker 3: pay them. 86 00:04:00,320 --> 00:04:03,160 Speaker 4: And that's essentially like I wasn't logged in once because 87 00:04:03,160 --> 00:04:04,800 Speaker 4: I've got an account and I paid for it and 88 00:04:04,960 --> 00:04:08,480 Speaker 4: I was doing so much on it that it went dude, mate, upgrade. 89 00:04:09,240 --> 00:04:10,520 Speaker 3: Oh sorry, I'll log in, you know. 90 00:04:10,680 --> 00:04:13,800 Speaker 4: So basically there's plenty of AI that's free to muck 91 00:04:13,840 --> 00:04:16,039 Speaker 4: around with, but if you're going to get serious about it, 92 00:04:16,400 --> 00:04:18,800 Speaker 4: paying a you know, fifteen to twenty dollars a month 93 00:04:18,839 --> 00:04:21,160 Speaker 4: subscription just gives you the ability to have like a 94 00:04:21,200 --> 00:04:24,640 Speaker 4: record of all your chats. And also, here's the mind 95 00:04:24,680 --> 00:04:28,720 Speaker 4: blowing thing it learns you. So I had thirty minutes 96 00:04:28,760 --> 00:04:30,159 Speaker 4: the other day and I went on a chat GEP, 97 00:04:30,320 --> 00:04:32,720 Speaker 4: and I said, do you know what, ask me some 98 00:04:32,839 --> 00:04:34,880 Speaker 4: questions so that you can learn about me. 99 00:04:35,240 --> 00:04:36,279 Speaker 3: And we went back and forth. 100 00:04:36,320 --> 00:04:38,520 Speaker 4: It asked me you just random questions, and it was 101 00:04:38,560 --> 00:04:41,119 Speaker 4: like you know that game where you asked twenty questions 102 00:04:41,160 --> 00:04:43,160 Speaker 4: and you work out the answer, you know, animal or 103 00:04:43,200 --> 00:04:44,680 Speaker 4: whatever someone's thinking. You know, it was kind of like that. 104 00:04:44,760 --> 00:04:48,120 Speaker 4: It was learning more about me. And so now my 105 00:04:48,680 --> 00:04:52,279 Speaker 4: chat EPT essentially knows broadly who I am, and so 106 00:04:52,320 --> 00:04:54,239 Speaker 4: it can speak to me in a little bit more 107 00:04:54,640 --> 00:04:58,240 Speaker 4: familiar terms. So I do recommend that because it makes 108 00:04:58,279 --> 00:05:01,000 Speaker 4: it a little bit more of a useful exercise. 109 00:05:01,400 --> 00:05:03,479 Speaker 1: Let's say people do actually sign up and get a 110 00:05:03,520 --> 00:05:06,440 Speaker 1: membership to chat GPT, the big question is what can 111 00:05:06,480 --> 00:05:09,120 Speaker 1: they do with it? They are an endless amount of uses, 112 00:05:09,120 --> 00:05:10,880 Speaker 1: But you're the expert. So Trevor I would like to 113 00:05:10,920 --> 00:05:14,000 Speaker 1: ask you, can you run us through five things we 114 00:05:14,040 --> 00:05:17,440 Speaker 1: could all be using AI for in our everyday lives. 115 00:05:17,920 --> 00:05:19,640 Speaker 3: So think firstly about work. 116 00:05:19,839 --> 00:05:23,240 Speaker 4: Most people in their work there's something that AOI could benefit. 117 00:05:23,279 --> 00:05:24,760 Speaker 4: And by the way, this could also now be on 118 00:05:24,800 --> 00:05:27,560 Speaker 4: your smartphone. We're talking Galaxy AI and Apple Intelligence can 119 00:05:27,600 --> 00:05:31,159 Speaker 4: do the things I'm about to mention, drafting emails, prof 120 00:05:31,160 --> 00:05:35,520 Speaker 4: rereading emails, rewriting things. So you've written a long, long email, 121 00:05:35,560 --> 00:05:37,440 Speaker 4: it might be to a client or a colleague or 122 00:05:37,480 --> 00:05:39,680 Speaker 4: your boss, and you've gone, you know what, I don't know? 123 00:05:39,839 --> 00:05:40,560 Speaker 3: Should I send it? 124 00:05:41,160 --> 00:05:44,240 Speaker 4: Highlight the text, put it into an AI or in 125 00:05:44,279 --> 00:05:46,880 Speaker 4: your Copilot PC or Apple Intelligence. 126 00:05:46,920 --> 00:05:49,480 Speaker 3: It'll do it for you. Say can you make this 127 00:05:49,520 --> 00:05:50,039 Speaker 3: more polite? 128 00:05:50,279 --> 00:05:50,440 Speaker 2: Oh? 129 00:05:50,520 --> 00:05:53,080 Speaker 3: Can you make this a bit more business like? Or 130 00:05:53,480 --> 00:05:56,000 Speaker 3: can you make this readable to a twelve year old? 131 00:05:56,120 --> 00:05:56,320 Speaker 3: You know? 132 00:05:56,400 --> 00:05:56,479 Speaker 2: Like? 133 00:05:56,640 --> 00:05:58,800 Speaker 4: You know, you can change the tone of a message. 134 00:05:58,960 --> 00:06:01,320 Speaker 4: We used to have spelling and grad check underlying the words. 135 00:06:01,360 --> 00:06:03,760 Speaker 4: Now it's like, okay, I'm happy with this spelling and grammar, 136 00:06:04,120 --> 00:06:06,760 Speaker 4: but just what about the overall tone and vibe. You know, 137 00:06:06,760 --> 00:06:10,560 Speaker 4: if you're known for your harsh and firm emails, maybe 138 00:06:10,680 --> 00:06:13,320 Speaker 4: use AI to take the edge off, or maybe. 139 00:06:13,000 --> 00:06:14,600 Speaker 3: You're known for the opposite and you need AI to 140 00:06:14,600 --> 00:06:15,200 Speaker 3: put the edge on. 141 00:06:15,320 --> 00:06:18,600 Speaker 4: So I think at work, just play with it on 142 00:06:18,680 --> 00:06:22,800 Speaker 4: everything from writing captions to your social media through to 143 00:06:23,200 --> 00:06:26,360 Speaker 4: coming up with fun marketing ideas. I would definitely do that. 144 00:06:26,600 --> 00:06:29,120 Speaker 4: The other thing is at work. The second thing i'd 145 00:06:29,160 --> 00:06:33,200 Speaker 4: say is summaries. Now you would do this too. As 146 00:06:33,200 --> 00:06:36,880 Speaker 4: a journal I sometimes you get sent a government pdf. 147 00:06:36,920 --> 00:06:39,880 Speaker 4: It's you know, it's a ten page document that is 148 00:06:40,120 --> 00:06:43,839 Speaker 4: about something that they've launched. Run it through AI and say, 149 00:06:43,880 --> 00:06:45,960 Speaker 4: can you give me the top five things about this 150 00:06:46,040 --> 00:06:48,920 Speaker 4: I should understand now I'm not suggesting for a minute 151 00:06:48,960 --> 00:06:51,039 Speaker 4: that you should not be reading the document. What I 152 00:06:51,120 --> 00:06:53,520 Speaker 4: do is I say, tell me where to look in 153 00:06:53,600 --> 00:06:55,960 Speaker 4: the document, tell me what I want to look at. 154 00:06:56,160 --> 00:06:58,360 Speaker 4: So I might look at a two hundred page government 155 00:06:58,360 --> 00:07:00,920 Speaker 4: document and say, can you point me to the five 156 00:07:01,000 --> 00:07:03,880 Speaker 4: submissions that are most critical of this government policy? And 157 00:07:03,920 --> 00:07:06,719 Speaker 4: then I can find those, read them myself, and do 158 00:07:06,839 --> 00:07:10,800 Speaker 4: the work. So use AI to take a complex task 159 00:07:11,240 --> 00:07:14,880 Speaker 4: and minimize it into your window of need, essentially, because 160 00:07:15,400 --> 00:07:16,840 Speaker 4: that can be really hard sometimes. 161 00:07:16,960 --> 00:07:19,400 Speaker 1: Yeah, that's going to save people so much time, not 162 00:07:19,480 --> 00:07:21,800 Speaker 1: just as a journalist, but people could have book reports 163 00:07:21,840 --> 00:07:23,880 Speaker 1: to do, or they could have so many little long 164 00:07:23,960 --> 00:07:26,200 Speaker 1: tasks they can just whittle down with AI. 165 00:07:26,880 --> 00:07:29,160 Speaker 4: Make your life easier, because that's going to make you 166 00:07:29,200 --> 00:07:32,160 Speaker 4: more efficient and more valuable to your employer in that way. 167 00:07:32,200 --> 00:07:34,280 Speaker 4: And I think that's where we need to understand that 168 00:07:34,320 --> 00:07:35,400 Speaker 4: AI can help us. 169 00:07:35,440 --> 00:07:37,480 Speaker 3: Now. The third one is education. 170 00:07:38,080 --> 00:07:40,680 Speaker 4: Now this might be grinding to some people, but my 171 00:07:41,160 --> 00:07:44,040 Speaker 4: thirteen year old was frustrated at his desk the other 172 00:07:44,120 --> 00:07:47,440 Speaker 4: day doing a maths task. Now I say to my kids, 173 00:07:47,560 --> 00:07:49,520 Speaker 4: I'm a genius. I came top one to two percent 174 00:07:49,560 --> 00:07:51,760 Speaker 4: in the state in maths that I don't need to understand. 175 00:07:51,760 --> 00:07:54,600 Speaker 4: That was maths in society, the simple, simple stuff. I 176 00:07:54,640 --> 00:07:58,240 Speaker 4: don't really know maths. So I took out my app, 177 00:07:58,280 --> 00:08:00,680 Speaker 4: my chatchy bit app. I said to my son, and 178 00:08:00,680 --> 00:08:02,560 Speaker 4: we're going to ask Barry. He said, who's Barry. I said, 179 00:08:02,560 --> 00:08:05,320 Speaker 4: that's what I've just named my chat GPT. Here's what 180 00:08:05,360 --> 00:08:08,040 Speaker 4: I did. I took a photo of what looked to 181 00:08:08,080 --> 00:08:11,080 Speaker 4: me like a very complex formula and question because I 182 00:08:11,120 --> 00:08:13,760 Speaker 4: have no idea what this you know exis and wise was, 183 00:08:14,280 --> 00:08:16,480 Speaker 4: And I said, buddy, we're going to ask chat ChiPT 184 00:08:16,560 --> 00:08:18,640 Speaker 4: And he said, isn't that cheating? I said, you don't 185 00:08:18,680 --> 00:08:21,720 Speaker 4: know the answer. I don't know the answer. Let's find out. 186 00:08:21,760 --> 00:08:21,880 Speaker 3: Now. 187 00:08:21,920 --> 00:08:24,680 Speaker 4: The critical thing here is chat Tipt didn't say here's 188 00:08:24,720 --> 00:08:27,800 Speaker 4: the answer. Chat Gipt worked through it and said, here's 189 00:08:27,800 --> 00:08:29,840 Speaker 4: what I did, Here's what I learned to I got 190 00:08:29,880 --> 00:08:33,200 Speaker 4: to tell you I learned something. I learned something doing that, 191 00:08:33,240 --> 00:08:35,880 Speaker 4: and so I was able to firstly have a really 192 00:08:35,880 --> 00:08:38,960 Speaker 4: important moment with my son to have this conversation about 193 00:08:38,960 --> 00:08:41,199 Speaker 4: this maths task, and then try and help him a 194 00:08:41,240 --> 00:08:41,679 Speaker 4: little bit. 195 00:08:42,000 --> 00:08:43,839 Speaker 3: And I think he learned from it just as much 196 00:08:43,840 --> 00:08:44,320 Speaker 3: as I did. 197 00:08:44,360 --> 00:08:47,760 Speaker 4: So I think learning and education is a brilliant use 198 00:08:47,840 --> 00:08:50,160 Speaker 4: for AI, whether it's in solving a simple task and 199 00:08:50,240 --> 00:08:53,840 Speaker 4: learning what it was, or Hey, can you just give 200 00:08:53,880 --> 00:08:57,800 Speaker 4: me the top five things about this this book that 201 00:08:57,880 --> 00:09:01,680 Speaker 4: I should reread and you know, work on my essays 202 00:09:02,280 --> 00:09:04,160 Speaker 4: because sometimes she's got to jog your memory. 203 00:09:04,280 --> 00:09:06,800 Speaker 1: Great tips there. We've still got two to go. Don't 204 00:09:06,800 --> 00:09:14,240 Speaker 1: go anywhere. We'll be back in a moment. I'm chatting 205 00:09:14,240 --> 00:09:16,839 Speaker 1: to Travel Long, a tech expert is running us through 206 00:09:16,880 --> 00:09:19,240 Speaker 1: the everyday us is for AI that we could all 207 00:09:19,280 --> 00:09:21,600 Speaker 1: be doing every single day. Trevor, We've got two to 208 00:09:21,640 --> 00:09:23,400 Speaker 1: go take it away, all right. 209 00:09:23,440 --> 00:09:26,480 Speaker 4: Travel planning is such a great thing because you've got 210 00:09:26,480 --> 00:09:29,880 Speaker 4: to remember that chatchabut Gemini. These things they've read the Internet. 211 00:09:30,320 --> 00:09:32,480 Speaker 4: They haven't just gone I'm going to read this little part. 212 00:09:32,480 --> 00:09:35,160 Speaker 4: They've read the Internet. So I'm taking my family to 213 00:09:35,200 --> 00:09:38,560 Speaker 4: America later in the year. I could say to Chatchabuit, Hey, 214 00:09:38,960 --> 00:09:42,800 Speaker 4: I'm taking my family to Houston for two days. What 215 00:09:42,880 --> 00:09:45,280 Speaker 4: should we see? And then it might list ten things. 216 00:09:45,320 --> 00:09:48,240 Speaker 4: But then I can say, actually, my kids are twelve, thirteen, 217 00:09:48,320 --> 00:09:50,640 Speaker 4: and eighteen, so can you make the more age appropriate. 218 00:09:50,760 --> 00:09:52,920 Speaker 4: It'll mold that. And then I could say, actually, two 219 00:09:52,960 --> 00:09:56,240 Speaker 4: of my kids are Celiac. Can you integrate some great 220 00:09:56,240 --> 00:09:58,120 Speaker 4: gluten free restaurant options into that? 221 00:09:58,440 --> 00:10:00,600 Speaker 3: And it'll do that. And no, no, no, that might not. 222 00:10:00,520 --> 00:10:02,800 Speaker 4: Be what we do on those days, but what a 223 00:10:02,840 --> 00:10:05,320 Speaker 4: great way to get you started. Because the problem with 224 00:10:05,440 --> 00:10:09,160 Speaker 4: Google is it's built around sponsorship, so who pays the 225 00:10:09,160 --> 00:10:11,880 Speaker 4: ads gets the best link. And it's also built around 226 00:10:11,920 --> 00:10:14,880 Speaker 4: trip Advisor style reviews, and it's like, unless someone has 227 00:10:14,960 --> 00:10:18,360 Speaker 4: done enough work on those reviews, it's very hard to 228 00:10:18,360 --> 00:10:23,400 Speaker 4: find the right information. So I think planning planning generally, 229 00:10:23,559 --> 00:10:25,760 Speaker 4: so we could say travel planning or just planning generally, 230 00:10:25,840 --> 00:10:28,600 Speaker 4: because you could use it for meal planning, you know, 231 00:10:28,720 --> 00:10:30,800 Speaker 4: but travel is a huge place where you just go 232 00:10:30,960 --> 00:10:33,800 Speaker 4: help me with an itinery for this. Give it the protocols, 233 00:10:33,800 --> 00:10:37,080 Speaker 4: you know, the dates and the whatever unbelievable knowledge that 234 00:10:37,160 --> 00:10:40,280 Speaker 4: you'll find that it has. And again, don't make it 235 00:10:40,320 --> 00:10:42,440 Speaker 4: the bell and end all. But like if you've five 236 00:10:42,480 --> 00:10:44,640 Speaker 4: of you around the table, make it the sixth. That's 237 00:10:44,720 --> 00:10:46,320 Speaker 4: what That's the way I look at Ah A. 238 00:10:46,240 --> 00:10:49,079 Speaker 1: Great suggestion there. I'm going over says soon as well, 239 00:10:49,160 --> 00:10:52,120 Speaker 1: so I'll definitely be doing that. What is the last one? Travelong? 240 00:10:52,760 --> 00:10:54,080 Speaker 4: I think we've got to have fun with it. So 241 00:10:54,200 --> 00:10:56,080 Speaker 4: this is where you've got to remember that you can. 242 00:10:56,160 --> 00:10:58,000 Speaker 4: You can get an emotion out of these things. So 243 00:10:58,080 --> 00:11:00,440 Speaker 4: Gem and I chatch it it. It doesn't matter. Up with 244 00:11:00,559 --> 00:11:02,760 Speaker 4: fun things. So you might be doing a wedding speech, 245 00:11:02,920 --> 00:11:05,120 Speaker 4: get it to help you. If you need jokes on 246 00:11:05,120 --> 00:11:07,440 Speaker 4: a particular topic, get it to help you. Do you 247 00:11:07,480 --> 00:11:09,280 Speaker 4: know what, if you and your kids have got a 248 00:11:09,320 --> 00:11:12,120 Speaker 4: fun little character in your lives, you know, not Milko 249 00:11:12,240 --> 00:11:15,079 Speaker 4: from Home and Away, but you know something in your life. 250 00:11:14,800 --> 00:11:16,720 Speaker 3: Get it to write a bedtime story for you. 251 00:11:16,760 --> 00:11:19,240 Speaker 4: Get it to write a fun poem for the family, 252 00:11:19,440 --> 00:11:22,280 Speaker 4: you know, use it for fun because it can have 253 00:11:22,480 --> 00:11:24,760 Speaker 4: an emotion in that sense. A lot of the things 254 00:11:24,760 --> 00:11:26,680 Speaker 4: that you find as an individual who doesn't do that 255 00:11:26,720 --> 00:11:29,640 Speaker 4: a lot is man, I don't know where to start well, AI, 256 00:11:30,160 --> 00:11:32,240 Speaker 4: So take the edge off the how to start it, 257 00:11:32,520 --> 00:11:35,840 Speaker 4: and say, I'm writing a speech for a work colleague 258 00:11:35,840 --> 00:11:37,720 Speaker 4: I've known for ten years, but maybe I don't know 259 00:11:37,760 --> 00:11:39,520 Speaker 4: them that well. I don't know their family at all. 260 00:11:39,960 --> 00:11:42,720 Speaker 4: Help me with a starting point. Where should I start 261 00:11:42,760 --> 00:11:44,400 Speaker 4: with this? And it'll just give you some ideas. And 262 00:11:44,400 --> 00:11:48,200 Speaker 4: then remember it's a two way conversation. And by the way, 263 00:11:48,360 --> 00:11:51,680 Speaker 4: with chatchypt anyway, you can have a literal conversation with 264 00:11:51,720 --> 00:11:56,160 Speaker 4: your AI, just talking and it is so real. It's awesome. 265 00:11:56,200 --> 00:11:58,840 Speaker 4: It's like having another I work in an office all alone. 266 00:11:58,920 --> 00:12:00,840 Speaker 4: I can just press a button on chat gibt and 267 00:12:00,840 --> 00:12:02,680 Speaker 4: we just have a chat about a topic. 268 00:12:03,720 --> 00:12:04,520 Speaker 3: It's very cool. 269 00:12:04,760 --> 00:12:07,640 Speaker 1: I have a mate who goes on a trip every 270 00:12:07,679 --> 00:12:10,080 Speaker 1: six months with his dad, and he likes to write 271 00:12:10,080 --> 00:12:12,360 Speaker 1: a poem about the trip so he had something to 272 00:12:12,400 --> 00:12:15,679 Speaker 1: remember it by. And lately he's been using AI to 273 00:12:15,920 --> 00:12:18,000 Speaker 1: kind of talk about the funny jokes the things they did, 274 00:12:18,080 --> 00:12:20,160 Speaker 1: and it helps him craft a poem and just makes 275 00:12:20,200 --> 00:12:21,200 Speaker 1: it stand out. 276 00:12:21,200 --> 00:12:21,720 Speaker 3: By the way. 277 00:12:22,040 --> 00:12:26,280 Speaker 4: The other fun thing, especially chat CYBT and Shim and 278 00:12:26,320 --> 00:12:27,000 Speaker 4: I know they all do it. 279 00:12:27,000 --> 00:12:28,319 Speaker 3: Copepole will generate an image. 280 00:12:28,360 --> 00:12:30,560 Speaker 4: So if you've if you've done a poem, or you've 281 00:12:30,600 --> 00:12:33,600 Speaker 4: you've reflected on a trip, say generate an image of 282 00:12:34,240 --> 00:12:36,520 Speaker 4: and just describe an image and it will do that. 283 00:12:36,600 --> 00:12:39,000 Speaker 4: It's it's not going to create a photo realistic picture 284 00:12:39,000 --> 00:12:41,080 Speaker 4: of you in that situation, but it might create a 285 00:12:41,120 --> 00:12:44,839 Speaker 4: fun cover for the book or slide for the presentation, 286 00:12:44,920 --> 00:12:47,000 Speaker 4: whatever it is. And that's just scratching the surface. Because 287 00:12:47,000 --> 00:12:49,520 Speaker 4: obviously in the future we've got audio and video creation. 288 00:12:49,679 --> 00:12:52,320 Speaker 4: It's it's pretty amazing. But right now, just think of 289 00:12:52,360 --> 00:12:55,240 Speaker 4: it like another member of your family or another member 290 00:12:55,280 --> 00:12:58,320 Speaker 4: of your work team and push its boundaries, get it 291 00:12:58,360 --> 00:12:59,080 Speaker 4: to do things for you. 292 00:12:59,280 --> 00:13:01,719 Speaker 1: Dreva, these have been such good tips from you. Thank 293 00:13:01,720 --> 00:13:03,440 Speaker 1: you so much for jumping on the podcast. And if 294 00:13:03,440 --> 00:13:06,520 Speaker 1: you want to see Trevor's website got at EFTM dot com. Trevor, 295 00:13:06,600 --> 00:13:07,400 Speaker 1: great to hear from. 296 00:13:07,280 --> 00:13:08,840 Speaker 3: You, teas mate anytime. 297 00:13:09,360 --> 00:13:11,559 Speaker 1: In the spirit of today's episode, I thought i'd ask 298 00:13:11,679 --> 00:13:14,400 Speaker 1: chat GPT to help me wrap things up by writing 299 00:13:14,440 --> 00:13:17,200 Speaker 1: a limerick to thank you all for listening. Here's what 300 00:13:17,200 --> 00:13:20,000 Speaker 1: it came up with. The headlines are done. That's a rap. 301 00:13:20,080 --> 00:13:21,840 Speaker 1: Hope we kept you all in the loop. Not the 302 00:13:21,920 --> 00:13:24,040 Speaker 1: Gap will be back real soon with more news to 303 00:13:24,120 --> 00:13:27,360 Speaker 1: tune till then. Stay sharp and don't let things flap. 304 00:13:27,679 --> 00:13:30,120 Speaker 1: I'm not sure about the word flap, but anyway, thank 305 00:13:30,120 --> 00:13:33,240 Speaker 1: you all for listening. I'll catch you later. Follow us, 306 00:13:33,280 --> 00:13:37,760 Speaker 1: subscribe to from the Newsroom wherever you get your podcasts.