1 00:00:00,640 --> 00:00:05,319 Speaker 1: Have you noticed an explosion of long AI generated documents 2 00:00:05,559 --> 00:00:08,960 Speaker 1: that look very polished but are actually full of rubbish. 3 00:00:09,640 --> 00:00:13,600 Speaker 1: You're not imagining it. AI slop is on the rise, 4 00:00:14,000 --> 00:00:18,280 Speaker 1: content that's created quickly, looks impressive but has no real 5 00:00:18,360 --> 00:00:23,360 Speaker 1: critical thought behind it. Today, I'm joined by Inventium's AI expert, 6 00:00:23,480 --> 00:00:27,720 Speaker 1: Neo Applin. Neo is going to unpack why aislop is 7 00:00:27,760 --> 00:00:32,280 Speaker 1: becoming so common, particularly in workplaces, the risks that it 8 00:00:32,400 --> 00:00:36,720 Speaker 1: poses for organizations, and most importantly, what you can do 9 00:00:36,960 --> 00:00:39,680 Speaker 1: to make sure your own work stands out as being 10 00:00:39,760 --> 00:00:49,000 Speaker 1: thoughtful and valuable in an AI saturated world. Welcome to 11 00:00:49,120 --> 00:00:52,960 Speaker 1: How I Work, a show about habits, rituals, and strategies 12 00:00:53,040 --> 00:00:56,920 Speaker 1: for optimizing your day. I'm your host, doctor Amantha Imber. 13 00:00:57,720 --> 00:01:01,120 Speaker 1: Before we get into today's episode, I have been geeking 14 00:01:01,200 --> 00:01:04,919 Speaker 1: out over the research on AI adoption in the workplace, 15 00:01:05,280 --> 00:01:08,360 Speaker 1: and the data is super clear. The people who master 16 00:01:08,480 --> 00:01:13,160 Speaker 1: JENAI now aren't just getting ahead, they are creating entirely 17 00:01:13,400 --> 00:01:17,920 Speaker 1: new levels of productivity that previously seemed impossible. Because in 18 00:01:17,959 --> 00:01:21,000 Speaker 1: a nutshell, what I'm seeing there are two types of professionals. 19 00:01:21,319 --> 00:01:24,400 Speaker 1: There are those that are very fluent in AI, and 20 00:01:24,400 --> 00:01:26,959 Speaker 1: there are those that are just dabbling and pasting prompts 21 00:01:26,959 --> 00:01:30,800 Speaker 1: from LinkedIn and thinking that they're mastering AI. So if 22 00:01:30,840 --> 00:01:34,920 Speaker 1: you are in the latter group, this is what I recommend. 23 00:01:35,200 --> 00:01:40,199 Speaker 1: Inventium's Genai Productivity Upgrade is a way to make sure 24 00:01:40,360 --> 00:01:43,640 Speaker 1: you are in that first group who are fully fluent 25 00:01:43,720 --> 00:01:46,800 Speaker 1: in AI. It is a twelve week course designed to 26 00:01:46,840 --> 00:01:50,720 Speaker 1: move you from dabbler to productivity machine. No fluff, just 27 00:01:50,760 --> 00:01:53,840 Speaker 1: practical strategies that will pay off from week one, saving 28 00:01:53,840 --> 00:01:57,760 Speaker 1: you at least ten hours every single week. You'll learn 29 00:01:57,840 --> 00:02:00,600 Speaker 1: how to automate the grunt work, use A as your 30 00:02:00,600 --> 00:02:03,320 Speaker 1: second brain to excel at your job, learn how to 31 00:02:03,400 --> 00:02:06,480 Speaker 1: create AI agents that wing you back hours every time 32 00:02:06,520 --> 00:02:09,360 Speaker 1: you use them, and so much more. Whether you're a 33 00:02:09,400 --> 00:02:13,440 Speaker 1: complete beginner or already using AI most days, We've got 34 00:02:13,480 --> 00:02:16,640 Speaker 1: you covered, starting with prompting fundamentals and going all the 35 00:02:16,680 --> 00:02:20,480 Speaker 1: way through to advanced automations and agentic AI. We kick 36 00:02:20,520 --> 00:02:25,119 Speaker 1: off on October fifteen and spots are limited. Visit inventium 37 00:02:25,120 --> 00:02:29,240 Speaker 1: dot com dot a U forward slash Genai hyphen cohort 38 00:02:29,560 --> 00:02:32,280 Speaker 1: to secure your place now and that link is also 39 00:02:32,480 --> 00:02:35,800 Speaker 1: in the show notes. You've literally got nothing to lose 40 00:02:35,840 --> 00:02:40,320 Speaker 1: because there's a seven day money back guarantee, so head 41 00:02:40,360 --> 00:02:43,040 Speaker 1: over to that link. It's in the show notes and 42 00:02:43,440 --> 00:02:47,440 Speaker 1: read more about the program today. Neo, I've heard you 43 00:02:47,600 --> 00:02:51,000 Speaker 1: use the term AI slop before, and I guess it 44 00:02:51,120 --> 00:02:54,960 Speaker 1: is kind of becoming an industry term if you like, 45 00:02:55,360 --> 00:02:58,520 Speaker 1: can you explain what AI slop actually is. 46 00:02:59,840 --> 00:03:04,720 Speaker 2: The term is being widely used for AI generated stuff 47 00:03:04,760 --> 00:03:08,480 Speaker 2: of low value, and it can be anything from lots 48 00:03:08,480 --> 00:03:11,280 Speaker 2: of words like documents and things like that right through 49 00:03:11,360 --> 00:03:14,760 Speaker 2: to AI videos. You've probably seen a lot of AI 50 00:03:14,840 --> 00:03:18,040 Speaker 2: videos on your social feeds that is also aislot, but 51 00:03:18,080 --> 00:03:23,280 Speaker 2: effectively it's AI material that is not valuable, that seems 52 00:03:23,320 --> 00:03:26,920 Speaker 2: to be created easily and with a lot of volume 53 00:03:27,160 --> 00:03:29,959 Speaker 2: and more and more volume coming up soon. So yeah, 54 00:03:30,000 --> 00:03:33,680 Speaker 2: that's more about low value AI created stuff. 55 00:03:34,240 --> 00:03:38,920 Speaker 1: And why is aislot becoming so common? I feel like 56 00:03:39,080 --> 00:03:43,280 Speaker 1: I'm experiencing it like every single day of my workday. 57 00:03:43,880 --> 00:03:46,080 Speaker 2: Yeah, we're seeing it everywhere. We're seeing it in our 58 00:03:46,120 --> 00:03:48,680 Speaker 2: social feeds, We're seeing it in our LinkedIn there's an 59 00:03:48,720 --> 00:03:51,360 Speaker 2: awful lot of people creating lots of LinkedIn posts. Now 60 00:03:52,040 --> 00:03:55,400 Speaker 2: we're seeing it in our workplaces with documents, and the 61 00:03:55,440 --> 00:03:59,200 Speaker 2: reason is it's easy. Now. We used to think about 62 00:03:59,360 --> 00:04:02,680 Speaker 2: productive works someone who could create things. And so Neo's 63 00:04:02,680 --> 00:04:05,200 Speaker 2: awesome because you can create two strategy papers and he's 64 00:04:05,200 --> 00:04:08,640 Speaker 2: done for analytics things, and three documents and four procedures. 65 00:04:09,960 --> 00:04:12,280 Speaker 2: But here's the thing. These days with AI, I can 66 00:04:12,320 --> 00:04:15,280 Speaker 2: create those things in five minutes. You know, three dot points, 67 00:04:15,360 --> 00:04:19,200 Speaker 2: create me a document, bam, it just does it. What 68 00:04:19,240 --> 00:04:22,520 Speaker 2: this does, though, is it creates content overload and lots 69 00:04:22,560 --> 00:04:26,680 Speaker 2: of documents that don't really move the dial that they're 70 00:04:26,720 --> 00:04:32,560 Speaker 2: just documents or artifacts or pictures for the sake of them. 71 00:04:33,120 --> 00:04:35,200 Speaker 2: The problem is people then are going to get too 72 00:04:35,240 --> 00:04:38,159 Speaker 2: many documents to read, and they won't be accurate, they 73 00:04:38,200 --> 00:04:40,720 Speaker 2: won't be targeted, they won't be exactly what you need, 74 00:04:40,760 --> 00:04:43,760 Speaker 2: the business needs, what the customer needs, and you won't 75 00:04:43,760 --> 00:04:47,679 Speaker 2: know which pieces of these documents are actually important versus 76 00:04:47,880 --> 00:04:51,800 Speaker 2: AI boiler plate. And so we will get buried in 77 00:04:51,839 --> 00:04:54,400 Speaker 2: these documents because people will want to feel like they're 78 00:04:54,400 --> 00:04:57,480 Speaker 2: productive and producing lots of things, and so we will 79 00:04:57,520 --> 00:05:00,640 Speaker 2: then get very, very sick of it with this AI slop. 80 00:05:01,520 --> 00:05:06,640 Speaker 1: To me, just that the time wastage component is so frustrating. 81 00:05:06,760 --> 00:05:09,920 Speaker 1: Like I had a conversation with someone this is like 82 00:05:09,960 --> 00:05:13,160 Speaker 1: a couple of months ago about a potential partnership opportunity. 83 00:05:13,680 --> 00:05:17,479 Speaker 1: And this person then said, okay, great, I'll you know, 84 00:05:17,480 --> 00:05:21,400 Speaker 1: I'll put together some thoughts fee to review. And I 85 00:05:21,440 --> 00:05:26,320 Speaker 1: received this ten page document that was just obviously generated 86 00:05:26,360 --> 00:05:30,400 Speaker 1: but with jen AI. But the problem was no critical 87 00:05:30,480 --> 00:05:33,800 Speaker 1: human thought had gone into it, which then means the 88 00:05:33,800 --> 00:05:37,200 Speaker 1: ball is on my court to review this AI slop. 89 00:05:37,600 --> 00:05:40,320 Speaker 1: And then I'm feeling really frustrated because there go several 90 00:05:40,360 --> 00:05:44,120 Speaker 1: hours of my time and ah, it just made me 91 00:05:44,320 --> 00:05:48,800 Speaker 1: so frustrated. So my question to you is how do 92 00:05:48,839 --> 00:05:51,400 Speaker 1: we actually fix this problem. 93 00:05:51,960 --> 00:05:54,880 Speaker 2: It's going to be tough because I can create a 94 00:05:54,920 --> 00:05:58,640 Speaker 2: document maybe this person I said three or four dot 95 00:05:58,720 --> 00:06:01,800 Speaker 2: points and now flesh out the document please, and then 96 00:06:01,880 --> 00:06:04,120 Speaker 2: you've got a ten page document that you've got to 97 00:06:04,160 --> 00:06:06,240 Speaker 2: plow through. Now you're going to get sick of this, 98 00:06:06,360 --> 00:06:08,800 Speaker 2: or people will get sick of this. We're going to 99 00:06:08,800 --> 00:06:11,359 Speaker 2: go get lots of these ten page documents and slide 100 00:06:11,360 --> 00:06:13,159 Speaker 2: decks and all those kind of things. So we're just 101 00:06:13,160 --> 00:06:16,080 Speaker 2: going to go, AI summarize that, will you? And when 102 00:06:16,120 --> 00:06:18,320 Speaker 2: you summarize it, do you think you're going to get 103 00:06:18,360 --> 00:06:20,680 Speaker 2: the saying three or four dot points that they put 104 00:06:20,720 --> 00:06:23,560 Speaker 2: in to create them. With AI, we're not, so we're 105 00:06:23,600 --> 00:06:25,840 Speaker 2: going to get the AI equivalent of I don't know 106 00:06:25,880 --> 00:06:28,359 Speaker 2: if it's politically correct or not, but Chinese whispers. You 107 00:06:28,360 --> 00:06:31,640 Speaker 2: know that thing where kids you'd say, you know, Neo 108 00:06:31,880 --> 00:06:33,880 Speaker 2: likes to talk on the telephone, and then it comes 109 00:06:33,880 --> 00:06:36,839 Speaker 2: out at the other end as Bob likes to eat 110 00:06:36,839 --> 00:06:40,640 Speaker 2: elephants or something. You know, So it's going to be that. 111 00:06:40,880 --> 00:06:44,120 Speaker 2: But with AI, because I'm going to be creating documents 112 00:06:44,200 --> 00:06:46,400 Speaker 2: easily with my three or four dot points, you're going 113 00:06:46,400 --> 00:06:49,799 Speaker 2: to get it inundated. You'll summarize them, but the points 114 00:06:49,839 --> 00:06:55,000 Speaker 2: will be lost. And so how do we stop this? Well, 115 00:06:55,440 --> 00:06:57,919 Speaker 2: stopping document creations not the answer. I think we still 116 00:06:57,960 --> 00:07:01,040 Speaker 2: need artifacts, we still need things to be created. But 117 00:07:01,160 --> 00:07:03,480 Speaker 2: how do we then make sure that people know what 118 00:07:03,520 --> 00:07:07,279 Speaker 2: the important points are? And I think within workplaces it's 119 00:07:07,320 --> 00:07:10,040 Speaker 2: something to be discussed. Like the first thing to be 120 00:07:10,080 --> 00:07:12,880 Speaker 2: discussed is, hey, if we're not careful, we are going 121 00:07:12,920 --> 00:07:16,800 Speaker 2: to be inundated with slop. So therefore, let's make sure 122 00:07:16,880 --> 00:07:21,440 Speaker 2: that targeted documents are much much more valuable than ai've 123 00:07:21,480 --> 00:07:25,200 Speaker 2: created stuff. So how do we together have agreements on 124 00:07:25,280 --> 00:07:28,679 Speaker 2: creating valuable documents. Maybe it's the first step. The second 125 00:07:28,840 --> 00:07:31,800 Speaker 2: is when you create documents, how do we call out 126 00:07:31,800 --> 00:07:36,000 Speaker 2: to humans and AI what the important points are. I 127 00:07:36,200 --> 00:07:39,280 Speaker 2: like the concept of if you've seen a dummy's guide 128 00:07:39,320 --> 00:07:42,160 Speaker 2: or an idiot's guide or Cliff's notes, they've got these 129 00:07:42,160 --> 00:07:45,000 Speaker 2: little callouts sometimes at the start of certain paragraphs or 130 00:07:45,040 --> 00:07:48,360 Speaker 2: sections and here's the important points. So maybe we need 131 00:07:48,400 --> 00:07:51,040 Speaker 2: to do things like that so that people and AI 132 00:07:51,200 --> 00:07:53,320 Speaker 2: can pull out the important points and then if they 133 00:07:53,360 --> 00:07:56,920 Speaker 2: want to go deeper they can. But ultimately it's a 134 00:07:56,920 --> 00:08:01,080 Speaker 2: discussion for each workplace for what our normal are as. 135 00:08:01,080 --> 00:08:03,280 Speaker 2: For society. I think that there's going to be a 136 00:08:03,280 --> 00:08:07,040 Speaker 2: difficult discussion to have. So maybe we need to be 137 00:08:07,080 --> 00:08:10,559 Speaker 2: feeling okay with the Hey, I got this ten page 138 00:08:10,600 --> 00:08:13,400 Speaker 2: document from you. Can you tell me what's really important 139 00:08:13,440 --> 00:08:15,360 Speaker 2: out of this? Because other than that, I'm just going 140 00:08:15,400 --> 00:08:18,120 Speaker 2: to summarize this because it doesn't feel like it lands. 141 00:08:18,160 --> 00:08:21,040 Speaker 2: And maybe we feel okay saying that rather than feeling 142 00:08:21,040 --> 00:08:23,520 Speaker 2: awkward at the moment saying I think AI generated all 143 00:08:23,560 --> 00:08:25,720 Speaker 2: your stuff. You know, maybe we need to be okay 144 00:08:25,840 --> 00:08:26,840 Speaker 2: having that discussion. 145 00:08:27,560 --> 00:08:33,400 Speaker 1: So if I'm a leader in an organization, what should 146 00:08:33,440 --> 00:08:36,080 Speaker 1: I be doing differently? Do I need to be thinking 147 00:08:36,640 --> 00:08:41,720 Speaker 1: about like policies behind how people create content, or. 148 00:08:41,840 --> 00:08:45,560 Speaker 2: Like what do I do? It's going to be tough. 149 00:08:46,720 --> 00:08:49,880 Speaker 2: The first thing is this is like a change management challenge. 150 00:08:49,920 --> 00:08:52,320 Speaker 2: We've all got. This is a new tool, everyone's got. 151 00:08:52,679 --> 00:08:54,360 Speaker 2: What do we do? What are the norms? How do 152 00:08:54,400 --> 00:08:57,760 Speaker 2: we deal with this? Leaders obviously need to lead from 153 00:08:57,760 --> 00:09:01,120 Speaker 2: the front. So when you're creating stuff, leader, don't create 154 00:09:01,160 --> 00:09:03,800 Speaker 2: AI slop because everyone and your team will then create 155 00:09:03,840 --> 00:09:07,600 Speaker 2: aislop thinking, well, Neo and Amantha did it somewhere, we 156 00:09:07,600 --> 00:09:10,040 Speaker 2: can do that as well. So yeah, you've got to 157 00:09:10,080 --> 00:09:13,200 Speaker 2: lead by example. The other is, let's have that discussion 158 00:09:13,440 --> 00:09:16,680 Speaker 2: now with your teams. While we've got these tools and 159 00:09:16,720 --> 00:09:20,720 Speaker 2: we're starting to learn what the norms and what good 160 00:09:20,840 --> 00:09:25,160 Speaker 2: looks like and what nice polite behaviors in the workplace are, 161 00:09:25,840 --> 00:09:28,120 Speaker 2: let's start to have those discussions. 162 00:09:28,160 --> 00:09:28,800 Speaker 1: Now. 163 00:09:29,360 --> 00:09:31,839 Speaker 2: We didn't need to have those discussions with other AI 164 00:09:32,040 --> 00:09:34,720 Speaker 2: like word and spell check and grammar check and things 165 00:09:34,760 --> 00:09:37,680 Speaker 2: like that. They didn't seem to be creating so much 166 00:09:37,760 --> 00:09:41,720 Speaker 2: problems as this will. But yeah, have those discussions about 167 00:09:41,760 --> 00:09:44,400 Speaker 2: what good looks like. If we're going to create a 168 00:09:44,440 --> 00:09:47,040 Speaker 2: document how do we then call out to people and 169 00:09:47,160 --> 00:09:52,320 Speaker 2: AI what the important points are. How if we create documents, 170 00:09:52,360 --> 00:09:55,560 Speaker 2: it needs to be targeted as opposed to volume for 171 00:09:55,640 --> 00:09:58,920 Speaker 2: volume sake, And if it's not targeted, how do we 172 00:09:58,960 --> 00:10:02,400 Speaker 2: as a team asked question politely about did you just 173 00:10:02,440 --> 00:10:07,280 Speaker 2: get AI to create this slop versus Hey, look, I'm 174 00:10:07,320 --> 00:10:09,600 Speaker 2: not quite sure this one hits the mark. So it's 175 00:10:09,679 --> 00:10:11,920 Speaker 2: going to be a discussion, and I think that now 176 00:10:12,040 --> 00:10:14,560 Speaker 2: is the time to have the discussion, otherwise we're going 177 00:10:14,600 --> 00:10:17,040 Speaker 2: to be buried and then people will be feeling it 178 00:10:17,200 --> 00:10:20,040 Speaker 2: awkward if we've got to change the norms later. 179 00:10:20,920 --> 00:10:23,360 Speaker 1: Okay, So out of all that, like, what are the 180 00:10:23,520 --> 00:10:29,040 Speaker 1: key things that you would love people to start doing 181 00:10:29,360 --> 00:10:33,120 Speaker 1: or stop doing based on this AI slop problem? 182 00:10:33,720 --> 00:10:37,080 Speaker 2: First off, we need to be kind to each other. 183 00:10:37,280 --> 00:10:40,679 Speaker 2: Don't just say here's ten page documents that you've created 184 00:10:40,720 --> 00:10:43,840 Speaker 2: in three seconds and expect them to go through. So 185 00:10:44,800 --> 00:10:48,280 Speaker 2: let's have an internal reframe where more content doesn't mean 186 00:10:48,320 --> 00:10:53,880 Speaker 2: good work. Instead, targeted content means great work. How do 187 00:10:53,960 --> 00:10:57,040 Speaker 2: we then call out parts from that targeted content to 188 00:10:57,160 --> 00:10:59,760 Speaker 2: show that it is actually targeted. So that's where I 189 00:10:59,840 --> 00:11:02,800 Speaker 2: like the idea of the dummies guide the cliffs notes 190 00:11:02,880 --> 00:11:06,040 Speaker 2: kind of thing where you've got main takeaways from each section, 191 00:11:06,160 --> 00:11:08,240 Speaker 2: maybe in a yellow box at the top of the 192 00:11:08,240 --> 00:11:11,000 Speaker 2: section so it's easily found out, and things like that, 193 00:11:11,679 --> 00:11:14,280 Speaker 2: so that people and AI can pull out those main 194 00:11:14,480 --> 00:11:17,280 Speaker 2: points really immediately, but if they want to go down 195 00:11:17,400 --> 00:11:20,439 Speaker 2: they can and let's figure out what that needs to 196 00:11:20,480 --> 00:11:24,720 Speaker 2: be for our workforce. So yeah, let's have let's be 197 00:11:24,880 --> 00:11:27,160 Speaker 2: kind to each other, not create a whole bunch of 198 00:11:27,960 --> 00:11:30,400 Speaker 2: documents for the sake of it. Let's be clear about 199 00:11:30,440 --> 00:11:33,880 Speaker 2: what's important in those documents. And when you do create 200 00:11:34,000 --> 00:11:36,880 Speaker 2: a document with AI, I think the last thing is 201 00:11:37,559 --> 00:11:40,679 Speaker 2: how do we show people in our team how that 202 00:11:40,760 --> 00:11:43,200 Speaker 2: was created? What I mean by that is I'd like 203 00:11:43,280 --> 00:11:47,840 Speaker 2: to borrow something from the Australian kangaroo made in Australia. 204 00:11:47,880 --> 00:11:49,920 Speaker 2: You know when you see that green and yellow kangaroo, 205 00:11:49,960 --> 00:11:51,959 Speaker 2: you've made it, it's been a mad in Australia. Or 206 00:11:52,000 --> 00:11:54,240 Speaker 2: if you're in a supermarket and you look at food, 207 00:11:54,360 --> 00:11:56,640 Speaker 2: they've got that bar chart on how much of that 208 00:11:56,679 --> 00:11:59,080 Speaker 2: food is made in Australia. How do we show to 209 00:11:59,120 --> 00:12:01,960 Speaker 2: our workmates how much of AI was used in this? 210 00:12:02,240 --> 00:12:05,200 Speaker 2: So if it was ninety nine percent generated with AI. 211 00:12:05,679 --> 00:12:07,880 Speaker 2: How do we show that to our colleagues as opposed 212 00:12:07,880 --> 00:12:10,400 Speaker 2: to look, this is all me, but I've got maybe 213 00:12:10,480 --> 00:12:13,520 Speaker 2: AI to do some critique on it, and so that's 214 00:12:13,520 --> 00:12:17,199 Speaker 2: probably a last more extension one. But in reality, let's 215 00:12:17,200 --> 00:12:19,960 Speaker 2: just be kind to each other targeted content and we'll 216 00:12:19,960 --> 00:12:22,920 Speaker 2: call out what the most important parts of each section 217 00:12:23,200 --> 00:12:24,280 Speaker 2: is in the document. 218 00:12:24,760 --> 00:12:28,160 Speaker 1: Thank you so much, Neo. I do hope that even 219 00:12:28,240 --> 00:12:31,800 Speaker 1: if just one AI slop document does not get produced 220 00:12:31,840 --> 00:12:34,760 Speaker 1: because of this chat, then we've done our job, So 221 00:12:34,840 --> 00:12:36,040 Speaker 1: thank you. 222 00:12:35,559 --> 00:12:38,120 Speaker 2: You're welcome. Or the other one is if people have 223 00:12:38,360 --> 00:12:41,680 Speaker 2: one conversation about AI slop and it's not an awkward conversation, 224 00:12:41,760 --> 00:12:44,320 Speaker 2: they're all on the same page. That's another thing I'd 225 00:12:44,400 --> 00:12:45,959 Speaker 2: like as well. 226 00:12:46,120 --> 00:12:48,880 Speaker 1: My biggest takeout from this chat with Neo is that 227 00:12:48,920 --> 00:12:53,920 Speaker 1: we need to agree that volume of content, like producing 228 00:12:54,040 --> 00:12:57,360 Speaker 1: lots of reports for example, is no longer a measure 229 00:12:57,720 --> 00:13:01,360 Speaker 1: of performance. We need to get up at rewarding and 230 00:13:01,520 --> 00:13:07,240 Speaker 1: recognizing thoughtful and targeted communication. So if you found this 231 00:13:07,280 --> 00:13:11,000 Speaker 1: conversation useful, please share it with the colleague who's maybe 232 00:13:11,080 --> 00:13:14,840 Speaker 1: also wrestling with aislop in their workplace. And if you're 233 00:13:14,880 --> 00:13:18,160 Speaker 1: looking to take your AI skills up to the next level. 234 00:13:18,480 --> 00:13:23,360 Speaker 1: In Ventium's Genai productivity upgrade is kicking off on October fifteen, 235 00:13:23,480 --> 00:13:26,600 Speaker 1: and there is a link to more info and bookings 236 00:13:26,600 --> 00:13:30,280 Speaker 1: in the show notes. If you like today's show, make 237 00:13:30,320 --> 00:13:33,080 Speaker 1: sure you hit follow on your podcast app to be 238 00:13:33,120 --> 00:13:36,959 Speaker 1: alerted when new episodes drop. How I Work was recorded 239 00:13:37,000 --> 00:13:39,680 Speaker 1: on the traditional land of the Warrangery people, part of 240 00:13:39,679 --> 00:13:42,400 Speaker 1: the Cooler Nation. A big thank you to Martin Nimber 241 00:13:42,480 --> 00:13:43,679 Speaker 1: for doing the sound mix.