1 00:00:00,080 --> 00:00:02,440 Speaker 1: I hate asking what people do for work. It's what 2 00:00:02,520 --> 00:00:04,800 Speaker 1: makes your heart happy. AI is such a good tool 3 00:00:04,920 --> 00:00:08,960 Speaker 1: from a disability and even just like an advocacy standpoint, 4 00:00:09,000 --> 00:00:11,319 Speaker 1: because if there's stuff that's missing that you can't have, 5 00:00:11,360 --> 00:00:12,879 Speaker 1: that can fill in some of the gaps for you. 6 00:00:12,960 --> 00:00:16,759 Speaker 2: Welcome to the Seize the Ya Podcast. Busy and happy 7 00:00:16,920 --> 00:00:19,840 Speaker 2: are not the same thing. We too rarely question what 8 00:00:20,000 --> 00:00:22,639 Speaker 2: makes the heart seeing. We work, then we rest, but 9 00:00:22,880 --> 00:00:25,919 Speaker 2: rarely we play and often don't realize there's more than 10 00:00:26,000 --> 00:00:28,760 Speaker 2: one way. So this is a platform to hear and 11 00:00:28,880 --> 00:00:32,280 Speaker 2: explore the stories of those who found lives they adore, 12 00:00:32,479 --> 00:00:35,120 Speaker 2: the good, bad and ugly. The best and worst day 13 00:00:35,360 --> 00:00:39,760 Speaker 2: will bear all the facets of Seizing your Yea. I'm 14 00:00:39,840 --> 00:00:43,120 Speaker 2: Sarah Davidson or a spoonful of Sarah, a lawyer turned 15 00:00:43,159 --> 00:00:45,559 Speaker 2: fu entrepreneur whos wapped the suits and heels to co 16 00:00:45,680 --> 00:00:48,919 Speaker 2: found matcha Maiden and matcha Milk Bar. Sez the Ya 17 00:00:49,200 --> 00:00:51,680 Speaker 2: is a series of conversations on finding a life you 18 00:00:51,800 --> 00:00:55,720 Speaker 2: love and exploring the self doubt, challenge, joy and fulfillment 19 00:00:55,800 --> 00:01:02,360 Speaker 2: along the way yighborhood. One of my favorite things about 20 00:01:02,360 --> 00:01:05,000 Speaker 2: this show is the incidental opportunity it gives me to 21 00:01:05,080 --> 00:01:08,080 Speaker 2: learn about things I don't really understand from the people 22 00:01:08,120 --> 00:01:11,119 Speaker 2: who find their yay in making it their area of expertise. 23 00:01:11,640 --> 00:01:14,839 Speaker 2: I often refer to my selfish motivations in approaching particular 24 00:01:14,880 --> 00:01:17,600 Speaker 2: guests to pick their brains about things that fascinate me. 25 00:01:18,080 --> 00:01:21,720 Speaker 2: And an episode on AI has been brewing for some time. 26 00:01:22,360 --> 00:01:24,800 Speaker 2: I think many of you will resonate with AI seeping 27 00:01:24,840 --> 00:01:27,040 Speaker 2: more and more into your life, and in my case, 28 00:01:27,080 --> 00:01:29,400 Speaker 2: at least a little quicker than my understanding of it 29 00:01:29,440 --> 00:01:32,480 Speaker 2: has been able to develop. I'm surrounded by content on 30 00:01:32,520 --> 00:01:35,120 Speaker 2: how it can change your life and productivity for the better, 31 00:01:35,280 --> 00:01:38,319 Speaker 2: but also how it'll replace us all by tomorrow. So 32 00:01:38,520 --> 00:01:40,839 Speaker 2: I wanted a guest who not only understands it well, 33 00:01:40,920 --> 00:01:43,200 Speaker 2: but is known for being able to make that information 34 00:01:43,400 --> 00:01:48,440 Speaker 2: digestible to the everyday person. Enter Lauren Dennett, who I 35 00:01:48,600 --> 00:01:51,440 Speaker 2: was lucky to meet through my partnership with HP, who 36 00:01:51,520 --> 00:01:54,840 Speaker 2: has helped me better grasp and therefore harness the power 37 00:01:54,880 --> 00:01:58,960 Speaker 2: of HP's AI technologies in such a big way. Lauren 38 00:01:59,120 --> 00:02:02,800 Speaker 2: is the product expert for HP among many other titles, 39 00:02:03,120 --> 00:02:05,040 Speaker 2: and I think you'll be able to hear within minutes 40 00:02:05,040 --> 00:02:07,920 Speaker 2: even of our chat just how passionate. They are about 41 00:02:07,960 --> 00:02:10,640 Speaker 2: the power of AI and empowering all of us to 42 00:02:10,680 --> 00:02:13,560 Speaker 2: reap its benefits in our day to day lives, from 43 00:02:13,680 --> 00:02:18,400 Speaker 2: busting common misconceptions to naming KEYAI powered tools that can 44 00:02:18,520 --> 00:02:21,280 Speaker 2: change your work or change your personal life. Even we 45 00:02:21,400 --> 00:02:25,720 Speaker 2: are so lucky to have Lauren here demystifying it all weaved, 46 00:02:25,720 --> 00:02:29,320 Speaker 2: of course, around Lauren's own path yay, including as you 47 00:02:29,360 --> 00:02:33,119 Speaker 2: can hear from their pronouns, their non binary identity, their 48 00:02:33,160 --> 00:02:37,639 Speaker 2: experience of inclusion, and their passionate advocacy in the workplace. 49 00:02:38,080 --> 00:02:40,919 Speaker 2: I hope you all enjoy and learn as much from 50 00:02:41,160 --> 00:02:46,440 Speaker 2: this episode as I did. Lauren, Hello, Welcome to CEZA. 51 00:02:46,919 --> 00:02:48,760 Speaker 1: You so so much. It's so good to be here. 52 00:02:49,080 --> 00:02:50,520 Speaker 2: It's so weird. I feel like we know each other 53 00:02:50,560 --> 00:02:52,880 Speaker 2: really well because we've spoken a lot online, but this 54 00:02:52,960 --> 00:02:55,919 Speaker 2: is our first IRL interaction. It's like we're besties. 55 00:02:56,200 --> 00:02:58,680 Speaker 1: I think we are besties. So now I'm questioning our 56 00:02:58,720 --> 00:02:59,959 Speaker 1: relationship at this point. 57 00:03:00,760 --> 00:03:04,120 Speaker 2: I am not. I am comfortable with the beautiful relationship 58 00:03:04,160 --> 00:03:06,560 Speaker 2: that we have, and I'm really excited to have you here. 59 00:03:07,080 --> 00:03:10,079 Speaker 2: We have already gone very rogue before we started recording, 60 00:03:10,240 --> 00:03:14,400 Speaker 2: so yeah, well, we'll distill all the wisdom that exists 61 00:03:14,400 --> 00:03:16,200 Speaker 2: in your brain. Your brain is one, as I mentioned 62 00:03:16,240 --> 00:03:18,440 Speaker 2: in the intro, that I'm very excited to pick for 63 00:03:18,560 --> 00:03:23,200 Speaker 2: so many reasons, but a very big part of CZA 64 00:03:23,320 --> 00:03:28,359 Speaker 2: is the idea that the best pathways are neverlinear. And 65 00:03:28,600 --> 00:03:31,480 Speaker 2: even though now it is so clear within million seconds 66 00:03:31,520 --> 00:03:33,359 Speaker 2: of speaking to you that you are so passionate about 67 00:03:33,400 --> 00:03:35,600 Speaker 2: what you do and that you're in exactly the place 68 00:03:35,600 --> 00:03:37,400 Speaker 2: that you're meant to be in this world, it's never 69 00:03:37,400 --> 00:03:39,160 Speaker 2: a straight line to get there, and there's often so 70 00:03:39,200 --> 00:03:41,680 Speaker 2: many chapters to get there before that. So I want 71 00:03:41,680 --> 00:03:44,720 Speaker 2: to go back to young you sort of the first 72 00:03:44,760 --> 00:03:46,800 Speaker 2: jobs that you had. I think we all think about 73 00:03:46,800 --> 00:03:48,600 Speaker 2: those jobs that we want to have when we grow up, 74 00:03:48,680 --> 00:03:50,840 Speaker 2: and you know, I think one of the things we 75 00:03:50,880 --> 00:03:53,400 Speaker 2: often say is the job you'll have eventually might not 76 00:03:53,480 --> 00:03:56,200 Speaker 2: even exist at the time that you know you're having 77 00:03:56,200 --> 00:03:58,200 Speaker 2: these hopes and dreams, which in your case is definitely 78 00:03:58,240 --> 00:04:01,040 Speaker 2: the case, because AI is very very news. So what 79 00:04:01,080 --> 00:04:02,640 Speaker 2: did you think you'd be when you were younger? 80 00:04:03,960 --> 00:04:05,720 Speaker 1: I was talking to my partner about this last night 81 00:04:05,760 --> 00:04:07,960 Speaker 1: because I was like, how do I summarize this in 82 00:04:08,000 --> 00:04:12,640 Speaker 1: a funny way? So basically, young me had this grand idea. 83 00:04:12,720 --> 00:04:14,880 Speaker 1: I wanted to be a creative. I wanted the chance 84 00:04:14,960 --> 00:04:16,880 Speaker 1: to sort of like put things on paper, and I 85 00:04:16,920 --> 00:04:21,880 Speaker 1: wanted to be an architect. However, I have afantasia or 86 00:04:21,920 --> 00:04:24,239 Speaker 1: afantasia where I can't picture things. 87 00:04:24,880 --> 00:04:25,000 Speaker 2: What. 88 00:04:25,600 --> 00:04:29,200 Speaker 1: Yeah, so I recently found out people, you're in a room, right, 89 00:04:29,240 --> 00:04:31,359 Speaker 1: So right now, I'm going to challenge you. Think about 90 00:04:31,480 --> 00:04:34,320 Speaker 1: a room. I want you to picture a ball in 91 00:04:34,320 --> 00:04:36,839 Speaker 1: that room. Okay, what does that ball look like? It 92 00:04:36,880 --> 00:04:39,719 Speaker 1: is a black squishy stress ball. 93 00:04:39,960 --> 00:04:43,840 Speaker 2: Okay, that has this logo on it that's on your microphone. 94 00:04:43,880 --> 00:04:46,200 Speaker 2: Because this is what we've just been speaking about before, 95 00:04:46,240 --> 00:04:48,880 Speaker 2: that we need a stress squishy ball to have something 96 00:04:48,920 --> 00:04:50,800 Speaker 2: to do with our hands. And that's what the ball 97 00:04:50,880 --> 00:04:53,080 Speaker 2: is in the room. Amazing, straight to my head the picture. 98 00:04:53,120 --> 00:04:55,159 Speaker 2: What does the table look like? It is a black 99 00:04:55,760 --> 00:05:00,640 Speaker 2: like matt aluminium leg table and it matches the black 100 00:05:00,680 --> 00:05:01,680 Speaker 2: ball that sitting in the middle of it. 101 00:05:01,760 --> 00:05:01,960 Speaker 1: Yeah. 102 00:05:01,960 --> 00:05:02,120 Speaker 3: Cool. 103 00:05:02,120 --> 00:05:04,400 Speaker 1: Do you want to know what I see? Nothing? Abyss 104 00:05:04,760 --> 00:05:07,600 Speaker 1: nothing stop it. So as as you can tell, as 105 00:05:07,640 --> 00:05:10,000 Speaker 1: a young person wanting to be an architect, I have 106 00:05:10,040 --> 00:05:13,720 Speaker 1: a complete disadvantage and I think it's rude. So anyway, wait, 107 00:05:13,760 --> 00:05:16,320 Speaker 1: hold on, so you can't. I can't see anything, and 108 00:05:16,400 --> 00:05:18,200 Speaker 1: I didn't know that wasn't normal. So now when I 109 00:05:18,200 --> 00:05:20,360 Speaker 1: talk to people, it's like, you know, when you meet someone, 110 00:05:20,400 --> 00:05:22,080 Speaker 1: you're like, what do you do for work? And obviously 111 00:05:22,360 --> 00:05:24,520 Speaker 1: it's a really weird one because immediately you start judging 112 00:05:24,520 --> 00:05:26,760 Speaker 1: people and what they do. So I hate asking what 113 00:05:26,839 --> 00:05:29,119 Speaker 1: people do for work. It's what makes your heart happy. 114 00:05:29,680 --> 00:05:31,080 Speaker 1: But sometimes it's. 115 00:05:31,400 --> 00:05:32,000 Speaker 3: What do you see? 116 00:05:32,040 --> 00:05:32,840 Speaker 1: Do you see this ball? 117 00:05:33,200 --> 00:05:36,160 Speaker 3: Let this ball look like the table look like? E Matually? 118 00:05:36,320 --> 00:05:37,040 Speaker 3: What are we in? 119 00:05:37,279 --> 00:05:39,599 Speaker 1: Is their noise canceling everywhere? Like? Is it bright? 120 00:05:39,720 --> 00:05:42,640 Speaker 3: Is its improved? Bring me to your room? 121 00:05:42,839 --> 00:05:46,520 Speaker 2: I don't have a room, see Lauren, that is so interesting. 122 00:05:47,120 --> 00:05:51,120 Speaker 1: So long story short, baby Lauren could not, unfortunately, be 123 00:05:51,160 --> 00:05:52,400 Speaker 1: an architect because it was. 124 00:05:52,360 --> 00:05:53,600 Speaker 3: Never going to work fair enough. 125 00:05:53,600 --> 00:05:55,960 Speaker 2: I mean, if you can't visualize things, unless you were 126 00:05:56,000 --> 00:05:58,920 Speaker 2: designing like a brutalist escape room that was just black, 127 00:05:59,040 --> 00:06:00,320 Speaker 2: in which case you'd be a made I. 128 00:06:00,320 --> 00:06:02,039 Speaker 1: Think I thrive in that, honestly, I think it really 129 00:06:02,080 --> 00:06:04,000 Speaker 1: was so just the aspect of watching people struggle. 130 00:06:05,720 --> 00:06:05,800 Speaker 2: Ye. 131 00:06:06,240 --> 00:06:08,520 Speaker 1: So I've got the creativity and I've got the passion, 132 00:06:08,520 --> 00:06:10,719 Speaker 1: but the problem is I don't have the visualization. It 133 00:06:10,800 --> 00:06:13,800 Speaker 1: just doesn't exist in my head. So I can recall 134 00:06:13,839 --> 00:06:16,200 Speaker 1: things from memory, but I can't make it up on 135 00:06:16,200 --> 00:06:18,760 Speaker 1: the spot. It just doesn't exist. So yeah, past me 136 00:06:18,800 --> 00:06:21,559 Speaker 1: couldn't do that. But as my career kind of grew, 137 00:06:21,680 --> 00:06:24,120 Speaker 1: like I was one of those kids where I grew 138 00:06:24,160 --> 00:06:26,240 Speaker 1: up in the Mornington Peninsula, there wasn't a lot of 139 00:06:26,279 --> 00:06:29,400 Speaker 1: opportunity there, so I didn't really know what options there were, 140 00:06:29,600 --> 00:06:32,520 Speaker 1: and so classic me, I went to TAVE tried to 141 00:06:32,520 --> 00:06:34,760 Speaker 1: figure that out, studied web design and thought, you know, 142 00:06:34,800 --> 00:06:37,400 Speaker 1: maybe this is what I want to do. Terrible at 143 00:06:37,440 --> 00:06:39,400 Speaker 1: that because naturally you can't picture things, so you. 144 00:06:39,360 --> 00:06:40,320 Speaker 3: Can't recall things. 145 00:06:40,360 --> 00:06:43,960 Speaker 1: So even studying going to school, it just wasn't helpful 146 00:06:43,960 --> 00:06:46,320 Speaker 1: for me. So I took a gap year, went to 147 00:06:46,400 --> 00:06:49,400 Speaker 1: JB High Fi, went to you know, started doing retail work, 148 00:06:49,440 --> 00:06:51,560 Speaker 1: selling my soul because I love talking to people, and 149 00:06:51,560 --> 00:06:53,680 Speaker 1: I realized I really wish there was a job where 150 00:06:53,680 --> 00:06:56,120 Speaker 1: I could just talk for a living. And so through 151 00:06:56,160 --> 00:06:58,000 Speaker 1: JB High Fi and through that there was like all 152 00:06:58,000 --> 00:07:00,599 Speaker 1: these opportunities to learn from brands and so so I'm 153 00:07:00,600 --> 00:07:02,640 Speaker 1: not going to name names, but there was a gentleman 154 00:07:02,720 --> 00:07:05,520 Speaker 1: that was this absolute catalyst for how I got to 155 00:07:05,560 --> 00:07:08,000 Speaker 1: my job. The way that he performed, the way that 156 00:07:08,080 --> 00:07:10,640 Speaker 1: he presented, and the way that he talked about product 157 00:07:10,640 --> 00:07:12,080 Speaker 1: and his passion. I was like, I want to be 158 00:07:12,120 --> 00:07:15,760 Speaker 1: that person. I absolutely willed it. So when I was 159 00:07:15,800 --> 00:07:18,360 Speaker 1: twenty one, I'm now thirty three, thirty. 160 00:07:18,160 --> 00:07:20,680 Speaker 2: Four, My god, just skin's amazing, My god, thank you 161 00:07:20,720 --> 00:07:24,920 Speaker 2: so much. I'm in the tangents. 162 00:07:25,680 --> 00:07:28,000 Speaker 3: Oh we already off track, but this is great, just. 163 00:07:28,040 --> 00:07:30,240 Speaker 1: Let it go. But yeah, So twenty one year old 164 00:07:30,360 --> 00:07:33,920 Speaker 1: me got presented to by this person, and so I 165 00:07:34,000 --> 00:07:35,320 Speaker 1: had this thing where I was like, I want to 166 00:07:35,320 --> 00:07:38,240 Speaker 1: be this person. And now I look at it, and 167 00:07:38,360 --> 00:07:41,960 Speaker 1: that person was the key presenter, the product expert for HP. 168 00:07:43,200 --> 00:07:45,240 Speaker 3: Yeah, chill act. 169 00:07:45,880 --> 00:07:48,360 Speaker 1: So I don't know how I willed it, but basically 170 00:07:48,440 --> 00:07:51,880 Speaker 1: my trajectory went from working in retail. I decided I 171 00:07:51,920 --> 00:07:55,000 Speaker 1: wanted to then go and do the natural progression from retailers. 172 00:07:55,000 --> 00:07:57,480 Speaker 1: Either you go do this really weird sales job, you 173 00:07:57,520 --> 00:07:59,720 Speaker 1: go be a car salesperson or a real estate agent, 174 00:07:59,880 --> 00:08:02,040 Speaker 1: or you go into rapping. So I went into wrapping. 175 00:08:02,440 --> 00:08:05,040 Speaker 1: I then started going into my retailers and then getting 176 00:08:05,040 --> 00:08:08,120 Speaker 1: the chance to sort of again build up my confidence 177 00:08:08,120 --> 00:08:11,240 Speaker 1: in talking to people, and I guess, recalling information and 178 00:08:11,280 --> 00:08:13,440 Speaker 1: more importantly just bringing my passion. And so I got 179 00:08:13,440 --> 00:08:16,760 Speaker 1: to do that, and then as that started to grow, randomly, 180 00:08:16,800 --> 00:08:19,160 Speaker 1: this job popped up. This guy he decided to leave 181 00:08:19,200 --> 00:08:20,800 Speaker 1: his job, and I was like, you know what, I'm 182 00:08:20,840 --> 00:08:23,240 Speaker 1: just gonna give it a go. It's my time to shine, literally, 183 00:08:23,320 --> 00:08:26,000 Speaker 1: and so I had my interview. I had no qualifications. 184 00:08:26,160 --> 00:08:29,360 Speaker 1: I was just really good at shit talking. Excuse my language. 185 00:08:29,360 --> 00:08:32,600 Speaker 1: I was really good at talking, and so naturally I 186 00:08:32,640 --> 00:08:35,000 Speaker 1: got really lucky. I had a four round interview and 187 00:08:35,080 --> 00:08:37,280 Speaker 1: next thing, you know, the guy that hired me took 188 00:08:37,320 --> 00:08:39,360 Speaker 1: a chance, and I'm here four years later. 189 00:08:39,920 --> 00:08:42,360 Speaker 2: Oh my god, talking as part of your job. Yeah, 190 00:08:42,800 --> 00:08:46,560 Speaker 2: product expert for HP eally chills. 191 00:08:46,640 --> 00:08:48,160 Speaker 1: So that's how I got to where I am now, 192 00:08:48,240 --> 00:08:53,000 Speaker 1: and it's what I planned. Never really existed, but it did. 193 00:08:53,040 --> 00:08:54,880 Speaker 1: I didn't know that this was a thing, and then 194 00:08:54,920 --> 00:08:56,720 Speaker 1: once I saw that person, I was like, I want 195 00:08:56,760 --> 00:08:59,480 Speaker 1: to be you in a non creepy way. Of course, 196 00:09:00,000 --> 00:09:02,840 Speaker 1: there I am. I'm doing that job, and even what 197 00:09:02,880 --> 00:09:05,000 Speaker 1: I saw versus what I'm doing now is so different, 198 00:09:05,000 --> 00:09:06,920 Speaker 1: Like I never would have thought that in my role 199 00:09:07,160 --> 00:09:09,840 Speaker 1: I'd be sitting here talking to someone amazing, like yourself 200 00:09:10,280 --> 00:09:14,040 Speaker 1: that also interviews incredible souls like again talking back to 201 00:09:14,080 --> 00:09:17,680 Speaker 1: your True Crime interview badly, like I was like, I 202 00:09:17,720 --> 00:09:20,080 Speaker 1: have imposter syndrome? What am I doing here? How in 203 00:09:20,120 --> 00:09:22,120 Speaker 1: the world am I sitting in the same table as 204 00:09:22,120 --> 00:09:25,440 Speaker 1: all these famous people, like even the bloody Purple Wiggle, Like, 205 00:09:25,840 --> 00:09:29,680 Speaker 1: I'm sorry, word, how am I in this room right now? So, yeah, 206 00:09:29,679 --> 00:09:32,800 Speaker 1: it's such an incredible opportunity. So again, thank you for 207 00:09:32,840 --> 00:09:35,080 Speaker 1: having me and liked being able to speak to who 208 00:09:35,080 --> 00:09:38,559 Speaker 1: I am and my current passions in HP Oh. 209 00:09:38,600 --> 00:09:41,480 Speaker 2: It is such an incredible privilege, not only because you 210 00:09:41,520 --> 00:09:43,880 Speaker 2: can tell, as I said immediately, how excited you are 211 00:09:43,920 --> 00:09:46,040 Speaker 2: about what you do. And my passion is talking to 212 00:09:46,080 --> 00:09:49,160 Speaker 2: people about what they're passionate about, because nothing lights a 213 00:09:49,200 --> 00:09:52,800 Speaker 2: person up like them finding their yay and actually doing it, 214 00:09:52,960 --> 00:09:55,680 Speaker 2: but also reminding people that, yeah, it's never straightforward to 215 00:09:55,679 --> 00:09:58,480 Speaker 2: get there. And I think your story is so powerful 216 00:09:58,480 --> 00:10:01,120 Speaker 2: for so many reasons because it's already touched on the 217 00:10:01,160 --> 00:10:04,240 Speaker 2: fact that you can't become something if you haven't seen it. 218 00:10:04,400 --> 00:10:08,360 Speaker 2: And I think visibility my show aims to tell stories 219 00:10:08,400 --> 00:10:10,960 Speaker 2: so that someone listening goes I didn't know that job 220 00:10:11,040 --> 00:10:12,959 Speaker 2: was possible. I want to be that you can't aim 221 00:10:13,000 --> 00:10:15,480 Speaker 2: for something you don't know. Is there exactly sliding doors 222 00:10:15,559 --> 00:10:19,040 Speaker 2: moments meeting the right person? And also the role of luck. 223 00:10:19,120 --> 00:10:21,240 Speaker 2: You said you got lucky. I don't think you got lucky. 224 00:10:21,360 --> 00:10:25,160 Speaker 2: I think there's luck and timing plays a role, but 225 00:10:25,240 --> 00:10:27,640 Speaker 2: I also think you do manifest things for yourself, and 226 00:10:27,679 --> 00:10:31,760 Speaker 2: by putting yourself forward that wasn't lucky. You actually stepped 227 00:10:31,760 --> 00:10:33,480 Speaker 2: through the discomfort to put your name forward and then 228 00:10:33,480 --> 00:10:34,760 Speaker 2: you got it. But you couldn't have got the job 229 00:10:34,880 --> 00:10:38,400 Speaker 2: unless you went for it. But the most important thing 230 00:10:38,400 --> 00:10:41,679 Speaker 2: that stood out for me in that is how we 231 00:10:41,760 --> 00:10:44,400 Speaker 2: silo ourselves based on the things we can and can't 232 00:10:44,440 --> 00:10:46,560 Speaker 2: do from a very young age, and you not being 233 00:10:46,600 --> 00:10:50,199 Speaker 2: able to visualize things could have easily made you believe 234 00:10:50,240 --> 00:10:50,800 Speaker 2: you weren't clever. 235 00:10:51,000 --> 00:10:53,199 Speaker 1: Absolutely, I genuinely thought I was stupid. 236 00:10:53,480 --> 00:10:56,880 Speaker 2: Yeah, because you couldn't do just one aspect of the 237 00:10:56,920 --> 00:11:01,000 Speaker 2: brain's capacity. But I almost want to cry thinking about 238 00:11:01,120 --> 00:11:03,439 Speaker 2: without those sliding doors moments, how you might never have 239 00:11:03,520 --> 00:11:06,960 Speaker 2: discovered that you're incredibly clever. It's just a different form offer. 240 00:11:07,400 --> 00:11:11,680 Speaker 1: Yeah, absolutely, And honestly I don't think like thinking back 241 00:11:11,679 --> 00:11:14,080 Speaker 1: to my high school days thinking back to other students 242 00:11:14,120 --> 00:11:17,760 Speaker 1: similar to myself, like I would have benefited wholeheartedly doing 243 00:11:17,800 --> 00:11:19,880 Speaker 1: more of like a tape style hands on because I 244 00:11:20,000 --> 00:11:23,480 Speaker 1: learned by doing. I don't learn by listening, And so 245 00:11:23,880 --> 00:11:26,200 Speaker 1: high school is very much like a how well can 246 00:11:26,240 --> 00:11:29,160 Speaker 1: you recite what you've learned? It's not how smart you are, 247 00:11:29,600 --> 00:11:32,440 Speaker 1: And so unfortunately I was at a disadvantage. So most 248 00:11:32,480 --> 00:11:34,320 Speaker 1: of my high school life I just turned up, put 249 00:11:34,320 --> 00:11:36,080 Speaker 1: my name on a test, and then walked away and 250 00:11:36,080 --> 00:11:38,480 Speaker 1: barely passed. But where I am now in my career, 251 00:11:38,520 --> 00:11:40,880 Speaker 1: I never would have thought that I'd be a successful person. 252 00:11:41,360 --> 00:11:43,640 Speaker 1: I never would have seen myself in a role like 253 00:11:43,679 --> 00:11:47,160 Speaker 1: this where I'm able to empower others. So it's so fulfilling. 254 00:11:47,240 --> 00:11:50,000 Speaker 1: Like I'm so proud and so proud to work for 255 00:11:50,160 --> 00:11:52,959 Speaker 1: HP and have people within HP trust me enough to 256 00:11:53,000 --> 00:11:54,360 Speaker 1: come and have a talk to you as well and 257 00:11:54,400 --> 00:11:56,880 Speaker 1: represent the brand is amazing. 258 00:11:57,120 --> 00:11:59,160 Speaker 2: Oh I love that. And I do think for anyone 259 00:11:59,240 --> 00:12:02,800 Speaker 2: listening who has tried to fit themselves in the wrong 260 00:12:02,960 --> 00:12:06,199 Speaker 2: shaped hole and it's made them reflect on their intelligence 261 00:12:06,280 --> 00:12:08,720 Speaker 2: or capabilities in a way that they think, well, I 262 00:12:08,720 --> 00:12:10,520 Speaker 2: can't do that, so I can't do anything you just 263 00:12:10,559 --> 00:12:13,240 Speaker 2: need to keep trying. You'll find something like someone out 264 00:12:13,240 --> 00:12:14,880 Speaker 2: there is looking for exactly what you have. That's the 265 00:12:14,920 --> 00:12:17,559 Speaker 2: quote I always come back to, and I'm so excited 266 00:12:17,559 --> 00:12:20,160 Speaker 2: that you have found that and obviously thriving doing it. 267 00:12:20,760 --> 00:12:23,120 Speaker 2: But it is true that another part of this show 268 00:12:23,200 --> 00:12:24,880 Speaker 2: is the idea that I don't also want to ask 269 00:12:24,960 --> 00:12:27,160 Speaker 2: people what they do as the first question. I want 270 00:12:27,200 --> 00:12:29,080 Speaker 2: to ask what's your yea? What lights you up? I 271 00:12:29,080 --> 00:12:31,800 Speaker 2: don't want it to be our whole identity, but it 272 00:12:31,920 --> 00:12:33,839 Speaker 2: is important to understand what you do so that others 273 00:12:33,880 --> 00:12:36,560 Speaker 2: could aim for this role in the future. How do 274 00:12:36,600 --> 00:12:38,959 Speaker 2: you describe it? Because it is so multifaceted, like. 275 00:12:38,880 --> 00:12:41,120 Speaker 1: What do you do? Look, if I want to tell 276 00:12:41,160 --> 00:12:44,560 Speaker 1: you my formal role is HP's lead trainer and content manager, 277 00:12:44,840 --> 00:12:48,760 Speaker 1: and then that asterisk field team manager Asterisk you know, 278 00:12:49,240 --> 00:12:54,079 Speaker 1: pr person, I guess or hype guy for HP. But honestly, 279 00:12:54,160 --> 00:12:54,480 Speaker 1: I love. 280 00:12:54,400 --> 00:12:56,080 Speaker 2: Dog shirt wearer extraordinare I. 281 00:12:56,160 --> 00:12:58,440 Speaker 1: Honestly the coul shirt guy? That's so fine too. 282 00:12:58,520 --> 00:13:01,760 Speaker 2: Honestly, whatever the official one I found on LinkedIn. 283 00:13:01,440 --> 00:13:05,199 Speaker 1: I thought, I thank you, I'm s It's fine. Please, 284 00:13:05,200 --> 00:13:06,680 Speaker 1: I'll make sure to put that as a skill. Yeah, 285 00:13:06,800 --> 00:13:09,319 Speaker 1: pull shirt where I know, I was tossing up to 286 00:13:09,360 --> 00:13:11,920 Speaker 1: I go dog shirt or caterpillars today, but I thought, 287 00:13:11,960 --> 00:13:12,360 Speaker 1: doc sh. 288 00:13:12,960 --> 00:13:14,960 Speaker 2: I mean it says a lot. I think dog immediately 289 00:13:15,080 --> 00:13:17,520 Speaker 2: ingratiating yourself with me, I'm like, this is good, thank you. 290 00:13:17,480 --> 00:13:19,880 Speaker 1: Thank you so much. Also, I realized I've also done 291 00:13:19,880 --> 00:13:22,000 Speaker 1: a few bit of other works with the same shirts, 292 00:13:22,080 --> 00:13:22,800 Speaker 1: so that's fine. 293 00:13:22,840 --> 00:13:25,080 Speaker 2: I just pretend that to sustainable sustainable fashion. 294 00:13:25,080 --> 00:13:28,240 Speaker 1: Absolutely. Anyway back to your main question though, what is 295 00:13:28,240 --> 00:13:31,000 Speaker 1: my role? So formally, I'm going to call it HP 296 00:13:31,120 --> 00:13:34,080 Speaker 1: hype guy, but also product expert for consumer products. So 297 00:13:34,440 --> 00:13:36,800 Speaker 1: whenever you go into a retail store, you go online, 298 00:13:36,840 --> 00:13:39,800 Speaker 1: you see HP products, whether it be our HP laptops 299 00:13:39,800 --> 00:13:44,640 Speaker 1: and premium, our own gaming laptops, polyhead sets, Hyperrex for gaming, 300 00:13:45,040 --> 00:13:47,319 Speaker 1: all of the above pattern pending no, I'm just. 301 00:13:47,800 --> 00:13:50,320 Speaker 2: Like, where are they? This is like literally an accident 302 00:13:50,320 --> 00:13:51,520 Speaker 2: that we both turned out with ours. 303 00:13:51,679 --> 00:13:53,960 Speaker 1: I mean, I personally am I have a wee bit 304 00:13:54,000 --> 00:13:59,000 Speaker 1: of the tismism, so being able to walk around in 305 00:13:59,040 --> 00:14:01,320 Speaker 1: public and stuff is really hard for me. Sometimes loud 306 00:14:01,320 --> 00:14:04,120 Speaker 1: noises can be really hard, so I actually use these 307 00:14:04,200 --> 00:14:06,120 Speaker 1: to block out the world. I find it really easy 308 00:14:06,160 --> 00:14:08,440 Speaker 1: to walk around with a podcast. That way, I'm dictating 309 00:14:08,440 --> 00:14:11,040 Speaker 1: what is going through my ears rather than loud jarring noises, 310 00:14:11,080 --> 00:14:13,480 Speaker 1: and I find myself creping a lot more with my 311 00:14:13,559 --> 00:14:15,600 Speaker 1: day to day when I have noise canceling headphones on. 312 00:14:15,840 --> 00:14:18,760 Speaker 1: So that was unintentional, but here we are talking about it. 313 00:14:19,040 --> 00:14:20,600 Speaker 2: I didn't think about that use case. For me. It 314 00:14:20,640 --> 00:14:23,400 Speaker 2: was more like block out my child screaming. But you know, 315 00:14:23,560 --> 00:14:25,560 Speaker 2: multifaceted either. 316 00:14:25,400 --> 00:14:26,600 Speaker 3: Way, same end goal. 317 00:14:27,360 --> 00:14:29,320 Speaker 1: But you know what, I love that. But yeah, so 318 00:14:29,440 --> 00:14:33,000 Speaker 1: long story, short product type guy, HP person. When it 319 00:14:33,040 --> 00:14:35,440 Speaker 1: comes down to all things consumer product. 320 00:14:35,480 --> 00:14:38,120 Speaker 2: That's amazing me and you have already. I mean that 321 00:14:38,200 --> 00:14:41,600 Speaker 2: is how we met, is through you explaining and unlocking 322 00:14:42,320 --> 00:14:45,040 Speaker 2: so many benefits for me. I mean everyone knows I 323 00:14:45,040 --> 00:14:47,280 Speaker 2: try to jam too much into my life. I am 324 00:14:47,360 --> 00:14:52,040 Speaker 2: constantly I call it optimism really about time, but I 325 00:14:52,080 --> 00:14:55,360 Speaker 2: mean often to my detriment. But you have really unlocked 326 00:14:55,360 --> 00:14:58,080 Speaker 2: I had all these devices, but wasn't sort of optimizing 327 00:14:58,120 --> 00:14:59,920 Speaker 2: the way that I use them. And you've unlocked so much. 328 00:15:00,040 --> 00:15:02,240 Speaker 2: I'm so excited to have you to explain to so 329 00:15:02,240 --> 00:15:04,800 Speaker 2: many people how they can enhance their workflows. And I 330 00:15:04,800 --> 00:15:07,040 Speaker 2: mean time is our biggest commodity and the thing that 331 00:15:07,080 --> 00:15:09,760 Speaker 2: we struggle the most with. But before we get to that, 332 00:15:09,880 --> 00:15:12,600 Speaker 2: another thing that I find really important on this show 333 00:15:12,680 --> 00:15:15,240 Speaker 2: is to humanize the person behind the role as well, 334 00:15:15,240 --> 00:15:17,320 Speaker 2: because that's a really big thing. And something that's beautiful 335 00:15:17,320 --> 00:15:20,080 Speaker 2: about HP is that you weren't just introduced to me 336 00:15:20,960 --> 00:15:23,360 Speaker 2: in your role in terms of what you actually do 337 00:15:23,440 --> 00:15:26,360 Speaker 2: skills wise. It was also part of your identity in 338 00:15:26,440 --> 00:15:30,240 Speaker 2: the organization and your advocacy for inclusion in the workplace, 339 00:15:30,320 --> 00:15:32,840 Speaker 2: and also your personal journey that you've been through being 340 00:15:32,880 --> 00:15:35,600 Speaker 2: a non binary and even I mean, as everyone's heard 341 00:15:35,640 --> 00:15:39,120 Speaker 2: in the intro, your pronouns are they them? And that's 342 00:15:39,480 --> 00:15:43,280 Speaker 2: relatively unfamiliar in a lot of workplaces and for a 343 00:15:43,280 --> 00:15:45,520 Speaker 2: lot of you know, the generations who are coming through now. 344 00:15:45,600 --> 00:15:47,600 Speaker 2: So can you talk to us a little bit about 345 00:15:47,680 --> 00:15:51,400 Speaker 2: that part of your experience and yeah, how you've navigated 346 00:15:51,480 --> 00:15:53,920 Speaker 2: when you came out and how it's been embraced. Yeah. 347 00:15:53,920 --> 00:15:57,280 Speaker 1: Absolutely, So I was really lucky, Like I still feel 348 00:15:57,320 --> 00:15:59,760 Speaker 1: like I came out quite young in comparison to a 349 00:15:59,800 --> 00:16:02,600 Speaker 1: lot of other people still finding their identity. So I 350 00:16:02,640 --> 00:16:05,160 Speaker 1: came out when I was twenty one. I'm honestly so 351 00:16:05,200 --> 00:16:07,720 Speaker 1: thankful that I had I was the right time, right place. Obviously, 352 00:16:07,960 --> 00:16:10,320 Speaker 1: working at jbhi Fi definitely helped because there's a lot 353 00:16:10,320 --> 00:16:13,880 Speaker 1: of fellow queers in the environment, which made me realize, oh, 354 00:16:13,960 --> 00:16:16,240 Speaker 1: I think I'm like them. But I started up thinking 355 00:16:16,240 --> 00:16:19,680 Speaker 1: I was a lesbian, thinking you know, I just I'm queer, 356 00:16:20,560 --> 00:16:23,560 Speaker 1: but still kind of questioning myself. And then I got 357 00:16:23,560 --> 00:16:25,720 Speaker 1: to about twenty five and I met somebody that was 358 00:16:25,760 --> 00:16:28,720 Speaker 1: non binary, and they explained, you know, I don't feel 359 00:16:28,760 --> 00:16:31,640 Speaker 1: like I'm a male. I don't feel female. I maybe 360 00:16:31,720 --> 00:16:33,640 Speaker 1: a fab or assigned female at birth, but I just 361 00:16:33,640 --> 00:16:36,320 Speaker 1: don't feel like I fit into that. And so for me, 362 00:16:36,400 --> 00:16:38,240 Speaker 1: I was like, that makes so much sense. 363 00:16:38,680 --> 00:16:39,080 Speaker 2: That's for me. 364 00:16:39,400 --> 00:16:41,520 Speaker 1: I was like, oh, my gosh. That Like again, it 365 00:16:41,560 --> 00:16:44,080 Speaker 1: all comes down to there's no linear experience. It is 366 00:16:44,360 --> 00:16:47,160 Speaker 1: just you know, as you experience it, realize oh that's 367 00:16:47,200 --> 00:16:50,040 Speaker 1: an option, my gosh. And so obviously I then started 368 00:16:50,080 --> 00:16:52,720 Speaker 1: researching it to make sure, because again it all comes 369 00:16:52,720 --> 00:16:55,440 Speaker 1: back to imposter syndrome. It's one thing to say, yeah, great, 370 00:16:55,480 --> 00:16:57,760 Speaker 1: I'm that, but really you really want to solidify that 371 00:16:57,840 --> 00:17:00,440 Speaker 1: before you started announcing it to people. But for me, 372 00:17:00,680 --> 00:17:03,520 Speaker 1: as a non binary person, I am so genuinely proud 373 00:17:03,520 --> 00:17:05,760 Speaker 1: to live my truth, to live who I am, and 374 00:17:05,800 --> 00:17:09,320 Speaker 1: feel so safe and respected within my own like my 375 00:17:09,400 --> 00:17:13,679 Speaker 1: workplace people, although I do get misgendered accidentally, but people 376 00:17:13,800 --> 00:17:17,440 Speaker 1: actively try. And it is still really new, like to 377 00:17:17,800 --> 00:17:20,439 Speaker 1: understand what non binary means, and it is new to 378 00:17:21,080 --> 00:17:23,600 Speaker 1: consider they them as a pronoun and a lot of 379 00:17:23,600 --> 00:17:26,280 Speaker 1: people still as an English language, get really confused that 380 00:17:26,280 --> 00:17:27,359 Speaker 1: that's not multiple people. 381 00:17:28,160 --> 00:17:29,000 Speaker 3: Yeah, exactly. 382 00:17:29,400 --> 00:17:31,520 Speaker 1: But then my question that people is like, okay, great, 383 00:17:31,600 --> 00:17:34,080 Speaker 1: so let's just say a post person just dropped off 384 00:17:34,119 --> 00:17:37,320 Speaker 1: a package to your front door. Did they deliver the 385 00:17:37,359 --> 00:17:39,960 Speaker 1: package or did he she? And they're like, well they 386 00:17:40,000 --> 00:17:43,119 Speaker 1: did because we don't know. I'm like, job done, ye, 387 00:17:44,040 --> 00:17:44,920 Speaker 1: thank you so much. 388 00:17:45,160 --> 00:17:46,199 Speaker 3: No, I never put it that way. 389 00:17:46,400 --> 00:17:50,359 Speaker 1: Yeah, And it's just so easy to overcome that plural discussion, 390 00:17:50,400 --> 00:17:51,720 Speaker 1: but people need to be open to it, and I 391 00:17:51,720 --> 00:17:53,920 Speaker 1: think that's the hard part. They need to be open 392 00:17:53,960 --> 00:17:56,720 Speaker 1: to that discussion. And so for me within HP, like, 393 00:17:56,760 --> 00:17:58,359 Speaker 1: it's one thing to be the product type guy, but 394 00:17:58,440 --> 00:18:01,120 Speaker 1: I'm kind of lucky because I also get to alongside 395 00:18:01,119 --> 00:18:03,560 Speaker 1: a couple of other people within the business. I get 396 00:18:03,600 --> 00:18:05,920 Speaker 1: to be the trendsetter in a way to helping these 397 00:18:05,960 --> 00:18:09,960 Speaker 1: parents understand what non binary is, what being queer means 398 00:18:10,080 --> 00:18:13,040 Speaker 1: to me because my goal within HP. I'm a part 399 00:18:13,080 --> 00:18:15,480 Speaker 1: of a business resource group called the Pride Network, and 400 00:18:15,520 --> 00:18:18,320 Speaker 1: I'm also part of the Women's Impact Network, and so 401 00:18:18,400 --> 00:18:21,280 Speaker 1: my goal within HP is also to help educate because 402 00:18:21,720 --> 00:18:24,080 Speaker 1: everyone within our business is starting to have kids and 403 00:18:24,119 --> 00:18:26,280 Speaker 1: their kids are starting to grow up, and I want 404 00:18:26,320 --> 00:18:29,080 Speaker 1: them to be able to feel comfortable in having the 405 00:18:29,160 --> 00:18:32,639 Speaker 1: tough conversations so that their children feel safe to come out, 406 00:18:32,960 --> 00:18:35,400 Speaker 1: that they can have those safe conversations so that they 407 00:18:35,440 --> 00:18:37,840 Speaker 1: don't have the scenario that I kind of went through 408 00:18:37,880 --> 00:18:40,119 Speaker 1: where I didn't feel safe. I didn't know how my 409 00:18:40,160 --> 00:18:42,639 Speaker 1: parents were going to handle it. But if your parents 410 00:18:42,680 --> 00:18:44,480 Speaker 1: come up to you and say, hey, how we feel 411 00:18:44,480 --> 00:18:47,480 Speaker 1: in today? What's your pronouns? Can I just get engauge, 412 00:18:47,560 --> 00:18:50,320 Speaker 1: like what are you feeling like? Could you imagine an 413 00:18:50,440 --> 00:18:52,800 Speaker 1: environment where kids feel safe enough to do that, that 414 00:18:52,840 --> 00:18:55,119 Speaker 1: they can just live their truth and not have to 415 00:18:55,240 --> 00:18:57,600 Speaker 1: question or even come out like I think that that 416 00:18:57,640 --> 00:19:00,680 Speaker 1: would just be such a beautiful environment. And HP is 417 00:19:00,720 --> 00:19:04,280 Speaker 1: a really great start because they are so what's the word, 418 00:19:04,359 --> 00:19:07,320 Speaker 1: They're so passionate about people living their truth and feeling 419 00:19:07,320 --> 00:19:10,359 Speaker 1: confident and comfortable. So yeah, I'm really really thankful that 420 00:19:10,600 --> 00:19:11,879 Speaker 1: I have the ability to do that. 421 00:19:12,359 --> 00:19:15,080 Speaker 2: I think it really does come back to that similar 422 00:19:15,160 --> 00:19:20,040 Speaker 2: parallel message of you can't step into something or a 423 00:19:20,080 --> 00:19:22,600 Speaker 2: part of your identity or a role professionally or a 424 00:19:22,680 --> 00:19:25,800 Speaker 2: label personally unless you've seen someone else do it, and 425 00:19:25,880 --> 00:19:27,879 Speaker 2: someone has to be the brave person to do it 426 00:19:27,960 --> 00:19:30,320 Speaker 2: first and navigate all the hard bits. There's so many 427 00:19:30,359 --> 00:19:33,159 Speaker 2: hard bits to be the example to show, you know, 428 00:19:33,359 --> 00:19:36,200 Speaker 2: younger generations, it's okay, this is what you might face, 429 00:19:36,240 --> 00:19:38,119 Speaker 2: but these are the good parts, and these are you know, 430 00:19:38,160 --> 00:19:39,919 Speaker 2: these are the way you can have these conversations. And 431 00:19:39,960 --> 00:19:43,120 Speaker 2: you're doing just such an incredible job kind of incorporating 432 00:19:43,160 --> 00:19:45,840 Speaker 2: both your personal life and your professional life into this 433 00:19:45,880 --> 00:19:50,360 Speaker 2: one beautiful role modeling. I mean, it's really powerful. Can 434 00:19:50,400 --> 00:19:53,560 Speaker 2: make me cry over, yesad of it. Oh, I think 435 00:19:53,560 --> 00:19:55,080 Speaker 2: you appreciate I really appreciate that. 436 00:19:55,280 --> 00:19:57,560 Speaker 1: I'm so proud. I'm genuinely so proud, especially when my 437 00:19:57,640 --> 00:19:59,119 Speaker 1: mum I called my mom and like, hey, I did 438 00:19:59,160 --> 00:20:01,200 Speaker 1: this today, You're proud of you. I'm like, thank you 439 00:20:01,240 --> 00:20:04,160 Speaker 1: for putting that in my pot later, so that it's 440 00:20:04,160 --> 00:20:07,280 Speaker 1: honestly it's it's genuinely so fulfilling in so many different avenues. 441 00:20:07,400 --> 00:20:09,840 Speaker 2: Yeah, and it's interesting that I think there is such 442 00:20:09,880 --> 00:20:13,600 Speaker 2: a fear around people saying the wrong thing often. I 443 00:20:13,640 --> 00:20:15,920 Speaker 2: mean there is like the small minority in society who 444 00:20:16,000 --> 00:20:20,000 Speaker 2: are not you know, on the page and will not 445 00:20:20,040 --> 00:20:21,920 Speaker 2: necessarily have the kindest things to say. But I think 446 00:20:21,920 --> 00:20:25,480 Speaker 2: for the most part people are if they misstep, it's 447 00:20:25,520 --> 00:20:29,840 Speaker 2: more because they're so scared about offending or being politically correct. 448 00:20:29,880 --> 00:20:32,280 Speaker 2: And I think it's really lovely that you have such 449 00:20:32,320 --> 00:20:35,320 Speaker 2: a patience to be like if you're trying, it doesn't 450 00:20:35,320 --> 00:20:37,560 Speaker 2: matter if you miss gender me because if you're open 451 00:20:37,600 --> 00:20:40,600 Speaker 2: to me helping you to, you know, do it the 452 00:20:40,640 --> 00:20:42,840 Speaker 2: right way. Then it's like, I think it's this fear 453 00:20:42,920 --> 00:20:44,840 Speaker 2: that's creating a lot of difficulty. 454 00:20:45,240 --> 00:20:47,280 Speaker 1: And if I could leave one message for that, like 455 00:20:47,600 --> 00:20:50,720 Speaker 1: don't hesitate, like try just just give it a red 456 00:20:50,760 --> 00:20:52,280 Speaker 1: hot go It's okay if you mess up. 457 00:20:52,440 --> 00:20:52,920 Speaker 2: Yeah. 458 00:20:52,760 --> 00:20:55,400 Speaker 1: The important thing is if, for instance, as I mentioned, 459 00:20:55,520 --> 00:20:57,560 Speaker 1: they them is my pronouns, if you call me she, 460 00:20:57,960 --> 00:21:00,159 Speaker 1: that's okay, just don't harbor on it, because if you 461 00:21:00,200 --> 00:21:02,119 Speaker 1: apologize and you harbor on it, it makes it awkward 462 00:21:02,119 --> 00:21:02,600 Speaker 1: for all of us. 463 00:21:02,720 --> 00:21:03,920 Speaker 2: Yeah, you just say she. 464 00:21:04,359 --> 00:21:06,720 Speaker 1: I mean, they move on. I'm not even gonna say anything, 465 00:21:06,720 --> 00:21:08,080 Speaker 1: if anything when it come up to you afterwards and 466 00:21:08,119 --> 00:21:12,160 Speaker 1: say thank you so much for trying. Yeah, like it's okay. Yeah, genuinely, 467 00:21:12,240 --> 00:21:14,520 Speaker 1: it makes my heart happier to see people trying then 468 00:21:14,600 --> 00:21:15,760 Speaker 1: not even give it a go. 469 00:21:16,000 --> 00:21:16,200 Speaker 2: Yeah. 470 00:21:16,240 --> 00:21:18,439 Speaker 1: And if you're really anxious and you know somebody that 471 00:21:18,600 --> 00:21:21,040 Speaker 1: their pronouns have changed from what you're used to, just 472 00:21:21,119 --> 00:21:25,080 Speaker 1: use their name. Yeah, it's really not hard. Yeah, so's 473 00:21:25,160 --> 00:21:27,520 Speaker 1: There's so many different different ways that you can approach 474 00:21:27,560 --> 00:21:31,160 Speaker 1: these things, but to give somebody the time of day, honestly, 475 00:21:31,200 --> 00:21:34,760 Speaker 1: I promise you that person will go home feeling elated and. 476 00:21:34,640 --> 00:21:35,280 Speaker 2: That you've tried. 477 00:21:35,520 --> 00:21:37,679 Speaker 1: Literally, Yeah, it makes my I will go home to 478 00:21:37,720 --> 00:21:40,760 Speaker 1: my partner like I have beautiful souls like Chloe or 479 00:21:41,320 --> 00:21:44,080 Speaker 1: like someone named Kat within my business who will literally 480 00:21:44,160 --> 00:21:46,280 Speaker 1: go out of their way and will correct themselves, and 481 00:21:46,280 --> 00:21:48,200 Speaker 1: I'll go home like this happened today when I talk 482 00:21:48,200 --> 00:21:50,439 Speaker 1: to my partner and she just has like the biggest 483 00:21:50,440 --> 00:21:53,080 Speaker 1: smile because again, we're just so appreciative that there are 484 00:21:53,119 --> 00:21:55,440 Speaker 1: people in our lives that care enough to try. 485 00:21:56,040 --> 00:21:59,280 Speaker 2: Oh that's beautiful, and I think, yeah, fear is an 486 00:21:59,280 --> 00:22:02,400 Speaker 2: emotion that does withhold us from a lot of things, 487 00:22:02,400 --> 00:22:07,240 Speaker 2: so in a complete pivot, but also similar similar vein 488 00:22:07,320 --> 00:22:10,920 Speaker 2: of that kind of fear. Another area that I've learned 489 00:22:10,920 --> 00:22:13,520 Speaker 2: a lot from your is AI, and that is a 490 00:22:13,560 --> 00:22:16,359 Speaker 2: topic of our conversation today because there's so much I mean, 491 00:22:16,400 --> 00:22:19,840 Speaker 2: I feel like I've gone from hearing about it constantly 492 00:22:19,960 --> 00:22:22,440 Speaker 2: but knowing nothing about how it works, just suddenly using 493 00:22:22,480 --> 00:22:25,440 Speaker 2: it every day, but my understanding of it. I use 494 00:22:25,480 --> 00:22:27,560 Speaker 2: it without fully understanding how it's working. And I think 495 00:22:27,600 --> 00:22:30,760 Speaker 2: all of us have been through that process of it's 496 00:22:30,800 --> 00:22:34,000 Speaker 2: now in our lies everywhere, but our brain's ability to 497 00:22:34,520 --> 00:22:37,679 Speaker 2: catch up with what it's doing has lagged. And you 498 00:22:37,800 --> 00:22:39,680 Speaker 2: have explained a lot of things to me. You've got 499 00:22:39,680 --> 00:22:43,919 Speaker 2: this incredible ability to distill the hectic science and techy 500 00:22:44,119 --> 00:22:49,120 Speaker 2: explanations into like a lay person explanation. But I think 501 00:22:49,119 --> 00:22:51,040 Speaker 2: there's a lot of fear and misinformation still about how 502 00:22:51,040 --> 00:22:53,720 Speaker 2: it's going to take over our lives. So can you 503 00:22:53,840 --> 00:22:58,639 Speaker 2: start by debunking a few of the biggest misconceptions that 504 00:22:58,720 --> 00:23:01,159 Speaker 2: we have, because I think fear is the first place 505 00:23:01,680 --> 00:23:04,000 Speaker 2: that people go before they sort of they have to 506 00:23:04,000 --> 00:23:06,199 Speaker 2: get over that to then go But look at all 507 00:23:06,200 --> 00:23:08,000 Speaker 2: the amazing things that can do, Look at how it 508 00:23:08,040 --> 00:23:10,560 Speaker 2: can change our lives. So yeah, maybe start with some 509 00:23:10,600 --> 00:23:11,880 Speaker 2: of the things we might have heard in the media, 510 00:23:11,960 --> 00:23:13,520 Speaker 2: like robots are going to take out of our lives 511 00:23:13,520 --> 00:23:14,080 Speaker 2: by tomorrow. 512 00:23:14,160 --> 00:23:16,879 Speaker 1: I like, yeah, was it. They're going to control our lives, 513 00:23:16,920 --> 00:23:18,520 Speaker 1: They're going to take our jobs, They're going to do 514 00:23:18,560 --> 00:23:20,199 Speaker 1: all of these things. But the problem is, have you 515 00:23:20,280 --> 00:23:23,520 Speaker 1: seen the AI renders of images with people with seven 516 00:23:23,560 --> 00:23:27,080 Speaker 1: fingersally just evoke to you about that, referring back to 517 00:23:27,119 --> 00:23:30,320 Speaker 1: your to your husband in that amazing video you did. Again, 518 00:23:30,400 --> 00:23:32,760 Speaker 1: it's never going to be perfect because AI is only 519 00:23:32,840 --> 00:23:36,920 Speaker 1: as smart as the people creating or prompting. So ultimately, 520 00:23:37,000 --> 00:23:39,280 Speaker 1: it's not going to take our jobs. It's not smart enough. 521 00:23:39,560 --> 00:23:41,159 Speaker 1: Even if you were to get it to d like, 522 00:23:41,320 --> 00:23:43,840 Speaker 1: for instance, we give you achine, say hey, here's the 523 00:23:43,880 --> 00:23:46,439 Speaker 1: brief for today. You can ask it to read the brief, 524 00:23:46,480 --> 00:23:48,359 Speaker 1: but it's still not going to be accurate without the 525 00:23:48,440 --> 00:23:51,160 Speaker 1: right prompting of asking give me a brief. I need 526 00:23:51,200 --> 00:23:53,720 Speaker 1: to know you know, A, B and C specifics on 527 00:23:53,880 --> 00:23:55,920 Speaker 1: what is the actual key takeaways? What do I need 528 00:23:55,960 --> 00:23:58,000 Speaker 1: from this? What do I need from this? If you're 529 00:23:58,040 --> 00:24:01,720 Speaker 1: not specific in your requests of the AI to analyze. 530 00:24:02,000 --> 00:24:04,240 Speaker 1: It's going to be extremely confused. It's not going to know, 531 00:24:04,400 --> 00:24:06,439 Speaker 1: and it's still yeah, it's using the clouds, so in 532 00:24:06,440 --> 00:24:08,840 Speaker 1: other words, it's up and the never never using everyone 533 00:24:08,840 --> 00:24:11,040 Speaker 1: else's brains to kind of give you a result. But 534 00:24:11,119 --> 00:24:15,360 Speaker 1: it's still using millions and millions of points of data 535 00:24:15,400 --> 00:24:18,600 Speaker 1: to try and assume what you want. So it's only 536 00:24:18,640 --> 00:24:21,560 Speaker 1: so smart. So really it's Yeah, it's some of the 537 00:24:21,600 --> 00:24:24,920 Speaker 1: greatest minds put into one piece, but ultimately it's still 538 00:24:25,000 --> 00:24:26,600 Speaker 1: just as important for you to make sure that you're 539 00:24:26,640 --> 00:24:27,480 Speaker 1: asking the right things. 540 00:24:27,680 --> 00:24:28,040 Speaker 2: Yeah. 541 00:24:28,080 --> 00:24:30,320 Speaker 1: So no, it's not going to take our jobs. Yeah, 542 00:24:30,359 --> 00:24:31,720 Speaker 1: it's going to make our lives a hell of a 543 00:24:31,800 --> 00:24:34,199 Speaker 1: lot easier. And I find it funny because people are 544 00:24:34,280 --> 00:24:36,359 Speaker 1: so scared of AI. But do you use Google Home, 545 00:24:36,440 --> 00:24:40,320 Speaker 1: do you use Siri? Do you use Amazon? Like in 546 00:24:40,359 --> 00:24:42,000 Speaker 1: the morning, I'm like, hey, can you set a timer 547 00:24:42,040 --> 00:24:43,800 Speaker 1: for this please and thank you just in case there 548 00:24:43,800 --> 00:24:46,359 Speaker 1: is an uprise. I want to back myself, but you 549 00:24:46,400 --> 00:24:50,119 Speaker 1: know that's just me seeing a lot of horror movies. 550 00:24:50,440 --> 00:24:54,320 Speaker 1: But ultimately I really don't see it being a problem. 551 00:24:54,359 --> 00:24:56,040 Speaker 1: We use it in our day to day lives without 552 00:24:56,080 --> 00:24:58,879 Speaker 1: even realizing. So I think it's really just use it 553 00:24:58,880 --> 00:25:00,560 Speaker 1: to your advantage. If you don't I want to use it, 554 00:25:00,600 --> 00:25:02,600 Speaker 1: and you don't want to adopt AI, that's fine, but 555 00:25:02,640 --> 00:25:04,800 Speaker 1: I promise you it'll actually make you more efficient. It 556 00:25:04,800 --> 00:25:07,000 Speaker 1: will make your life so much easier if you give 557 00:25:07,040 --> 00:25:09,720 Speaker 1: it a chance. But you've got to be willing again 558 00:25:09,840 --> 00:25:12,320 Speaker 1: to try. You've got a willing be willing to actually 559 00:25:12,400 --> 00:25:14,159 Speaker 1: research what it is that you want out of this 560 00:25:14,240 --> 00:25:16,320 Speaker 1: AI and how it is going to make your life easier, 561 00:25:16,680 --> 00:25:20,800 Speaker 1: which is a fantastic tool, especially within our HP products 562 00:25:21,000 --> 00:25:23,439 Speaker 1: because obviously I'm here to talk about HP I'm passionate 563 00:25:23,480 --> 00:25:27,240 Speaker 1: about HP. We have our own HPAI companion where you 564 00:25:27,280 --> 00:25:29,520 Speaker 1: can actually draw all these questions and make your life 565 00:25:29,560 --> 00:25:32,720 Speaker 1: easier and make yourself more efficient. Because you mentioned earlier 566 00:25:32,760 --> 00:25:36,640 Speaker 1: like you are somebody that is very precious of your time. 567 00:25:36,680 --> 00:25:38,919 Speaker 1: However you probably say yes to a thousand things like 568 00:25:38,960 --> 00:25:41,520 Speaker 1: I've seen your podcast, I've seen your social media. Where 569 00:25:41,560 --> 00:25:42,800 Speaker 1: you get the time is beyond me. 570 00:25:43,200 --> 00:25:44,320 Speaker 3: We as people. 571 00:25:44,760 --> 00:25:47,479 Speaker 1: There's this misconception back in the day when you're working 572 00:25:47,520 --> 00:25:50,000 Speaker 1: that if you put in extra hours you're valued, you're 573 00:25:50,040 --> 00:25:54,200 Speaker 1: going to be irreplaceable. But ultimately that's not the case. 574 00:25:54,280 --> 00:25:56,240 Speaker 1: You need to value your time. You spend more time 575 00:25:56,280 --> 00:25:58,639 Speaker 1: with your work colleagues than you do with the people 576 00:25:58,680 --> 00:26:01,480 Speaker 1: you choose to love, your partners, your kids, whatever it 577 00:26:01,480 --> 00:26:03,520 Speaker 1: may be. So if you can get time back using 578 00:26:03,560 --> 00:26:05,600 Speaker 1: AI so you can spend it with the people you love, 579 00:26:05,840 --> 00:26:06,600 Speaker 1: why wouldn't you. 580 00:26:06,680 --> 00:26:09,359 Speaker 2: One hundred percent? And I think, yeah, I if I 581 00:26:09,440 --> 00:26:11,879 Speaker 2: had not so much fear, but if I had like 582 00:26:11,920 --> 00:26:15,280 Speaker 2: confusion or any hesitation, as soon as I saw that 583 00:26:15,800 --> 00:26:19,320 Speaker 2: time is my most precious commodity of any other value 584 00:26:19,320 --> 00:26:21,520 Speaker 2: in my life. If it helps me get more of 585 00:26:21,560 --> 00:26:24,320 Speaker 2: that and do everything I do the same button more efficiently, 586 00:26:25,080 --> 00:26:25,840 Speaker 2: Like why wouldn't you? 587 00:26:26,080 --> 00:26:28,600 Speaker 1: Ually money comes and goes, times gone. 588 00:26:28,440 --> 00:26:30,280 Speaker 2: You can never make that back. Yeah, but you can 589 00:26:30,320 --> 00:26:33,520 Speaker 2: optimize it. And I think that's really important. The kind 590 00:26:33,520 --> 00:26:36,080 Speaker 2: of connected fear with it taking over our lives is 591 00:26:36,119 --> 00:26:39,640 Speaker 2: whether or not it can I think there's a two 592 00:26:39,640 --> 00:26:41,680 Speaker 2: concepts that they're kind of similar. Can it make the 593 00:26:41,720 --> 00:26:46,200 Speaker 2: same decisions as a human brain? And can it be autonomous? 594 00:26:46,240 --> 00:26:48,760 Speaker 2: And I think the big fear around saying please and 595 00:26:48,760 --> 00:26:50,640 Speaker 2: thank you, that's been really big. I want to ask 596 00:26:50,680 --> 00:26:53,479 Speaker 2: you that that's been a big thing in the news recently, 597 00:26:53,640 --> 00:26:56,040 Speaker 2: should you say please and thank you? Because can AI 598 00:26:56,359 --> 00:26:59,639 Speaker 2: make decisions without you prompting it? And I think the 599 00:26:59,680 --> 00:27:02,679 Speaker 2: answer is no, we still need human prompts. But I 600 00:27:02,720 --> 00:27:06,320 Speaker 2: think the fear is that they're bubbling away. Your AI 601 00:27:06,400 --> 00:27:08,560 Speaker 2: is bubbling away like judging you and getting ready to 602 00:27:08,600 --> 00:27:11,040 Speaker 2: take over your life. Explained to us why that's not 603 00:27:11,080 --> 00:27:14,240 Speaker 2: a concern, but why you might still say please and 604 00:27:14,359 --> 00:27:14,800 Speaker 2: thank you. 605 00:27:15,080 --> 00:27:19,120 Speaker 1: I mean, let's be real. AI is still in its 606 00:27:19,119 --> 00:27:22,480 Speaker 1: own state, kind of like a machine. It's not something 607 00:27:22,520 --> 00:27:24,880 Speaker 1: that has a brain that thinks. So even saying please 608 00:27:24,880 --> 00:27:27,080 Speaker 1: and thank you, that's just a word to AI. It 609 00:27:27,119 --> 00:27:30,719 Speaker 1: doesn't actually have the emotion behind it. So that's just 610 00:27:30,880 --> 00:27:32,320 Speaker 1: something I do just in case. 611 00:27:32,400 --> 00:27:35,240 Speaker 2: But realistically, I don't. 612 00:27:35,560 --> 00:27:37,920 Speaker 1: Look, this is my own personal opinion, but I really 613 00:27:37,920 --> 00:27:40,720 Speaker 1: don't think it's gonna uprise and come for me. But 614 00:27:40,800 --> 00:27:45,280 Speaker 1: I'm still backing myself. Yeah, but from like a standpoint 615 00:27:45,320 --> 00:27:48,960 Speaker 1: of uprising and all of the fear mongering that we have, Look, 616 00:27:49,320 --> 00:27:51,960 Speaker 1: people are the ones that are creating these AI bots, 617 00:27:52,000 --> 00:27:53,880 Speaker 1: they're the ones that are teaching it, they're the ones 618 00:27:53,920 --> 00:27:56,840 Speaker 1: that are giving it reference points. So if there are 619 00:27:56,920 --> 00:28:00,199 Speaker 1: any issues regarding that, these people can switch them off 620 00:28:00,240 --> 00:28:02,800 Speaker 1: in a heartbeat, Like there are still some rigorous testing 621 00:28:02,800 --> 00:28:05,040 Speaker 1: because there are people that are so curious, Like as people, 622 00:28:05,080 --> 00:28:07,840 Speaker 1: we are naturally curious to see the what if. So 623 00:28:07,880 --> 00:28:10,960 Speaker 1: there's probably people out there trying to build something, but 624 00:28:11,119 --> 00:28:13,919 Speaker 1: they're also still in control. They're still controlling that they 625 00:28:13,920 --> 00:28:16,720 Speaker 1: can switch it off because AI needs power. If you 626 00:28:16,720 --> 00:28:18,520 Speaker 1: don't give it, if you just flick the switch off 627 00:28:18,520 --> 00:28:24,320 Speaker 1: and well, ultimately gone, I keep going. Yeah, again, that's 628 00:28:24,359 --> 00:28:26,320 Speaker 1: that's my own personal opinion. Like maybe if they're I 629 00:28:26,359 --> 00:28:29,640 Speaker 1: don't know, hooking it up to sustainable energy like Thula power, 630 00:28:29,680 --> 00:28:32,399 Speaker 1: who knows what's going to happen. But ultimately I don't 631 00:28:32,560 --> 00:28:36,239 Speaker 1: personally see that issue coming along, at least not in 632 00:28:36,240 --> 00:28:38,920 Speaker 1: my generation. Who knows what the future holds because people 633 00:28:38,920 --> 00:28:42,120 Speaker 1: are curious. We've all seen horror movies. You just you 634 00:28:42,200 --> 00:28:44,800 Speaker 1: don't know. But I think people need to really move 635 00:28:44,840 --> 00:28:46,920 Speaker 1: on from that fear mongering and that scare tactic and 636 00:28:47,040 --> 00:28:50,400 Speaker 1: just take advantage of trying to get their time back. Yeah, 637 00:28:50,440 --> 00:28:52,680 Speaker 1: that is what we need to value more because once 638 00:28:52,720 --> 00:28:56,160 Speaker 1: it's gone, it's gone. So like you yourself, you have 639 00:28:56,200 --> 00:28:58,240 Speaker 1: such a young family, why would you not want to 640 00:28:58,240 --> 00:29:00,320 Speaker 1: go spend time with your kids and watch them grow 641 00:29:00,440 --> 00:29:01,960 Speaker 1: up and not miss those milestones. 642 00:29:02,400 --> 00:29:04,680 Speaker 2: Yeah. Absolutely, and that's I mean, that is my next 643 00:29:04,760 --> 00:29:07,680 Speaker 2: question for you is what are some of the biggest ways, 644 00:29:07,760 --> 00:29:12,080 Speaker 2: particularly for people who haven't yet kind of had that 645 00:29:12,240 --> 00:29:15,440 Speaker 2: like aha moment where it's clicked over of this literally 646 00:29:15,560 --> 00:29:18,480 Speaker 2: is the way like tangible examples of things you use 647 00:29:18,560 --> 00:29:21,440 Speaker 2: that help you work smarter, So our listeners can adopt 648 00:29:21,760 --> 00:29:25,040 Speaker 2: without needing to understand AI. You can use hp AI 649 00:29:25,080 --> 00:29:27,960 Speaker 2: companion without needing to know how it works. You literally 650 00:29:28,040 --> 00:29:30,560 Speaker 2: open the window and type in a question and it'll 651 00:29:30,600 --> 00:29:33,240 Speaker 2: answer it for you. Literally, like what are some tangible 652 00:29:33,240 --> 00:29:35,480 Speaker 2: examples of things people can introduce into their lives. 653 00:29:35,520 --> 00:29:37,920 Speaker 1: So obviously, as I mentioned, I have a fantasia. I 654 00:29:37,960 --> 00:29:41,080 Speaker 1: can't visualize anything, but I can type things, and I 655 00:29:41,080 --> 00:29:43,080 Speaker 1: can ask questions, and I can ask it to generate 656 00:29:43,120 --> 00:29:45,360 Speaker 1: something for me. So now I have my own version 657 00:29:45,560 --> 00:29:48,480 Speaker 1: of what you visualize. AI is such a good tool 658 00:29:48,600 --> 00:29:52,840 Speaker 1: from a disability and even just like an advocacy standpoint, 659 00:29:52,880 --> 00:29:55,040 Speaker 1: because there's if there's stuff that's missing that you can't 660 00:29:55,040 --> 00:29:56,680 Speaker 1: have that can fill in some of the gaps for you. 661 00:29:56,760 --> 00:29:58,960 Speaker 2: I didn't even think of it. Yeah, you can't picture it, 662 00:29:59,000 --> 00:30:01,640 Speaker 2: so you type in the description and a picture come 663 00:30:01,680 --> 00:30:02,200 Speaker 2: out and I'm. 664 00:30:02,040 --> 00:30:04,120 Speaker 1: Like, yeah, can you please create this? So at the moment, 665 00:30:04,240 --> 00:30:07,480 Speaker 1: I for my role, I do a lot of conferences 666 00:30:07,520 --> 00:30:09,720 Speaker 1: and I like to create boots and I'm really creative 667 00:30:09,800 --> 00:30:11,160 Speaker 1: in what I want to do, but I can never 668 00:30:11,240 --> 00:30:13,400 Speaker 1: picture it, and then trying to explain it to my 669 00:30:13,440 --> 00:30:15,800 Speaker 1: colleagues to say this is what I want, like my 670 00:30:15,800 --> 00:30:17,560 Speaker 1: poor colleagues to Chear has to deal with a lot 671 00:30:17,560 --> 00:30:20,160 Speaker 1: of my crap and a lot of the times she's like, 672 00:30:20,240 --> 00:30:22,080 Speaker 1: I don't know what you're saying. Was So I use 673 00:30:22,120 --> 00:30:25,200 Speaker 1: AI to kind of create a visualization so that I 674 00:30:25,200 --> 00:30:28,240 Speaker 1: can actually present that back because my descriptions aren't great 675 00:30:28,440 --> 00:30:30,360 Speaker 1: and I know that even in my house at the moment, 676 00:30:30,400 --> 00:30:32,880 Speaker 1: my partner is sick of my stuff. My idea is 677 00:30:32,960 --> 00:30:35,560 Speaker 1: because I can't picture it. But I'm like, you know what, 678 00:30:35,640 --> 00:30:37,520 Speaker 1: that TV cabinet looks really good, and she's like that 679 00:30:37,560 --> 00:30:39,200 Speaker 1: will clash with the colors of the house, but I 680 00:30:39,240 --> 00:30:41,680 Speaker 1: can't picture it. I think it looks great, but I 681 00:30:41,720 --> 00:30:45,480 Speaker 1: see it as a single item, not multiple. But we 682 00:30:45,560 --> 00:30:48,520 Speaker 1: actually used AI where we were able to incorporate the 683 00:30:48,520 --> 00:30:50,960 Speaker 1: photo image in a three D created a three D 684 00:30:51,040 --> 00:30:54,280 Speaker 1: rendering of our home, use the three D rendering of 685 00:30:54,280 --> 00:30:56,440 Speaker 1: that particular product put in the laund room, only for 686 00:30:56,440 --> 00:31:01,080 Speaker 1: me to realize, yeah, I was wrong, and I'm very 687 00:31:01,560 --> 00:31:03,640 Speaker 1: very aware that I am not going to be correct 688 00:31:03,640 --> 00:31:04,280 Speaker 1: all of the time. 689 00:31:04,600 --> 00:31:07,240 Speaker 2: But for instance, like what is that like? I don't 690 00:31:07,280 --> 00:31:09,440 Speaker 2: know what that's like because I'm not right all the time. 691 00:31:09,680 --> 00:31:13,160 Speaker 1: I can't relate. Unless it's work related, then I am 692 00:31:13,160 --> 00:31:17,680 Speaker 1: a genius if anyone else. No one questioned me, But yeah, 693 00:31:17,800 --> 00:31:19,800 Speaker 1: I think when it comes down to it, like drawing 694 00:31:19,840 --> 00:31:21,800 Speaker 1: it back to product and stuff. So for me in 695 00:31:21,800 --> 00:31:23,800 Speaker 1: my personal life, that's one way. But obviously from a 696 00:31:23,840 --> 00:31:27,320 Speaker 1: work perspective, asking my AI bot on my laptop so 697 00:31:27,440 --> 00:31:30,200 Speaker 1: AI companion to actually create those renderings for me, I 698 00:31:30,240 --> 00:31:33,040 Speaker 1: can also utilize the AI companion within the PC to 699 00:31:33,040 --> 00:31:36,440 Speaker 1: actually optimize my computer. So then the role of having 700 00:31:36,440 --> 00:31:38,640 Speaker 1: to worry about technicians kind of goes out the door 701 00:31:38,680 --> 00:31:42,520 Speaker 1: because the PC can actually adopt how you use your 702 00:31:42,520 --> 00:31:45,640 Speaker 1: PC for yourself, so you don't have to worry constantly 703 00:31:45,680 --> 00:31:48,239 Speaker 1: about going into the bios or breaking the laptop. The 704 00:31:48,240 --> 00:31:50,120 Speaker 1: software does it all for you. Oh my god, that's 705 00:31:50,160 --> 00:31:52,640 Speaker 1: so so cool, right, And then you've got the efficiency piece. 706 00:31:52,680 --> 00:31:54,800 Speaker 1: Like everybody wants a laptop that's going to last days. 707 00:31:55,120 --> 00:31:56,760 Speaker 1: What if I told you that actually exists. Like the 708 00:31:56,800 --> 00:31:58,880 Speaker 1: laptops that we're currently using, you look at like twenty 709 00:31:58,920 --> 00:32:02,480 Speaker 1: something hours of battery life. Because AI companion is able 710 00:32:02,520 --> 00:32:05,640 Speaker 1: to actually switch off specific things, It's able to look 711 00:32:05,640 --> 00:32:08,080 Speaker 1: at what the RAM is being utilized for, shrink it down, 712 00:32:08,320 --> 00:32:10,240 Speaker 1: lower the demand so that it can actually tailor to 713 00:32:10,280 --> 00:32:12,120 Speaker 1: what you're actually using it for. To give you that 714 00:32:12,160 --> 00:32:15,560 Speaker 1: all day battery life. You close the lid, it automatically 715 00:32:15,600 --> 00:32:17,760 Speaker 1: adopts again because you have HP smarts since So you 716 00:32:17,800 --> 00:32:20,080 Speaker 1: close the lid and you put in your backpack. Four 717 00:32:20,160 --> 00:32:21,880 Speaker 1: days later, you realize you forgot to turn it off 718 00:32:21,920 --> 00:32:24,120 Speaker 1: and it's only dropped by ten percent. Like, there's so 719 00:32:24,240 --> 00:32:26,720 Speaker 1: many different things that you can utilize without even needing 720 00:32:26,760 --> 00:32:28,960 Speaker 1: to do anything. And I think that that, within itself, 721 00:32:29,040 --> 00:32:31,280 Speaker 1: is one tickbox. You click the button, done and dusted. 722 00:32:31,320 --> 00:32:33,320 Speaker 1: You don't have to think about it anymore. And a 723 00:32:33,320 --> 00:32:34,800 Speaker 1: lot of the times people are just scared that they're 724 00:32:34,800 --> 00:32:35,400 Speaker 1: going to break it. 725 00:32:35,640 --> 00:32:38,680 Speaker 2: Okay, Well, that actually highlights another thing that confuses a 726 00:32:38,680 --> 00:32:41,480 Speaker 2: lot of people about AI, and that confused me before 727 00:32:41,600 --> 00:32:44,520 Speaker 2: you explained it recently, So I think most people when 728 00:32:44,520 --> 00:32:47,240 Speaker 2: they think of AI are thinking of the AI bot 729 00:32:47,400 --> 00:32:49,760 Speaker 2: or the best friend in a search engine that you 730 00:32:49,840 --> 00:32:52,360 Speaker 2: can pretty much ask anything. You type a question in 731 00:32:52,400 --> 00:32:55,600 Speaker 2: a search box and it answers your questions or creates 732 00:32:55,600 --> 00:32:57,600 Speaker 2: whatever you want it to create. A lot of listeners 733 00:32:57,640 --> 00:33:00,360 Speaker 2: will have used chat GPT for that. We both have 734 00:33:00,520 --> 00:33:02,720 Speaker 2: our hp omnibox in front of us, and they have 735 00:33:02,800 --> 00:33:06,920 Speaker 2: an inbuilt version called HPAI Companion that we've been talking about. 736 00:33:07,360 --> 00:33:10,760 Speaker 2: I use this daily for things like asking it to 737 00:33:10,760 --> 00:33:13,600 Speaker 2: write the podcast intros for me. So I'll literally type 738 00:33:13,600 --> 00:33:16,520 Speaker 2: in write a brief introduction and bio for Lauren Dennett, 739 00:33:16,520 --> 00:33:18,680 Speaker 2: and it will search the entire Internet and spit out 740 00:33:18,720 --> 00:33:21,320 Speaker 2: within seconds the text for me to then tweak and 741 00:33:21,360 --> 00:33:23,920 Speaker 2: put into my own words. But that saves so much time. 742 00:33:24,520 --> 00:33:27,239 Speaker 2: Or I'll say, list the main misconceptions people might have 743 00:33:27,280 --> 00:33:29,360 Speaker 2: about AI, and that's how I build the questions for 744 00:33:29,440 --> 00:33:31,760 Speaker 2: this episode instead of, you know, spending time thinking it 745 00:33:31,840 --> 00:33:34,960 Speaker 2: up in my own brain first. But then you've been 746 00:33:35,000 --> 00:33:39,640 Speaker 2: talking about optimizing battery life and AI working like an 747 00:33:39,640 --> 00:33:42,880 Speaker 2: inbuilt technician and switching things on and off without you 748 00:33:42,960 --> 00:33:46,160 Speaker 2: even asking it to. So when you talk about that 749 00:33:46,360 --> 00:33:49,840 Speaker 2: kind of function like AI optimizing our devices or our 750 00:33:49,920 --> 00:33:53,120 Speaker 2: apps without us prompting it. What does that mean? Like, 751 00:33:53,200 --> 00:33:56,000 Speaker 2: what is the difference between those functions? 752 00:33:56,160 --> 00:33:57,880 Speaker 1: A lot of the time you say AI and people 753 00:33:57,880 --> 00:34:01,760 Speaker 1: automatically go straight to chat. GPT and Microsoft have obviously 754 00:34:01,760 --> 00:34:05,040 Speaker 1: started bringing in copilot and they're starting to be that 755 00:34:05,160 --> 00:34:08,800 Speaker 1: next demand. But for us at HP, we have HPAI Companion, 756 00:34:08,840 --> 00:34:11,239 Speaker 1: which is only on our units. But to draw back 757 00:34:11,239 --> 00:34:13,560 Speaker 1: to your question around like what is the difference. So 758 00:34:14,120 --> 00:34:16,600 Speaker 1: for an overarching thing, you've got to look at how 759 00:34:16,640 --> 00:34:20,960 Speaker 1: PCs are built, right, So ordinarily we're all familiar with CPUs, GPUs. 760 00:34:21,480 --> 00:34:23,480 Speaker 1: Now with AI, we have a thing called the NPU, 761 00:34:23,520 --> 00:34:26,880 Speaker 1: which is a neural processing unit. So it's extra jargon 762 00:34:26,960 --> 00:34:28,839 Speaker 1: that no one ever wants to know about. So if 763 00:34:28,840 --> 00:34:30,720 Speaker 1: I would have bring it to you in a really 764 00:34:30,800 --> 00:34:33,920 Speaker 1: like simplistic way that most people will understand, You've got 765 00:34:33,960 --> 00:34:36,120 Speaker 1: your CPU, which is the brain of the PC. You've 766 00:34:36,120 --> 00:34:39,360 Speaker 1: got your GPU which is your muscles. Now my own iteration. 767 00:34:39,440 --> 00:34:41,799 Speaker 1: So again I'm not holding HP accountable by the way 768 00:34:41,840 --> 00:34:43,400 Speaker 1: for any of this, but this is my own way 769 00:34:43,440 --> 00:34:46,120 Speaker 1: of explaining it. To make it make english or makes sense, 770 00:34:46,560 --> 00:34:49,920 Speaker 1: is by referring to the NPU as a therapist. Personally, 771 00:34:50,040 --> 00:34:52,759 Speaker 1: I love therapy. I love going to talk to my 772 00:34:52,840 --> 00:34:55,799 Speaker 1: psychologist after I've had a ripping week, or if I'm 773 00:34:55,800 --> 00:34:58,359 Speaker 1: exhausted and I just need to just absolutely brain dump 774 00:34:58,400 --> 00:35:02,120 Speaker 1: everything that's happening, then go to them. Basically, get rid 775 00:35:02,200 --> 00:35:04,720 Speaker 1: of everything, go home and feel refreshed, feel more efficient, 776 00:35:04,880 --> 00:35:08,360 Speaker 1: sleep better, not feel exhausted constantly. That is basically what 777 00:35:08,400 --> 00:35:10,160 Speaker 1: the MPU is doing for the PC when it comes 778 00:35:10,200 --> 00:35:14,880 Speaker 1: to AI driven tasks. So there are localized AI tasks 779 00:35:15,040 --> 00:35:19,680 Speaker 1: like basically using some parts of AI companion, using particular 780 00:35:19,800 --> 00:35:22,760 Speaker 1: camera tools so you know, when you open your calls 781 00:35:22,800 --> 00:35:25,239 Speaker 1: and you've got background blur on, or you might want 782 00:35:25,239 --> 00:35:27,279 Speaker 1: it to track your face because you've got ADHD and 783 00:35:27,280 --> 00:35:29,160 Speaker 1: you want the camera to just keep an eye on you. 784 00:35:29,719 --> 00:35:31,760 Speaker 1: Or you've got the creepy tool where you can basically 785 00:35:31,880 --> 00:35:34,400 Speaker 1: keep the camera on and if you blink or if 786 00:35:34,440 --> 00:35:37,480 Speaker 1: you look away, it still tracks your eyes. Freaks me out, 787 00:35:37,520 --> 00:35:39,560 Speaker 1: by the way, don't turn it on. But that is 788 00:35:39,640 --> 00:35:42,719 Speaker 1: using the NPU because it's an AI driven task. So 789 00:35:42,840 --> 00:35:45,840 Speaker 1: again it makes the PC more efficient instead of requiring 790 00:35:45,840 --> 00:35:48,280 Speaker 1: lots of power that is then reserved for the CPU 791 00:35:48,360 --> 00:35:50,200 Speaker 1: and GPU to do what it needs to do. So 792 00:35:50,680 --> 00:35:54,799 Speaker 1: brain muscles therapist. Whow that is my breakdown. But in 793 00:35:54,880 --> 00:35:58,400 Speaker 1: terms of like categorizing AI and how it can be utilized, Look, 794 00:35:58,600 --> 00:36:01,000 Speaker 1: you've got your cloud based one, which is where most 795 00:36:01,040 --> 00:36:03,600 Speaker 1: people are kind of concerned because obviously they're thinking, where's 796 00:36:03,600 --> 00:36:06,239 Speaker 1: my information going, where are my questions going? Where's all 797 00:36:06,239 --> 00:36:08,680 Speaker 1: this going? And yeah, it's sitting up here. I can't 798 00:36:08,680 --> 00:36:10,799 Speaker 1: tell you where it's going because it depends on what 799 00:36:10,920 --> 00:36:15,279 Speaker 1: you're using where it's going. Look, ultimately, no one's going 800 00:36:15,360 --> 00:36:18,080 Speaker 1: to really care where you're twenty twenty one photos from 801 00:36:18,120 --> 00:36:21,680 Speaker 1: BALI are going to be honest, But I. 802 00:36:21,600 --> 00:36:23,359 Speaker 2: Think about that all the time. One's so weird, and 803 00:36:23,360 --> 00:36:25,480 Speaker 2: I'm like, it's all your search is about seventy five 804 00:36:25,480 --> 00:36:28,680 Speaker 2: different Zuchini slice recipes, Like no one cares. 805 00:36:29,600 --> 00:36:31,359 Speaker 1: Her mind's just like, am I allergic to this? 806 00:36:32,640 --> 00:36:33,800 Speaker 3: Google adopted? Please help me? 807 00:36:33,800 --> 00:36:35,799 Speaker 2: Man? What does this mean? 808 00:36:36,840 --> 00:36:40,680 Speaker 1: Yea? So look, people post on like all of our 809 00:36:40,680 --> 00:36:43,640 Speaker 1: social media photos constantly. What makes you think that's any 810 00:36:43,640 --> 00:36:46,120 Speaker 1: different to what AI is doing with that data? So 811 00:36:46,239 --> 00:36:48,200 Speaker 1: I think we really it took us a while to 812 00:36:48,239 --> 00:36:50,479 Speaker 1: adopt social media, and I have no doubt that AI 813 00:36:50,560 --> 00:36:53,960 Speaker 1: is in a similar standpoint, so I think we'll adopt it. Inevitably, 814 00:36:54,000 --> 00:36:55,759 Speaker 1: people will start to feel more and more comfortable once 815 00:36:55,800 --> 00:36:58,960 Speaker 1: they realize the benefits of the efficiencies that are offered. 816 00:36:59,360 --> 00:37:01,480 Speaker 1: But like I said, drawing back to you've got cloud, 817 00:37:01,520 --> 00:37:04,840 Speaker 1: you've got localized version. So at the moment with AI Companion, 818 00:37:04,880 --> 00:37:07,800 Speaker 1: it's currently all cloud based, but it can use localized 819 00:37:07,840 --> 00:37:10,000 Speaker 1: information on the PC. So when you ask it questions, 820 00:37:10,040 --> 00:37:11,839 Speaker 1: it draws up, goes up here, it comes back down, 821 00:37:11,880 --> 00:37:14,359 Speaker 1: gives you the response. But there are parts of it. 822 00:37:14,440 --> 00:37:17,080 Speaker 1: So you've got three key points to AI Companion. You've 823 00:37:17,120 --> 00:37:19,799 Speaker 1: got analyzing, you've got ask it questions, and then you've 824 00:37:19,840 --> 00:37:22,839 Speaker 1: got performance. So for you specifically, you've done a lot 825 00:37:22,880 --> 00:37:25,040 Speaker 1: of like the asking, but you've also done a bit 826 00:37:25,080 --> 00:37:27,960 Speaker 1: of performance without realizing. And more importantly, you've got the 827 00:37:28,040 --> 00:37:31,439 Speaker 1: analyzed section where it allows you to have a briefing doc. 828 00:37:31,719 --> 00:37:34,279 Speaker 1: You can upload it directly to AI Companion and ask 829 00:37:34,400 --> 00:37:37,640 Speaker 1: questions of that reference point. So it's your information. It's 830 00:37:37,719 --> 00:37:40,160 Speaker 1: specifically for you. No one else is going to have 831 00:37:40,239 --> 00:37:42,920 Speaker 1: that particular file. That's your file, so you can actually 832 00:37:42,920 --> 00:37:45,680 Speaker 1: have a conversation with the AI on that file. So again, 833 00:37:45,800 --> 00:37:48,600 Speaker 1: I think it's really great from an inclusion standpoint because 834 00:37:48,640 --> 00:37:51,120 Speaker 1: AI is going to make life easier for those that 835 00:37:51,239 --> 00:37:54,879 Speaker 1: may not have the same communication abilities. They may have 836 00:37:54,960 --> 00:37:57,880 Speaker 1: ADHD and may not be able to read specific parts. 837 00:37:57,880 --> 00:38:00,000 Speaker 1: But if you can copy that and get the AI 838 00:38:00,040 --> 00:38:01,640 Speaker 1: to actually read it out loud and treat it like 839 00:38:01,640 --> 00:38:05,239 Speaker 1: a conversation or a podcast, I retain information so much 840 00:38:05,280 --> 00:38:08,920 Speaker 1: better by listening to podcasts like what you create. But 841 00:38:09,160 --> 00:38:11,759 Speaker 1: if you give me the same thing as a transcript 842 00:38:11,800 --> 00:38:13,400 Speaker 1: on a page, you better believe I'm not going to 843 00:38:13,400 --> 00:38:17,120 Speaker 1: get through that. I couldn't think of anything worse. So if, yeah, 844 00:38:17,239 --> 00:38:20,800 Speaker 1: you use tools like AI to create conversations that ordinarily 845 00:38:20,800 --> 00:38:21,040 Speaker 1: are on. 846 00:38:21,040 --> 00:38:22,600 Speaker 2: A piece of paper, why wouldn't you want to do that? 847 00:38:23,120 --> 00:38:25,680 Speaker 2: So yeah, yeah, one hundred percent. That's one of the 848 00:38:25,719 --> 00:38:27,759 Speaker 2: things that I didn't know it could do. Like I 849 00:38:27,800 --> 00:38:30,200 Speaker 2: was always just using it more like a search engine function, 850 00:38:30,360 --> 00:38:34,560 Speaker 2: and that was both AI companion on my device about 851 00:38:34,560 --> 00:38:36,480 Speaker 2: the device, but about anything. I mean, when I created 852 00:38:36,520 --> 00:38:38,560 Speaker 2: a video, I was like, make a video of my 853 00:38:38,640 --> 00:38:42,520 Speaker 2: husband and I as bears, Like it can literally do anything. Yeah, 854 00:38:42,560 --> 00:38:45,920 Speaker 2: and the same with more Yeah, cloud based like CHATDPD, 855 00:38:46,000 --> 00:38:48,000 Speaker 2: I would do the same thing, I'd ask it to 856 00:38:48,040 --> 00:38:50,080 Speaker 2: write my intros, Like, that's so time saving because I 857 00:38:50,080 --> 00:38:51,799 Speaker 2: spent so long on the wording. And even though, of 858 00:38:51,840 --> 00:38:53,680 Speaker 2: course you tailor it to your so you make edit 859 00:38:53,719 --> 00:38:56,640 Speaker 2: so it's in your own voice, starting with a block 860 00:38:56,680 --> 00:38:58,560 Speaker 2: of text is so much more useful. But what I 861 00:38:58,600 --> 00:39:01,040 Speaker 2: had never used it for until you explained analyze to 862 00:39:01,080 --> 00:39:04,320 Speaker 2: me is yeah, uploading a document and then getting AI 863 00:39:04,400 --> 00:39:06,799 Speaker 2: to analyze it for you. So for anyone who runs 864 00:39:06,800 --> 00:39:10,720 Speaker 2: a business and you need to like time consuming tasks 865 00:39:10,760 --> 00:39:15,200 Speaker 2: like uploading bank statements or statements where you're like, please 866 00:39:15,239 --> 00:39:17,319 Speaker 2: tell me how much I spent on uber eats or 867 00:39:17,320 --> 00:39:22,120 Speaker 2: like high light, don't do that, I share. But you 868 00:39:22,120 --> 00:39:24,279 Speaker 2: can literally something that you would otherwise go through line 869 00:39:24,320 --> 00:39:26,560 Speaker 2: by line yourself and add it all up. You can 870 00:39:26,760 --> 00:39:29,960 Speaker 2: upload a document and say what is the key thesis 871 00:39:29,960 --> 00:39:33,279 Speaker 2: of this article or what is the main expense that 872 00:39:33,320 --> 00:39:35,120 Speaker 2: you see coming up over and over again? Like you 873 00:39:35,160 --> 00:39:37,880 Speaker 2: can ask it to read it quickly for you and 874 00:39:37,920 --> 00:39:42,360 Speaker 2: then spit out answers. It is revolutionary. Like the time 875 00:39:42,400 --> 00:39:46,200 Speaker 2: consuming tasks and in editing, it also does things like 876 00:39:46,719 --> 00:39:49,200 Speaker 2: the way AI has really helped with podcast and video 877 00:39:49,280 --> 00:39:51,799 Speaker 2: editing is there are lots of pauses, like I have 878 00:39:52,040 --> 00:39:53,800 Speaker 2: so many yes. 879 00:39:53,800 --> 00:39:56,480 Speaker 1: Just classic silences where you're like, my brain's thinking, what 880 00:39:56,480 --> 00:39:58,479 Speaker 1: do I need to say here? I need to see 881 00:39:58,520 --> 00:40:00,360 Speaker 1: your brain ticking over want. 882 00:40:01,160 --> 00:40:04,040 Speaker 2: Arms and ahs and instead of I used to manually 883 00:40:04,080 --> 00:40:05,080 Speaker 2: go through and edit. 884 00:40:05,680 --> 00:40:06,920 Speaker 1: How many hours is that going to take you? 885 00:40:07,000 --> 00:40:09,520 Speaker 2: Like hours hours? Because you really listen to the episode 886 00:40:09,520 --> 00:40:12,040 Speaker 2: at like half speed to cut, you know, basically half 887 00:40:12,040 --> 00:40:12,839 Speaker 2: to give. 888 00:40:12,760 --> 00:40:14,640 Speaker 3: Your own voice are you oh my god? And the 889 00:40:14,680 --> 00:40:15,879 Speaker 3: guest flows for the time. 890 00:40:15,880 --> 00:40:17,799 Speaker 2: It's like killed my buzz by the end of the 891 00:40:17,800 --> 00:40:19,640 Speaker 2: episode because all I've heard is the arms and not 892 00:40:19,680 --> 00:40:22,239 Speaker 2: the actual content. But now like AI can do, there 893 00:40:22,280 --> 00:40:24,960 Speaker 2: are settings in your apps where it's like cut out 894 00:40:24,960 --> 00:40:27,439 Speaker 2: all the arms and it will just process the entire 895 00:40:27,480 --> 00:40:29,879 Speaker 2: file in like a minute and all the arms are gone. 896 00:40:29,880 --> 00:40:32,839 Speaker 2: And I'm like, why have we not all been using 897 00:40:32,880 --> 00:40:34,080 Speaker 2: these tools same changing? 898 00:40:34,080 --> 00:40:36,000 Speaker 1: And do you know what? It's kind of like the 899 00:40:36,120 --> 00:40:39,839 Speaker 1: slightly bitter part of me is mad because students get 900 00:40:39,840 --> 00:40:40,640 Speaker 1: to benefit from this. 901 00:40:40,960 --> 00:40:43,320 Speaker 3: Yes, the gen zs, the gen whatever. 902 00:40:43,719 --> 00:40:46,759 Speaker 1: Oh god, now I'm really showing how think. Yeah, young 903 00:40:47,320 --> 00:40:51,040 Speaker 1: snappers honestly. 904 00:40:52,480 --> 00:40:56,239 Speaker 2: Using the Facebook and the Google I am hip with 905 00:40:56,320 --> 00:40:58,239 Speaker 2: the yes I am trend I. 906 00:41:00,280 --> 00:41:03,719 Speaker 1: Just reminds me of the guy from Grown Ups where 907 00:41:03,719 --> 00:41:04,920 Speaker 1: he turns up with the skateboard. 908 00:41:04,960 --> 00:41:08,000 Speaker 2: Oh yeah, exactly that that's us, but the whipper snappers 909 00:41:08,040 --> 00:41:09,839 Speaker 2: and so. 910 00:41:10,480 --> 00:41:12,400 Speaker 1: But yeah, drawing back to that, like I think of 911 00:41:12,480 --> 00:41:14,200 Speaker 1: all the people that are going back to study UNI 912 00:41:14,400 --> 00:41:17,720 Speaker 1: high schools. Union is actually starting to pay for subscriptions 913 00:41:17,800 --> 00:41:20,840 Speaker 1: so that people they can kind of get ahead and 914 00:41:20,920 --> 00:41:23,360 Speaker 1: know what is being used because obviously if the unions 915 00:41:23,360 --> 00:41:25,239 Speaker 1: pay for the subscription, they get to see what people 916 00:41:25,239 --> 00:41:27,279 Speaker 1: are actually utilizing the AI for, which I think is 917 00:41:27,280 --> 00:41:30,240 Speaker 1: really smart because they're going to do it anyway. But honestly, 918 00:41:30,280 --> 00:41:32,400 Speaker 1: the hardest part of starting a lot of assignments and 919 00:41:32,440 --> 00:41:34,719 Speaker 1: things is literally that first question you mentioned, like how 920 00:41:34,719 --> 00:41:37,000 Speaker 1: do you start the brief, how do you start the podcast? 921 00:41:37,000 --> 00:41:39,560 Speaker 1: How are you doing the welcoming? If you can get 922 00:41:39,600 --> 00:41:42,520 Speaker 1: it to do that, like, imagine how much time that 923 00:41:42,600 --> 00:41:45,279 Speaker 1: block no longer exists, and I think that that is 924 00:41:45,320 --> 00:41:47,319 Speaker 1: really cool. So again, yeah, the bitter person in me 925 00:41:47,400 --> 00:41:49,840 Speaker 1: is like, that's rubbish. Maybe I would have actually succeeded 926 00:41:50,480 --> 00:41:53,360 Speaker 1: faster if I had have had this benefit, but ultimately 927 00:41:53,480 --> 00:41:56,680 Speaker 1: it's a really cool thing to have and to see 928 00:41:56,680 --> 00:42:00,160 Speaker 1: someone creative like yourself actually utilizing it for the benefits 929 00:42:00,200 --> 00:42:03,480 Speaker 1: of like your podcast. Your creativity is great, but the 930 00:42:03,600 --> 00:42:06,640 Speaker 1: mundane stuff of you know, tax times coming up, I 931 00:42:06,680 --> 00:42:08,600 Speaker 1: can think of anything bloody worse, and I will mention 932 00:42:08,680 --> 00:42:10,080 Speaker 1: I do not want to see how many times I've 933 00:42:10,120 --> 00:42:12,359 Speaker 1: used uber eats because I do not need that shame. 934 00:42:12,440 --> 00:42:12,839 Speaker 1: Right now. 935 00:42:13,000 --> 00:42:15,120 Speaker 2: I'm really actually upset that I did that exercise. I'm 936 00:42:15,120 --> 00:42:18,799 Speaker 2: not gonna lie. I thought you actually did it informative 937 00:42:18,960 --> 00:42:21,640 Speaker 2: like to shock me into behavior change and just a 938 00:42:21,680 --> 00:42:22,239 Speaker 2: spiral to. 939 00:42:22,600 --> 00:42:24,080 Speaker 1: You then order uber eats to feel better. 940 00:42:24,880 --> 00:42:26,440 Speaker 2: I was like, I need to emotional eat my way 941 00:42:26,440 --> 00:42:26,759 Speaker 2: out of. 942 00:42:26,680 --> 00:42:29,480 Speaker 3: This hole feeling terrible about myself. 943 00:42:29,560 --> 00:42:31,640 Speaker 1: I might just get that that you know, that burger that, 944 00:42:33,280 --> 00:42:35,680 Speaker 1: but I think the other thing that's it's really cool, 945 00:42:35,680 --> 00:42:38,319 Speaker 1: even just from like a mundane task perspective, is like 946 00:42:38,440 --> 00:42:42,760 Speaker 1: house contracts, not having to pay somebody tons of money 947 00:42:42,840 --> 00:42:45,480 Speaker 1: to analyze something that you could probably get AI to 948 00:42:45,560 --> 00:42:48,120 Speaker 1: highlight some of the key things, key words like even 949 00:42:48,160 --> 00:42:51,080 Speaker 1: that's really cool, And I actually use that recently myself 950 00:42:51,080 --> 00:42:52,920 Speaker 1: when I was selling a car, like just trying to 951 00:42:52,960 --> 00:42:56,040 Speaker 1: go through that and try and understand it, like the time, 952 00:42:56,200 --> 00:42:58,520 Speaker 1: even just to know, okay, cool. Do I actually trust 953 00:42:58,520 --> 00:43:00,560 Speaker 1: that other person in what they're saying. I can actually 954 00:43:00,560 --> 00:43:02,840 Speaker 1: analyze it myself. It just kind of reduces like the 955 00:43:03,520 --> 00:43:06,920 Speaker 1: remorse or the regret of making that decision if you 956 00:43:06,960 --> 00:43:09,000 Speaker 1: can sort of back yourself as it's really cool. 957 00:43:09,360 --> 00:43:12,439 Speaker 2: Okay, that's actually a great question. In the mundane, day 958 00:43:12,480 --> 00:43:17,480 Speaker 2: to day realm of people's lives. What are a couple 959 00:43:17,520 --> 00:43:21,399 Speaker 2: of major ways if you were like a total AI 960 00:43:21,440 --> 00:43:23,480 Speaker 2: KNEWB if you met them and were like, h, I 961 00:43:23,560 --> 00:43:26,480 Speaker 2: want you to like integrate these three things into these 962 00:43:26,600 --> 00:43:30,600 Speaker 2: mundane like tax or like editing or like whatever. What 963 00:43:30,680 --> 00:43:33,160 Speaker 2: would be the three or four use cases that you're like, 964 00:43:33,280 --> 00:43:35,360 Speaker 2: AI is amazing for these things. 965 00:43:35,719 --> 00:43:40,480 Speaker 1: Groceries? What, Yeah, you can like so you can actually 966 00:43:40,640 --> 00:43:44,680 Speaker 1: use AI. You can create like regular lists of groceries 967 00:43:44,719 --> 00:43:47,360 Speaker 1: if you're willing to do tracking, especially for someone like myself. 968 00:43:47,400 --> 00:43:50,920 Speaker 1: I'm neurodiverse. I've got ADHD pretty bad. I can't focus 969 00:43:50,960 --> 00:43:53,120 Speaker 1: on things, so I make lists for everything. Then I 970 00:43:53,160 --> 00:43:56,319 Speaker 1: lose the lists. Nice, So if I can, obviously, then 971 00:43:56,360 --> 00:43:58,080 Speaker 1: I do it again. I'm like, I've got this genius idea. 972 00:43:58,120 --> 00:43:59,759 Speaker 1: My partner's like, we did that last month. Remember how 973 00:43:59,800 --> 00:44:03,200 Speaker 1: well that went? So using AI for a mundane task 974 00:44:03,239 --> 00:44:06,799 Speaker 1: like groceries or even bill management where money anxiety is there, 975 00:44:06,960 --> 00:44:09,000 Speaker 1: to be able to reduce that stress of being able 976 00:44:09,080 --> 00:44:12,360 Speaker 1: to track where you're at utilizing AI, even AI companion, 977 00:44:12,400 --> 00:44:14,520 Speaker 1: because the way that it works on our PCs is 978 00:44:14,520 --> 00:44:17,360 Speaker 1: because it's yours, it actually can recall, you can create 979 00:44:17,440 --> 00:44:19,440 Speaker 1: libraries so you can actually save it. But from an 980 00:44:19,440 --> 00:44:23,239 Speaker 1: AI perspective, outside of even just the AI companion, having 981 00:44:23,320 --> 00:44:26,799 Speaker 1: the ability to track your usage on an app for 982 00:44:26,880 --> 00:44:30,560 Speaker 1: your groceries, being able to tick a box because again ADHD, 983 00:44:30,600 --> 00:44:32,479 Speaker 1: it feels really nice to tick something and say yeah, 984 00:44:32,480 --> 00:44:34,920 Speaker 1: good because then it feels like you've accomplished something. So 985 00:44:34,960 --> 00:44:36,719 Speaker 1: to be able to do that is really really cool. 986 00:44:36,800 --> 00:44:39,319 Speaker 1: So groceries huge list. To be able to track what's 987 00:44:39,320 --> 00:44:40,960 Speaker 1: in your fridge and know what you need to order, 988 00:44:41,040 --> 00:44:43,920 Speaker 1: because unless it's upfront for me, I don't know it exists. 989 00:44:44,280 --> 00:44:46,880 Speaker 1: So unfortunately within my fridge, you better believe there's probably 990 00:44:47,200 --> 00:44:50,440 Speaker 1: like a whole bloody cob of corn that's just. 991 00:44:50,560 --> 00:44:53,319 Speaker 2: Really badly dynastic a long time ago. 992 00:44:53,760 --> 00:44:55,800 Speaker 1: So to be able to track that is so so handy. 993 00:44:55,840 --> 00:44:58,640 Speaker 1: But from a calendar perspective as well, I use AI 994 00:44:58,760 --> 00:45:01,840 Speaker 1: with my partner to actually understand what is happening in 995 00:45:01,880 --> 00:45:05,760 Speaker 1: her schedule. She's a midwife and so she's doing really 996 00:45:06,040 --> 00:45:08,839 Speaker 1: really crazy hours. And then I travel a lot for work, 997 00:45:08,920 --> 00:45:11,320 Speaker 1: so I never know when we're home. So we utilize 998 00:45:11,320 --> 00:45:14,240 Speaker 1: it to be able to understand our schedules more importantly, 999 00:45:14,239 --> 00:45:17,880 Speaker 1: create date nights because again I really value my personal time, 1000 00:45:18,000 --> 00:45:20,880 Speaker 1: but my favorite person in the world, Unfortunately, our schedules 1001 00:45:20,920 --> 00:45:23,640 Speaker 1: clash a lot. She's also doing UNI and that also 1002 00:45:23,719 --> 00:45:24,279 Speaker 1: makes like. 1003 00:45:24,200 --> 00:45:25,560 Speaker 2: Two and a half minutes for date night. 1004 00:45:25,840 --> 00:45:29,319 Speaker 1: Literally, but it's okay, We're okay, We're working really well, 1005 00:45:29,320 --> 00:45:31,359 Speaker 1: and we utilize AI to be able to figure out 1006 00:45:31,400 --> 00:45:33,479 Speaker 1: where we're going for dinner because the fatigue of making 1007 00:45:33,520 --> 00:45:36,080 Speaker 1: decisions is so bloody hard as well. 1008 00:45:36,160 --> 00:45:39,680 Speaker 2: Right, Oh my god, So what would you put a 1009 00:45:39,760 --> 00:45:42,080 Speaker 2: prompt in? Would you be like, where should we go 1010 00:45:42,160 --> 00:45:45,560 Speaker 2: for date night? Out of Japanese, Mexican and whatever, Like 1011 00:45:45,640 --> 00:45:48,120 Speaker 2: you literally just give it so ditation I. 1012 00:45:48,120 --> 00:45:51,560 Speaker 1: Go location, give me like four to five star rating 1013 00:45:51,600 --> 00:45:54,480 Speaker 1: restaurants and this is our budget and it will send recommendations. 1014 00:45:54,640 --> 00:45:57,560 Speaker 2: Oh my god, I've never used it. I've never That 1015 00:45:57,680 --> 00:46:00,040 Speaker 2: is like the biggest pain point in our relationship. What 1016 00:46:00,080 --> 00:46:00,839 Speaker 2: are we going to have for dinner. 1017 00:46:01,239 --> 00:46:03,000 Speaker 3: Oh my god, so obviously my life is over. 1018 00:46:03,120 --> 00:46:05,520 Speaker 1: AI is so handy. But if you really want to 1019 00:46:05,560 --> 00:46:07,760 Speaker 1: like manipulate your partner a little bit, the best suggestion 1020 00:46:07,880 --> 00:46:10,040 Speaker 1: always ever give you is go up to your partner 1021 00:46:10,040 --> 00:46:11,759 Speaker 1: and be like, oh my gosh, you'll never guess where 1022 00:46:11,760 --> 00:46:13,759 Speaker 1: we're going out for dinner because usually the first place 1023 00:46:13,800 --> 00:46:17,680 Speaker 1: is where they want to go. One hundred bank might drop, yeah, 1024 00:46:17,680 --> 00:46:19,880 Speaker 1: but sometimes that fatigue is still there. So I definitely 1025 00:46:19,920 --> 00:46:23,080 Speaker 1: still do recommend, like if you want help picking particular places, 1026 00:46:23,120 --> 00:46:25,640 Speaker 1: because my partner and I tend to have a habit 1027 00:46:25,680 --> 00:46:27,319 Speaker 1: of just picking the same place every time. It's either 1028 00:46:27,320 --> 00:46:30,560 Speaker 1: that's thigh food or potentially Chuckold chicken because. 1029 00:46:30,320 --> 00:46:32,960 Speaker 2: I love chucoal chicken same I is Japanese or chocol chicken. 1030 00:46:33,040 --> 00:46:34,960 Speaker 1: Yeah, literally, but sometimes it's nice to just get out 1031 00:46:34,960 --> 00:46:36,920 Speaker 1: of your comfort zone. Like I live in Gippsland. I 1032 00:46:36,960 --> 00:46:40,440 Speaker 1: live in Regional Victoria on Gippsland. Yeah, we're in Giffee. 1033 00:46:40,520 --> 00:46:41,640 Speaker 1: I wear a new warrigal. 1034 00:46:41,840 --> 00:46:45,239 Speaker 3: Oh my god, so I'm a warrigal girl. Stop it. 1035 00:46:46,440 --> 00:46:47,960 Speaker 1: I mean, like proud of you for getting out by 1036 00:46:48,000 --> 00:46:48,680 Speaker 1: the way right, like. 1037 00:46:49,080 --> 00:46:51,160 Speaker 2: I know, I mean I was eighteen months old. I 1038 00:46:51,200 --> 00:46:53,600 Speaker 2: can't say that I engendered that decision, but like my 1039 00:46:53,760 --> 00:46:57,160 Speaker 2: first postcode yeah horrible. Yeah. I went back to speak 1040 00:46:57,200 --> 00:46:59,120 Speaker 2: for women in gips Land recently and they were like, oh, 1041 00:46:59,120 --> 00:47:02,120 Speaker 2: you're a warrigor girl like kind of was like you 1042 00:47:02,120 --> 00:47:03,319 Speaker 2: know it, but like. 1043 00:47:03,280 --> 00:47:05,719 Speaker 3: I definitely left my mark on that plan. 1044 00:47:05,560 --> 00:47:07,440 Speaker 1: Absolutely, like that'll take it. 1045 00:47:07,560 --> 00:47:08,680 Speaker 3: That's yeah. 1046 00:47:08,680 --> 00:47:11,440 Speaker 1: But yeah, so I remember from from Gippsland. But as 1047 00:47:11,480 --> 00:47:15,800 Speaker 1: you know that there's probably thirty bakeries, sixteen pizza shops. 1048 00:47:15,880 --> 00:47:18,640 Speaker 3: Yeah, yeah, a lot of fish and the rest is 1049 00:47:18,680 --> 00:47:21,000 Speaker 3: just hairdressers. So that's true. 1050 00:47:21,080 --> 00:47:22,719 Speaker 1: You've got to be willing to drive like an hour 1051 00:47:22,840 --> 00:47:25,640 Speaker 1: for something good, so you want it to be good exactly. 1052 00:47:25,640 --> 00:47:28,360 Speaker 1: So using AI to answer the questions and the biggest 1053 00:47:28,400 --> 00:47:30,719 Speaker 1: draw card is again back to prompting, making sure you're 1054 00:47:30,719 --> 00:47:32,360 Speaker 1: asking the right question. So a lot of the times 1055 00:47:32,440 --> 00:47:34,719 Speaker 1: is we want to do this, here's the purpose of 1056 00:47:34,719 --> 00:47:37,120 Speaker 1: what it is, here's what we're looking for, here's your 1057 00:47:37,160 --> 00:47:40,000 Speaker 1: reference points, and then ask me five questions back to 1058 00:47:40,120 --> 00:47:42,840 Speaker 1: ensure that you know or understand what it is that 1059 00:47:42,880 --> 00:47:45,400 Speaker 1: I'm requesting. So that way it'll say, okay, great, I 1060 00:47:45,480 --> 00:47:48,359 Speaker 1: understand that you're vibing, you know, Mexican, Are there any 1061 00:47:48,400 --> 00:47:50,640 Speaker 1: allergies that we need to be concerned about? Or what's 1062 00:47:50,680 --> 00:47:53,040 Speaker 1: the budget or whatever it is. So having that proper 1063 00:47:53,080 --> 00:47:55,440 Speaker 1: conversation with the AI to really give you the response 1064 00:47:55,440 --> 00:47:57,359 Speaker 1: and the answer as a final result is so much 1065 00:47:57,400 --> 00:48:02,160 Speaker 1: more important than just being like give me food please, recommendations. 1066 00:48:01,360 --> 00:48:02,640 Speaker 2: Like that's not going to help at all. 1067 00:48:02,719 --> 00:48:04,920 Speaker 1: That's not going to give you your answer. So drawing 1068 00:48:05,000 --> 00:48:07,360 Speaker 1: back to AI, it's only as smart as the person 1069 00:48:07,440 --> 00:48:09,719 Speaker 1: willing to give the answers in the prompt Yeah, and 1070 00:48:09,760 --> 00:48:12,040 Speaker 1: I'm not saying people as stupid. I'm just saying that 1071 00:48:12,480 --> 00:48:15,839 Speaker 1: you taking the half asked approach to prompting is not 1072 00:48:15,960 --> 00:48:17,520 Speaker 1: going to give you what you need. It's not going 1073 00:48:17,560 --> 00:48:19,279 Speaker 1: to give you the results, and it's actually going to 1074 00:48:19,280 --> 00:48:21,080 Speaker 1: give you a bad experience, and therefore you will not 1075 00:48:21,120 --> 00:48:23,560 Speaker 1: continue to use it. Give it the chance to succeed 1076 00:48:23,640 --> 00:48:25,600 Speaker 1: so that you can actually utilize it to its fullest. 1077 00:48:25,880 --> 00:48:29,720 Speaker 2: The best tip that you gave me was ask AI 1078 00:48:29,880 --> 00:48:32,319 Speaker 2: to ask you questions back, and I was like, what 1079 00:48:32,360 --> 00:48:35,000 Speaker 2: do you even mean? Like what are you even saying? 1080 00:48:35,719 --> 00:48:38,200 Speaker 2: And you was when I was making the video, you said, 1081 00:48:38,520 --> 00:48:41,399 Speaker 2: give it your story because it was a Valentine's Day video. Yep, 1082 00:48:41,719 --> 00:48:44,280 Speaker 2: tell it your story and then get it to ask 1083 00:48:44,360 --> 00:48:47,239 Speaker 2: you back some questions that you think it would need 1084 00:48:47,280 --> 00:48:50,280 Speaker 2: to know, like do you want it to be more emotional? 1085 00:48:50,320 --> 00:48:52,080 Speaker 2: You know how long we've been together, like whatever, just 1086 00:48:52,120 --> 00:48:53,680 Speaker 2: ask her to ask you questions back. And I was like, 1087 00:48:53,760 --> 00:48:57,080 Speaker 2: that is so so clever because I didn't know the 1088 00:48:57,160 --> 00:48:59,839 Speaker 2: questions I needed to ask, and it asked me questions back. 1089 00:49:00,000 --> 00:49:02,800 Speaker 2: I was like, oh my god, this is so smart, exactly. 1090 00:49:02,480 --> 00:49:04,600 Speaker 1: So smart exactly, so thank you for giving it a 1091 00:49:04,680 --> 00:49:06,520 Speaker 1: chance to succeed and give you what you need because 1092 00:49:06,520 --> 00:49:08,520 Speaker 1: we've got that final result. And I loved every minute. 1093 00:49:08,600 --> 00:49:10,600 Speaker 2: I was obviously the best thing I've ever read. I'm 1094 00:49:10,680 --> 00:49:12,800 Speaker 2: where is my Oscar nomination? I don't want to, I 1095 00:49:12,840 --> 00:49:15,440 Speaker 2: don't want to, like jumping done, but let me call 1096 00:49:15,640 --> 00:49:16,800 Speaker 2: the oscars really quickly. 1097 00:49:16,920 --> 00:49:18,760 Speaker 3: Yeah. 1098 00:49:18,800 --> 00:49:22,120 Speaker 2: Well, I've obviously, yeah, fully embraced not just AI generally 1099 00:49:22,200 --> 00:49:25,520 Speaker 2: and lots of cloud based platforms, but the omnibook specifically 1100 00:49:25,520 --> 00:49:27,040 Speaker 2: has so much built in and as you said, the 1101 00:49:27,040 --> 00:49:30,319 Speaker 2: battery life. It optimizes it when I'm editing versus when 1102 00:49:30,360 --> 00:49:33,600 Speaker 2: I'm word processing versus when I'm making bare videos like 1103 00:49:33,640 --> 00:49:36,400 Speaker 2: this when I'm listening passively like there's so many or scrolling, 1104 00:49:36,440 --> 00:49:39,080 Speaker 2: like there's so many different things that it is always 1105 00:49:39,120 --> 00:49:41,680 Speaker 2: working out how to best suit my life as a 1106 00:49:42,120 --> 00:49:45,239 Speaker 2: very light to a very heavy processing task person, I'm 1107 00:49:45,280 --> 00:49:47,399 Speaker 2: kind of doing the full spectrum. So it's I think, 1108 00:49:48,120 --> 00:49:51,040 Speaker 2: really anyone who's doing a lot of different tasks and 1109 00:49:51,040 --> 00:49:52,600 Speaker 2: who needs battery life and is on the goal all 1110 00:49:52,600 --> 00:49:54,360 Speaker 2: the time, it's such a good candidate for this laptop 1111 00:49:54,400 --> 00:49:57,759 Speaker 2: because it is AI optimized specifically. Who else do you 1112 00:49:57,800 --> 00:49:59,160 Speaker 2: think would really benefit? 1113 00:49:59,640 --> 00:50:02,160 Speaker 1: I think the people that are less techy but willing 1114 00:50:02,200 --> 00:50:04,560 Speaker 1: to give it a go. You've got early adopters, You've 1115 00:50:04,600 --> 00:50:07,239 Speaker 1: got all these different people that love tech but don't 1116 00:50:07,280 --> 00:50:10,640 Speaker 1: really know. I think that's really handy because like from 1117 00:50:10,719 --> 00:50:12,680 Speaker 1: my retail days, we used to get customers coming in and 1118 00:50:12,680 --> 00:50:14,920 Speaker 1: they're like, I need this spec, this spec, this spec. 1119 00:50:15,000 --> 00:50:16,480 Speaker 1: And I'm like, oh, when did you get that information? 1120 00:50:16,640 --> 00:50:18,520 Speaker 1: Like my grandson they told me I need it. I'm like, 1121 00:50:18,520 --> 00:50:22,600 Speaker 1: what are you using it for? And they're like Google, great. 1122 00:50:22,880 --> 00:50:24,000 Speaker 2: Microsoft word Yeah. 1123 00:50:24,000 --> 00:50:24,400 Speaker 1: Literally. 1124 00:50:24,520 --> 00:50:25,960 Speaker 3: So I think there's text. 1125 00:50:25,840 --> 00:50:27,600 Speaker 2: The tech what's that text one that has like no 1126 00:50:27,680 --> 00:50:32,080 Speaker 2: formatting or note notes, I don't know, whatever the one 1127 00:50:32,080 --> 00:50:33,320 Speaker 2: is where it's just text. 1128 00:50:33,480 --> 00:50:36,279 Speaker 1: But you know what, even like even from like a 1129 00:50:36,320 --> 00:50:38,239 Speaker 1: photo standpoint, they're like, I use paint. 1130 00:50:38,360 --> 00:50:40,320 Speaker 2: Oh my god, stop it. 1131 00:50:40,880 --> 00:50:43,960 Speaker 1: Yeah, I love paint, but you're going to crash like 1132 00:50:44,080 --> 00:50:47,520 Speaker 1: the circles and squiggles and then you color it in. Anyway, 1133 00:50:47,560 --> 00:50:50,080 Speaker 1: back to the actual like question at hand, who I 1134 00:50:50,120 --> 00:50:52,640 Speaker 1: think genuinely like the people that are willing to give 1135 00:50:52,719 --> 00:50:54,640 Speaker 1: tech a go but don't fully understand it. It's a 1136 00:50:54,640 --> 00:50:56,759 Speaker 1: really great tool because they may not understand where the 1137 00:50:56,760 --> 00:50:59,440 Speaker 1: settings are. They might want to deep dive and have 1138 00:50:59,480 --> 00:51:02,080 Speaker 1: a conversation with their PC and they can. And I 1139 00:51:02,120 --> 00:51:05,000 Speaker 1: think that that's something that we truly as users take 1140 00:51:05,000 --> 00:51:07,440 Speaker 1: advantage of. Like for me, I turn my laptop on, 1141 00:51:07,640 --> 00:51:10,279 Speaker 1: I mindlessly do things and I just I'm on autopilot. 1142 00:51:10,360 --> 00:51:11,680 Speaker 1: I know what I need to do, I know the 1143 00:51:11,760 --> 00:51:13,319 Speaker 1: questions I need to ask. But there are so many 1144 00:51:13,360 --> 00:51:16,280 Speaker 1: people out there that still are so scared and nervous 1145 00:51:16,360 --> 00:51:19,040 Speaker 1: to buy expensive devices because they don't think that they're 1146 00:51:19,040 --> 00:51:20,880 Speaker 1: going to take full advantage of it. But what if 1147 00:51:20,880 --> 00:51:23,600 Speaker 1: you could actually ask the PC to do specific settings 1148 00:51:23,719 --> 00:51:26,480 Speaker 1: or tailor it to them and it can do that 1149 00:51:26,600 --> 00:51:29,640 Speaker 1: job like great, Like there are so many amazing people 1150 00:51:29,680 --> 00:51:33,200 Speaker 1: out there that really deserve the chance to fully adopt 1151 00:51:33,239 --> 00:51:35,800 Speaker 1: all this new technology but just don't know where to start. 1152 00:51:36,040 --> 00:51:37,200 Speaker 1: It's a really great way to go. 1153 00:51:37,840 --> 00:51:39,960 Speaker 2: One hundred percent. Well, oh my gosh, Lauren, I could 1154 00:51:39,960 --> 00:51:44,319 Speaker 2: pick your brain forever. But this has been so educational 1155 00:51:45,200 --> 00:51:48,480 Speaker 2: and even yeah, I just hadn't even thought about your 1156 00:51:48,600 --> 00:51:52,160 Speaker 2: challenges with visualizing things and how from a disability standpoint, 1157 00:51:52,280 --> 00:51:53,920 Speaker 2: if like I just hadn't thought of that use case 1158 00:51:53,920 --> 00:51:55,839 Speaker 2: at all, I think that is so powerful as well, 1159 00:51:55,880 --> 00:51:58,200 Speaker 2: because yeah, that is a huge block to your ability 1160 00:51:58,239 --> 00:52:00,359 Speaker 2: to do the things that you want to do. And 1161 00:52:00,440 --> 00:52:04,200 Speaker 2: like this is a like unrelated but related example, even 1162 00:52:04,320 --> 00:52:07,680 Speaker 2: like renovating. I have used it so much to put 1163 00:52:07,680 --> 00:52:09,759 Speaker 2: in a picture of a room and say, put this 1164 00:52:09,880 --> 00:52:11,680 Speaker 2: couch in that room and show me what it would 1165 00:52:11,680 --> 00:52:13,680 Speaker 2: look like, move it to the other wall. Like that 1166 00:52:13,719 --> 00:52:16,440 Speaker 2: stuff is simple day to day stuff, but like from 1167 00:52:16,480 --> 00:52:19,520 Speaker 2: a visualization perspective, you couldn't do that before without getting 1168 00:52:19,520 --> 00:52:23,359 Speaker 2: a drafts person to like sketch it for you, or 1169 00:52:23,400 --> 00:52:26,080 Speaker 2: like trying to or you draw yourself in a magazine. Yeah, 1170 00:52:26,080 --> 00:52:27,960 Speaker 2: you print and cut it like it's a new world. 1171 00:52:28,080 --> 00:52:29,799 Speaker 2: And it's so exciting that there are so many ways 1172 00:52:29,840 --> 00:52:32,200 Speaker 2: to enhance our lives. So thank you so much for 1173 00:52:32,239 --> 00:52:33,880 Speaker 2: sharing my pleasure. 1174 00:52:33,960 --> 00:52:36,279 Speaker 1: Thank you, and I'm really glad that you actually can 1175 00:52:36,360 --> 00:52:39,439 Speaker 1: visualize because that made my heart happy because it proved 1176 00:52:39,440 --> 00:52:41,439 Speaker 1: my point that I clearly am missing out. 1177 00:52:41,880 --> 00:52:44,760 Speaker 2: So you're taking not now because I'm not, because AI, 1178 00:52:45,239 --> 00:52:45,720 Speaker 2: I complain. 1179 00:52:46,360 --> 00:52:48,000 Speaker 1: So if someone asked me that question, like give me 1180 00:52:48,040 --> 00:52:49,960 Speaker 1: one minute, let me just quickly like draw something up 1181 00:52:50,000 --> 00:52:52,560 Speaker 1: really quickly on my AI instead, because clearly. 1182 00:52:53,880 --> 00:52:57,120 Speaker 2: Well speaking of I hear that we finish every episode 1183 00:52:57,160 --> 00:52:59,320 Speaker 2: and I've actually forgotten to do this the last few episodes, 1184 00:52:59,360 --> 00:53:02,680 Speaker 2: but we finish most episodes with a quote, and I 1185 00:53:02,760 --> 00:53:06,640 Speaker 2: hear that you didn't have one. But I also find 1186 00:53:06,680 --> 00:53:09,200 Speaker 2: it hilarious that in the context of this conversation, you 1187 00:53:09,200 --> 00:53:12,359 Speaker 2: didn't go ask AI companion for a quote that would 1188 00:53:12,440 --> 00:53:16,800 Speaker 2: fit the CCA podcast. That specific guys, you can literally say, 1189 00:53:16,920 --> 00:53:19,840 Speaker 2: I am going on the CZA podcast, find me a 1190 00:53:19,920 --> 00:53:21,960 Speaker 2: quote that fits in with the vibe of this podcast, 1191 00:53:21,960 --> 00:53:24,040 Speaker 2: and it will do it for it is so crazy. 1192 00:53:23,840 --> 00:53:25,640 Speaker 1: Literally, and you know what, I think that I'm going 1193 00:53:25,719 --> 00:53:28,400 Speaker 1: to steal that quote, So yeah, what Sarah said, but genuinely, 1194 00:53:28,400 --> 00:53:31,279 Speaker 1: like I'm kind of ashamed considering the whole purpose of 1195 00:53:31,280 --> 00:53:32,160 Speaker 1: why I'm here today. 1196 00:53:32,600 --> 00:53:34,080 Speaker 3: And I didn't even think of that because. 1197 00:53:33,840 --> 00:53:35,480 Speaker 1: I was like, maybe I should just like google some 1198 00:53:35,560 --> 00:53:37,640 Speaker 1: quotes and then just take something with me. But I 1199 00:53:37,680 --> 00:53:39,879 Speaker 1: don't remember quotes. I don't remember that, but I do 1200 00:53:39,920 --> 00:53:42,080 Speaker 1: remember when people have passion. But if I read a 1201 00:53:42,160 --> 00:53:44,359 Speaker 1: quote off a piece of paper, there's no emotion to that, 1202 00:53:44,400 --> 00:53:46,839 Speaker 1: so it doesn't mean much to me. So I think, 1203 00:53:47,040 --> 00:53:49,799 Speaker 1: just in general, like drawing back to everything that we say, 1204 00:53:49,840 --> 00:53:52,719 Speaker 1: just be an adopter. Give it a red hot go. 1205 00:53:52,880 --> 00:53:54,640 Speaker 1: If you were willing to give social media a go, 1206 00:53:55,200 --> 00:53:57,760 Speaker 1: then you can try AI because ultimately, if you're scared 1207 00:53:57,800 --> 00:53:59,800 Speaker 1: with all your information where that's going, what makes you 1208 00:53:59,800 --> 00:54:02,160 Speaker 1: think that your photos are not sending to the abyss, 1209 00:54:02,480 --> 00:54:05,160 Speaker 1: give it a chance, give it a chance to succeed, 1210 00:54:05,360 --> 00:54:07,799 Speaker 1: get your time back, do something for you so that 1211 00:54:07,840 --> 00:54:09,719 Speaker 1: you can spend more time with the people that you love. 1212 00:54:10,440 --> 00:54:12,879 Speaker 2: I love it. Thank you so much, Lauren, Sure, thank 1213 00:54:12,880 --> 00:54:13,160 Speaker 2: you for. 1214 00:54:13,120 --> 00:54:16,400 Speaker 3: Having me here. Love it. 1215 00:54:16,400 --> 00:54:17,040 Speaker 1: It's been great. 1216 00:54:18,400 --> 00:54:21,919 Speaker 2: My brain is still exploding in all directions from this one, 1217 00:54:21,960 --> 00:54:25,880 Speaker 2: and every time I re listened when editing, new questions emerged. 1218 00:54:26,239 --> 00:54:28,719 Speaker 2: If you do have any follow up questions, feel free 1219 00:54:28,760 --> 00:54:30,680 Speaker 2: to drop us a DM as always that we can 1220 00:54:30,760 --> 00:54:33,359 Speaker 2: pass on to Lauren, or any feedback for that matter, 1221 00:54:33,360 --> 00:54:36,040 Speaker 2: as I'm sure you found Lauren's tips as invaluable as 1222 00:54:36,080 --> 00:54:39,760 Speaker 2: I did. If you enjoyed, as always, please do share 1223 00:54:39,800 --> 00:54:43,000 Speaker 2: the episode with your thoughts and takeaways. It means the 1224 00:54:43,080 --> 00:54:46,200 Speaker 2: world to grow the neighborhood as far and wide as possible, 1225 00:54:46,560 --> 00:54:49,279 Speaker 2: and I will include links to the HP omnibook and 1226 00:54:49,360 --> 00:54:51,600 Speaker 2: other resources we mentioned in the show notes so that 1227 00:54:51,640 --> 00:54:54,120 Speaker 2: you can revolutionize your work set up the way I 1228 00:54:54,239 --> 00:54:57,600 Speaker 2: have thanks to HP. In the meantime, I hope you're 1229 00:54:57,600 --> 00:55:06,520 Speaker 2: having a wonderful week and are seizing your sim semi 1230 00:55:06,760 --> 00:55:08,879 Speaker 2: humanly re semler em