1 00:00:01,000 --> 00:00:06,200 Speaker 1: I'm trying to kill myself in sixty days. That's how 2 00:00:06,320 --> 00:00:11,160 Speaker 1: Georgie Holt describes her latest experiment with AI, not literally, 3 00:00:11,320 --> 00:00:13,800 Speaker 1: of course, but her mission was to see if she 4 00:00:13,840 --> 00:00:17,680 Speaker 1: could hand over the operational side of her CEO role 5 00:00:18,040 --> 00:00:22,760 Speaker 1: to AI and free herself up completely. Georgie is the 6 00:00:22,800 --> 00:00:27,160 Speaker 1: co founder of flight Story alongside Stephen Bartlett, who you 7 00:00:27,240 --> 00:00:30,240 Speaker 1: might know from Diary of a CEO, and by the 8 00:00:30,360 --> 00:00:33,400 Speaker 1: end of this quick Win episode you will hear how 9 00:00:33,520 --> 00:00:36,880 Speaker 1: Georgie built a tool that slashed twenty to twenty five 10 00:00:36,920 --> 00:00:40,720 Speaker 1: hours from her week and changed the way she shows 11 00:00:40,800 --> 00:00:49,559 Speaker 1: up as a leader. Welcome to How I Work, a 12 00:00:49,600 --> 00:00:53,600 Speaker 1: show about habits, rituals, and strategies for optimizing your date. 13 00:00:54,200 --> 00:01:01,520 Speaker 1: I'm your host, doctor Amantha Imber. I love that example. 14 00:01:01,680 --> 00:01:04,800 Speaker 1: I want to digg into how you built that, Like 15 00:01:04,920 --> 00:01:07,600 Speaker 1: is this something that anyone can build, you know, with 16 00:01:07,680 --> 00:01:10,800 Speaker 1: the tool like report, which is a vibe coding tool 17 00:01:10,920 --> 00:01:13,920 Speaker 1: for mere mortals who are not programmers. Or is this 18 00:01:14,000 --> 00:01:17,319 Speaker 1: something where you built a GPT and chat jpt so 19 00:01:17,720 --> 00:01:19,440 Speaker 1: or did you have a programmer help you? What did 20 00:01:19,480 --> 00:01:20,000 Speaker 1: that look like? 21 00:01:20,600 --> 00:01:24,560 Speaker 2: I built it in a GPT, I sat down and 22 00:01:24,920 --> 00:01:27,000 Speaker 2: mapped out the process end to end. I worked with 23 00:01:27,080 --> 00:01:29,920 Speaker 2: GPT a lot, so it has a standing of who 24 00:01:29,959 --> 00:01:32,160 Speaker 2: I am, how I work, or you know, the kind 25 00:01:32,200 --> 00:01:35,160 Speaker 2: of leader and sort of character I am, and so 26 00:01:35,280 --> 00:01:37,720 Speaker 2: it sort of understands some of my watch points when 27 00:01:37,720 --> 00:01:40,160 Speaker 2: I'm interviewing with somebody. And then I sat down with 28 00:01:40,160 --> 00:01:42,080 Speaker 2: our data scientists. I said, look, I think I've got 29 00:01:42,120 --> 00:01:43,760 Speaker 2: something here, and I would like to spend some time 30 00:01:43,760 --> 00:01:45,959 Speaker 2: just qualifying whether you think I have to, and we 31 00:01:46,000 --> 00:01:49,080 Speaker 2: asked the GPT to build it. So we challenged it 32 00:01:49,080 --> 00:01:51,240 Speaker 2: to build it for us, and I gave it everything 33 00:01:51,280 --> 00:01:54,120 Speaker 2: that I wanted to achieve. I spoke to it a lot, 34 00:01:54,200 --> 00:01:56,080 Speaker 2: so it wasn't just entire I kind of had a 35 00:01:56,120 --> 00:01:59,720 Speaker 2: conversation with it, explained my pain points, explain what I 36 00:01:59,720 --> 00:02:02,760 Speaker 2: want to create, what I think I wanted it to, 37 00:02:02,800 --> 00:02:05,120 Speaker 2: show what it needed to do for me, and we 38 00:02:05,160 --> 00:02:08,160 Speaker 2: went back and forth like a team, and we brainstormed 39 00:02:08,200 --> 00:02:10,560 Speaker 2: with the GPT, and eventually it built itself. 40 00:02:10,600 --> 00:02:13,440 Speaker 1: That is amazing. I would love to hear a couple 41 00:02:13,440 --> 00:02:18,040 Speaker 1: of other examples of where AI has completely changed your workflows. 42 00:02:18,520 --> 00:02:23,520 Speaker 2: Absolutely, I have this deep feeling around AI that is 43 00:02:23,560 --> 00:02:25,600 Speaker 2: one of our greatest tools. But my fear is that 44 00:02:25,760 --> 00:02:28,440 Speaker 2: it will sometimes take away the most important thing that 45 00:02:28,480 --> 00:02:33,000 Speaker 2: we have, which is our ability to summarize intellectually really 46 00:02:33,040 --> 00:02:36,640 Speaker 2: important problems and challenges. Because I think one of the 47 00:02:36,680 --> 00:02:39,080 Speaker 2: biggest skills that a leader can learn over time is 48 00:02:39,080 --> 00:02:45,280 Speaker 2: the ability to assimilate and summarize information and then take 49 00:02:45,320 --> 00:02:49,080 Speaker 2: that information you distribute it into an organization, distribute it 50 00:02:49,120 --> 00:02:51,480 Speaker 2: into a team, understanding how the sales team might want 51 00:02:51,480 --> 00:02:54,720 Speaker 2: to receive it, understanding how a data science team wants 52 00:02:54,720 --> 00:02:56,920 Speaker 2: to receive it. Because your job as a leader is 53 00:02:56,960 --> 00:02:59,600 Speaker 2: to take a team on a journey with you. So 54 00:03:00,000 --> 00:03:04,200 Speaker 2: we took inspiration from the great writing machines of the world, 55 00:03:04,280 --> 00:03:06,120 Speaker 2: and I think, and I looked to Hollywood, if I 56 00:03:06,160 --> 00:03:09,840 Speaker 2: looked to Hollywood Studios, and I looked at the great 57 00:03:09,960 --> 00:03:12,640 Speaker 2: narrative creators in the world, and I thought, well, what 58 00:03:12,720 --> 00:03:15,680 Speaker 2: do they have. They have a writer's room. So quite 59 00:03:15,680 --> 00:03:19,399 Speaker 2: often they will have building out narrative or building out scripting. 60 00:03:19,520 --> 00:03:25,280 Speaker 2: They will have multiple writers attacking a challenge or a 61 00:03:25,840 --> 00:03:28,680 Speaker 2: storyline from all different perspectives, and they'll all come together 62 00:03:28,720 --> 00:03:31,600 Speaker 2: to debate and discuss those different points of view. And 63 00:03:31,680 --> 00:03:34,760 Speaker 2: I thought, well, what an interesting thing to try and build. 64 00:03:34,880 --> 00:03:37,920 Speaker 2: Because I don't want chat GPT to write for me, 65 00:03:38,040 --> 00:03:40,520 Speaker 2: but I want it to challenge what I'm writing. So 66 00:03:40,600 --> 00:03:42,200 Speaker 2: I asked it whether I thought it could build out 67 00:03:42,200 --> 00:03:44,880 Speaker 2: a writer's room for me, and we would have different 68 00:03:44,880 --> 00:03:48,320 Speaker 2: personas as each writer to anyone from the sort of 69 00:03:48,680 --> 00:03:53,240 Speaker 2: the format and the function expert, to the disruptor, to 70 00:03:53,400 --> 00:03:58,160 Speaker 2: the emotional story arc to the person who maybe wrote 71 00:03:58,200 --> 00:04:01,920 Speaker 2: for sort of more Jeopardy. And so I basically constructed 72 00:04:01,920 --> 00:04:03,840 Speaker 2: a writer. I said, look, what are the eight archetypes 73 00:04:03,840 --> 00:04:05,880 Speaker 2: of a writer's room that you might find in Hollywood? 74 00:04:05,920 --> 00:04:08,640 Speaker 2: What are the great script writers? Do you think we 75 00:04:08,640 --> 00:04:10,320 Speaker 2: could build out a writer's room or all of those 76 00:04:10,360 --> 00:04:13,880 Speaker 2: personas exist? And if I write something and I can 77 00:04:13,920 --> 00:04:15,960 Speaker 2: call on you, I can literally say, please bring the 78 00:04:15,960 --> 00:04:19,520 Speaker 2: writer's room to this piece of communication and challenge it. 79 00:04:20,040 --> 00:04:22,760 Speaker 2: So I now have a writer's room. So everything that 80 00:04:22,839 --> 00:04:25,880 Speaker 2: I create, which I think I have got to a 81 00:04:25,920 --> 00:04:28,320 Speaker 2: good enough place, I will then put it into the 82 00:04:28,320 --> 00:04:30,880 Speaker 2: writer's room and they'll challenge it and say the emotional 83 00:04:30,960 --> 00:04:33,840 Speaker 2: art isn't strong enough, or this could be format about that. 84 00:04:33,839 --> 00:04:35,719 Speaker 2: Actually the disrupture will come in. And so you could 85 00:04:35,800 --> 00:04:40,000 Speaker 2: create such a more compelling narrative if you put in 86 00:04:40,000 --> 00:04:42,680 Speaker 2: a little bit of jeopardy, So they don't write for 87 00:04:42,880 --> 00:04:47,120 Speaker 2: me because I am extremely conscious and I think it's 88 00:04:47,160 --> 00:04:49,520 Speaker 2: one of our greatest challenges that we are going to 89 00:04:50,440 --> 00:04:54,800 Speaker 2: lose some of the things that make humans extremely special, 90 00:04:55,360 --> 00:04:57,880 Speaker 2: and that is storytelling, and that is the ability to 91 00:04:57,920 --> 00:05:02,400 Speaker 2: summarize and distill information to the world in a way 92 00:05:02,440 --> 00:05:04,920 Speaker 2: that is well understood. And I think about Google Maps. 93 00:05:04,920 --> 00:05:06,680 Speaker 2: It's like, when was the last time you really remember 94 00:05:06,760 --> 00:05:09,480 Speaker 2: the direction to somewhere unless it was somewhere within sort 95 00:05:09,480 --> 00:05:13,120 Speaker 2: of twenty miles of your home. So I am concerned 96 00:05:13,240 --> 00:05:16,680 Speaker 2: that we're going to lose our way, and we're going 97 00:05:16,720 --> 00:05:19,159 Speaker 2: to lose how to remember how to do things and 98 00:05:19,200 --> 00:05:20,920 Speaker 2: do things well in the same way we just rely 99 00:05:21,000 --> 00:05:22,920 Speaker 2: on Google Maps, we leave the house and we hand 100 00:05:22,960 --> 00:05:26,000 Speaker 2: over the direction of our journey to somebody else. I 101 00:05:26,000 --> 00:05:28,600 Speaker 2: would hate to think we were handing over our creativity 102 00:05:28,640 --> 00:05:30,200 Speaker 2: and our consciousness to something. 103 00:05:30,600 --> 00:05:33,680 Speaker 1: What stayed with me from Georgie's story is how AI 104 00:05:34,080 --> 00:05:38,159 Speaker 1: gave her the most valuable resource back time, not to 105 00:05:38,200 --> 00:05:41,880 Speaker 1: cram in more tasks, but to be fully present with people. 106 00:05:42,400 --> 00:05:44,640 Speaker 1: So the next time you're heading into an interview or 107 00:05:44,680 --> 00:05:48,120 Speaker 1: a high stakes meeting, think how could you strip away 108 00:05:48,120 --> 00:05:51,240 Speaker 1: the admin so that you can show up with curiosity 109 00:05:51,440 --> 00:05:54,760 Speaker 1: instead of distraction. And of course, if you want to 110 00:05:54,800 --> 00:05:58,120 Speaker 1: go and listen to my full chat with Georgie, there 111 00:05:58,160 --> 00:06:00,839 Speaker 1: is a link in the show notes. If you like 112 00:06:00,880 --> 00:06:04,120 Speaker 1: today's show, make sure you hit follow on your podcast 113 00:06:04,160 --> 00:06:07,560 Speaker 1: app to be alerted when new episodes drop. How I 114 00:06:07,640 --> 00:06:10,799 Speaker 1: Work was recorded on the traditional land of the Warrangery people, 115 00:06:10,920 --> 00:06:11,960 Speaker 1: part of the Kulan nation.