1 00:00:06,080 --> 00:00:07,920 Speaker 1: Welcome to Fear and Greed Q and A, where we 2 00:00:07,960 --> 00:00:12,080 Speaker 1: ask and answer questions about business, investing, economics, politics and more. 3 00:00:12,119 --> 00:00:16,239 Speaker 1: I'm Sean Ailman. Australian businesses are under pressure, rising overheads, 4 00:00:16,360 --> 00:00:20,360 Speaker 1: shrinking margins and a need to embrace AI and technology 5 00:00:20,440 --> 00:00:23,520 Speaker 1: or risk being left behind. But maybe leaders are coming 6 00:00:23,560 --> 00:00:27,160 Speaker 1: at the challenge from the wrong perspective, focusing on roles 7 00:00:27,680 --> 00:00:31,280 Speaker 1: not tasks. Jarah Borman is the director of Strategy at 8 00:00:31,320 --> 00:00:34,800 Speaker 1: Outrun Global, a great supporter of Fear and Greed. Jarah, 9 00:00:34,800 --> 00:00:36,839 Speaker 1: Welcome to Fear and Greed Q and A thank you. 10 00:00:37,000 --> 00:00:38,520 Speaker 2: It's pleased to be here. Nice to meet you, Sean. 11 00:00:38,840 --> 00:00:42,040 Speaker 1: Now, business under pressure, you know, looking for customers. You've 12 00:00:42,040 --> 00:00:45,560 Speaker 1: got costs, overheads, that type of thing, competition, a growing 13 00:00:45,640 --> 00:00:48,280 Speaker 1: gap between effort and output. We all kind of know 14 00:00:48,440 --> 00:00:53,320 Speaker 1: that story. You're saying this is mostly about visibility rather 15 00:00:53,400 --> 00:00:55,920 Speaker 1: than capability. Just explain that to me. 16 00:00:56,400 --> 00:01:00,200 Speaker 2: Yeah, So look, we get it. Businesses are struggling. There's 17 00:01:00,200 --> 00:01:02,240 Speaker 2: a lot of squeeze. Theres a lot of pressure on margins, 18 00:01:02,360 --> 00:01:05,319 Speaker 2: increased costs. You've covered a few different aspects there that 19 00:01:05,360 --> 00:01:07,360 Speaker 2: businesses is a struggling with. We're also going through a 20 00:01:07,400 --> 00:01:10,440 Speaker 2: transition from the information agent to the intelligence age at 21 00:01:10,440 --> 00:01:13,039 Speaker 2: the same time, and that's creating a lot of upheaval 22 00:01:13,200 --> 00:01:15,560 Speaker 2: as well. So from our point of view, there's not 23 00:01:15,600 --> 00:01:19,399 Speaker 2: a capability problem within businesses. There's just a visibility around 24 00:01:19,400 --> 00:01:22,360 Speaker 2: what work is actually being done. Once you start to 25 00:01:22,440 --> 00:01:24,600 Speaker 2: understand that, you can realize a lot of businesses kind 26 00:01:24,600 --> 00:01:28,520 Speaker 2: of sometimes unintentionally at tanking their own capacity within their 27 00:01:28,600 --> 00:01:31,920 Speaker 2: organization because they're looking at things in that traditional view, 28 00:01:32,040 --> 00:01:35,200 Speaker 2: which is cost centers and rolls and roles themselves are 29 00:01:35,280 --> 00:01:37,880 Speaker 2: very broad. They tend to be a collection of tasks 30 00:01:37,880 --> 00:01:41,400 Speaker 2: and activity sometimes that have evolved over years until they 31 00:01:41,400 --> 00:01:44,399 Speaker 2: don't necessarily represent what the core focus of that role is. 32 00:01:44,920 --> 00:01:47,319 Speaker 2: So the question that we think most businesses should be 33 00:01:47,319 --> 00:01:50,760 Speaker 2: asking is not what their team needs to be doing, 34 00:01:50,880 --> 00:01:53,840 Speaker 2: but what they shouldn't be doing. And to understand that, 35 00:01:53,880 --> 00:01:56,120 Speaker 2: you need to break the roll down into those tasks 36 00:01:56,120 --> 00:01:58,040 Speaker 2: and activities. And you'll find that this sometimes does and 37 00:01:58,160 --> 00:02:01,960 Speaker 2: sometimes even hundreds of individual activities within a role, and 38 00:02:02,080 --> 00:02:06,600 Speaker 2: each of those have different cost profiles and different automation 39 00:02:06,680 --> 00:02:09,440 Speaker 2: opportunities within them, and so that's kind of the track. 40 00:02:09,480 --> 00:02:13,520 Speaker 2: Once you identify those activities which are capable within a role, 41 00:02:13,919 --> 00:02:15,920 Speaker 2: you can actually do them at a lower cost, maybe 42 00:02:15,919 --> 00:02:18,600 Speaker 2: do them more efficiently or effectively with a specialist that 43 00:02:18,680 --> 00:02:22,079 Speaker 2: sort of thing. And so actually getting an administrator to 44 00:02:22,120 --> 00:02:24,400 Speaker 2: come in and pull lots of the work and tasks 45 00:02:24,400 --> 00:02:27,280 Speaker 2: that the team members are doing can be really really effective. 46 00:02:27,360 --> 00:02:29,520 Speaker 2: But you don't really see that day to day when 47 00:02:29,560 --> 00:02:31,480 Speaker 2: you're just looking at things from a role perspective. 48 00:02:32,440 --> 00:02:34,720 Speaker 1: Okay, so let me break that down a bit. So 49 00:02:34,720 --> 00:02:37,280 Speaker 1: it's almost like process mapping everything that goes on in 50 00:02:37,320 --> 00:02:40,800 Speaker 1: your organization and working out what the tasks are. 51 00:02:41,880 --> 00:02:44,200 Speaker 2: Yes, so it's actually sometimes a little bit more granular 52 00:02:44,240 --> 00:02:46,640 Speaker 2: than that. So like if we use some terminology and 53 00:02:46,840 --> 00:02:50,440 Speaker 2: a process classification framework, the task is the most granular 54 00:02:50,440 --> 00:02:53,560 Speaker 2: piece of work. Then an activity is a collection of tasks, 55 00:02:53,560 --> 00:02:56,360 Speaker 2: and a process is a collection of activities, right, And 56 00:02:56,400 --> 00:02:58,400 Speaker 2: so if you actually start to look down at that 57 00:02:58,440 --> 00:03:00,840 Speaker 2: activity and task level, you can find where there's a 58 00:03:00,880 --> 00:03:04,680 Speaker 2: lot of overpaying for the delivery of those activity and tasks. 59 00:03:04,720 --> 00:03:08,080 Speaker 2: You might have really high paid local team members that 60 00:03:08,120 --> 00:03:10,799 Speaker 2: are delivering those tasks when they actually shouldn't be, and 61 00:03:10,840 --> 00:03:12,960 Speaker 2: you can deliver them a much lower cost. An example 62 00:03:13,040 --> 00:03:15,520 Speaker 2: could be like a one hundred and fifty k ear 63 00:03:15,600 --> 00:03:18,399 Speaker 2: marketing manager who spending four or five hours a week 64 00:03:18,440 --> 00:03:21,600 Speaker 2: on pulling data together for a report, when they could 65 00:03:21,600 --> 00:03:24,960 Speaker 2: be focused on much bigger, more strategic activities and that 66 00:03:25,000 --> 00:03:26,600 Speaker 2: could easily be done at a lower cost. 67 00:03:27,720 --> 00:03:32,720 Speaker 1: How much of this is an AI issue? So it's 68 00:03:32,760 --> 00:03:35,240 Speaker 1: kind of come to the fall because it is AI, 69 00:03:35,560 --> 00:03:38,800 Speaker 1: I mean, and AI are better at tasks I presume 70 00:03:38,920 --> 00:03:42,560 Speaker 1: than roles. How much of it is AI and how 71 00:03:42,640 --> 00:03:47,800 Speaker 1: much of it is just the way organizations have morphed? 72 00:03:48,720 --> 00:03:50,720 Speaker 2: Well, I think it's a really good point. It's kind 73 00:03:50,720 --> 00:03:52,880 Speaker 2: of both. Right, there's a structural issue that we've got 74 00:03:52,880 --> 00:03:56,680 Speaker 2: here that if we address that, even without AI, we 75 00:03:56,720 --> 00:03:58,720 Speaker 2: could be doing a heck of a lot better. Right, 76 00:03:58,920 --> 00:04:01,560 Speaker 2: So if you even just got a local administrator pulling 77 00:04:01,560 --> 00:04:03,920 Speaker 2: a lot of these tasks and activities off your more 78 00:04:03,960 --> 00:04:08,119 Speaker 2: expensive local team, then you could save money and cut 79 00:04:08,160 --> 00:04:10,560 Speaker 2: costs and be more efficient. Just from that along, then 80 00:04:10,600 --> 00:04:13,480 Speaker 2: if you look at the rise and access to global labor, 81 00:04:13,720 --> 00:04:15,800 Speaker 2: you can leverage that and that can reduce the costs 82 00:04:15,840 --> 00:04:18,479 Speaker 2: even further. And then when it comes to AI, it's 83 00:04:18,680 --> 00:04:23,479 Speaker 2: very good at doing specific tasks, usually with some human oversight, 84 00:04:24,320 --> 00:04:26,400 Speaker 2: and so that also gives you a bit of a 85 00:04:26,480 --> 00:04:30,200 Speaker 2: roadmap towards being able to look effectively at AI adoption. 86 00:04:30,760 --> 00:04:33,760 Speaker 2: So it's like a lens that you need regardless. But 87 00:04:33,960 --> 00:04:36,400 Speaker 2: in this environment, when you've got global labor and AI 88 00:04:36,600 --> 00:04:39,960 Speaker 2: just going through the roof, you actually really can leverage 89 00:04:39,960 --> 00:04:40,680 Speaker 2: those effectively. 90 00:04:41,520 --> 00:04:44,000 Speaker 1: So what's the size of the price here for organizations? 91 00:04:44,040 --> 00:04:47,520 Speaker 1: I mean, not all organizations are going to win as 92 00:04:47,600 --> 00:04:49,040 Speaker 1: much as others, presumably. 93 00:04:50,320 --> 00:04:53,120 Speaker 2: Yeah, So like every business is I think that we've 94 00:04:53,120 --> 00:04:56,680 Speaker 2: looked at, even from very small businesses can take advantage 95 00:04:56,680 --> 00:04:59,040 Speaker 2: of this sort of thinking and this sort of lens 96 00:04:59,120 --> 00:05:03,599 Speaker 2: to optimize their processes but optimize their costs, I should say, 97 00:05:03,720 --> 00:05:07,240 Speaker 2: by working out where tasks should be applied. But the 98 00:05:07,279 --> 00:05:10,120 Speaker 2: bigger the business, often the longer it's been running, the 99 00:05:10,200 --> 00:05:13,960 Speaker 2: more process and task accumulation has happened around roles, and 100 00:05:13,960 --> 00:05:16,760 Speaker 2: the more opportunity there is to kind of re engineer 101 00:05:16,800 --> 00:05:20,120 Speaker 2: the workforce a little bit and free up that strategic 102 00:05:20,160 --> 00:05:24,039 Speaker 2: thinking or that revenue generating activity that your team members 103 00:05:24,040 --> 00:05:27,160 Speaker 2: should be focused on by getting kind of unburdening them, 104 00:05:27,160 --> 00:05:27,680 Speaker 2: if you will. 105 00:05:28,360 --> 00:05:31,560 Speaker 1: So if you think of these admin coordination manual processes, 106 00:05:32,160 --> 00:05:35,160 Speaker 1: I'm sure there's a lot of duplicated effort out there. 107 00:05:36,560 --> 00:05:38,640 Speaker 1: I saw one of the figures there was at sixty 108 00:05:38,680 --> 00:05:41,919 Speaker 1: to seventy percent of the operations of a business is 109 00:05:41,960 --> 00:05:45,000 Speaker 1: involved in that sort of thing. There are a lot 110 00:05:45,000 --> 00:05:46,680 Speaker 1: of that is the sort of thing where you can 111 00:05:46,760 --> 00:05:49,720 Speaker 1: actually reduce costs, or if you don't want to reduce costs, 112 00:05:49,760 --> 00:05:51,520 Speaker 1: just keep the people but actually add value. 113 00:05:52,000 --> 00:05:54,640 Speaker 2: Yeah, so that's a really interesting point. There's a piece 114 00:05:54,680 --> 00:05:56,919 Speaker 2: of research that came out last year that's really telling 115 00:05:56,920 --> 00:05:59,000 Speaker 2: when it comes to came out as a result of 116 00:05:59,040 --> 00:06:03,320 Speaker 2: like AI adoption, where Asana looked at a thousands of 117 00:06:03,440 --> 00:06:06,359 Speaker 2: organizations and they found that employees only spend about twenty 118 00:06:06,400 --> 00:06:09,120 Speaker 2: seven percent of their time on the skilled work that 119 00:06:09,160 --> 00:06:12,359 Speaker 2: they're actually hired to do. Right, that's staggering. It means 120 00:06:12,360 --> 00:06:15,040 Speaker 2: like you're only getting thirty percent of the output of 121 00:06:15,600 --> 00:06:19,680 Speaker 2: your staff using on average, using the traditional role based model. 122 00:06:20,080 --> 00:06:24,320 Speaker 2: So switching this around and looking at unburdening them and 123 00:06:24,360 --> 00:06:27,440 Speaker 2: getting the right support in place for them, whether that's 124 00:06:27,480 --> 00:06:30,359 Speaker 2: through offshare staff or AI or a combination of both, 125 00:06:30,880 --> 00:06:33,240 Speaker 2: you can quite easily flip that figure around and have 126 00:06:33,320 --> 00:06:35,680 Speaker 2: a two thirds output. You're not going to get rid 127 00:06:35,720 --> 00:06:38,360 Speaker 2: of DMin completely, right, but you at least can can 128 00:06:38,480 --> 00:06:42,520 Speaker 2: really drive into those opportunities and maybe double productivity quite 129 00:06:42,560 --> 00:06:45,400 Speaker 2: easily just by looking at things this way and getting 130 00:06:45,440 --> 00:06:47,560 Speaker 2: a bit smarter about where you allocate your resources. 131 00:06:48,279 --> 00:06:49,840 Speaker 1: So how can you start, Jarah? What you're saying all 132 00:06:49,880 --> 00:06:52,039 Speaker 1: makes sense, and I think of my own business and 133 00:06:52,120 --> 00:06:54,440 Speaker 1: you know what we could do. But where do we start? 134 00:06:55,240 --> 00:06:57,960 Speaker 2: Well, so often the first place to start is looking 135 00:06:58,040 --> 00:07:01,279 Speaker 2: at repeatable tasks and activities that can be turned into 136 00:07:01,560 --> 00:07:04,680 Speaker 2: a standard operating process. And that really defines what we 137 00:07:04,720 --> 00:07:08,200 Speaker 2: would call a remote capable task or remote capable activity. 138 00:07:08,680 --> 00:07:11,960 Speaker 2: And so if you're able to identify those within the organization, 139 00:07:12,080 --> 00:07:15,280 Speaker 2: things that someone without any prior training or knowledge could 140 00:07:15,280 --> 00:07:18,040 Speaker 2: pick up a guide and follow step by step. If 141 00:07:18,080 --> 00:07:20,440 Speaker 2: you have those things in place, they become things that 142 00:07:20,480 --> 00:07:24,240 Speaker 2: you can then look at resourcing and either offshoring or 143 00:07:24,400 --> 00:07:26,880 Speaker 2: getting AI to help you with. So that's where you 144 00:07:26,960 --> 00:07:29,040 Speaker 2: kind of start. It is like going through and analyzing 145 00:07:29,080 --> 00:07:32,600 Speaker 2: those activities and it's not specific to any individual role 146 00:07:32,720 --> 00:07:36,320 Speaker 2: or team. Those activities could be consistent across a wide 147 00:07:36,400 --> 00:07:39,400 Speaker 2: variety of different roles and just getting a specialist then 148 00:07:39,440 --> 00:07:42,360 Speaker 2: who owns that process, who owns that activity, can have 149 00:07:42,400 --> 00:07:44,760 Speaker 2: a really big impact just in and of itself. So 150 00:07:45,080 --> 00:07:50,239 Speaker 2: that's where data entry, report creation, administration, those sort of things. 151 00:07:50,680 --> 00:07:53,440 Speaker 2: Even when it comes to marketing teams, you can look 152 00:07:53,440 --> 00:07:57,280 Speaker 2: at campaign implementation specialists who do all the grinding work 153 00:07:57,520 --> 00:08:01,800 Speaker 2: associated with setting up campaigns and doing them, monitoring and reporting, 154 00:08:01,920 --> 00:08:04,280 Speaker 2: or even some of the draft commentary with the support 155 00:08:04,280 --> 00:08:07,240 Speaker 2: of AI. Those sort of things are all very very 156 00:08:07,480 --> 00:08:11,080 Speaker 2: powerful when you identify them, pull them off the local team, 157 00:08:11,120 --> 00:08:13,880 Speaker 2: so to speak, and free them up to really express 158 00:08:14,080 --> 00:08:17,840 Speaker 2: their skills properly within the organization. It's what we think of. 159 00:08:17,920 --> 00:08:20,440 Speaker 2: It's an opportunity cost before the term many times over 160 00:08:20,440 --> 00:08:22,360 Speaker 2: the years, but it's a cost that businesses are paying 161 00:08:22,480 --> 00:08:24,320 Speaker 2: right now that they don't need to be. 162 00:08:25,080 --> 00:08:28,400 Speaker 1: What's interesting, Jarah, you've talked about outsourcing as well, and 163 00:08:28,480 --> 00:08:31,560 Speaker 1: you know, using people better. AI is in a silver bullet. 164 00:08:31,600 --> 00:08:33,640 Speaker 1: It's not like, hey, let's get AI, things will be grown. 165 00:08:33,880 --> 00:08:36,560 Speaker 1: It's actually just part of a much bigger puzzle. 166 00:08:37,240 --> 00:08:41,000 Speaker 2: Absolutely, so AI is. It's amazing what's happening at the moment. 167 00:08:41,000 --> 00:08:43,200 Speaker 2: There's so much hype, there's so much talk. It's so 168 00:08:43,400 --> 00:08:45,760 Speaker 2: really daunting. But I listened to the podcast. He had 169 00:08:45,760 --> 00:08:48,000 Speaker 2: a couple of weeks ago with Kevin Hubner who was 170 00:08:48,040 --> 00:08:51,160 Speaker 2: talking about AI companies and how much cash they're burning 171 00:08:51,240 --> 00:08:54,400 Speaker 2: building these models and how far away they are from 172 00:08:54,440 --> 00:08:58,760 Speaker 2: true accuracy, right and therefore profitability. But the problem is 173 00:08:58,760 --> 00:09:01,040 Speaker 2: is that you're getting the same thing with businesses that 174 00:09:01,080 --> 00:09:04,120 Speaker 2: are trying to adopt AI. They're taking that same lens 175 00:09:04,160 --> 00:09:06,520 Speaker 2: and that they're trying to adopt AI and about ninety 176 00:09:06,559 --> 00:09:10,600 Speaker 2: five percent of generative AI implementations are actually failing to 177 00:09:10,640 --> 00:09:13,480 Speaker 2: achieve the production impact. That was a study that came 178 00:09:13,480 --> 00:09:16,560 Speaker 2: out from MIT last year. That's huge figure, right, So 179 00:09:17,000 --> 00:09:19,200 Speaker 2: when we look at that from our point of view, 180 00:09:19,240 --> 00:09:22,360 Speaker 2: we're looking at going, well, what does proper AI adoption 181 00:09:22,440 --> 00:09:24,520 Speaker 2: look like and how can you do it successfully because 182 00:09:24,559 --> 00:09:27,160 Speaker 2: it's a big part of what we do, alongside access 183 00:09:27,160 --> 00:09:29,480 Speaker 2: to global labor. And the best way to think about 184 00:09:29,480 --> 00:09:32,840 Speaker 2: it is that there are four different stages of AI adoption. Right, 185 00:09:32,920 --> 00:09:34,560 Speaker 2: there's no adoption at all. So you've got a bunch 186 00:09:34,559 --> 00:09:37,199 Speaker 2: of businesses in the space that have either not started 187 00:09:37,240 --> 00:09:40,080 Speaker 2: because they don't know where to start, or they've tried 188 00:09:40,080 --> 00:09:42,120 Speaker 2: and they've failed and they kind of stepped away from it. 189 00:09:42,440 --> 00:09:44,439 Speaker 2: I don't know what to do here, right then, you've 190 00:09:44,440 --> 00:09:46,880 Speaker 2: got stage two right, And first of all, we probably 191 00:09:46,880 --> 00:09:48,439 Speaker 2: don't want to stay in stage one for too long 192 00:09:48,440 --> 00:09:51,880 Speaker 2: because things are moving so quickly and there's definitely opportunities there. 193 00:09:52,280 --> 00:09:55,920 Speaker 2: Stage two is human lad with AI support, right, and 194 00:09:55,960 --> 00:09:59,120 Speaker 2: that is where you augment a person, you give them 195 00:09:59,160 --> 00:10:01,880 Speaker 2: access to general AI tools. It allows them to be 196 00:10:02,000 --> 00:10:05,360 Speaker 2: more efficient, more productive, more accurate. This is a really 197 00:10:05,440 --> 00:10:09,080 Speaker 2: really good space to be using AI and where the 198 00:10:09,120 --> 00:10:12,560 Speaker 2: most successful adoption is right now and where we can 199 00:10:12,559 --> 00:10:15,640 Speaker 2: see it being quite successful for some time. Then you've 200 00:10:15,679 --> 00:10:19,120 Speaker 2: got stage three. That's where the AI leads and human 201 00:10:19,200 --> 00:10:21,280 Speaker 2: oversight is required, and you see that with a lot 202 00:10:21,320 --> 00:10:25,640 Speaker 2: of testing and yeah, then's level four is we've got 203 00:10:25,640 --> 00:10:29,360 Speaker 2: a full blown AI agent looking after things. Now. What's 204 00:10:29,440 --> 00:10:32,040 Speaker 2: really interesting is there's a lot of hype around point four, 205 00:10:32,400 --> 00:10:35,520 Speaker 2: but it doesn't really play out like that. It often 206 00:10:35,600 --> 00:10:39,200 Speaker 2: falls back to level three with that human oversight because 207 00:10:39,200 --> 00:10:42,160 Speaker 2: we don't actually have full AGI at the stage right. 208 00:10:42,960 --> 00:10:44,480 Speaker 2: Good example of that is a very note call this 209 00:10:44,520 --> 00:10:48,120 Speaker 2: morning on WEYMO and San Francisco. The city's actually having 210 00:10:48,200 --> 00:10:52,720 Speaker 2: to classify call outs to deal with or waymo cars 211 00:10:52,720 --> 00:10:54,480 Speaker 2: that have just stopped in traffic and are causing all 212 00:10:54,520 --> 00:10:56,520 Speaker 2: sorts of problems, and so they have to send out 213 00:10:56,520 --> 00:10:59,199 Speaker 2: emergency services to move the cars on. And so that's 214 00:10:59,240 --> 00:11:02,320 Speaker 2: a level of human oversight and cost associated with that 215 00:11:02,440 --> 00:11:05,679 Speaker 2: type of adoption that, yeah, just doesn't really make a 216 00:11:05,720 --> 00:11:08,000 Speaker 2: lot of sense, right, but that's the reality of those 217 00:11:08,040 --> 00:11:12,080 Speaker 2: sort of projects. So where we see it is the 218 00:11:12,080 --> 00:11:15,080 Speaker 2: best and safest and most effective way to take advantage 219 00:11:15,080 --> 00:11:18,439 Speaker 2: of AI right now is actually in an AI augmented 220 00:11:18,440 --> 00:11:22,240 Speaker 2: space where it's human lad where their skills and so 221 00:11:22,320 --> 00:11:25,120 Speaker 2: on get enhanced by the use of these generative AI tools. 222 00:11:25,320 --> 00:11:28,280 Speaker 2: Really safe place to kind of play and to deploy 223 00:11:28,920 --> 00:11:31,000 Speaker 2: and get really effective outcomes from. 224 00:11:32,000 --> 00:11:33,880 Speaker 1: Thank you very much for talking to Fear and Greed. 225 00:11:34,160 --> 00:11:38,120 Speaker 1: Welcome now as Jarah Borman, director of strategy at outrun Global, 226 00:11:38,120 --> 00:11:40,480 Speaker 1: a great support of this podcast. If you're keen to 227 00:11:40,520 --> 00:11:44,120 Speaker 1: explore the opportunities behind what Jared talked about, outrun have 228 00:11:44,200 --> 00:11:47,440 Speaker 1: put together a new guide. It's called the Top ten 229 00:11:47,559 --> 00:11:51,079 Speaker 1: Tasks to Optimize for the Biggest Impact. The top ten 230 00:11:51,160 --> 00:11:54,439 Speaker 1: tasks to optimize the biggest impact plus the cost savings 231 00:11:54,480 --> 00:11:58,040 Speaker 1: you can unlock. You can grab it at outrun dot global, 232 00:11:58,080 --> 00:12:01,440 Speaker 1: slash Fearing Greed, outrun, dot glow, will slash Fear and Greed. 233 00:12:01,640 --> 00:12:03,400 Speaker 1: We'll put the link in the show notes too. I'm 234 00:12:03,440 --> 00:12:05,319 Speaker 1: Sean Almer, and this is Fear and Greed Q and 235 00:12:05,400 --> 00:12:06,040 Speaker 1: I