1 00:00:01,080 --> 00:00:03,560 Speaker 1: This week on the Business of Tech Powered by two Degrees. 2 00:00:03,600 --> 00:00:07,760 Speaker 1: I'm coming to you from Microsoft's headquarters in Redmond, Seattle, 3 00:00:08,080 --> 00:00:11,240 Speaker 1: sitting down with the kiwi who now helps decide which 4 00:00:11,360 --> 00:00:14,840 Speaker 1: AI models millions of people around the world get to use. 5 00:00:15,280 --> 00:00:18,840 Speaker 1: Steve Sweetman grew up studying architecture in New Zealand, fell 6 00:00:18,880 --> 00:00:22,800 Speaker 1: in love with technology as computer assisted design disrupted the 7 00:00:22,920 --> 00:00:26,000 Speaker 1: drawing board, and rode that wave all the way from 8 00:00:26,160 --> 00:00:30,440 Speaker 1: Wang Computer, Telecom and Spark to Microsoft New Zealand, and 9 00:00:30,480 --> 00:00:33,400 Speaker 1: then all the way to Redmond. Over nearly twenty five 10 00:00:33,479 --> 00:00:36,400 Speaker 1: years at Microsoft, Steve's worked at the sharp end of 11 00:00:36,479 --> 00:00:40,800 Speaker 1: new technologies, from HoloLens and early chat pots to helping 12 00:00:40,840 --> 00:00:44,360 Speaker 1: set up the Office of Responsible AI and building the 13 00:00:44,400 --> 00:00:49,320 Speaker 1: tools that let data scientists test models for fairness, safety, 14 00:00:49,560 --> 00:00:54,840 Speaker 1: and explainability. Today, Steve leads the AI models team behind Foundry, 15 00:00:54,880 --> 00:00:58,440 Speaker 1: the Microsoft platform that brings together more than ten thousand 16 00:00:58,480 --> 00:01:03,200 Speaker 1: of the world's leading aimode models from Open AI, Athropic, Meta, 17 00:01:03,360 --> 00:01:07,240 Speaker 1: Deep Seek, and Microsoft's own labs and make them available 18 00:01:07,280 --> 00:01:11,319 Speaker 1: in one trusted environment for businesses everywhere. We talk about 19 00:01:11,319 --> 00:01:15,000 Speaker 1: how Microsoft went from backing chat gipt in its earliest 20 00:01:15,200 --> 00:01:19,480 Speaker 1: enterprise days to running a vast AI factory in the cloud, 21 00:01:19,480 --> 00:01:22,399 Speaker 1: where the real value isn't in the model alone, but 22 00:01:22,520 --> 00:01:26,960 Speaker 1: the data governance and agent frameworks wrapped around it. And 23 00:01:27,000 --> 00:01:30,959 Speaker 1: how companies like Alaska Airlines, CVS Health and even tiny 24 00:01:30,959 --> 00:01:34,000 Speaker 1: startups are already building on top of it. How does 25 00:01:34,040 --> 00:01:37,080 Speaker 1: Microsoft decide which third party models make the cut. We 26 00:01:37,120 --> 00:01:40,440 Speaker 1: get into that, why some popular models are rejected on 27 00:01:40,480 --> 00:01:43,840 Speaker 1: safety grounds, and how Microsoft is trying to balance the 28 00:01:43,880 --> 00:01:47,160 Speaker 1: pressure to move fast with the responsibility not to break 29 00:01:47,200 --> 00:01:50,120 Speaker 1: things at a global scale, And because this is the 30 00:01:50,160 --> 00:01:52,360 Speaker 1: business of tech, we zoom in on what all of 31 00:01:52,400 --> 00:01:55,080 Speaker 1: this means for smaller markets like New Zealand, where the 32 00:01:55,120 --> 00:01:58,280 Speaker 1: talent base and AI is thin, but the opportunity to 33 00:01:58,400 --> 00:02:00,600 Speaker 1: leap frog with AI use it needs to it some 34 00:02:00,680 --> 00:02:04,320 Speaker 1: tool is massive. That's all coming up with Microsoft. Steve 35 00:02:04,360 --> 00:02:11,160 Speaker 1: Sweetman right after this, Steve, Welcome to the business of tech. 36 00:02:11,160 --> 00:02:11,679 Speaker 1: How are you doing. 37 00:02:11,919 --> 00:02:13,880 Speaker 2: I am wonderful, Thank you, thank you for having me. 38 00:02:13,880 --> 00:02:18,359 Speaker 1: We're sitting in the north part of the campus at 39 00:02:18,400 --> 00:02:22,280 Speaker 1: Microsoft's headquarters in Redmond. Beautiful new buildings here that have 40 00:02:22,480 --> 00:02:26,280 Speaker 1: propped up since the last time that I've been here. 41 00:02:26,360 --> 00:02:29,720 Speaker 1: So great to be here. Tell us about your pathway 42 00:02:30,040 --> 00:02:33,080 Speaker 1: into Microsoft. You've been well eighteen years now at Microsoft. 43 00:02:33,120 --> 00:02:35,440 Speaker 1: You did a stint at Spark probably what was telecom 44 00:02:35,520 --> 00:02:37,880 Speaker 1: ye previous to that. Give us your sort of origin 45 00:02:37,919 --> 00:02:40,200 Speaker 1: story and how you got into tech and then Microsoft 46 00:02:40,240 --> 00:02:40,640 Speaker 1: for sure. 47 00:02:40,840 --> 00:02:43,520 Speaker 3: So yeah, key wee kid despite the accent, grew up 48 00:02:43,520 --> 00:02:46,280 Speaker 3: all over the world, but went to University of New Zealand, 49 00:02:46,320 --> 00:02:49,240 Speaker 3: so repatriated to New Zealand at that time, and actually 50 00:02:49,280 --> 00:02:52,680 Speaker 3: studied architecture. But during that study I fell in love 51 00:02:52,720 --> 00:02:56,120 Speaker 3: with technology. It was really during the transition of manual 52 00:02:56,400 --> 00:03:00,400 Speaker 3: drawing to CAD and I went wow, like, if this 53 00:03:00,440 --> 00:03:04,320 Speaker 3: can transform how architects are imagining buildings, imagine how we 54 00:03:04,360 --> 00:03:05,600 Speaker 3: can transform every industry. 55 00:03:05,760 --> 00:03:08,000 Speaker 2: And so out of university I pursued technology. 56 00:03:08,040 --> 00:03:10,280 Speaker 3: I was lucky enough to actually start at Wighing before 57 00:03:10,320 --> 00:03:14,280 Speaker 3: it became before it became Spark, which was at the 58 00:03:14,280 --> 00:03:17,919 Speaker 3: time one of the biggest microsoftat company, and so from 59 00:03:17,960 --> 00:03:21,360 Speaker 3: there that really set me up from a technology industry perspective. 60 00:03:21,760 --> 00:03:24,000 Speaker 3: And then starting with Microsoft in New Zealand was super 61 00:03:24,000 --> 00:03:27,000 Speaker 3: special as well, because you're part of this large corporation 62 00:03:27,520 --> 00:03:30,320 Speaker 3: and yet you're in New Zealand and so you can 63 00:03:30,360 --> 00:03:33,200 Speaker 3: see across the floor quite literally of a building and. 64 00:03:33,160 --> 00:03:34,760 Speaker 2: See every function of the business. 65 00:03:35,120 --> 00:03:37,000 Speaker 3: And so it was a great foundation to be able 66 00:03:37,000 --> 00:03:39,720 Speaker 3: to both learn about our customers in New Zealand but 67 00:03:39,760 --> 00:03:43,120 Speaker 3: also learn about how our business like Microsoft operates from 68 00:03:43,480 --> 00:03:48,400 Speaker 3: consulting and marketing and legal and sales and product and 69 00:03:48,440 --> 00:03:51,080 Speaker 3: be able to actually get that foundation. So I'm super 70 00:03:51,080 --> 00:03:54,480 Speaker 3: grateful for that experience. And yeah, started in New Zealand 71 00:03:54,480 --> 00:03:56,560 Speaker 3: in two thousand and two, had a brief stint out 72 00:03:56,600 --> 00:03:58,840 Speaker 3: of the company, back into the company, and then moved 73 00:03:58,880 --> 00:04:00,280 Speaker 3: up to Redmond here in two thousand. 74 00:04:00,480 --> 00:04:04,480 Speaker 1: And It's a pathway a number of Kiwis and Microsoft 75 00:04:04,560 --> 00:04:07,560 Speaker 1: have taken them, thinking of Angus Norton and people like 76 00:04:07,600 --> 00:04:10,400 Speaker 1: that who've gone on to then rise up through the 77 00:04:10,480 --> 00:04:12,320 Speaker 1: ranks here at Ridmond. 78 00:04:12,560 --> 00:04:12,960 Speaker 2: Yeah. 79 00:04:13,120 --> 00:04:15,840 Speaker 3: Yeah, and we've got a pretty cool Kiwi culture up 80 00:04:15,880 --> 00:04:17,720 Speaker 3: here too, so a number of folks are up here. 81 00:04:17,880 --> 00:04:20,320 Speaker 3: One of my first mentors in New Zealand was d Templeton, 82 00:04:20,320 --> 00:04:23,520 Speaker 3: now defy CTO of Microsoft, and so we actually moved 83 00:04:23,560 --> 00:04:25,600 Speaker 3: up about the same time new Angus as well, and 84 00:04:25,680 --> 00:04:28,320 Speaker 3: his journey has been fantastic to watch. And I think 85 00:04:28,360 --> 00:04:30,640 Speaker 3: part of that is when you get to work in 86 00:04:30,680 --> 00:04:34,360 Speaker 3: an organization like Microsoft, you have access to so much 87 00:04:34,400 --> 00:04:37,760 Speaker 3: opportunity for impact and that's really an incredible platform that 88 00:04:37,800 --> 00:04:39,720 Speaker 3: I cherish and it's a big part of the reason 89 00:04:39,720 --> 00:04:41,279 Speaker 3: I've been here for over twenty years. 90 00:04:41,360 --> 00:04:44,560 Speaker 1: Yeah, and you've covered a number of different roles including 91 00:04:44,880 --> 00:04:51,000 Speaker 1: really interestingly the Office of AI Responsibility, Responsible AI key 92 00:04:51,360 --> 00:04:53,920 Speaker 1: to touch on that as well, bit of software development 93 00:04:53,920 --> 00:04:56,719 Speaker 1: product all that sort of thing. Where you are now 94 00:04:56,920 --> 00:04:59,000 Speaker 1: is really you know, I think at the heart of 95 00:04:59,760 --> 00:05:03,160 Speaker 1: MI AI offerings, which is you know, as your foundry, 96 00:05:03,480 --> 00:05:07,240 Speaker 1: where all of these best in class AI models are 97 00:05:07,240 --> 00:05:12,000 Speaker 1: offered your own ones and through parties as well. Talk 98 00:05:12,040 --> 00:05:14,640 Speaker 1: about the transition because you also worked on as your 99 00:05:14,760 --> 00:05:18,719 Speaker 1: open AI, this partnership microsofted with open Ai when chat 100 00:05:18,720 --> 00:05:21,400 Speaker 1: GPT was just new and businesses were going, how do 101 00:05:21,480 --> 00:05:25,000 Speaker 1: we incorporate that into a safe environment for us to 102 00:05:25,080 --> 00:05:28,400 Speaker 1: use and use with our own data. What was it like? 103 00:05:28,440 --> 00:05:31,839 Speaker 1: Because that's been such a rapid transition from that fundamental 104 00:05:31,920 --> 00:05:35,520 Speaker 1: partnership offering that including a lot of customers in New 105 00:05:35,560 --> 00:05:39,600 Speaker 1: Zealand using it, to then suddenly eleven thousand models available 106 00:05:39,800 --> 00:05:41,520 Speaker 1: via foundry. What was that transition like? 107 00:05:42,400 --> 00:05:43,880 Speaker 3: And maybe I'll go back a little bit in the 108 00:05:43,920 --> 00:05:46,200 Speaker 3: story as well and just talk about Okay. So I 109 00:05:46,279 --> 00:05:48,400 Speaker 3: joined here in Badman now as part of a CTO 110 00:05:48,520 --> 00:05:51,000 Speaker 3: office at that time, and really what we're trying to 111 00:05:51,000 --> 00:05:55,160 Speaker 3: figure out is as technology transforms, how do customers get 112 00:05:55,240 --> 00:05:55,880 Speaker 3: value from that? 113 00:05:56,120 --> 00:05:56,280 Speaker 1: Right? 114 00:05:56,520 --> 00:05:58,000 Speaker 3: You know, one of the things in the tech industry 115 00:05:58,040 --> 00:06:00,599 Speaker 3: that a mistake we often make is we pursue that technology, 116 00:06:00,640 --> 00:06:04,080 Speaker 3: not the customer problem. And so I spent close to 117 00:06:04,520 --> 00:06:08,839 Speaker 3: eight years focusing on that intersection between new technologies, Internet 118 00:06:08,880 --> 00:06:12,599 Speaker 3: of things, unified communications, and social and what that means 119 00:06:12,680 --> 00:06:16,240 Speaker 3: to corporate businesses, even things like holographic computing with HoloLens 120 00:06:16,320 --> 00:06:17,599 Speaker 3: and understanding how that fits. 121 00:06:17,880 --> 00:06:20,200 Speaker 2: And that actually became my gateway into AI. 122 00:06:20,520 --> 00:06:22,960 Speaker 3: It was actually twenty sixteen when we were working on 123 00:06:23,040 --> 00:06:24,600 Speaker 3: language understanding the early. 124 00:06:24,400 --> 00:06:25,400 Speaker 2: Days of chatbots. 125 00:06:25,480 --> 00:06:27,160 Speaker 3: Right, you can pick up out of a segment of 126 00:06:27,240 --> 00:06:30,520 Speaker 3: few words, translate those words into intent, and from intent 127 00:06:30,560 --> 00:06:34,520 Speaker 3: you can help solve customer problems quickly. Well, we quickly 128 00:06:34,560 --> 00:06:38,760 Speaker 3: realized as a company that these technologies are incredibly powerful. 129 00:06:38,440 --> 00:06:39,599 Speaker 2: And we needed safeguards. 130 00:06:40,000 --> 00:06:42,880 Speaker 3: And so it's actually from that experience of working at 131 00:06:42,920 --> 00:06:45,920 Speaker 3: the intersection between humans and technology that then I went 132 00:06:45,960 --> 00:06:49,040 Speaker 3: and helped to establish the Officer responsible AI in our 133 00:06:49,120 --> 00:06:51,960 Speaker 3: legal team spent a year there establishing the frameworks and 134 00:06:51,960 --> 00:06:53,359 Speaker 3: the policies for the organization. 135 00:06:53,440 --> 00:06:54,200 Speaker 2: Incredible time. 136 00:06:54,600 --> 00:06:57,119 Speaker 3: But then I spotted a different gap, which was, Okay, 137 00:06:57,120 --> 00:06:59,240 Speaker 3: we can have policies, but if you don't have technology 138 00:06:59,279 --> 00:07:03,479 Speaker 3: map to see the people building the tools aren't going 139 00:07:03,560 --> 00:07:06,080 Speaker 3: to be able to implement the policy. And so that's 140 00:07:06,120 --> 00:07:09,280 Speaker 3: actually what led me into COREYI. It was AI platform 141 00:07:09,320 --> 00:07:11,840 Speaker 3: back then where I started to work on what are 142 00:07:11,840 --> 00:07:15,040 Speaker 3: the foundational tools that data scientists need to understand the 143 00:07:15,040 --> 00:07:17,640 Speaker 3: impact of AI, to understand the fairness of the training 144 00:07:17,760 --> 00:07:20,840 Speaker 3: data sets, to understand how they can explain the outcomes 145 00:07:20,840 --> 00:07:23,360 Speaker 3: of these tools. And yes, then I've been able to 146 00:07:23,400 --> 00:07:26,880 Speaker 3: see that rapid evolution from responsible AI tools to then 147 00:07:27,120 --> 00:07:31,040 Speaker 3: Azure OpenAI and our incredible partnership there, which again shifted 148 00:07:31,440 --> 00:07:35,400 Speaker 3: because now data scientists weren't necessarily using training data, they 149 00:07:35,400 --> 00:07:40,040 Speaker 3: were just feeding context of documents, and these amazing generative models, 150 00:07:40,080 --> 00:07:43,720 Speaker 3: these foundational models, which have amazing context in and of themselves, 151 00:07:43,760 --> 00:07:48,200 Speaker 3: could quickly answer questions. So that partnership really gave Microsoft 152 00:07:48,400 --> 00:07:52,400 Speaker 3: to answer your question directly, a leadership position in the industry. 153 00:07:52,520 --> 00:07:54,800 Speaker 3: It really helped set us up as a company that 154 00:07:54,880 --> 00:07:59,160 Speaker 3: was at the forefront of this generative AI transformation where 155 00:07:59,360 --> 00:08:02,720 Speaker 3: technology went from a few scientists who knew how to 156 00:08:02,800 --> 00:08:06,560 Speaker 3: train models to being able to unlock every individual in 157 00:08:06,640 --> 00:08:09,480 Speaker 3: the world with the power being able to harness their 158 00:08:09,560 --> 00:08:12,720 Speaker 3: data to produce amazing outcomes, right, And so that was 159 00:08:13,000 --> 00:08:15,800 Speaker 3: really up until I would say about a year ago, 160 00:08:16,520 --> 00:08:19,320 Speaker 3: and then something else changed. And the next change was 161 00:08:19,360 --> 00:08:22,200 Speaker 3: that we started to see a proliferation of amazing labs 162 00:08:22,680 --> 00:08:26,600 Speaker 3: building amazing models. So open AI's leadership and our partnership 163 00:08:26,600 --> 00:08:28,520 Speaker 3: with them remains incredibly important. 164 00:08:28,760 --> 00:08:30,600 Speaker 2: And we saw the rise. 165 00:08:30,440 --> 00:08:32,600 Speaker 3: Of Anthropic, who we just added as a partner back 166 00:08:32,600 --> 00:08:35,600 Speaker 3: in November, rise of open source providers like deep Sea, 167 00:08:36,000 --> 00:08:37,880 Speaker 3: whose models we also offer on our. 168 00:08:37,760 --> 00:08:40,280 Speaker 1: Platform, Lama from Meta, Lama. 169 00:08:39,960 --> 00:08:42,760 Speaker 3: From medie Grock from Xai, and so yes, we have 170 00:08:42,840 --> 00:08:45,520 Speaker 3: this platform now of over ten thousand models from the 171 00:08:45,600 --> 00:08:46,920 Speaker 3: world's premier providers. 172 00:08:47,240 --> 00:08:49,320 Speaker 2: And you can think of it like a factory, right. 173 00:08:49,679 --> 00:08:50,920 Speaker 2: It's the place that has. 174 00:08:50,800 --> 00:08:53,960 Speaker 3: All of the tools in one location to build whatever 175 00:08:54,000 --> 00:08:56,120 Speaker 3: you need to build. And so that is really what 176 00:08:56,200 --> 00:08:59,840 Speaker 3: Microsoft Foundry is is we are the platform that allows 177 00:09:00,080 --> 00:09:03,840 Speaker 3: every customer to build with these amazing generative AI tools, 178 00:09:04,120 --> 00:09:06,760 Speaker 3: whatever it is that they can imagine to impact and 179 00:09:07,200 --> 00:09:08,680 Speaker 3: make the world of their customers better. 180 00:09:08,960 --> 00:09:13,120 Speaker 1: Yeah, it's sort of like a marketplace for these models 181 00:09:13,120 --> 00:09:15,840 Speaker 1: as well. You also offer your own models. Microsoft has 182 00:09:15,880 --> 00:09:17,800 Speaker 1: done a lot of work out of Microsoft Research on 183 00:09:18,920 --> 00:09:20,400 Speaker 1: foundational models as well. 184 00:09:20,280 --> 00:09:23,160 Speaker 3: Right, Yeah, so we have really kind of two leans there. 185 00:09:23,160 --> 00:09:26,160 Speaker 3: One is the Foundry Labs program with Microsoft Research, which 186 00:09:26,200 --> 00:09:29,520 Speaker 3: continues to work on the frontiers of innovation in places 187 00:09:29,520 --> 00:09:32,760 Speaker 3: like medicine and geospatial mapping and how we think about 188 00:09:33,040 --> 00:09:35,840 Speaker 3: solving some of the intractable problems to forward research. And then, 189 00:09:35,880 --> 00:09:39,200 Speaker 3: of course we've got Microsoft Ai, this division internally that 190 00:09:39,600 --> 00:09:41,959 Speaker 3: we are building world class models of our own from 191 00:09:42,000 --> 00:09:45,000 Speaker 3: image models and audio models and amazing models that are 192 00:09:45,120 --> 00:09:48,079 Speaker 3: powering our own copilots, but are also being made available 193 00:09:48,080 --> 00:09:48,920 Speaker 3: through our platform. 194 00:09:48,960 --> 00:09:51,040 Speaker 1: Microsoft has made it clear over the last couple of 195 00:09:51,120 --> 00:09:53,240 Speaker 1: days as I've been talking people around the campus here. 196 00:09:54,080 --> 00:09:56,920 Speaker 1: You know, an AI model on its own, you know, 197 00:09:57,080 --> 00:09:59,760 Speaker 1: it's powerful, but it's not really that useful. It's all 198 00:09:59,800 --> 00:10:03,199 Speaker 1: the stuff that wraps around it. It's the data and 199 00:10:03,280 --> 00:10:07,240 Speaker 1: how clean and high quality that data is. It's the 200 00:10:07,679 --> 00:10:11,360 Speaker 1: actual AI tools you use to interact with that model. 201 00:10:12,120 --> 00:10:14,360 Speaker 1: It's the governance you put around it as well. And 202 00:10:14,520 --> 00:10:16,800 Speaker 1: I guess that's the core for what foundry is. You've 203 00:10:16,800 --> 00:10:19,400 Speaker 1: got all of these great models that you've done, licensing deals, 204 00:10:19,440 --> 00:10:22,200 Speaker 1: you've got your own models there, but you've got aspects 205 00:10:22,240 --> 00:10:25,040 Speaker 1: of that platform like fabric, which is all about the data, 206 00:10:25,080 --> 00:10:27,640 Speaker 1: getting the data in a good order. So when it 207 00:10:27,679 --> 00:10:29,680 Speaker 1: goes into that model and it's used to fine tune 208 00:10:29,960 --> 00:10:31,160 Speaker 1: or train it, you're going to. 209 00:10:31,080 --> 00:10:32,240 Speaker 2: Get the outcomes that you want. 210 00:10:32,480 --> 00:10:34,480 Speaker 1: That's the real value proposition. 211 00:10:34,120 --> 00:10:37,160 Speaker 3: Absolutely, and again, building on the factory analogy, you can 212 00:10:37,200 --> 00:10:39,680 Speaker 3: think of it as like a supply chain. Right in 213 00:10:39,760 --> 00:10:42,480 Speaker 3: the factory, we have the machines, those the models, they're 214 00:10:42,520 --> 00:10:44,120 Speaker 3: ready to produce whatever it is you. 215 00:10:44,120 --> 00:10:44,600 Speaker 2: Want to build. 216 00:10:44,600 --> 00:10:48,199 Speaker 3: But of course you need the supplies, you need the 217 00:10:48,320 --> 00:10:50,800 Speaker 3: raw materials, and in this case for most companies, the 218 00:10:50,880 --> 00:10:53,760 Speaker 3: raw materials is data, right, and so how do we 219 00:10:53,800 --> 00:10:56,680 Speaker 3: actually connect to the data sources. Some of that could 220 00:10:56,720 --> 00:10:58,600 Speaker 3: be in fabric, like you said, some of that could 221 00:10:58,600 --> 00:10:59,920 Speaker 3: be in applications that. 222 00:10:59,840 --> 00:11:01,840 Speaker 2: Are running an Azure M three sixty five. 223 00:11:02,280 --> 00:11:04,920 Speaker 3: And so that's really what foundry allows you to do is, Yes, 224 00:11:05,000 --> 00:11:07,680 Speaker 3: you've got the engines, you've got the machines, and you've 225 00:11:07,679 --> 00:11:09,960 Speaker 3: got the connections to all of the data and then 226 00:11:10,000 --> 00:11:11,520 Speaker 3: on the other side of the factory, we also have 227 00:11:11,600 --> 00:11:14,680 Speaker 3: the supply chain for delivering that on Azure as an application. So, 228 00:11:14,760 --> 00:11:17,000 Speaker 3: now that you've built this application, how do you make 229 00:11:17,000 --> 00:11:19,840 Speaker 3: that available to your customers as well? And so that application, 230 00:11:19,880 --> 00:11:23,840 Speaker 3: as an example, could be Alaska's application. So being based 231 00:11:23,880 --> 00:11:26,160 Speaker 3: here in Seattle, Alaska is one of our carriers out 232 00:11:26,160 --> 00:11:28,280 Speaker 3: of the Seattle area that we travel a lot with, 233 00:11:28,520 --> 00:11:31,400 Speaker 3: and so they use Generative AI to actually start to 234 00:11:31,480 --> 00:11:33,640 Speaker 3: understand what is it that customers are looking for when 235 00:11:33,679 --> 00:11:36,960 Speaker 3: it comes to travel destinations. As keywis that travel the world, 236 00:11:37,040 --> 00:11:39,280 Speaker 3: we know that moving or traveling to another country is 237 00:11:39,320 --> 00:11:41,280 Speaker 3: a lot more than just booking a flight. It's about 238 00:11:41,280 --> 00:11:43,199 Speaker 3: what are the experiences I want to see? What is 239 00:11:43,240 --> 00:11:45,560 Speaker 3: the destination I want to reach, How's the best way 240 00:11:45,559 --> 00:11:46,080 Speaker 3: to get there? 241 00:11:46,240 --> 00:11:48,280 Speaker 2: What kind of hotels do I prefer to stay in? 242 00:11:48,600 --> 00:11:52,440 Speaker 3: And so using Generative AI, using the foundation models that 243 00:11:52,480 --> 00:11:55,600 Speaker 3: are part of Azure OpenAI on Foundry, Alaska Airlines has 244 00:11:55,600 --> 00:11:59,200 Speaker 3: been able to actually build applications which actually personalize those 245 00:11:59,240 --> 00:12:03,120 Speaker 3: recommendations to you instead of just publishing generic specials or 246 00:12:03,160 --> 00:12:04,160 Speaker 3: sales that are available. 247 00:12:04,240 --> 00:12:08,520 Speaker 1: Yeah, great use case so you obviously did that setting 248 00:12:08,600 --> 00:12:12,600 Speaker 1: up the Office of Responsible AI, the decision making you 249 00:12:12,640 --> 00:12:16,280 Speaker 1: go through to decide what models can come into this 250 00:12:16,400 --> 00:12:20,360 Speaker 1: trusted space foundry. What did you learn in that experience 251 00:12:20,400 --> 00:12:24,480 Speaker 1: setting up that office that informs those decisions? And when 252 00:12:24,520 --> 00:12:27,480 Speaker 1: it's moving so quickly. I'm thinking of openclaw, which just 253 00:12:27,520 --> 00:12:31,439 Speaker 1: in January just exploded out of nowhere. One guy, Peter Steinberger, 254 00:12:31,440 --> 00:12:35,000 Speaker 1: who created this thing and then it went viral. There 255 00:12:35,080 --> 00:12:37,040 Speaker 1: might be pressure from customers I want to access that 256 00:12:37,160 --> 00:12:40,400 Speaker 1: in a trusted environment. How do you balance that need 257 00:12:40,440 --> 00:12:43,920 Speaker 1: to be competitive and move fast but to tick all 258 00:12:43,920 --> 00:12:45,960 Speaker 1: of those responsible AI boxes as well. 259 00:12:46,080 --> 00:12:49,120 Speaker 3: Yeah, it's a fantastic question, and let me go into 260 00:12:49,160 --> 00:12:50,800 Speaker 3: it with kind of two things. 261 00:12:50,840 --> 00:12:52,719 Speaker 2: One is, yeah, as a company, we have. 262 00:12:52,800 --> 00:12:55,520 Speaker 3: Established a clear set of principles and we take those 263 00:12:55,520 --> 00:12:58,320 Speaker 3: principles very seriously. And as I said when I joined 264 00:12:58,320 --> 00:13:00,640 Speaker 3: this organization and now I get to partner with amazing 265 00:13:00,720 --> 00:13:02,640 Speaker 3: leaders like Sarah Bird, what we've been able to do 266 00:13:02,720 --> 00:13:05,160 Speaker 3: is turn those principles into tools and technologies that we 267 00:13:05,280 --> 00:13:08,400 Speaker 3: use to apply them. And so as an example, we 268 00:13:08,440 --> 00:13:11,280 Speaker 3: work with all of the major labs and we actually 269 00:13:11,280 --> 00:13:13,520 Speaker 3: evaluate all of the models before we release them. On 270 00:13:13,520 --> 00:13:15,920 Speaker 3: our platform, and we assess them to ensure that they 271 00:13:16,000 --> 00:13:19,640 Speaker 3: uphold our principles and our safety standards. And we've actually 272 00:13:19,679 --> 00:13:22,120 Speaker 3: rejected models which are very popular in the marketplace, but 273 00:13:22,160 --> 00:13:26,160 Speaker 3: we've said hasn't held our bar for the safety and 274 00:13:26,240 --> 00:13:28,360 Speaker 3: the responsible use standards that we apply. 275 00:13:28,840 --> 00:13:31,120 Speaker 2: Another example is when there's powerful tools that. 276 00:13:31,080 --> 00:13:33,760 Speaker 3: Could be used or misused, we put gating controls in 277 00:13:33,800 --> 00:13:36,480 Speaker 3: place to actually ensure that customers that are using them 278 00:13:36,480 --> 00:13:39,360 Speaker 3: aligned to our safety standards as well. We actually have 279 00:13:39,400 --> 00:13:42,040 Speaker 3: customers attest to those safety standards, and so it's an 280 00:13:42,080 --> 00:13:45,040 Speaker 3: important part of trust in our platform. Doesn't just mean 281 00:13:45,080 --> 00:13:47,240 Speaker 3: having a set of principles, but it means applying those 282 00:13:47,559 --> 00:13:50,800 Speaker 3: at velocity and at pace. And we are constantly learning, 283 00:13:50,840 --> 00:13:54,319 Speaker 3: and we're constantly looking across the industry and evaluating how 284 00:13:54,320 --> 00:13:56,880 Speaker 3: do we improve our tools and also how do we 285 00:13:56,920 --> 00:13:59,400 Speaker 3: partner with some of our closest partners as well. 286 00:13:59,440 --> 00:14:00,920 Speaker 2: And so we're part of several. 287 00:14:00,640 --> 00:14:04,480 Speaker 3: Partnership forums together with Opening Eye, together with Anthropic and 288 00:14:04,559 --> 00:14:07,840 Speaker 3: Gemini and many industry leaders to actually shape the way 289 00:14:07,840 --> 00:14:11,079 Speaker 3: that these models are actually being developed so that responsible 290 00:14:11,160 --> 00:14:13,400 Speaker 3: use is actually part of their DNA as well. 291 00:14:13,520 --> 00:14:16,679 Speaker 1: Yeah, if you think of a market like New Zealand's 292 00:14:17,800 --> 00:14:21,160 Speaker 1: big Microsoft shop. Essentially every government department runs a lot 293 00:14:21,160 --> 00:14:23,760 Speaker 1: of our big enterprises as well. A lot of them 294 00:14:23,920 --> 00:14:27,520 Speaker 1: have been using maybe AI tools on the Azure platform. 295 00:14:27,920 --> 00:14:31,800 Speaker 1: Then they've adopted Copilot Microsoft three sixty five. They're big 296 00:14:31,880 --> 00:14:34,560 Speaker 1: users of that, so a lot of their employees are 297 00:14:34,640 --> 00:14:38,120 Speaker 1: using that. Now we're seeing this transition into what models 298 00:14:38,120 --> 00:14:41,080 Speaker 1: can I use? And I'm kind to for you to 299 00:14:41,200 --> 00:14:43,640 Speaker 1: sort of explain this sort of servilist model you have 300 00:14:43,720 --> 00:14:47,800 Speaker 1: as well, you know, serveralist model as a service concept, 301 00:14:48,040 --> 00:14:51,240 Speaker 1: but how are customers sort of getting into that next 302 00:14:51,280 --> 00:14:54,400 Speaker 1: stage where they're actually looking at these models and how 303 00:14:54,400 --> 00:14:57,640 Speaker 1: they can employ them to do that really cool stuff 304 00:14:57,640 --> 00:14:58,960 Speaker 1: like Alaska Airlines is doing. 305 00:14:59,120 --> 00:15:02,280 Speaker 3: Yeah, and maybe I'll start with a customer example and 306 00:15:02,360 --> 00:15:04,080 Speaker 3: work back from that. So one of the customers that 307 00:15:04,120 --> 00:15:07,400 Speaker 3: we have worked closely with is CBS Health and so 308 00:15:07,600 --> 00:15:09,760 Speaker 3: CVS Health. One of the areas that they work very 309 00:15:10,440 --> 00:15:14,320 Speaker 3: deeply is in cancer research and cancer drug creation and 310 00:15:14,520 --> 00:15:17,120 Speaker 3: helping some of the millions of people around the world 311 00:15:17,200 --> 00:15:19,800 Speaker 3: who are suffering with cancer to both identify what the 312 00:15:19,880 --> 00:15:21,880 Speaker 3: right course of treatment is for them and also helping 313 00:15:21,920 --> 00:15:25,560 Speaker 3: researchers with patients opting in with data to actually unlock 314 00:15:25,640 --> 00:15:28,360 Speaker 3: new treatments and be able to assess them quickly, and 315 00:15:28,400 --> 00:15:30,280 Speaker 3: so all of that is built on our platform. Now, 316 00:15:30,280 --> 00:15:32,480 Speaker 3: as you build those types of solutions, a company like 317 00:15:32,520 --> 00:15:35,320 Speaker 3: CBS did want to know, how do we ensure that 318 00:15:35,360 --> 00:15:39,160 Speaker 3: the models we're using are aligned to our objectives as well. 319 00:15:39,680 --> 00:15:41,960 Speaker 3: So part of our platform we have a set of 320 00:15:42,000 --> 00:15:44,320 Speaker 3: models to come back to it, which are called Azure 321 00:15:44,320 --> 00:15:46,680 Speaker 3: Direct models. These are models that we run all the 322 00:15:46,680 --> 00:15:49,400 Speaker 3: safety checks on. There are also models which we offer 323 00:15:49,800 --> 00:15:52,720 Speaker 3: included as part of the offering a set of content 324 00:15:52,760 --> 00:15:57,720 Speaker 3: filters and abuse detection services to ensure that they remain 325 00:15:57,960 --> 00:16:01,200 Speaker 3: aligned while they're being used. And then we've built tools 326 00:16:01,240 --> 00:16:03,479 Speaker 3: on top of that to actually help our customers evaluate 327 00:16:03,520 --> 00:16:06,160 Speaker 3: these models. And so the evaluation tools like the ones 328 00:16:06,240 --> 00:16:11,280 Speaker 3: CVS use, can evaluate definitely for quality, definitely for performance, 329 00:16:11,560 --> 00:16:13,800 Speaker 3: but we've also taken all the safety evails that we 330 00:16:13,880 --> 00:16:16,960 Speaker 3: run and we packaged them into a safety evaluation agent, 331 00:16:17,320 --> 00:16:19,840 Speaker 3: and so customers can also evaluate the models for safety 332 00:16:20,320 --> 00:16:23,240 Speaker 3: and ask themselves for the scenario that I'm using, whether 333 00:16:23,280 --> 00:16:28,240 Speaker 3: it's the Alaska Airline scenario of vacation recommendations or the 334 00:16:28,280 --> 00:16:31,920 Speaker 3: CVS health scenario of looking for better research to solve 335 00:16:31,960 --> 00:16:35,800 Speaker 3: cancer problems. They probably have quite different safety profiles, so 336 00:16:35,840 --> 00:16:38,520 Speaker 3: they can use these tools to understand the capabilities of 337 00:16:38,520 --> 00:16:40,880 Speaker 3: the model, which one is best fit for their purpose, 338 00:16:40,920 --> 00:16:43,000 Speaker 3: both in terms of quality and in terms of safety. 339 00:16:43,200 --> 00:16:47,280 Speaker 1: Yeah, and what are you typically finding? Is most of 340 00:16:47,320 --> 00:16:50,560 Speaker 1: the use clustered around a couple of models or are 341 00:16:50,360 --> 00:16:52,880 Speaker 1: there our customers mixing the use of models. 342 00:16:53,000 --> 00:16:56,160 Speaker 3: Yeah, that's a great question. We have really seen the 343 00:16:56,240 --> 00:16:59,280 Speaker 3: market and our platform use shifting into I would say 344 00:16:59,320 --> 00:17:03,000 Speaker 3: two category worries. One category is certainly the majority. You 345 00:17:03,000 --> 00:17:06,120 Speaker 3: can think of it as seventy plus percent of overall 346 00:17:06,119 --> 00:17:09,440 Speaker 3: customer use is what we would consider as the hero models. 347 00:17:09,800 --> 00:17:11,600 Speaker 2: So today those would be things like. 348 00:17:11,920 --> 00:17:14,679 Speaker 3: Anthropics claudes on at four to six or Opus four 349 00:17:14,720 --> 00:17:18,119 Speaker 3: to six, or open ais GPT five point three or 350 00:17:18,200 --> 00:17:21,280 Speaker 3: five point four models, the latest models which are just 351 00:17:21,359 --> 00:17:25,240 Speaker 3: phenomenal across just about every category that you can imagine. 352 00:17:25,320 --> 00:17:29,320 Speaker 3: And then we've seen this rising star of customers who 353 00:17:29,320 --> 00:17:31,440 Speaker 3: have been using those models for some time and now 354 00:17:31,480 --> 00:17:34,200 Speaker 3: are looking at how can I train an open source 355 00:17:34,240 --> 00:17:38,840 Speaker 3: model to be just as good at these these larger models, 356 00:17:39,000 --> 00:17:41,840 Speaker 3: but at a very specific task. And so we're seeing 357 00:17:42,080 --> 00:17:45,320 Speaker 3: a rise in the open source market, not of customers 358 00:17:45,320 --> 00:17:47,240 Speaker 3: that are just taking models off the shelf at applying 359 00:17:47,240 --> 00:17:50,080 Speaker 3: them like the hero Lab models, but that are tuning 360 00:17:50,119 --> 00:17:53,119 Speaker 3: them and training them to be fit specifically for their purpose, 361 00:17:53,440 --> 00:17:55,200 Speaker 3: and that allows them to run it at a lower 362 00:17:55,240 --> 00:17:57,919 Speaker 3: cost with higher performance at times than the larger models. 363 00:17:58,040 --> 00:18:01,399 Speaker 1: Yeah. For instance, you've got Deepseek on their another phenomenon 364 00:18:01,440 --> 00:18:03,240 Speaker 1: that came, you know, about a year ago or something 365 00:18:03,280 --> 00:18:06,240 Speaker 1: like that, but very rapidly you evaluated that and added 366 00:18:06,240 --> 00:18:07,280 Speaker 1: that to the lineup as well. 367 00:18:07,359 --> 00:18:07,560 Speaker 2: Yeah. 368 00:18:07,640 --> 00:18:09,840 Speaker 3: Most recently we added the Kimmi models as well, which 369 00:18:09,840 --> 00:18:13,240 Speaker 3: are just fantastic open source coding models to that fleet too, 370 00:18:13,320 --> 00:18:15,199 Speaker 3: And so we're continuing to look at the market and 371 00:18:15,240 --> 00:18:18,080 Speaker 3: evaluate those options, and we're seeing customers start to use. 372 00:18:18,000 --> 00:18:19,760 Speaker 2: Them in very unique and compelling ways. 373 00:18:19,840 --> 00:18:22,280 Speaker 1: Yeah, and you have this. So you've got the foundry, 374 00:18:22,280 --> 00:18:25,200 Speaker 1: You've got all the models there in a secure environment. 375 00:18:25,520 --> 00:18:28,760 Speaker 1: It's interrogating your own data however you put it in there. 376 00:18:29,160 --> 00:18:31,520 Speaker 1: You've also got this AI studio on there as well, right, 377 00:18:31,600 --> 00:18:36,399 Speaker 1: so you can actually use tools to develop applications based 378 00:18:36,400 --> 00:18:37,359 Speaker 1: on all of those models. 379 00:18:37,520 --> 00:18:37,800 Speaker 2: Yeah. 380 00:18:38,520 --> 00:18:40,479 Speaker 3: You know, I've been around the industry for a while 381 00:18:40,640 --> 00:18:44,239 Speaker 3: and I saw the transition of web from Web one 382 00:18:44,320 --> 00:18:47,520 Speaker 3: point oh to Web two point zero. And for average consumers, 383 00:18:47,560 --> 00:18:50,760 Speaker 3: what that looked like is a place of the world going. 384 00:18:50,560 --> 00:18:52,800 Speaker 2: From information just being published online. 385 00:18:52,920 --> 00:18:55,159 Speaker 3: So my product catalog is now visible to you and 386 00:18:55,200 --> 00:18:57,920 Speaker 3: you can call me in order something to e commerce 387 00:18:58,119 --> 00:19:01,440 Speaker 3: where two systems could transact and actually fulfill. 388 00:19:01,119 --> 00:19:02,800 Speaker 2: An order and a supply. And that's the world we 389 00:19:02,840 --> 00:19:03,399 Speaker 2: live in today. 390 00:19:03,760 --> 00:19:06,560 Speaker 3: In the same way with generative AI, you know, the 391 00:19:06,600 --> 00:19:09,680 Speaker 3: pattern a year ago was like Web one point zero 392 00:19:09,760 --> 00:19:12,400 Speaker 3: is upload all of my information and give me rich context, 393 00:19:12,800 --> 00:19:15,560 Speaker 3: let me talk to it, let me actually ask questions 394 00:19:15,600 --> 00:19:20,800 Speaker 3: about unstructured data, and it can create phenomenal answers the classic. 395 00:19:20,520 --> 00:19:22,000 Speaker 2: Chat GPT example, right. 396 00:19:23,280 --> 00:19:26,720 Speaker 3: But really, the evolution we're now in is synonymous with 397 00:19:26,760 --> 00:19:27,879 Speaker 3: Web two point zero. 398 00:19:27,960 --> 00:19:29,480 Speaker 2: It's the agent evolution. 399 00:19:30,040 --> 00:19:33,080 Speaker 3: And this actually allows us to create value chains, allows 400 00:19:33,160 --> 00:19:36,439 Speaker 3: us to connect information to fulfill an order, allows us 401 00:19:36,480 --> 00:19:37,480 Speaker 3: to connect. 402 00:19:37,320 --> 00:19:40,000 Speaker 2: To actions to actually create an outcome. 403 00:19:40,320 --> 00:19:42,320 Speaker 3: That the biggest place we're seeing this is actually in 404 00:19:42,359 --> 00:19:44,760 Speaker 3: software development itself. You know, I was chatting to my 405 00:19:44,800 --> 00:19:47,520 Speaker 3: wife the other day and she was asked me, Hey, how. 406 00:19:47,359 --> 00:19:49,280 Speaker 2: Can I build a website. 407 00:19:48,800 --> 00:19:51,639 Speaker 3: For this nonprofit organization that I work for, do have 408 00:19:51,640 --> 00:19:54,080 Speaker 3: any hints, and so I pointed her a couple AI tools. 409 00:19:54,359 --> 00:19:58,399 Speaker 3: She's someone who's not a technologist, hasn't worked in the 410 00:19:58,440 --> 00:20:00,879 Speaker 3: technology field apart from living in the same house as 411 00:20:00,920 --> 00:20:03,200 Speaker 3: me for over twenty five years, and she was able 412 00:20:03,240 --> 00:20:06,040 Speaker 3: to stand up a website in ten minutes. Because these 413 00:20:06,080 --> 00:20:10,080 Speaker 3: tools can connect intent to outcome and create amazing things. 414 00:20:10,160 --> 00:20:12,560 Speaker 3: And so that's really what I'm excited about, and coming 415 00:20:12,560 --> 00:20:16,000 Speaker 3: back to Foundry. We give companies the platform to create 416 00:20:16,040 --> 00:20:19,320 Speaker 3: those connections. We have this agent framework that actually allows 417 00:20:19,359 --> 00:20:23,000 Speaker 3: customers and companies to take that intent, turn it into data, 418 00:20:23,160 --> 00:20:25,199 Speaker 3: take that data and turn it into an outcome. 419 00:20:25,920 --> 00:20:30,760 Speaker 1: That's incredible, and I think that's the advantage that Microsoft has. 420 00:20:31,359 --> 00:20:33,919 Speaker 1: You might SharePoint or wherever you've got all of this 421 00:20:34,080 --> 00:20:36,960 Speaker 1: data that's in a safe place, so it's sort of 422 00:20:36,960 --> 00:20:40,240 Speaker 1: like a sandbox where you can experiment with AI before 423 00:20:40,359 --> 00:20:44,320 Speaker 1: unleashing agents more broadly across the organization. 424 00:20:44,760 --> 00:20:47,280 Speaker 3: Yeah, it is something that is very important to us 425 00:20:48,240 --> 00:20:52,240 Speaker 3: is our data promises to customers. We realize customers have 426 00:20:52,240 --> 00:20:55,400 Speaker 3: a choice of platforms today. There's a number of very 427 00:20:55,400 --> 00:20:58,439 Speaker 3: good providers out there, and so why would they choose Azure, 428 00:20:58,480 --> 00:21:03,000 Speaker 3: why would they choose Microsoft Foundry And the reasons we 429 00:21:03,080 --> 00:21:05,840 Speaker 3: recognize our most important does come down to the privacy, 430 00:21:06,520 --> 00:21:10,199 Speaker 3: the security, the reliability of the data. You know, we 431 00:21:10,280 --> 00:21:12,520 Speaker 3: make this very clear promise that we do not train 432 00:21:12,600 --> 00:21:13,520 Speaker 3: on customer data. 433 00:21:13,680 --> 00:21:15,320 Speaker 2: We do not look at customer data. 434 00:21:15,680 --> 00:21:18,040 Speaker 3: Even though we have our own labs environment that are 435 00:21:18,040 --> 00:21:21,199 Speaker 3: building models. We never use the data that is on 436 00:21:21,280 --> 00:21:25,040 Speaker 3: Microsoft Foundry, and we give customers control over that data 437 00:21:25,119 --> 00:21:28,719 Speaker 3: and so the data remains in the region that you choose. 438 00:21:29,000 --> 00:21:31,320 Speaker 3: You're customer based in Australia and you upload your data 439 00:21:31,359 --> 00:21:34,680 Speaker 3: in Australian data center, the data will remain at rest 440 00:21:34,760 --> 00:21:37,560 Speaker 3: in Australia. It is under your control as a customer 441 00:21:37,680 --> 00:21:40,440 Speaker 3: when you're using our service, and we recognize that's really 442 00:21:40,520 --> 00:21:43,480 Speaker 3: really important for customers across every industry is to ensure 443 00:21:43,640 --> 00:21:46,120 Speaker 3: that while they can harness the power of these models, 444 00:21:46,280 --> 00:21:49,440 Speaker 3: they remain in control of their data, their information. They're 445 00:21:49,480 --> 00:21:51,880 Speaker 3: not leaking information outside of their organizations. 446 00:21:52,160 --> 00:21:54,159 Speaker 1: Yeah, and related to that, talk about that sort of 447 00:21:54,200 --> 00:21:58,119 Speaker 1: servillus idea where you know, for a long time it 448 00:21:58,200 --> 00:21:59,840 Speaker 1: was you know, you would buy sort of capacity and 449 00:21:59,840 --> 00:22:03,639 Speaker 1: attuned in you know, the public cloud. It wasn't dedicated space. 450 00:22:03,680 --> 00:22:08,480 Speaker 1: It was sort of renting time on GPUs or CPUs 451 00:22:08,520 --> 00:22:12,040 Speaker 1: on a server. So now, yeah, you don't need any 452 00:22:12,080 --> 00:22:14,919 Speaker 1: really dedicated space to maybe for your data, but the 453 00:22:14,960 --> 00:22:17,760 Speaker 1: model is rented to you as a service. 454 00:22:17,960 --> 00:22:20,760 Speaker 3: Yeah, we really try to aim to deliver time to 455 00:22:20,840 --> 00:22:22,960 Speaker 3: value to our customers and what that means is you 456 00:22:22,960 --> 00:22:25,920 Speaker 3: can get started with as simple as paying for the request. 457 00:22:25,960 --> 00:22:27,240 Speaker 2: The request is in the form of the. 458 00:22:27,200 --> 00:22:30,359 Speaker 3: Token, right, the token is the now value exchange of 459 00:22:30,480 --> 00:22:33,960 Speaker 3: money to outcome of characters or words from these models, 460 00:22:34,359 --> 00:22:37,880 Speaker 3: and so our service offerings across our hero of now 461 00:22:37,880 --> 00:22:40,159 Speaker 3: over sixty five models that we offer as services in 462 00:22:40,200 --> 00:22:44,080 Speaker 3: our platform are as simple as just create an endpoint, 463 00:22:44,760 --> 00:22:45,520 Speaker 3: start sending. 464 00:22:45,359 --> 00:22:47,800 Speaker 2: Your traffic, just pay for the tokens that you need. 465 00:22:48,320 --> 00:22:51,199 Speaker 3: Now, as customers climb up into more and more critical, 466 00:22:51,480 --> 00:22:55,479 Speaker 3: business critical and mission critical use cases, then they can 467 00:22:55,520 --> 00:22:58,720 Speaker 3: actually start to transition to a capacity backed offering where 468 00:22:58,720 --> 00:23:03,360 Speaker 3: we have guarantees around latency and performance for that capacity. 469 00:23:03,680 --> 00:23:06,640 Speaker 3: So you know, one of the examples for that is HARVEYAI. 470 00:23:06,840 --> 00:23:10,399 Speaker 3: So they are a legal organization that we partner very 471 00:23:10,440 --> 00:23:15,040 Speaker 3: closely with that uses our service to actually support organizations 472 00:23:15,080 --> 00:23:19,400 Speaker 3: legal departments when it comes to legal document retention, document 473 00:23:19,440 --> 00:23:22,359 Speaker 3: discovery and information. Obviously, they want to ensure that the 474 00:23:22,400 --> 00:23:25,320 Speaker 3: performance of that service is guaranteed, and so they use 475 00:23:25,359 --> 00:23:28,960 Speaker 3: their provisioned offering as part of their service to actually 476 00:23:28,960 --> 00:23:31,240 Speaker 3: guarantee that performance back to their customers. 477 00:23:31,440 --> 00:23:34,200 Speaker 2: And so both are service offerings. 478 00:23:34,400 --> 00:23:36,960 Speaker 3: Customers don't have to buy computers, they don't have to 479 00:23:36,960 --> 00:23:39,960 Speaker 3: buy GPUs. They just pay for how they use it 480 00:23:40,119 --> 00:23:42,120 Speaker 3: and they need to pay on a per request basis, 481 00:23:42,359 --> 00:23:45,520 Speaker 3: or they can pay for a capacity backed guarantee based 482 00:23:45,520 --> 00:23:46,480 Speaker 3: on where their services. 483 00:23:49,960 --> 00:23:52,240 Speaker 1: I think that's an easy way in because a lot 484 00:23:52,240 --> 00:23:54,640 Speaker 1: of the research from our part of the world Australia, 485 00:23:54,680 --> 00:23:56,920 Speaker 1: New Zealand, we are laying a little bit in AI 486 00:23:57,160 --> 00:24:01,800 Speaker 1: adoption and some of the barriers are just capable within organizations, 487 00:24:02,160 --> 00:24:05,040 Speaker 1: both from a governance level but also in IT capability. 488 00:24:05,119 --> 00:24:07,679 Speaker 1: We don't have that many AI experts in New Zealand. 489 00:24:07,760 --> 00:24:11,440 Speaker 1: So are you finding that it's allowing companies to sort 490 00:24:11,440 --> 00:24:13,800 Speaker 1: of dip their toe in a AI in a way 491 00:24:13,840 --> 00:24:16,439 Speaker 1: where it's a lot easier than maybe it would have 492 00:24:16,480 --> 00:24:18,840 Speaker 1: been five or ten years ago. We literally had to 493 00:24:18,880 --> 00:24:20,520 Speaker 1: have data scientists in the organization. 494 00:24:20,960 --> 00:24:21,360 Speaker 2: Yeah. 495 00:24:21,400 --> 00:24:23,800 Speaker 3: Absolutely, it's a lot easier, and it's something as a 496 00:24:23,840 --> 00:24:26,760 Speaker 3: KIWI I'm actually really excited about. And if there's a 497 00:24:26,800 --> 00:24:29,119 Speaker 3: call to action to any kiwis that are listening to this, 498 00:24:29,200 --> 00:24:32,680 Speaker 3: I would say that the past in some sense doesn't matter. 499 00:24:33,040 --> 00:24:36,120 Speaker 3: We're at a reset moment and the opportunities in front 500 00:24:36,160 --> 00:24:39,120 Speaker 3: of us, and as kiwis, we have this ingenuity right, 501 00:24:39,160 --> 00:24:41,360 Speaker 3: we have the number eight wire mentality of being able 502 00:24:41,400 --> 00:24:44,320 Speaker 3: to figure it out and fix it myself. And I'm 503 00:24:44,359 --> 00:24:47,240 Speaker 3: bringing that analogy forward because what I'm seeing is that 504 00:24:47,280 --> 00:24:51,800 Speaker 3: the most exciting things are happening in small startups, seeing 505 00:24:51,840 --> 00:24:55,520 Speaker 3: one or two person startups creating amazing breakthroughs with generative 506 00:24:55,560 --> 00:24:58,880 Speaker 3: AI because the speed that they can move in creating 507 00:24:58,920 --> 00:25:02,360 Speaker 3: new products is phenomenal. One example Lirebird Healthcare out of Australia. 508 00:25:02,400 --> 00:25:04,399 Speaker 3: I know it's not New Zealand, but this is an 509 00:25:04,440 --> 00:25:07,840 Speaker 3: example that's close to home. They've built a service that 510 00:25:08,000 --> 00:25:10,879 Speaker 3: uses a lot of the speech capabilities and our Microsoft 511 00:25:10,880 --> 00:25:14,840 Speaker 3: foundry to actually close the gap between patient and provider, 512 00:25:15,400 --> 00:25:19,760 Speaker 3: especially across multiple different languages, using the translation capabilities, ensuring 513 00:25:19,800 --> 00:25:22,560 Speaker 3: that those context that the doctors have when patients call them, 514 00:25:22,800 --> 00:25:25,280 Speaker 3: and en showing that patients get the help that they need. 515 00:25:25,680 --> 00:25:29,159 Speaker 2: That was started by a couple of young kids. 516 00:25:29,240 --> 00:25:31,920 Speaker 3: I would say, you know, sub twenty five year olds 517 00:25:31,960 --> 00:25:34,320 Speaker 3: being able to create this amazing startup that is earning 518 00:25:34,359 --> 00:25:36,200 Speaker 3: tens and tens of millions of dollars a year now. 519 00:25:36,560 --> 00:25:41,800 Speaker 3: So small companies can excel at a global level with 520 00:25:41,880 --> 00:25:46,240 Speaker 3: these capabilities. Now, for large companies like Microsoft, that's also 521 00:25:46,320 --> 00:25:48,520 Speaker 3: an interesting challenge like how do we transform? 522 00:25:48,840 --> 00:25:50,400 Speaker 2: How do we actually change our culture? 523 00:25:50,760 --> 00:25:52,480 Speaker 3: And so a big part of Microsoft Foundry being in 524 00:25:52,560 --> 00:25:55,040 Speaker 3: Corey I is Microsoft's own experiment to say, how do 525 00:25:55,119 --> 00:25:58,200 Speaker 3: we run like an like a startup within a large 526 00:25:58,760 --> 00:25:59,959 Speaker 3: enterprise organized? 527 00:26:00,440 --> 00:26:03,880 Speaker 2: How do we actually rewire our own culture to be fast, 528 00:26:04,080 --> 00:26:06,080 Speaker 2: to be flat, to actually move quickly. 529 00:26:06,440 --> 00:26:08,800 Speaker 3: So that's the challenges I think for enterprises in New 530 00:26:08,880 --> 00:26:11,360 Speaker 3: Zealand too, how do they think about rewiring their culture 531 00:26:11,640 --> 00:26:13,879 Speaker 3: to move quickly to keep up with the pace of 532 00:26:13,920 --> 00:26:15,080 Speaker 3: the technology. 533 00:26:14,560 --> 00:26:15,320 Speaker 2: And the opportunity. 534 00:26:15,480 --> 00:26:18,560 Speaker 1: Yeah, we're probably never going to develop our own foundational 535 00:26:18,600 --> 00:26:21,800 Speaker 1: models in New Zealand. The cost of that is a massive, 536 00:26:21,840 --> 00:26:24,920 Speaker 1: but we can become leaders at the adoption of AI 537 00:26:25,000 --> 00:26:27,840 Speaker 1: and applying it to various problems. And as you say, 538 00:26:27,880 --> 00:26:29,360 Speaker 1: that could be a one or two man band it. 539 00:26:29,320 --> 00:26:32,680 Speaker 3: Does that, Yeah, absolutely, And just the number of examples 540 00:26:32,720 --> 00:26:35,800 Speaker 3: I'm seeing around the world for that is really really exciting, 541 00:26:36,240 --> 00:26:38,879 Speaker 3: and the reach of their impact is exciting too. And 542 00:26:38,960 --> 00:26:43,400 Speaker 3: so I would encourage if people haven't gone and gone 543 00:26:43,440 --> 00:26:46,680 Speaker 3: to AI dot Azure dot com, our Microsoft Foundery website, 544 00:26:46,680 --> 00:26:48,919 Speaker 3: if they haven't signed up and got an endpoint and 545 00:26:49,000 --> 00:26:51,359 Speaker 3: just started to see what they can do, or go 546 00:26:51,400 --> 00:26:55,639 Speaker 3: to their favorite CLI something like GitHub copilot. Just the 547 00:26:55,680 --> 00:26:58,600 Speaker 3: other day, I downloaded the latest version of GitHub copilot cli, 548 00:26:58,840 --> 00:27:01,040 Speaker 3: and what I decided to do is actually have copiled 549 00:27:01,080 --> 00:27:03,919 Speaker 3: up find some free time in my calendar, and so 550 00:27:03,960 --> 00:27:06,080 Speaker 3: I wired copil that up with work iq and I 551 00:27:06,119 --> 00:27:08,680 Speaker 3: asked work iq, hey, I need eight hours to learn 552 00:27:08,720 --> 00:27:11,560 Speaker 3: about some more new agentic tools that. 553 00:27:11,480 --> 00:27:12,840 Speaker 2: Are coming out. Can you find that for me? 554 00:27:13,119 --> 00:27:14,960 Speaker 3: And it went through my calendar and it found me 555 00:27:15,040 --> 00:27:16,800 Speaker 3: eight hours in my week that I didn't think I 556 00:27:16,880 --> 00:27:19,320 Speaker 3: had and to block that time for me. Then the 557 00:27:19,320 --> 00:27:20,639 Speaker 3: next thing I was able to do is at the 558 00:27:20,720 --> 00:27:22,359 Speaker 3: end of that week, I actually asked copil that, okay, 559 00:27:22,400 --> 00:27:24,639 Speaker 3: grade me, how well did I do it using that 560 00:27:24,720 --> 00:27:25,960 Speaker 3: time that you found me. 561 00:27:26,119 --> 00:27:28,679 Speaker 2: Now, in full transparency, I failed. I failed. 562 00:27:28,680 --> 00:27:30,560 Speaker 3: The great I decided to fill up the time working 563 00:27:30,560 --> 00:27:33,680 Speaker 3: on other things. But my point is that was from 564 00:27:33,680 --> 00:27:37,639 Speaker 3: someone who Remember, I'm an architect. I didn't go to 565 00:27:38,280 --> 00:27:42,760 Speaker 3: computer science grad school. I am not a developer by nature. 566 00:27:42,880 --> 00:27:47,440 Speaker 3: I started drawing buildings and learning CAD, and in less 567 00:27:47,440 --> 00:27:49,520 Speaker 3: than twenty four hours I was able to wire these 568 00:27:49,560 --> 00:27:53,280 Speaker 3: tools together and create things that are giving me time 569 00:27:53,320 --> 00:27:55,800 Speaker 3: back in my day and helping me learn about the tools. 570 00:27:55,960 --> 00:27:58,640 Speaker 2: And so I think that's the exciting opportunity. 571 00:27:58,080 --> 00:28:02,000 Speaker 1: Here, Yeah, what is the opportunity? And Steve just finally, 572 00:28:02,040 --> 00:28:05,560 Speaker 1: you know you've seen, you've witnessed firsthand in Microsoft that 573 00:28:05,640 --> 00:28:09,159 Speaker 1: whole transition from AI thinks like computer vision in that 574 00:28:09,240 --> 00:28:13,240 Speaker 1: Microsoft was really big on. Then the generative AI revolution, 575 00:28:13,800 --> 00:28:17,840 Speaker 1: agentic AI. Where's it all going? I mean we've seen 576 00:28:18,280 --> 00:28:20,240 Speaker 1: even in the last six months of the narrative around 577 00:28:20,280 --> 00:28:23,320 Speaker 1: oh spinning up a couple of agents to literally thousands 578 00:28:23,359 --> 00:28:27,080 Speaker 1: of agents, agents for everything, agents talking to each other. 579 00:28:27,160 --> 00:28:30,240 Speaker 1: How do you sort of navigate that complexity that is coming? 580 00:28:30,640 --> 00:28:33,560 Speaker 3: Yeah, So there's two parts that one is navigating the 581 00:28:33,560 --> 00:28:36,320 Speaker 3: complexity and the other is where is it going. So 582 00:28:36,400 --> 00:28:38,080 Speaker 3: let me start with where it's going and then maybe 583 00:28:38,120 --> 00:28:40,640 Speaker 3: come back to navigating the complexity in terms of where 584 00:28:40,680 --> 00:28:41,040 Speaker 3: it's going. 585 00:28:41,040 --> 00:28:42,480 Speaker 2: I think the next frontier. 586 00:28:42,120 --> 00:28:45,920 Speaker 3: That everyone's excited about is context memory IQ. So we've 587 00:28:45,920 --> 00:28:49,080 Speaker 3: got foundry IQ, we've got fabric IQ, we've got work IQ. 588 00:28:49,240 --> 00:28:53,440 Speaker 3: We've got these IQs growing inside Microsoft and across the industry, 589 00:28:53,480 --> 00:28:55,920 Speaker 3: which is about how do I bring that context. I 590 00:28:55,960 --> 00:28:58,840 Speaker 3: gave the Alaska Airlines example of a company that's doing 591 00:28:58,880 --> 00:29:02,360 Speaker 3: that today, taking your personal flight behavior, is your personal 592 00:29:02,440 --> 00:29:06,000 Speaker 3: vacation plans, and using that as context to personalize the 593 00:29:06,040 --> 00:29:08,240 Speaker 3: outreach to you in a way that you can opt 594 00:29:08,320 --> 00:29:10,400 Speaker 3: into as a consumer. You can decide do I want 595 00:29:10,440 --> 00:29:12,880 Speaker 3: to do this or not. But it brings more context, 596 00:29:12,920 --> 00:29:15,520 Speaker 3: which brings better answers, it brings better insights. And so 597 00:29:15,600 --> 00:29:18,080 Speaker 3: that's the next frontier is not just agents that are 598 00:29:18,120 --> 00:29:20,280 Speaker 3: doing what I tell it to, but agents that learn 599 00:29:20,320 --> 00:29:23,520 Speaker 3: my behavior. The agent that could go in and automatically 600 00:29:23,560 --> 00:29:26,040 Speaker 3: book that eight is for me because it knows I 601 00:29:26,080 --> 00:29:28,440 Speaker 3: need it or at least recommend that so I can 602 00:29:28,520 --> 00:29:30,640 Speaker 3: take an action on it. That's where it's going, and 603 00:29:30,680 --> 00:29:32,880 Speaker 3: that's what I'm excited about. So then how do you 604 00:29:32,960 --> 00:29:35,480 Speaker 3: govern that? Right, there's this other side of Okay, I've 605 00:29:35,480 --> 00:29:37,760 Speaker 3: got all these agents I've got all these MCP servers 606 00:29:38,000 --> 00:29:40,600 Speaker 3: in my organizations. I've got the equivalent to a USB 607 00:29:40,720 --> 00:29:43,600 Speaker 3: for software that can plug and play any two systems together. 608 00:29:43,640 --> 00:29:45,080 Speaker 2: How do I control that? 609 00:29:45,440 --> 00:29:47,000 Speaker 3: And so one of the key things in foundry, as 610 00:29:47,040 --> 00:29:48,800 Speaker 3: well as the foundry control playing this is. 611 00:29:48,800 --> 00:29:49,720 Speaker 2: Our governance layer. 612 00:29:50,040 --> 00:29:53,200 Speaker 3: This identifies all of the agentic services that you have 613 00:29:53,320 --> 00:29:56,959 Speaker 3: created a registertive foundry and begins to apply governance to that, 614 00:29:57,160 --> 00:29:59,880 Speaker 3: first of all, letting you see who's creating it, who's 615 00:30:00,120 --> 00:30:02,760 Speaker 3: using it, and then giving you controls over what data 616 00:30:02,800 --> 00:30:04,520 Speaker 3: can it access and how can. 617 00:30:04,440 --> 00:30:05,160 Speaker 2: It access that? 618 00:30:05,440 --> 00:30:07,440 Speaker 3: And so companies do need to be thoughtful about their 619 00:30:07,480 --> 00:30:11,600 Speaker 3: governance structures as well as embracing the opportunity for opening 620 00:30:11,600 --> 00:30:11,960 Speaker 3: that up. 621 00:30:12,240 --> 00:30:17,040 Speaker 1: Yeah. Yeah, and it's going to be wild ride. It 622 00:30:17,080 --> 00:30:21,160 Speaker 1: has been already just since chat GPT exploded onto the market. 623 00:30:21,240 --> 00:30:24,840 Speaker 1: It's been incredible the transition. So we'll keep it closer. 624 00:30:24,840 --> 00:30:27,520 Speaker 1: I hopefully we'll see you back in New Zealand evangelizing 625 00:30:27,520 --> 00:30:30,120 Speaker 1: this because I think our businesses really do need to 626 00:30:30,480 --> 00:30:32,440 Speaker 1: jump on board this or they will be left behind. 627 00:30:32,480 --> 00:30:34,440 Speaker 1: But thanks so much Steve for your time and thanks 628 00:30:34,480 --> 00:30:35,960 Speaker 1: for coming on the Business of Tech. 629 00:30:36,120 --> 00:30:37,120 Speaker 2: Yeah, thank you for having me. 630 00:30:37,120 --> 00:30:40,960 Speaker 1: I really appreciate it, so thanks to Steve Sweeten for 631 00:30:41,040 --> 00:30:45,000 Speaker 1: coming on just after I recorded that interview with Steve 632 00:30:45,520 --> 00:30:49,520 Speaker 1: open Ai, which Microsoft has a multi billion dollar investment in, 633 00:30:50,000 --> 00:30:54,200 Speaker 1: released GPT five point four, its latest model. It calls 634 00:30:54,240 --> 00:30:58,280 Speaker 1: it its Thinking model. Now. GPT five point four is interesting. 635 00:30:58,400 --> 00:31:01,560 Speaker 1: It will provide you with an upfront plan if it's thinking, 636 00:31:01,760 --> 00:31:05,800 Speaker 1: so essentially you can change course mid response while it's 637 00:31:05,880 --> 00:31:09,160 Speaker 1: working and arrive at a final output that's closer to 638 00:31:09,160 --> 00:31:12,120 Speaker 1: what you really want, which is pretty cool ideal for 639 00:31:12,360 --> 00:31:16,480 Speaker 1: professional work. It went straight up onto Foundry after evaluation 640 00:31:16,600 --> 00:31:19,840 Speaker 1: by Steve's team. The clear message really I took away 641 00:31:19,840 --> 00:31:23,040 Speaker 1: from the trip to Microsoft on aifront is just the 642 00:31:23,120 --> 00:31:28,560 Speaker 1: acceleration in model development and release, and how keeping on 643 00:31:28,640 --> 00:31:32,360 Speaker 1: top of that is absolutely crucial for the credibility of 644 00:31:32,480 --> 00:31:36,080 Speaker 1: a big company like Microsoft that is offering access to 645 00:31:36,160 --> 00:31:38,520 Speaker 1: all of these models. All of the hyperscalers are doing it, 646 00:31:38,920 --> 00:31:41,479 Speaker 1: and they've got to get it right. That's it for 647 00:31:41,520 --> 00:31:44,360 Speaker 1: this episode of the Business of Tech with Me, Peter Griffin. 648 00:31:44,440 --> 00:31:48,080 Speaker 1: Thanks to Microsoft for inviting me to Redmond and covering 649 00:31:48,160 --> 00:31:52,960 Speaker 1: my travel and accommodation. I also visited Microsoft's Quantum computing labs, 650 00:31:53,360 --> 00:31:55,320 Speaker 1: how to look at how they create their own high 651 00:31:55,400 --> 00:31:59,200 Speaker 1: performance silicon for their Azure data centers, and caught up 652 00:31:59,240 --> 00:32:02,560 Speaker 1: with Microsoft President Brad Smith. So check out my LinkedIn 653 00:32:02,640 --> 00:32:05,760 Speaker 1: page for updates on all of that thanks to our 654 00:32:05,760 --> 00:32:09,000 Speaker 1: sponsored two degrees. If you found this useful, follow or 655 00:32:09,000 --> 00:32:12,400 Speaker 1: subscribe in your favorite podcast app and share with someone 656 00:32:12,440 --> 00:32:15,840 Speaker 1: in your organization who needs a pragmatic take on where 657 00:32:15,880 --> 00:32:19,160 Speaker 1: AI is really heading. Thanks for listening and I'll catch 658 00:32:19,160 --> 00:32:19,720 Speaker 1: you next week.