1 00:00:03,040 --> 00:00:14,360 Speaker 1: Bloomberg Audio Studios, Podcasts, radio News. 2 00:00:18,040 --> 00:00:21,520 Speaker 2: Hello and welcome to another episode of the Odd Lots podcast. 3 00:00:21,600 --> 00:00:24,639 Speaker 3: I'm Joe Wasnal and I'm Tracy Alloway. Tracy. 4 00:00:24,680 --> 00:00:29,120 Speaker 2: We're recording this February eleventh, and IGV the software ETF 5 00:00:29,160 --> 00:00:31,280 Speaker 2: down another three percent today. 6 00:00:31,600 --> 00:00:35,720 Speaker 4: It has been ugly in software. Everyone's throwing around the 7 00:00:35,760 --> 00:00:38,560 Speaker 4: term SaaS apocalypse. I mean, the great thing about SaaS 8 00:00:38,680 --> 00:00:40,160 Speaker 4: is there are a lot of things that like rhyme 9 00:00:40,200 --> 00:00:42,120 Speaker 4: with it, a lot of hominems, so you can you 10 00:00:42,159 --> 00:00:46,240 Speaker 4: can make all those puns. Yeah, exactly, SaaS is trash whatever. 11 00:00:46,600 --> 00:00:50,000 Speaker 4: But I'm looking at the share price of Salesforce, oh yeah, 12 00:00:50,000 --> 00:00:52,800 Speaker 4: in particular because I always think of Salesforce as sort 13 00:00:52,840 --> 00:00:56,720 Speaker 4: of like emblematic poster child. Yeah, post child of like 14 00:00:56,760 --> 00:01:00,520 Speaker 4: a software company that I'm not really sure what they do, 15 00:01:01,080 --> 00:01:02,800 Speaker 4: but yeah, it's it's just ugly. 16 00:01:03,000 --> 00:01:05,160 Speaker 2: It's basically been cut in half, hasn't it. Since it's 17 00:01:05,200 --> 00:01:08,080 Speaker 2: peaked like an early twenty twenty five. Right now it's 18 00:01:08,120 --> 00:01:09,560 Speaker 2: one eighty four eighty. 19 00:01:09,319 --> 00:01:12,039 Speaker 4: Four, and it's all your fault, shoe. 20 00:01:12,040 --> 00:01:12,920 Speaker 3: It's all my fault. 21 00:01:13,000 --> 00:01:16,119 Speaker 2: That's right, because earlier in the year, after we got 22 00:01:16,160 --> 00:01:19,520 Speaker 2: back from Christmas, vacation or Christmas break, you know, around 23 00:01:19,520 --> 00:01:22,920 Speaker 2: that I'd seen everyone playing around with claud code, and 24 00:01:22,959 --> 00:01:25,080 Speaker 2: then I had to do it. We did an episode, 25 00:01:25,120 --> 00:01:27,360 Speaker 2: and so people were like, oh, if Joe Wisittal can 26 00:01:27,600 --> 00:01:30,160 Speaker 2: like figure out claud code, that there must not be 27 00:01:30,200 --> 00:01:33,000 Speaker 2: any value to any of these companies at all. 28 00:01:33,080 --> 00:01:33,360 Speaker 5: There. 29 00:01:33,360 --> 00:01:36,680 Speaker 2: You know, you mentioned Salesforce. That's far from the ugliest one. 30 00:01:36,720 --> 00:01:38,959 Speaker 2: I'm looking at Atlassian, which makes a lot of like 31 00:01:39,000 --> 00:01:43,039 Speaker 2: workforce productivity companies, like some Slack competitors and stuff. That 32 00:01:43,120 --> 00:01:44,880 Speaker 2: was a four hundred and fifty dollars stock back in 33 00:01:44,920 --> 00:01:48,040 Speaker 2: twenty twenty one. That's an eighty six dollars stock. So like, yeah, 34 00:01:48,040 --> 00:01:50,559 Speaker 2: it's ugly, and yeah, as you said, everyone is realizing 35 00:01:50,680 --> 00:01:54,720 Speaker 2: that if any old fool can write software, maybe these companies, yeah, 36 00:01:54,800 --> 00:01:55,720 Speaker 2: I don't have much value. 37 00:01:55,840 --> 00:01:58,080 Speaker 4: I mean, I will just say it's not just software 38 00:01:58,360 --> 00:02:02,080 Speaker 4: right now. So we're seeing sort of rolling series of 39 00:02:02,200 --> 00:02:06,920 Speaker 4: concerns where like every time AI does something or create 40 00:02:07,040 --> 00:02:10,000 Speaker 4: some new product, it hits a particular industry. So on 41 00:02:10,080 --> 00:02:14,240 Speaker 4: Monday it was the insurance industry, insurance brokers, and you 42 00:02:14,240 --> 00:02:17,800 Speaker 4: know today Wednesday, February eleventh, I think it's some of 43 00:02:17,840 --> 00:02:20,120 Speaker 4: the stockbroker firms. 44 00:02:19,840 --> 00:02:21,480 Speaker 2: And yeah, all you have to do is just say 45 00:02:21,520 --> 00:02:24,320 Speaker 2: AI industry and there's a you know, it's really there's 46 00:02:24,320 --> 00:02:26,360 Speaker 2: a lot of anxiety. But there's something that doesn't make 47 00:02:26,400 --> 00:02:28,200 Speaker 2: any sense to me about this or the thing that 48 00:02:28,200 --> 00:02:29,200 Speaker 2: I'm wrapping my head around. 49 00:02:29,240 --> 00:02:30,520 Speaker 3: It's like, sure, any of. 50 00:02:30,520 --> 00:02:34,079 Speaker 2: Us could easily like write some software, but like writing 51 00:02:34,120 --> 00:02:36,160 Speaker 2: software is a cost center for these companies. 52 00:02:36,360 --> 00:02:36,520 Speaker 1: Right. 53 00:02:36,720 --> 00:02:40,440 Speaker 2: If you're salesforce and you can trivially reduce the cost 54 00:02:40,480 --> 00:02:43,600 Speaker 2: of building software, that's also a benefit for you. And 55 00:02:43,639 --> 00:02:45,680 Speaker 2: there's a lot more to a software company than just 56 00:02:45,760 --> 00:02:49,080 Speaker 2: code generation, because there's all kinds of you know, network 57 00:02:49,080 --> 00:02:52,079 Speaker 2: effects and links into this. It's like a software company 58 00:02:52,120 --> 00:02:55,000 Speaker 2: is clearly more than just code, and so the fact 59 00:02:55,040 --> 00:02:57,959 Speaker 2: that maybe code can be generated a lot cheaper does 60 00:02:58,040 --> 00:03:00,400 Speaker 2: not screen to me, like, oh, these companies are worth 61 00:03:00,480 --> 00:03:01,040 Speaker 2: lessing views. 62 00:03:01,360 --> 00:03:04,639 Speaker 4: Sure, but at the same time they've been pricing. Their 63 00:03:04,680 --> 00:03:08,040 Speaker 4: pricing is based on that assumption, right, like that there 64 00:03:08,120 --> 00:03:10,480 Speaker 4: is no competitor for what they're doing, and suddenly you 65 00:03:10,560 --> 00:03:12,000 Speaker 4: might have an in house competitor. 66 00:03:12,880 --> 00:03:15,360 Speaker 2: Absolutely, But you know, it's like network effects and do 67 00:03:15,440 --> 00:03:18,280 Speaker 2: companies want to start like building their own like payroll 68 00:03:18,480 --> 00:03:21,760 Speaker 2: software Anyway, I have a lot of questions about this 69 00:03:21,880 --> 00:03:23,200 Speaker 2: sell off, and to your point. 70 00:03:23,120 --> 00:03:27,000 Speaker 4: No, no, no, this is you doing like penance first, 71 00:03:27,240 --> 00:03:29,720 Speaker 4: for causing causing the sell off. 72 00:03:29,760 --> 00:03:31,640 Speaker 2: All right, let's talk to someone who actually might be 73 00:03:31,639 --> 00:03:35,120 Speaker 2: able to answer some of these questions for us. We're 74 00:03:35,160 --> 00:03:37,240 Speaker 2: gonna be speaking to someone who's been in the software space, 75 00:03:37,240 --> 00:03:39,160 Speaker 2: an investor in the software space for a long time, 76 00:03:39,200 --> 00:03:43,880 Speaker 2: recently put out a great deck really diving into SaaS 77 00:03:43,880 --> 00:03:46,800 Speaker 2: of the SaaS apocalypse and what kinds of companies are 78 00:03:46,840 --> 00:03:49,640 Speaker 2: thriving and what kinds of companies were struggling even before 79 00:03:50,000 --> 00:03:52,840 Speaker 2: everyone started talking about AI code generation and all that. 80 00:03:53,080 --> 00:03:55,160 Speaker 2: We're gonna be speaking with Jared Sleeper. He is a 81 00:03:55,160 --> 00:03:58,320 Speaker 2: partner at Avenue, which does a growth investing private company. 82 00:03:58,440 --> 00:03:59,920 Speaker 3: So, Jared, thanks for coming on off. 83 00:04:00,240 --> 00:04:01,720 Speaker 5: Yeah, my pleasure. Excited to be here. 84 00:04:01,720 --> 00:04:03,440 Speaker 2: Why are we talking to you? Just you know, for 85 00:04:03,480 --> 00:04:06,400 Speaker 2: our listeners. Apparently your first time on a podcast, which 86 00:04:06,440 --> 00:04:07,840 Speaker 2: is crazy. But why are we talking you? Give us 87 00:04:07,840 --> 00:04:08,720 Speaker 2: a little bit about. 88 00:04:08,480 --> 00:04:11,520 Speaker 3: Your background investing in software and understanding the space. 89 00:04:11,600 --> 00:04:14,880 Speaker 6: Yeah, my pleasure. So I think one thing that makes 90 00:04:14,920 --> 00:04:17,400 Speaker 6: me a little bit different in the investor world is 91 00:04:17,400 --> 00:04:21,040 Speaker 6: that I've spent time investing in early stage startups, public companies, 92 00:04:21,080 --> 00:04:22,719 Speaker 6: and everything in between. So I spent a chunk of 93 00:04:22,720 --> 00:04:24,920 Speaker 6: my career at an early stage venture fund in Boston 94 00:04:24,960 --> 00:04:28,039 Speaker 6: called Matrix Partners, working with an og SaaS investor named 95 00:04:28,080 --> 00:04:30,279 Speaker 6: David Scock, and then was also at ko TU where 96 00:04:30,279 --> 00:04:32,800 Speaker 6: I ran public software. And so I've kind of have 97 00:04:32,920 --> 00:04:36,560 Speaker 6: this like experience across a spectrum from ground floor startups 98 00:04:36,560 --> 00:04:38,280 Speaker 6: to looking at the big public companies, which I've done 99 00:04:38,279 --> 00:04:39,080 Speaker 6: for the last ten years. 100 00:04:39,080 --> 00:04:40,480 Speaker 3: Perfect guest, Perfect guests. 101 00:04:40,560 --> 00:04:43,480 Speaker 4: So give us some color on the mood in software 102 00:04:43,560 --> 00:04:46,120 Speaker 4: at the moment. Are people like I don't know, hunkering 103 00:04:46,160 --> 00:04:48,000 Speaker 4: down in their bunkers. How bad is it? 104 00:04:48,279 --> 00:04:48,520 Speaker 5: Yeah? 105 00:04:48,560 --> 00:04:51,800 Speaker 6: I get texted constantly from folks on the byside, just 106 00:04:52,240 --> 00:04:55,760 Speaker 6: you know, retrenching. I can't believe this is happening, can't 107 00:04:55,920 --> 00:04:58,120 Speaker 6: go lower. I keep saying that it's one hundredth time 108 00:04:58,160 --> 00:05:00,600 Speaker 6: I bought the DIP. You use the SaaS apocalypse like 109 00:05:00,680 --> 00:05:05,560 Speaker 6: casastrophe is my science. It's definitely one of those moments. 110 00:05:05,560 --> 00:05:07,239 Speaker 6: And we were talking about this a little bit earlier 111 00:05:07,240 --> 00:05:09,720 Speaker 6: before starting, But one of the things about software that's 112 00:05:09,760 --> 00:05:12,240 Speaker 6: really fascinating is there's very few folks, even on the 113 00:05:12,240 --> 00:05:15,960 Speaker 6: buy side who really understand how software works. It's one 114 00:05:15,960 --> 00:05:19,040 Speaker 6: of those roar shot tests kind of sectors where no 115 00:05:19,040 --> 00:05:21,880 Speaker 6: almost no one's logged into Salesforce and clicked around much 116 00:05:21,920 --> 00:05:24,720 Speaker 6: less than a Salesforce admin and understood the full complexity. 117 00:05:25,080 --> 00:05:27,839 Speaker 6: And so when there's panic, there's not a lot of 118 00:05:27,880 --> 00:05:30,719 Speaker 6: support for the stocks, and people, you know, get scared 119 00:05:30,800 --> 00:05:31,280 Speaker 6: very easily. 120 00:05:31,560 --> 00:05:32,760 Speaker 3: Well, explain what this means. 121 00:05:32,800 --> 00:05:37,279 Speaker 2: So, for example, in a lot of companies, it's like 122 00:05:37,560 --> 00:05:40,360 Speaker 2: you're saying that the people who investor trade these stocks, 123 00:05:40,720 --> 00:05:45,120 Speaker 2: they just know them as financial tables basically, and they 124 00:05:45,120 --> 00:05:47,280 Speaker 2: have some idea of their financials and some idea of 125 00:05:47,279 --> 00:05:49,599 Speaker 2: their customer base, et cetera. But they don't have like 126 00:05:49,640 --> 00:05:53,760 Speaker 2: a great intuition for the product onlike say, you know 127 00:05:54,120 --> 00:05:56,360 Speaker 2: people who use Instagram and therefore might have a feel 128 00:05:56,400 --> 00:05:57,320 Speaker 2: about meta for example. 129 00:05:57,360 --> 00:05:59,440 Speaker 6: Yeah, if you're an investor in Lulu Lemon, you have 130 00:05:59,480 --> 00:06:01,680 Speaker 6: a pretty solid conception of what that business is. You 131 00:06:01,680 --> 00:06:04,360 Speaker 6: can go into the scioga pants exactly. You can buy 132 00:06:04,360 --> 00:06:06,400 Speaker 6: the product to ship it to yourself. If you're an 133 00:06:06,400 --> 00:06:10,040 Speaker 6: investor in Viva, which makes CRM software for pharmaceutical reps, 134 00:06:10,279 --> 00:06:12,760 Speaker 6: I bet you there's almost no investors in Viva who 135 00:06:12,800 --> 00:06:15,600 Speaker 6: have ever been inside the product even once, much less 136 00:06:15,720 --> 00:06:17,480 Speaker 6: used it on a day to day basis and understood 137 00:06:17,480 --> 00:06:17,960 Speaker 6: how it works. 138 00:06:18,720 --> 00:06:21,320 Speaker 4: So I'm going to go way back in time and start, 139 00:06:21,480 --> 00:06:23,760 Speaker 4: I guess the very beginning, But why is it that 140 00:06:23,920 --> 00:06:28,239 Speaker 4: software like this, you know, payment management systems, whatever, Why 141 00:06:28,279 --> 00:06:31,440 Speaker 4: were they historically not developed in house? 142 00:06:31,640 --> 00:06:31,760 Speaker 3: Like? 143 00:06:31,800 --> 00:06:33,920 Speaker 4: How did we get this model where we have these 144 00:06:34,000 --> 00:06:37,680 Speaker 4: huge software companies that are really you know, to date 145 00:06:37,920 --> 00:06:40,320 Speaker 4: have been really integral to a lot of businesses. 146 00:06:40,520 --> 00:06:42,760 Speaker 6: Yeah, it's a great question. You know, back in the 147 00:06:42,880 --> 00:06:44,880 Speaker 6: very early days of software, like back in the seventies 148 00:06:44,960 --> 00:06:46,960 Speaker 6: or eighties, there was a lot done in house, and 149 00:06:46,960 --> 00:06:49,800 Speaker 6: we've seen a very clear mixshift over time towards using 150 00:06:49,839 --> 00:06:53,000 Speaker 6: third party software. And what it comes down to is 151 00:06:53,080 --> 00:06:56,720 Speaker 6: the software was expensive to build and maintain, and there's 152 00:06:56,720 --> 00:07:00,200 Speaker 6: this need for an ecosystem of integrations around it, which 153 00:07:00,200 --> 00:07:02,640 Speaker 6: are also expensive to build and maintain. And so if 154 00:07:02,640 --> 00:07:04,440 Speaker 6: you look at a software company, it can afford to 155 00:07:04,440 --> 00:07:08,880 Speaker 6: have one, two, three thousand engineers plus partnership teams, et cetera, 156 00:07:09,080 --> 00:07:11,960 Speaker 6: all working to build the perfect piece of software for 157 00:07:12,040 --> 00:07:15,320 Speaker 6: a given application, and then what's striking And this will 158 00:07:15,320 --> 00:07:17,240 Speaker 6: come up a lot more in this conversation, is not 159 00:07:17,360 --> 00:07:18,640 Speaker 6: selling it for that much money. 160 00:07:18,840 --> 00:07:19,040 Speaker 5: Right. 161 00:07:19,160 --> 00:07:21,760 Speaker 6: A lot of software companies report a stat which is 162 00:07:21,840 --> 00:07:23,560 Speaker 6: the share of our customers that pay us more than 163 00:07:23,600 --> 00:07:26,239 Speaker 6: one hundred thousand dollars a year, And one hundred thousand 164 00:07:26,280 --> 00:07:28,360 Speaker 6: dollars a year is less than half of the fully 165 00:07:28,400 --> 00:07:31,560 Speaker 6: loaded cost of a software engineer, right. And so the 166 00:07:31,640 --> 00:07:34,520 Speaker 6: software model was build a product that can be applied 167 00:07:34,560 --> 00:07:36,920 Speaker 6: to thousands of customers and it's the same product for 168 00:07:37,000 --> 00:07:39,680 Speaker 6: every customer, and then sell it to them for way 169 00:07:39,760 --> 00:07:42,160 Speaker 6: cheaper than they could ever hope to build it themselves, 170 00:07:42,200 --> 00:07:43,680 Speaker 6: even less in the cost of one employee. 171 00:07:44,080 --> 00:07:46,880 Speaker 2: Okay, I'd love to just talk long term software history 172 00:07:46,920 --> 00:07:49,480 Speaker 2: even before you know, we think a lot about SaaS 173 00:07:49,520 --> 00:07:51,760 Speaker 2: and these startups and stuff like that. But like a 174 00:07:51,800 --> 00:07:54,320 Speaker 2: lot of the big companies that we think of in software, 175 00:07:54,520 --> 00:08:00,440 Speaker 2: especially like pre Salesforce, whether it's like SAP Oracle Microsoft obviously, 176 00:08:00,600 --> 00:08:03,480 Speaker 2: aren't there a bunch of third party companies whose job 177 00:08:03,600 --> 00:08:05,680 Speaker 2: is to just like help install it for you, Ye, 178 00:08:05,840 --> 00:08:08,640 Speaker 2: like an SAP installed, And that'll be a totally separate company. 179 00:08:08,640 --> 00:08:11,840 Speaker 2: Because it's so big and it's so unwieldy and complex 180 00:08:12,040 --> 00:08:14,480 Speaker 2: that you actually you can't just like install it yourself, 181 00:08:14,560 --> 00:08:16,280 Speaker 2: or it has to be customized or whatever. 182 00:08:17,000 --> 00:08:19,120 Speaker 6: And there's two parts to that which I think are important. 183 00:08:19,360 --> 00:08:23,080 Speaker 6: One is the integrations into your existing systems. Right, a 184 00:08:23,120 --> 00:08:26,960 Speaker 6: lot of big old companies have old databases, old applications, 185 00:08:26,960 --> 00:08:29,520 Speaker 6: and it's important for everything to be stitched together. So 186 00:08:29,560 --> 00:08:32,640 Speaker 6: you need software engineers and you know, consultants to go 187 00:08:32,679 --> 00:08:35,200 Speaker 6: in and understand those existing systems and kind of get 188 00:08:35,240 --> 00:08:38,000 Speaker 6: them linked up to the new systems. But the other one, 189 00:08:38,000 --> 00:08:41,960 Speaker 6: which is probably bigger, is just people management and change management. 190 00:08:42,200 --> 00:08:44,720 Speaker 6: You know, any software system is the combination of the 191 00:08:44,760 --> 00:08:48,040 Speaker 6: code and all of the individual users who have learned 192 00:08:48,040 --> 00:08:49,640 Speaker 6: how to use it. If you're trying to change out 193 00:08:49,679 --> 00:08:52,880 Speaker 6: your CRM at a company, that means training every single 194 00:08:52,960 --> 00:08:55,840 Speaker 6: sales rep on how to use the new CRM and 195 00:08:55,880 --> 00:08:58,120 Speaker 6: getting it right. And if they get it wrong, then 196 00:08:58,160 --> 00:09:01,040 Speaker 6: you lose deals that quarter. And so, you know, one 197 00:09:01,080 --> 00:09:03,439 Speaker 6: of the kind of tropes in investing is if you 198 00:09:03,480 --> 00:09:06,400 Speaker 6: see a company that's doing an ERP transition. ERP stands 199 00:09:06,400 --> 00:09:09,360 Speaker 6: for Enterprise Resource planning. It's the kind of core software 200 00:09:09,480 --> 00:09:13,000 Speaker 6: accounting you know, supply chain, et cetera. That company's probably 201 00:09:13,000 --> 00:09:14,719 Speaker 6: going to miss its earnings over the next one or 202 00:09:14,720 --> 00:09:18,080 Speaker 6: two quarters because those transitions are so painful and so yes, 203 00:09:18,120 --> 00:09:21,280 Speaker 6: there's a big consulting complex around it that does its 204 00:09:21,320 --> 00:09:23,520 Speaker 6: best to come in and parachute in the talent that's 205 00:09:23,559 --> 00:09:26,200 Speaker 6: required to make those transitions smooth. And that tells you 206 00:09:26,240 --> 00:09:28,800 Speaker 6: something about what makes software so sticky, or at least 207 00:09:28,800 --> 00:09:29,560 Speaker 6: has historically. 208 00:09:30,400 --> 00:09:32,720 Speaker 4: It's third party agents all the way down, I feel. 209 00:09:32,880 --> 00:09:35,840 Speaker 4: But actually, on this note, so we hear the integration 210 00:09:36,000 --> 00:09:38,719 Speaker 4: point brought up a lot, and I think the very 211 00:09:38,760 --> 00:09:41,000 Speaker 4: first episode we did on claud code we talked a 212 00:09:41,000 --> 00:09:43,240 Speaker 4: little bit about it as well. But like, if you 213 00:09:43,320 --> 00:09:45,800 Speaker 4: have something like claud code where you can just give 214 00:09:45,800 --> 00:09:49,520 Speaker 4: it permissions to make changes to your computer, does some 215 00:09:49,600 --> 00:09:53,760 Speaker 4: of that integration expertise actually start to go away, because 216 00:09:53,840 --> 00:09:57,760 Speaker 4: presumably we are going to get AI. I would assume 217 00:09:57,840 --> 00:10:00,960 Speaker 4: at some point, given the rate that it's developed and improving, 218 00:10:01,440 --> 00:10:03,600 Speaker 4: that we'll be able to do this like plug itself 219 00:10:03,640 --> 00:10:04,720 Speaker 4: into various systems. 220 00:10:04,840 --> 00:10:07,840 Speaker 6: Yeah, one hundred percent. I think the challenge of writing 221 00:10:07,920 --> 00:10:12,560 Speaker 6: the code for the integrations is going away. That's not 222 00:10:12,800 --> 00:10:15,560 Speaker 6: the bulk of the challenge for a majority of integrations. 223 00:10:15,559 --> 00:10:19,480 Speaker 6: It's about really deeply understanding the prior system and how 224 00:10:19,480 --> 00:10:21,880 Speaker 6: it maps to the new system. And the reality is, 225 00:10:21,880 --> 00:10:25,760 Speaker 6: within most organizations that's a human problem. It's hey, this 226 00:10:25,880 --> 00:10:30,640 Speaker 6: column says status two thousand and four, what does that mean? Like? 227 00:10:30,679 --> 00:10:32,720 Speaker 6: How does that map to the new system that we're building? 228 00:10:32,720 --> 00:10:34,199 Speaker 6: So you have to go talk to someone and understand it. 229 00:10:34,280 --> 00:10:37,280 Speaker 6: And so there's certain types of integrations where I think 230 00:10:37,320 --> 00:10:40,040 Speaker 6: they're effectively solved problems now because you can write a 231 00:10:40,120 --> 00:10:42,319 Speaker 6: quick write into chat in to clog code and get 232 00:10:42,360 --> 00:10:44,880 Speaker 6: a perfectly written piece of software to make it happen. 233 00:10:45,120 --> 00:10:47,600 Speaker 6: And then there's others that are just fundamentally human problems 234 00:10:47,640 --> 00:10:49,960 Speaker 6: because the data doesn't exist in digital space. 235 00:10:50,480 --> 00:10:52,720 Speaker 2: Let's talk more about that, because really it is pretty 236 00:10:52,720 --> 00:10:57,280 Speaker 2: extraordinary the degree to which I don't know it's the 237 00:10:57,640 --> 00:10:59,840 Speaker 2: working code. I don't know if it's high quality code, 238 00:11:00,160 --> 00:11:04,240 Speaker 2: but certainly these models can generate working code, and it's 239 00:11:04,280 --> 00:11:06,960 Speaker 2: just it blows my mind whenever I use it. But 240 00:11:07,080 --> 00:11:08,880 Speaker 2: talk to us a little bit more about from the 241 00:11:08,920 --> 00:11:13,040 Speaker 2: perspective of various software vendors, and I'm sure there's a 242 00:11:13,160 --> 00:11:15,520 Speaker 2: range about what they're selling and how much is it 243 00:11:15,600 --> 00:11:18,200 Speaker 2: code versus how much is it other stuff, and which 244 00:11:18,200 --> 00:11:22,400 Speaker 2: ones are more exposed to the pure like code generation ability. 245 00:11:22,520 --> 00:11:24,600 Speaker 6: Yeah, it's a great question, and you're one hundred percent right. 246 00:11:24,640 --> 00:11:27,160 Speaker 6: It's producing working code, and frankly it has been for 247 00:11:27,280 --> 00:11:29,560 Speaker 6: the last year or so. I built my first leveable 248 00:11:29,600 --> 00:11:32,800 Speaker 6: app that was working in production about about a year ago, 249 00:11:33,120 --> 00:11:35,040 Speaker 6: and it's even intensified in the last three months. 250 00:11:35,160 --> 00:11:35,319 Speaker 1: Right. 251 00:11:35,800 --> 00:11:38,960 Speaker 6: I think when people buy software, there's a set of 252 00:11:38,960 --> 00:11:40,839 Speaker 6: things that they're buying. One thing that I think is 253 00:11:40,840 --> 00:11:43,920 Speaker 6: important for everyone understand is that open source software has 254 00:11:43,960 --> 00:11:45,920 Speaker 6: been a thing, and there have been free, open source 255 00:11:46,000 --> 00:11:49,040 Speaker 6: versions of almost any software you could buy for all 256 00:11:49,080 --> 00:11:52,480 Speaker 6: of recorded history. There's actually some companies that are public 257 00:11:52,600 --> 00:11:55,480 Speaker 6: that built their businesses packaging that open source software and 258 00:11:55,520 --> 00:11:58,680 Speaker 6: adding a few custom features and then support on top 259 00:11:58,720 --> 00:12:01,760 Speaker 6: of it. Because of the companies reliant on an open 260 00:12:01,800 --> 00:12:04,959 Speaker 6: source database or a company like Elastic with its Elastic 261 00:12:05,000 --> 00:12:08,079 Speaker 6: Search product, which is an infrastructure tool and it breaks, 262 00:12:08,400 --> 00:12:11,000 Speaker 6: they need someone to call, both for COIA reasons and 263 00:12:11,120 --> 00:12:14,040 Speaker 6: because it can be very complex and technical and they 264 00:12:14,040 --> 00:12:17,600 Speaker 6: need to quickly understand it. And so that has been 265 00:12:18,040 --> 00:12:20,520 Speaker 6: a big part of the story historically, is that need 266 00:12:20,559 --> 00:12:23,319 Speaker 6: to you know, have support. Another thing that you sell 267 00:12:23,360 --> 00:12:25,440 Speaker 6: when you sell as a software vendor is what I 268 00:12:25,520 --> 00:12:29,360 Speaker 6: call herd familiarity, which means everyone on Earth knows how 269 00:12:29,400 --> 00:12:31,800 Speaker 6: to use your software, which just simplifies the training and 270 00:12:31,920 --> 00:12:34,000 Speaker 6: onboarding workflow. I'll give a few examples because I'm sure 271 00:12:34,000 --> 00:12:35,400 Speaker 6: it's a new term for listeners since I. 272 00:12:35,320 --> 00:12:35,800 Speaker 5: Made it up. 273 00:12:36,040 --> 00:12:40,280 Speaker 6: You know, Zoom is a great business. Microsoft has been 274 00:12:40,280 --> 00:12:43,560 Speaker 6: giving away a free version of the product forever in teams. 275 00:12:43,800 --> 00:12:47,560 Speaker 6: Why do people use Zoom Because in certain industries almost 276 00:12:47,559 --> 00:12:50,240 Speaker 6: everyone knows how to use Zoom. They have their Zoom setup, 277 00:12:50,320 --> 00:12:52,679 Speaker 6: they have their virtual background chosen. They're not going to 278 00:12:52,720 --> 00:12:54,839 Speaker 6: fumble around for the first minute or two on the call, 279 00:12:55,120 --> 00:12:57,680 Speaker 6: and that's well worth the twenty dollars a month to 280 00:12:57,720 --> 00:13:00,240 Speaker 6: have a Zoom plan. But that applies to lots of 281 00:13:00,280 --> 00:13:03,640 Speaker 6: other areas as well. So think about Microsoft Excel for example. 282 00:13:04,440 --> 00:13:06,240 Speaker 6: You might be able to use Google Sheets to do 283 00:13:06,320 --> 00:13:08,719 Speaker 6: the same thing, but you really want to retrain every 284 00:13:08,760 --> 00:13:11,520 Speaker 6: person who comes in on the Google Sheets shortcuts versus 285 00:13:11,640 --> 00:13:14,560 Speaker 6: Excel shortcuts. It's not a good use of time, especially 286 00:13:14,559 --> 00:13:17,000 Speaker 6: when the software is already so cheap. And so that's 287 00:13:17,040 --> 00:13:19,560 Speaker 6: another plank in what people are buying when they buy 288 00:13:19,600 --> 00:13:22,160 Speaker 6: software is the standardization and the knowledge that they'll be 289 00:13:22,200 --> 00:13:24,920 Speaker 6: able to hire employees who have that, and then there's 290 00:13:24,960 --> 00:13:27,800 Speaker 6: things like brand again, the kind of ecosystem that comes 291 00:13:27,800 --> 00:13:29,840 Speaker 6: around it, and so it really is more than just 292 00:13:29,960 --> 00:13:30,720 Speaker 6: the raw code. 293 00:13:30,880 --> 00:13:33,440 Speaker 4: We've been joking about this, but the idea of software 294 00:13:33,520 --> 00:13:37,280 Speaker 4: companies value lying in being a scapegoat essentially for when 295 00:13:37,320 --> 00:13:40,640 Speaker 4: things go wrong is kind of funny and dystopian, I 296 00:13:40,679 --> 00:13:41,680 Speaker 4: think in many ways. 297 00:13:42,000 --> 00:13:45,360 Speaker 6: Yeah, I mean, I think you know, it's a real fear, 298 00:13:45,600 --> 00:13:47,679 Speaker 6: right And the way I think about it is, there 299 00:13:47,679 --> 00:13:51,320 Speaker 6: are two arguments against software right now. One is the 300 00:13:51,320 --> 00:13:53,160 Speaker 6: world is going to stay the same, but software just 301 00:13:53,160 --> 00:13:54,960 Speaker 6: going to get a lot cheaper over time now that 302 00:13:54,960 --> 00:13:56,880 Speaker 6: it's cheaper to build. And I think there's no one 303 00:13:56,920 --> 00:13:59,000 Speaker 6: who would argue that it's not gotten dramatically cheaper to 304 00:13:59,000 --> 00:14:01,320 Speaker 6: build for reasons that we laid out in our deck 305 00:14:01,360 --> 00:14:03,160 Speaker 6: and we can talk through it more. We don't buy 306 00:14:03,200 --> 00:14:06,160 Speaker 6: that argument. I don't buy that argument. But the second 307 00:14:06,280 --> 00:14:08,400 Speaker 6: is the world's about to get really weird and the 308 00:14:08,440 --> 00:14:10,840 Speaker 6: way that knowledge work happens is going to change. And 309 00:14:10,920 --> 00:14:13,320 Speaker 6: if we think out three, four or five years, who 310 00:14:13,320 --> 00:14:15,559 Speaker 6: knows if there will even be customer support reps or 311 00:14:15,600 --> 00:14:19,280 Speaker 6: sales reps or software engineers, and I think that's what's 312 00:14:19,360 --> 00:14:21,440 Speaker 6: causing the kind of hit to the share prices lately, 313 00:14:21,520 --> 00:14:23,360 Speaker 6: is this terminal value concern. 314 00:14:23,760 --> 00:14:24,920 Speaker 3: Yeah, it was interesting. 315 00:14:25,000 --> 00:14:28,040 Speaker 2: So one of the companies that's been associated with the uh, 316 00:14:28,120 --> 00:14:29,600 Speaker 2: what did you say casastrophe? 317 00:14:29,840 --> 00:14:31,400 Speaker 3: One of those companies that's been. 318 00:14:31,280 --> 00:14:34,680 Speaker 2: Caught up with this blue owl, the private investing firm 319 00:14:34,720 --> 00:14:37,400 Speaker 2: Private Credit. I read through their conference call and their 320 00:14:37,480 --> 00:14:40,160 Speaker 2: CEO was like, not only do we not see red lights, 321 00:14:40,320 --> 00:14:42,480 Speaker 2: not only do we not even see yellow lights, we 322 00:14:42,520 --> 00:14:44,560 Speaker 2: actually see a lot of green lights, which I think 323 00:14:44,640 --> 00:14:46,080 Speaker 2: is really interesting. 324 00:14:45,920 --> 00:14:48,160 Speaker 3: Because it can fit with this idea of. 325 00:14:48,560 --> 00:14:51,000 Speaker 2: This year could be fine, next year could be fine, 326 00:14:51,080 --> 00:14:52,960 Speaker 2: year after that could be fine, and then the year 327 00:14:52,960 --> 00:14:55,200 Speaker 2: after that could be zero or at least that's the 328 00:14:55,240 --> 00:14:57,480 Speaker 2: anxiety that there's this terminal value talking. 329 00:14:57,800 --> 00:14:58,720 Speaker 4: That's like a cliff rest. 330 00:14:58,760 --> 00:15:01,480 Speaker 5: Yeah, there's this cliff Yeah. I think it's really helpful. 331 00:15:01,680 --> 00:15:03,560 Speaker 6: You know, this is our second iteration of the deck, 332 00:15:03,600 --> 00:15:05,560 Speaker 6: and so we kind of force ourselves to recenter on 333 00:15:05,720 --> 00:15:09,760 Speaker 6: what actually happened since the last deck, right, and there's 334 00:15:09,760 --> 00:15:12,120 Speaker 6: a very clear pattern in software and what happened over 335 00:15:12,160 --> 00:15:15,640 Speaker 6: the last five years, which is the pandemic. People freaked 336 00:15:15,640 --> 00:15:17,480 Speaker 6: out at the beginning, but it was rapidly clear that 337 00:15:17,480 --> 00:15:20,080 Speaker 6: it was an accelerant for SaaS as everyone tried to 338 00:15:20,120 --> 00:15:23,040 Speaker 6: digitize their companies, and so you had a spike in 339 00:15:23,080 --> 00:15:25,680 Speaker 6: the growth rate and net retention of the businesses. It 340 00:15:25,720 --> 00:15:28,520 Speaker 6: peaked at just over forty percent and twenty twenty one 341 00:15:28,560 --> 00:15:31,920 Speaker 6: for the median software companies. That's really nice annualized growth. 342 00:15:32,600 --> 00:15:35,040 Speaker 6: And then there was a hangover and that slowed down, 343 00:15:35,280 --> 00:15:37,880 Speaker 6: and we wrote eighteen months ago that that reflected the 344 00:15:37,880 --> 00:15:41,360 Speaker 6: sector sort of maturing. The adoption had just slowed down 345 00:15:41,400 --> 00:15:43,680 Speaker 6: because most folks had adopted the software that they needed 346 00:15:43,880 --> 00:15:46,280 Speaker 6: under the pressure of the pandemic. And so for the 347 00:15:46,360 --> 00:15:49,200 Speaker 6: last few years after that, we saw this degradation and 348 00:15:49,240 --> 00:15:52,120 Speaker 6: growth rates across the sector. By the beginning of last year, 349 00:15:52,400 --> 00:15:55,040 Speaker 6: the median company was growing eighteen percent instead of forty percent, 350 00:15:55,040 --> 00:15:58,160 Speaker 6: so you saw a pretty significant draw down. What's fascinating 351 00:15:58,200 --> 00:16:00,760 Speaker 6: is that if you look at the actual financial performance 352 00:16:00,800 --> 00:16:03,000 Speaker 6: of the companies in the last year, it's been pretty good. 353 00:16:03,440 --> 00:16:05,720 Speaker 6: That growth rate has held. It was eighteen percent again 354 00:16:05,800 --> 00:16:08,720 Speaker 6: in Q three. Net retention has also been consistent at 355 00:16:08,720 --> 00:16:11,200 Speaker 6: about one hundred and ten percent. That's revenue from existing 356 00:16:11,400 --> 00:16:14,400 Speaker 6: customers over the same revenue from those customers a prior year. 357 00:16:14,640 --> 00:16:16,880 Speaker 6: So there's not a churn issue developing or a lack 358 00:16:16,920 --> 00:16:19,120 Speaker 6: of expansion within the customer base, and a lot of 359 00:16:19,160 --> 00:16:22,480 Speaker 6: the companies are actually accelerating growth or guiding to accelerating growth. 360 00:16:22,520 --> 00:16:24,200 Speaker 6: We have a chart showing the number of those companies 361 00:16:24,240 --> 00:16:27,120 Speaker 6: has increased each quarter of the last three successive quarters. 362 00:16:27,400 --> 00:16:30,120 Speaker 6: And so there's a lot going on right now with 363 00:16:30,160 --> 00:16:32,800 Speaker 6: the terminal value. But it's very hard to argue that 364 00:16:32,840 --> 00:16:35,040 Speaker 6: this is something that's happening today and showing up in 365 00:16:35,080 --> 00:16:37,720 Speaker 6: the numbers. The thing is investors are sharp, right, and 366 00:16:37,720 --> 00:16:39,160 Speaker 6: they're constantly looking at that booking. 367 00:16:39,320 --> 00:16:39,520 Speaker 5: Yeah. 368 00:16:39,520 --> 00:16:41,280 Speaker 6: I mean look at CHEG right, which went down very 369 00:16:41,320 --> 00:16:44,280 Speaker 6: quickly in the aftermath of chat GPT coming out, and 370 00:16:44,320 --> 00:16:45,480 Speaker 6: that was completely correct. 371 00:16:45,840 --> 00:16:46,040 Speaker 5: Right. 372 00:16:46,120 --> 00:16:48,400 Speaker 6: Investors were ahead of that, and of course for the 373 00:16:48,400 --> 00:16:51,240 Speaker 6: first few quarters the management team of CHEG, you know, 374 00:16:51,560 --> 00:16:53,760 Speaker 6: had their heads in the sand, but then it became 375 00:16:53,840 --> 00:16:55,760 Speaker 6: clear that it really was existential to their business. 376 00:16:55,840 --> 00:16:57,040 Speaker 4: That's a fun chart. 377 00:16:58,160 --> 00:17:01,440 Speaker 2: I thought I was looking at a typo because I saw, wow, 378 00:17:01,840 --> 00:17:04,480 Speaker 2: that was a near one hundred dollars stock in February 379 00:17:04,520 --> 00:17:05,280 Speaker 2: twenty twenty one. 380 00:17:05,600 --> 00:17:07,159 Speaker 3: It is now a sixty one cent. 381 00:17:07,000 --> 00:17:09,119 Speaker 6: Stock and you have to give the markets credit. Like 382 00:17:09,160 --> 00:17:11,240 Speaker 6: the second chat of T came out, people were like, 383 00:17:11,320 --> 00:17:13,480 Speaker 6: this company's in big trouble. They didn't wait for it 384 00:17:13,520 --> 00:17:16,199 Speaker 6: to hit the financial results, and so there is signal 385 00:17:16,240 --> 00:17:17,000 Speaker 6: and what people think. 386 00:17:32,800 --> 00:17:35,400 Speaker 4: I have a bunch more questions, but just briefly, where 387 00:17:35,440 --> 00:17:37,879 Speaker 4: does data actually fit into all of this? Because the 388 00:17:37,960 --> 00:17:40,840 Speaker 4: other thing we hear about AI is maybe the models 389 00:17:40,880 --> 00:17:43,719 Speaker 4: don't matter that much, but it's the actual data that 390 00:17:43,800 --> 00:17:47,800 Speaker 4: you have access to. And I imagine the customers themselves 391 00:17:47,920 --> 00:17:50,960 Speaker 4: of SaaS companies, they have their own data. Do the 392 00:17:51,000 --> 00:17:53,000 Speaker 4: SaaS companies have their own data as well? Can they 393 00:17:53,000 --> 00:17:53,600 Speaker 4: build off of that? 394 00:17:53,840 --> 00:17:55,399 Speaker 6: Yeah, it's a great question. And we're here at one 395 00:17:55,440 --> 00:17:58,840 Speaker 6: of the world's biggest data companies, so very apt. Well, 396 00:17:58,880 --> 00:18:03,560 Speaker 6: disclosure data is definitely something that gets more valuable in 397 00:18:03,640 --> 00:18:07,040 Speaker 6: this world. If you think about a stylized AI model, 398 00:18:07,400 --> 00:18:10,040 Speaker 6: it could have PhD level intelligence in a domain. But 399 00:18:10,119 --> 00:18:12,440 Speaker 6: if you hired a PhD in your company and sat 400 00:18:12,480 --> 00:18:15,360 Speaker 6: her down on her first day, she wouldn't be very useful, right. 401 00:18:15,440 --> 00:18:18,160 Speaker 6: She would have to understand how the organization functions, where 402 00:18:18,280 --> 00:18:20,760 Speaker 6: things live. Do I trust this chart or that chart. 403 00:18:20,760 --> 00:18:22,760 Speaker 6: I need access to the Google drive, I need access 404 00:18:22,800 --> 00:18:24,639 Speaker 6: to Slack, I need to spend some time reading up 405 00:18:24,720 --> 00:18:26,879 Speaker 6: and so we call this kind of context. 406 00:18:27,000 --> 00:18:27,200 Speaker 5: Right. 407 00:18:27,240 --> 00:18:29,879 Speaker 6: It's all the extra information that an AI needs to 408 00:18:30,040 --> 00:18:33,680 Speaker 6: get something done, no matter how intelligent it is. And 409 00:18:33,800 --> 00:18:35,520 Speaker 6: we wrote about this in the chart in the deck. 410 00:18:35,560 --> 00:18:38,959 Speaker 6: But there's a real question of who becomes that system 411 00:18:39,000 --> 00:18:41,520 Speaker 6: of context. And you're right, a lot of the software 412 00:18:41,560 --> 00:18:44,320 Speaker 6: companies do sit on a pool of very important data. 413 00:18:44,440 --> 00:18:46,320 Speaker 6: Let's talk about Salesforce for example. 414 00:18:46,400 --> 00:18:48,400 Speaker 5: Right. CRM is where. 415 00:18:48,240 --> 00:18:51,480 Speaker 6: You track the records of every customer. You have every 416 00:18:51,640 --> 00:18:55,160 Speaker 6: prospect in your pipeline, all of your historic interactions with them, 417 00:18:55,320 --> 00:18:58,080 Speaker 6: notes from sales reps on what's going on the status 418 00:18:58,080 --> 00:19:00,919 Speaker 6: of their account, their customer support requests. It's an incredibly 419 00:19:01,000 --> 00:19:04,920 Speaker 6: complex piece of software for a large enterprise, and obviously 420 00:19:04,960 --> 00:19:07,879 Speaker 6: if you are an AI agent working within a company, 421 00:19:07,880 --> 00:19:09,760 Speaker 6: you would need access to that in order to get 422 00:19:09,760 --> 00:19:14,120 Speaker 6: almost anything done. Right, But you need more than is there. 423 00:19:14,600 --> 00:19:17,040 Speaker 6: You don't know what happened at the sales dinner last 424 00:19:17,119 --> 00:19:19,560 Speaker 6: night unless the rep took really detailed notes. And I 425 00:19:19,600 --> 00:19:21,760 Speaker 6: can tell you one comment learning and software is they 426 00:19:21,800 --> 00:19:23,360 Speaker 6: do not take very detailed notes. 427 00:19:23,320 --> 00:19:25,680 Speaker 5: Especially have a sales party, right, yeah, exactly. 428 00:19:25,760 --> 00:19:28,320 Speaker 6: Can people assume that software management teams know exactly what's 429 00:19:28,359 --> 00:19:30,880 Speaker 6: going on, but they're looking through really messy salesforce data 430 00:19:31,040 --> 00:19:32,000 Speaker 6: and doing their very best. 431 00:19:32,359 --> 00:19:35,240 Speaker 4: Now I'm imagining a sales agent being like the Cabernet 432 00:19:35,480 --> 00:19:38,399 Speaker 4: was exquisite at last night's party, just putting in all 433 00:19:38,400 --> 00:19:41,199 Speaker 4: these irrelevant like diary entries, exactly. 434 00:19:41,280 --> 00:19:44,360 Speaker 6: But a lot of that context does live in human brains. 435 00:19:44,440 --> 00:19:46,680 Speaker 6: You know, a sales rep meets a person at dinner, 436 00:19:46,720 --> 00:19:48,880 Speaker 6: gets to know their kids, figures out what sports team 437 00:19:48,880 --> 00:19:51,320 Speaker 6: they root for, and they're not automatically pumping all that 438 00:19:51,359 --> 00:19:54,560 Speaker 6: into the CRM. And so there's this race to collect 439 00:19:54,720 --> 00:19:57,240 Speaker 6: the information that an AI agent would need in order 440 00:19:57,240 --> 00:20:00,919 Speaker 6: to actually take proactive action, and the software companies have 441 00:20:01,000 --> 00:20:03,320 Speaker 6: a position there. But there's also this set of AI 442 00:20:03,400 --> 00:20:07,360 Speaker 6: native startups that are coming in building actual agents who 443 00:20:07,359 --> 00:20:09,480 Speaker 6: are doing their own work to collect that context. 444 00:20:09,480 --> 00:20:10,200 Speaker 5: And that's one of the. 445 00:20:10,200 --> 00:20:13,040 Speaker 6: Battles that we saw, you know, kind of highlighted in 446 00:20:13,080 --> 00:20:15,600 Speaker 6: our deck is whoever wins that has a chance to 447 00:20:15,640 --> 00:20:17,400 Speaker 6: be a really valuable company. 448 00:20:17,720 --> 00:20:19,240 Speaker 2: You know what I think about and I think you 449 00:20:19,359 --> 00:20:21,080 Speaker 2: talk about this in your deck, But when I think 450 00:20:21,119 --> 00:20:25,000 Speaker 2: about software, I sort of have like if there's a spectrum, 451 00:20:25,240 --> 00:20:28,600 Speaker 2: you know, I think about Salesforce dot com, which is 452 00:20:28,640 --> 00:20:31,560 Speaker 2: a platform, and there's third party developers that build on 453 00:20:31,600 --> 00:20:34,240 Speaker 2: top of Salesforce and they sort of offer any everything, 454 00:20:34,520 --> 00:20:37,399 Speaker 2: And then I think about something niche like this is 455 00:20:37,440 --> 00:20:41,679 Speaker 2: the company that makes point of sales software for dentist's office, 456 00:20:42,000 --> 00:20:45,439 Speaker 2: and they went around by giving them free payment terminals, 457 00:20:45,520 --> 00:20:48,520 Speaker 2: and they joined y Combinator, and you know, they signed 458 00:20:48,560 --> 00:20:50,560 Speaker 2: up ten thousand dentist office and then they pay those 459 00:20:50,680 --> 00:20:54,199 Speaker 2: offices pay them ten dollars a month forever for access 460 00:20:54,240 --> 00:20:56,280 Speaker 2: to that. You know what, I'm just making it up, 461 00:20:56,280 --> 00:20:58,600 Speaker 2: but things like that. Is there a side of the 462 00:20:58,640 --> 00:21:02,920 Speaker 2: spectrum that's more at risk here? Is that spectrum legitimate 463 00:21:02,920 --> 00:21:06,320 Speaker 2: way to think about the industry or is there threats 464 00:21:06,359 --> 00:21:07,960 Speaker 2: on sort of wherever you look? 465 00:21:08,480 --> 00:21:10,600 Speaker 6: Yeah, it's a great question. I mean, certainly in the 466 00:21:10,680 --> 00:21:13,680 Speaker 6: world gets really weird scenario, it's not clear there's anywhere 467 00:21:13,680 --> 00:21:16,800 Speaker 6: immune from threats, but it's important to think through what 468 00:21:16,840 --> 00:21:19,399 Speaker 6: it looks like. I think what's most at threat is 469 00:21:19,760 --> 00:21:25,480 Speaker 6: companies that serve enterprises with very customized software already or 470 00:21:25,520 --> 00:21:28,520 Speaker 6: software that takes a very heavy implementation. And the reason 471 00:21:28,640 --> 00:21:32,000 Speaker 6: is if anyone's going to take advantage of this wave 472 00:21:32,000 --> 00:21:35,600 Speaker 6: of technology to really you know, advance and replace a 473 00:21:35,640 --> 00:21:38,400 Speaker 6: core system of software, it's going to be the enterprises 474 00:21:38,440 --> 00:21:41,600 Speaker 6: that have the resources and the customization needs. If you 475 00:21:41,600 --> 00:21:45,199 Speaker 6: think about SMBs. You know, my dad runs our family's 476 00:21:45,240 --> 00:21:47,199 Speaker 6: grocery store in the family for one hundred years, and 477 00:21:47,240 --> 00:21:49,359 Speaker 6: he just changed his point of sale for the first 478 00:21:49,359 --> 00:21:51,879 Speaker 6: time in a few decades, and it was a really 479 00:21:51,960 --> 00:21:52,960 Speaker 6: messy process. 480 00:21:53,080 --> 00:21:55,880 Speaker 5: It took a long time. Your dad, come on all alls, Yeah, sure, 481 00:21:56,160 --> 00:21:57,679 Speaker 5: we love, we love to. 482 00:21:58,480 --> 00:22:01,800 Speaker 6: All about independent grocery. And but you know he's certainly 483 00:22:01,840 --> 00:22:04,320 Speaker 6: not going to sit down and vibe code himself a 484 00:22:04,359 --> 00:22:06,359 Speaker 6: point of sale system and put the store on it. 485 00:22:06,440 --> 00:22:08,359 Speaker 6: I can guarantee you that, nor will any dentist. 486 00:22:08,520 --> 00:22:08,680 Speaker 5: Right. 487 00:22:09,160 --> 00:22:12,760 Speaker 6: There's a chance that someone comes along with a cheaper version, 488 00:22:13,000 --> 00:22:16,520 Speaker 6: but you know, I think that's not something he's going 489 00:22:16,560 --> 00:22:18,440 Speaker 6: to switch to anytime soon. He's going to go through 490 00:22:18,440 --> 00:22:21,359 Speaker 6: that pain for another few decades to come right. And 491 00:22:21,440 --> 00:22:24,800 Speaker 6: so it really is, you know, kind of company by company, 492 00:22:24,840 --> 00:22:27,560 Speaker 6: Like I'm doing this exercise right now on X where 493 00:22:27,600 --> 00:22:29,480 Speaker 6: every day I look at a different software company and 494 00:22:29,560 --> 00:22:32,199 Speaker 6: just think hard about what will AI look like for 495 00:22:32,240 --> 00:22:34,720 Speaker 6: this company, And it's really interesting when you press I'll 496 00:22:34,720 --> 00:22:37,680 Speaker 6: give an example like DocuSign, which I think to most 497 00:22:37,720 --> 00:22:42,080 Speaker 6: investors would seem like an incredibly simple, easy piece of software. Right, 498 00:22:42,160 --> 00:22:45,360 Speaker 6: it's an eating software. We've all experienced it. DocuSign has 499 00:22:45,400 --> 00:22:49,760 Speaker 6: more employees today than open ai and andthrop it combined god, 500 00:22:50,080 --> 00:22:53,800 Speaker 6: which is a crazy stat and probably reflects that labor 501 00:22:53,880 --> 00:22:57,200 Speaker 6: is inefficiently allocated across the market. But when you actually 502 00:22:57,240 --> 00:23:00,200 Speaker 6: double click into what DocuSign does, there are ways in 503 00:23:00,240 --> 00:23:04,520 Speaker 6: which it's very complicated. Right understanding the signature regulations in 504 00:23:04,560 --> 00:23:06,720 Speaker 6: every country around the world, what does it take for 505 00:23:06,760 --> 00:23:09,399 Speaker 6: signature to be legally valid. Most of its signatures are 506 00:23:09,440 --> 00:23:11,840 Speaker 6: done as an API, so folks are integrating it into 507 00:23:11,880 --> 00:23:15,280 Speaker 6: their own applications. And there's a benefit to using docu 508 00:23:15,320 --> 00:23:17,400 Speaker 6: sign which is the brand people have been giving away 509 00:23:17,400 --> 00:23:20,159 Speaker 6: free e signature software for a very long time. But 510 00:23:20,200 --> 00:23:22,520 Speaker 6: if you're a company of a certain esteem, you want 511 00:23:22,560 --> 00:23:25,240 Speaker 6: to make sure your customers trust what they're signing, and 512 00:23:25,240 --> 00:23:27,360 Speaker 6: if they're getting a contract from you, you'd much rather 513 00:23:27,400 --> 00:23:31,000 Speaker 6: say DocuSign than xyz sign that someone vibe coded, right, 514 00:23:31,320 --> 00:23:34,040 Speaker 6: And so I think it's really important to look a 515 00:23:34,080 --> 00:23:38,000 Speaker 6: company by company. It's definitely a stock pickers market where 516 00:23:38,160 --> 00:23:41,119 Speaker 6: there's some that are either relatively immune or have a 517 00:23:41,200 --> 00:23:42,960 Speaker 6: chance to benefit, and tho's others that could be in 518 00:23:42,960 --> 00:23:43,480 Speaker 6: real trouble. 519 00:23:44,160 --> 00:23:47,159 Speaker 4: So is the argument the bold case for software, or 520 00:23:47,200 --> 00:23:50,679 Speaker 4: at least the non sudden death case for software. This 521 00:23:50,800 --> 00:23:53,560 Speaker 4: idea that like, Okay, if you have a software company 522 00:23:53,600 --> 00:23:57,000 Speaker 4: that's producing I don't know, like DocuSign, you're able to 523 00:23:57,240 --> 00:24:00,159 Speaker 4: sign documents digitally and track them and share them and 524 00:24:00,200 --> 00:24:03,840 Speaker 4: all of that. You can build more quickly and more 525 00:24:03,920 --> 00:24:09,159 Speaker 4: efficiently off of that base model and provide new versions, 526 00:24:09,320 --> 00:24:14,680 Speaker 4: new customizations for customers. So I could do DocuSign for dentists, 527 00:24:14,760 --> 00:24:16,600 Speaker 4: just to stick with that example. I don't know what 528 00:24:16,640 --> 00:24:19,880 Speaker 4: specific needs dentists would have. I don't know, maybe marking 529 00:24:20,000 --> 00:24:22,520 Speaker 4: up like teeth or something. Yeah, and then I can 530 00:24:22,600 --> 00:24:26,840 Speaker 4: do like DocuSign for doctors and DocuSign for sales agents 531 00:24:26,920 --> 00:24:28,040 Speaker 4: or whatever and just keep going. 532 00:24:28,359 --> 00:24:28,560 Speaker 5: Yeah. 533 00:24:28,600 --> 00:24:30,399 Speaker 6: I think that's right. I actually kind of think of 534 00:24:30,400 --> 00:24:33,600 Speaker 6: it as there's three cases. There's the software gets wiped 535 00:24:33,640 --> 00:24:37,520 Speaker 6: out case, there's the not much happens to software case, 536 00:24:37,680 --> 00:24:39,920 Speaker 6: and then there's the bual case where the software companies 537 00:24:39,960 --> 00:24:42,040 Speaker 6: capture a lot of value. I think it's a little 538 00:24:42,080 --> 00:24:45,040 Speaker 6: different than them adding a lot of features and functionality. Frankly, 539 00:24:45,040 --> 00:24:47,399 Speaker 6: I think a lot of software products today are pretty mature. 540 00:24:47,840 --> 00:24:50,439 Speaker 6: There than thousand engineers working on them for ten years, 541 00:24:50,560 --> 00:24:52,800 Speaker 6: and they've built not all, but most of the things 542 00:24:52,840 --> 00:24:56,000 Speaker 6: that you'd want to build with today's technology. But with agents, 543 00:24:56,119 --> 00:24:59,160 Speaker 6: there's ways to automate a big chunk of the work. 544 00:24:59,320 --> 00:25:02,480 Speaker 6: So software company that's done this very well is Intercom. 545 00:25:02,720 --> 00:25:05,960 Speaker 6: Intercom sells customer support software. It's those little widgets on 546 00:25:06,000 --> 00:25:07,800 Speaker 6: the bottom right hand corner of websites. They were the 547 00:25:07,840 --> 00:25:10,280 Speaker 6: creators of that. They had a nice business, but then 548 00:25:10,320 --> 00:25:13,400 Speaker 6: they got very aggressive about building out an AI product 549 00:25:13,440 --> 00:25:16,560 Speaker 6: called Finn, which answers customer support queries on its own. 550 00:25:17,080 --> 00:25:19,480 Speaker 6: And they've i think they've mentioned that it's almost one 551 00:25:19,560 --> 00:25:21,800 Speaker 6: hundred million of ARR now on a base that was 552 00:25:22,000 --> 00:25:24,640 Speaker 6: you know, like three hundred million of ARR or something 553 00:25:24,680 --> 00:25:27,840 Speaker 6: like that. And so they've really reaccelerated their business by 554 00:25:27,880 --> 00:25:30,719 Speaker 6: building an AI native tool that actually solves the work, 555 00:25:31,040 --> 00:25:33,960 Speaker 6: not just a tool that not just kind of exists 556 00:25:33,960 --> 00:25:37,439 Speaker 6: as a tool that humans use. And so yeah, I 557 00:25:37,440 --> 00:25:39,800 Speaker 6: think that's like the mega bowlcase, right. I think about 558 00:25:39,800 --> 00:25:43,080 Speaker 6: it like almost a transition from brick and mortar retail 559 00:25:43,119 --> 00:25:45,600 Speaker 6: to e commerce, where you have a brand new way 560 00:25:45,800 --> 00:25:48,200 Speaker 6: of doing business and you have a bunch of legacy 561 00:25:48,240 --> 00:25:52,000 Speaker 6: companies and some of them will probably just exist as 562 00:25:52,040 --> 00:25:56,560 Speaker 6: they always have. Others can benefit from the change and 563 00:25:56,600 --> 00:25:58,840 Speaker 6: add new business lines. You look at Walmart's share price, 564 00:25:58,920 --> 00:26:01,639 Speaker 6: it's done amazingly well at incorporateing e commerce into its business. 565 00:26:02,080 --> 00:26:03,560 Speaker 6: And then there's going to be some that are like 566 00:26:03,640 --> 00:26:04,840 Speaker 6: Seers and go away. 567 00:26:05,160 --> 00:26:07,800 Speaker 4: That's funny. Sears always reminds me of my dad loves 568 00:26:07,800 --> 00:26:10,359 Speaker 4: Sears because he always said the parking lot was empty 569 00:26:10,760 --> 00:26:12,879 Speaker 4: when he goes to the shopping mall, so he always 570 00:26:12,920 --> 00:26:16,320 Speaker 4: went through Sears anyway. So I understand like the cost 571 00:26:16,520 --> 00:26:19,080 Speaker 4: argument and brings down the cost of code. Maybe you 572 00:26:19,160 --> 00:26:22,680 Speaker 4: have fewer employees or whatever, But where does growth actually 573 00:26:22,720 --> 00:26:25,640 Speaker 4: come from in that world? How are you expanding your 574 00:26:25,680 --> 00:26:26,440 Speaker 4: customer base? 575 00:26:26,800 --> 00:26:29,520 Speaker 6: Yeah, you're really going to them and saying we are 576 00:26:29,760 --> 00:26:34,520 Speaker 6: replacing human labor. And there's a different pricing paradigm. 577 00:26:34,600 --> 00:26:34,800 Speaker 7: Now. 578 00:26:34,920 --> 00:26:37,160 Speaker 6: You used to think of us as something you paid, 579 00:26:37,440 --> 00:26:40,080 Speaker 6: you know, twenty thirty forty fifty dollars per seat, per 580 00:26:40,080 --> 00:26:43,280 Speaker 6: month for as a tool for your employees, almost as 581 00:26:43,280 --> 00:26:45,680 Speaker 6: if you know your employees are artisans and they're getting 582 00:26:45,680 --> 00:26:48,439 Speaker 6: a toolkit to work with. And now we're just selling 583 00:26:48,440 --> 00:26:51,960 Speaker 6: you an employee or the results of an employee. So 584 00:26:52,520 --> 00:26:55,520 Speaker 6: you know, we will sell you customer support tickets getting 585 00:26:55,520 --> 00:26:59,280 Speaker 6: closed out for fifty cents or a dollar per ticket, 586 00:26:59,400 --> 00:27:01,360 Speaker 6: and you can do math of what it would cost 587 00:27:01,400 --> 00:27:02,840 Speaker 6: you for the human to do that, or what it 588 00:27:02,880 --> 00:27:05,359 Speaker 6: would cost you for AI to do that, and we'll 589 00:27:05,400 --> 00:27:08,120 Speaker 6: be cheaper. But we're also dramatically increasing what you pay 590 00:27:08,200 --> 00:27:11,320 Speaker 6: us because you know, we're cutting into a completely different stream. 591 00:27:11,359 --> 00:27:13,240 Speaker 6: And so that's what I think it looks like. We 592 00:27:13,240 --> 00:27:16,119 Speaker 6: see a lot of exciting examples in the startup space 593 00:27:16,280 --> 00:27:19,120 Speaker 6: of companies that are getting much much higher pricing. 594 00:27:19,680 --> 00:27:23,240 Speaker 2: This is a totally new pricing model for software. If 595 00:27:23,240 --> 00:27:26,480 Speaker 2: you just recorded another episode and they guess teased that, 596 00:27:26,560 --> 00:27:29,440 Speaker 2: but talk to us about like results based pricing, talk 597 00:27:29,480 --> 00:27:29,760 Speaker 2: to us. 598 00:27:29,960 --> 00:27:32,240 Speaker 6: Yeah, it's results based pricing. There's a lot of questions 599 00:27:32,280 --> 00:27:35,879 Speaker 6: on how it'll ultimately shake out. Fundamentally, what these companies 600 00:27:35,880 --> 00:27:38,760 Speaker 6: are doing is they are reselling intelligence. 601 00:27:39,000 --> 00:27:40,080 Speaker 5: Right the core model. 602 00:27:40,160 --> 00:27:43,720 Speaker 6: Vendors Open Eye and Thropict, Google have created a way 603 00:27:43,760 --> 00:27:47,280 Speaker 6: to get elastic intelligence. And if you have the right 604 00:27:47,359 --> 00:27:49,520 Speaker 6: data and you can put the right harness around it, 605 00:27:50,160 --> 00:27:52,479 Speaker 6: you can now sell that to your customers. What's an 606 00:27:52,480 --> 00:27:55,879 Speaker 6: open question is how do you price that relative to 607 00:27:55,920 --> 00:27:58,400 Speaker 6: the intelligence. So I was talking to someone this morning. 608 00:27:58,400 --> 00:28:01,159 Speaker 6: You said, I think fifty percent growth smart on intelligence 609 00:28:01,200 --> 00:28:03,479 Speaker 6: are about right. But we see a lot of variants 610 00:28:03,480 --> 00:28:05,240 Speaker 6: in how startups are doing it today. Some are getting 611 00:28:05,280 --> 00:28:07,600 Speaker 6: eighty percent gross margins on top of the model vendors. 612 00:28:07,880 --> 00:28:11,399 Speaker 6: Others are getting twenty percent. But what's absolutely true in 613 00:28:11,440 --> 00:28:14,320 Speaker 6: any case is if you're able to do that, you 614 00:28:14,359 --> 00:28:17,480 Speaker 6: get much much higher pricing in total dollars than you 615 00:28:17,520 --> 00:28:19,880 Speaker 6: did before orders a magnitude in some cases. 616 00:28:19,960 --> 00:28:22,919 Speaker 4: But just to be clear, like the cost savings, it 617 00:28:22,960 --> 00:28:25,880 Speaker 4: can't be priced so high that the company that's using 618 00:28:25,960 --> 00:28:30,520 Speaker 4: the software to produce these outcomes like isn't saving money, right, 619 00:28:30,560 --> 00:28:32,120 Speaker 4: that's the balancing act. 620 00:28:32,000 --> 00:28:33,800 Speaker 6: One hundred percent. But I think if you think about 621 00:28:33,840 --> 00:28:35,160 Speaker 6: when we talk about this a little bit earlier, think 622 00:28:35,160 --> 00:28:39,560 Speaker 6: about where software pricing was already right, you know, think 623 00:28:39,560 --> 00:28:42,520 Speaker 6: about salesforce. You know, at the elite tier, you know, 624 00:28:42,840 --> 00:28:45,320 Speaker 6: eighty ninety one hundred dollars per user per month. So 625 00:28:45,440 --> 00:28:48,240 Speaker 6: for round numbers, say one thousand dollars per user per year, 626 00:28:48,760 --> 00:28:50,959 Speaker 6: for sales reps who could be making on average two 627 00:28:51,000 --> 00:28:53,720 Speaker 6: hundred and fifty three hundred thousand dollars per year, if 628 00:28:53,760 --> 00:28:55,760 Speaker 6: you have a technology that can come in and replace 629 00:28:55,840 --> 00:28:59,280 Speaker 6: a sales rep, you can charge fifty thousand dollars still 630 00:28:59,280 --> 00:29:02,040 Speaker 6: give the customer or a five x ROI, and then 631 00:29:02,200 --> 00:29:05,400 Speaker 6: you have effectively fifty x your take rate on that revenue. 632 00:29:05,400 --> 00:29:08,040 Speaker 6: And so that's the exciting opportunity. That's that's what has 633 00:29:08,080 --> 00:29:10,360 Speaker 6: people excited in startup land for sure. If you talk 634 00:29:10,400 --> 00:29:13,120 Speaker 6: to folks from Silicon Valley, they are foaming at the 635 00:29:13,160 --> 00:29:16,480 Speaker 6: mouth about the opportunity to really expand tech spend in 636 00:29:16,520 --> 00:29:19,080 Speaker 6: this way. And that's also the opportunity for the software 637 00:29:19,080 --> 00:29:19,920 Speaker 6: companies that get it right. 638 00:29:20,120 --> 00:29:22,560 Speaker 2: There must be another risk too, which is that if 639 00:29:22,560 --> 00:29:25,520 Speaker 2: you can sort of resell intelligence it'd say an eighty 640 00:29:25,560 --> 00:29:30,239 Speaker 2: percent gross margin, then for the model makers themselves, they're like, well, 641 00:29:30,240 --> 00:29:32,000 Speaker 2: why do we just want to be this is gonna 642 00:29:32,000 --> 00:29:34,320 Speaker 2: sound weird. Why do we want to be the dumb intelligence? 643 00:29:34,440 --> 00:29:34,640 Speaker 5: Right? 644 00:29:34,680 --> 00:29:36,240 Speaker 2: And that's sort of like we don't they use it. 645 00:29:36,280 --> 00:29:37,600 Speaker 2: We don't want to be the dumb pipe. If we 646 00:29:37,640 --> 00:29:41,360 Speaker 2: saw that like in the cloud era, right, the Asias 647 00:29:41,400 --> 00:29:44,480 Speaker 2: and the Google Cloud and the Amazon One, they didn't 648 00:29:44,480 --> 00:29:47,400 Speaker 2: want to just be commodity cloud and they started building 649 00:29:47,480 --> 00:29:51,080 Speaker 2: like medical features. They wanted to differentiate themselves. So it 650 00:29:51,160 --> 00:29:54,080 Speaker 2: must be a risk for the company's reselling intelligence that 651 00:29:54,160 --> 00:29:56,840 Speaker 2: it's so lucrative. And then like, how are you thinking 652 00:29:56,880 --> 00:29:59,920 Speaker 2: about the core model makers themselves? Yeah, and how they're 653 00:30:00,080 --> 00:30:03,800 Speaker 2: thinking about expanding into some of these fields rather than 654 00:30:03,920 --> 00:30:05,600 Speaker 2: just piping in intelligence for them. 655 00:30:05,880 --> 00:30:07,960 Speaker 6: Well, look like in any situation, they're gonna have to 656 00:30:08,000 --> 00:30:11,080 Speaker 6: make decisions, right, So in Amazon, you know, built aws, 657 00:30:11,480 --> 00:30:13,680 Speaker 6: they had to decide where are we going to press 658 00:30:13,880 --> 00:30:15,640 Speaker 6: and where are we not? Are we going to sell 659 00:30:15,720 --> 00:30:17,760 Speaker 6: database software or are we going to let other vendors 660 00:30:17,840 --> 00:30:19,360 Speaker 6: do that on top of us? And they kind of 661 00:30:19,400 --> 00:30:22,080 Speaker 6: made those decisions as it went out. What's really interesting 662 00:30:22,120 --> 00:30:24,560 Speaker 6: is if you look at the foundation model vendors, they 663 00:30:24,560 --> 00:30:27,920 Speaker 6: have been racing towards the application layer. Both cloud code 664 00:30:27,960 --> 00:30:31,600 Speaker 6: and cowork and open ai codex are applications that people 665 00:30:31,760 --> 00:30:35,640 Speaker 6: download and use right, And I think that reflects this 666 00:30:35,800 --> 00:30:39,480 Speaker 6: understanding that there is value in getting the users used 667 00:30:39,520 --> 00:30:43,479 Speaker 6: to using your application. Otherwise, as you know, you may 668 00:30:43,600 --> 00:30:46,160 Speaker 6: you risk being an API that's commoditized. People switch back 669 00:30:46,160 --> 00:30:48,560 Speaker 6: and forth between you and that kind of application vendor 670 00:30:48,840 --> 00:30:49,560 Speaker 6: has that control. 671 00:31:05,640 --> 00:31:08,800 Speaker 4: So one of the advantages that software has is like 672 00:31:09,360 --> 00:31:14,880 Speaker 4: this network effect comfort software as a security blanket for management. Right, 673 00:31:15,240 --> 00:31:18,480 Speaker 4: But at the same time, people are getting really comfortable 674 00:31:18,760 --> 00:31:22,600 Speaker 4: with AI, like telling them everything. And I keep thinking 675 00:31:22,680 --> 00:31:26,200 Speaker 4: like if part of the sales pitch for software is 676 00:31:26,280 --> 00:31:30,640 Speaker 4: like this sense of comfort, but then AI is rapidly 677 00:31:30,720 --> 00:31:35,240 Speaker 4: becoming the thing that you talk to for everything. Does 678 00:31:35,240 --> 00:31:38,000 Speaker 4: it eventually just become a portal for doing all these 679 00:31:38,360 --> 00:31:39,120 Speaker 4: different things. 680 00:31:39,880 --> 00:31:42,480 Speaker 6: It's a really interesting question, and this is where there's 681 00:31:42,520 --> 00:31:46,840 Speaker 6: probably the biggest disparity between how enterprise buyers think and 682 00:31:46,880 --> 00:31:49,440 Speaker 6: how humans think. Right, I'm sure you guys have seen 683 00:31:49,600 --> 00:31:52,600 Speaker 6: claude Bot and the kind of rise of these, you know, 684 00:31:52,720 --> 00:31:55,720 Speaker 6: open source agents that people are deploying for themselves, giving 685 00:31:55,720 --> 00:31:57,920 Speaker 6: it and access to everything their whole computer, et cetera. 686 00:31:58,080 --> 00:31:58,560 Speaker 1: That's Joe. 687 00:31:58,840 --> 00:32:01,760 Speaker 3: Yeah, no, I didn't install I didn't install Club. 688 00:32:01,880 --> 00:32:02,640 Speaker 4: Oh you didn't know. 689 00:32:02,680 --> 00:32:04,640 Speaker 5: I'm getting mindset up. I lacked many with a hammer 690 00:32:04,680 --> 00:32:05,040 Speaker 5: next to it. 691 00:32:05,080 --> 00:32:06,240 Speaker 4: Well, I'm really curious why not? 692 00:32:06,800 --> 00:32:07,760 Speaker 6: Because of this issue? 693 00:32:08,040 --> 00:32:11,040 Speaker 2: Yeah, because of that and it just seemed like a 694 00:32:11,080 --> 00:32:12,640 Speaker 2: potential waste of tokens and stuff. 695 00:32:12,680 --> 00:32:14,560 Speaker 6: And yeah, and then then it turned out that for 696 00:32:14,600 --> 00:32:17,200 Speaker 6: a while on Moltbook, which was the social media for 697 00:32:17,440 --> 00:32:20,040 Speaker 6: all of the all the APIs are available in a 698 00:32:20,080 --> 00:32:22,240 Speaker 6: public facing database and one could go read and so 699 00:32:22,280 --> 00:32:24,240 Speaker 6: it was like a completely open system. 700 00:32:23,960 --> 00:32:24,760 Speaker 5: That had to get fixed. 701 00:32:24,760 --> 00:32:29,000 Speaker 6: And so, you know, enterprises really do worry about this stuff, 702 00:32:29,000 --> 00:32:30,920 Speaker 6: and they worry about it for a good reason. You 703 00:32:31,000 --> 00:32:33,000 Speaker 6: have another really interesting example. So there's a bunch of 704 00:32:33,040 --> 00:32:36,400 Speaker 6: startups that help you record zoom calls and transcribe them. 705 00:32:36,600 --> 00:32:40,040 Speaker 6: All of those zoom calls then become legally discoverable because 706 00:32:40,080 --> 00:32:42,720 Speaker 6: they're transcribed somewhere, and so you have vcs in Silicon 707 00:32:42,800 --> 00:32:44,560 Speaker 6: Valley who will refuse to use them, and you have 708 00:32:44,600 --> 00:32:46,680 Speaker 6: other firms that are all in and recording everything that 709 00:32:46,720 --> 00:32:49,400 Speaker 6: happens across the board so that they can upload that 710 00:32:49,440 --> 00:32:52,360 Speaker 6: into AI as context. I think it's a really it's 711 00:32:52,360 --> 00:32:54,320 Speaker 6: a really great point, you know. And one of the 712 00:32:54,320 --> 00:32:57,160 Speaker 6: things that makes me wonder is companies that are willing 713 00:32:57,200 --> 00:33:00,320 Speaker 6: to skirt the rules or you know, play fast and loose, 714 00:33:00,400 --> 00:33:02,640 Speaker 6: will be moving much faster over the next two or 715 00:33:02,680 --> 00:33:05,200 Speaker 6: three years. And One of the reasons big incumbents struggle 716 00:33:05,560 --> 00:33:08,000 Speaker 6: is because they actually do have to care about this stuff. 717 00:33:08,000 --> 00:33:09,320 Speaker 5: They have stuff to protect, they. 718 00:33:09,280 --> 00:33:12,640 Speaker 6: Don't want to be sued. They can't handle a major breach, 719 00:33:12,920 --> 00:33:15,320 Speaker 6: and startups are able to just move faster given that. 720 00:33:16,280 --> 00:33:20,000 Speaker 2: So every time software stocks sell off with this and 721 00:33:20,040 --> 00:33:22,400 Speaker 2: people say, oh, they micro bargain hunting, and they say 722 00:33:22,440 --> 00:33:25,240 Speaker 2: what's cheap and what baby is being thrown out with 723 00:33:25,280 --> 00:33:25,880 Speaker 2: the bath water? 724 00:33:26,000 --> 00:33:27,320 Speaker 3: Someone always a bunch. 725 00:33:27,160 --> 00:33:29,400 Speaker 2: Of people is like, yes, they look cheap, but have 726 00:33:29,520 --> 00:33:32,960 Speaker 2: you considered a stock based compensation? And it turns out 727 00:33:33,040 --> 00:33:35,400 Speaker 2: that these companies are not nearly as profitable once you 728 00:33:35,480 --> 00:33:39,160 Speaker 2: factor this in. Its very interesting note from Barkleys I 729 00:33:39,160 --> 00:33:41,720 Speaker 2: think it was I think it was Barkleys. This is 730 00:33:41,800 --> 00:33:44,800 Speaker 2: very interesting, and it said our European investors are always 731 00:33:44,840 --> 00:33:48,120 Speaker 2: asking about SBC. Our American investors only ask whether there's 732 00:33:48,160 --> 00:33:48,680 Speaker 2: a crisis. 733 00:33:49,000 --> 00:33:50,320 Speaker 3: I think tells you something about. 734 00:33:50,120 --> 00:33:52,200 Speaker 2: The differentiat in Europeans and Americas. I thought that was 735 00:33:52,240 --> 00:33:56,280 Speaker 2: a fascinating sociological observation. Tell us, like, how should we 736 00:33:56,360 --> 00:33:59,880 Speaker 2: think about the cost, because again, if code generation is 737 00:33:59,880 --> 00:34:03,400 Speaker 2: a cost base, presumably these software companies don't need as 738 00:34:03,440 --> 00:34:05,520 Speaker 2: many employees either. And they could pair back on this, 739 00:34:05,600 --> 00:34:07,320 Speaker 2: So talk to us about how we're thinking about the 740 00:34:07,360 --> 00:34:09,480 Speaker 2: costs inside the software company. 741 00:34:09,520 --> 00:34:12,719 Speaker 6: Great question, and yeah, I mean certainly theoretically true, right, 742 00:34:12,760 --> 00:34:17,239 Speaker 6: but aside from Elon cutting eighty percent of twitter X's headcount, 743 00:34:17,520 --> 00:34:20,440 Speaker 6: we really haven't seen any companies take the pill and 744 00:34:20,520 --> 00:34:23,560 Speaker 6: kind of realize the benefits of that. The SBC debate 745 00:34:23,800 --> 00:34:26,719 Speaker 6: has been going on for a long time. I've had 746 00:34:26,719 --> 00:34:29,080 Speaker 6: it at nausimo of the course of my career. It's 747 00:34:29,080 --> 00:34:32,320 Speaker 6: a real expense. You're issuing your employees stock. They value 748 00:34:32,320 --> 00:34:34,640 Speaker 6: it like cash. Many of them auto sell it the 749 00:34:34,719 --> 00:34:37,960 Speaker 6: day at vests for them. And I think what the 750 00:34:38,080 --> 00:34:41,480 Speaker 6: problem that it creates for software companies is they the 751 00:34:41,520 --> 00:34:44,560 Speaker 6: management teams are addicted to reporting non gap, which excludes 752 00:34:44,600 --> 00:34:48,320 Speaker 6: the impact of SBC. And so if you are an 753 00:34:48,480 --> 00:34:51,959 Speaker 6: entrepreneur who founded the software business who's technical, hasn't really 754 00:34:51,960 --> 00:34:54,200 Speaker 6: ever cared more that much about the financial side your 755 00:34:54,280 --> 00:34:56,799 Speaker 6: product person, you may think that you've been doing a 756 00:34:56,840 --> 00:34:59,920 Speaker 6: good job of being a profitable company because your CFO 757 00:35:00,120 --> 00:35:02,000 Speaker 6: telling you, well, we're at a twenty five percent non 758 00:35:02,000 --> 00:35:05,279 Speaker 6: gap operating margin. That's pretty good when the reality is 759 00:35:05,360 --> 00:35:07,879 Speaker 6: you're running break even, which is a very common state 760 00:35:07,920 --> 00:35:10,120 Speaker 6: of affairs. You know, we looked at the whole universe 761 00:35:10,160 --> 00:35:12,680 Speaker 6: and the media and public software company has a five 762 00:35:12,719 --> 00:35:15,320 Speaker 6: percent non gap net income or gap net income margin, 763 00:35:15,680 --> 00:35:18,360 Speaker 6: which is not enough to value the companies on. And 764 00:35:18,400 --> 00:35:21,839 Speaker 6: so it creates this dynamic where you know, yes, there's 765 00:35:21,840 --> 00:35:24,160 Speaker 6: a terminal value concern, which by far the most important thing, 766 00:35:24,160 --> 00:35:25,560 Speaker 6: but there's also no floor. 767 00:35:25,800 --> 00:35:25,960 Speaker 5: Right. 768 00:35:26,000 --> 00:35:28,480 Speaker 6: I was looking at the earnings report from fresh Works, 769 00:35:28,480 --> 00:35:31,560 Speaker 6: which is a mid market seller of customer support and 770 00:35:31,640 --> 00:35:34,479 Speaker 6: IT management software. It trades at one and a half 771 00:35:34,520 --> 00:35:37,759 Speaker 6: times EVY to sales. If it ran at even a 772 00:35:37,800 --> 00:35:41,960 Speaker 6: ten percent gap margin, it'd be trading at fifteen times earnings. 773 00:35:42,000 --> 00:35:44,600 Speaker 6: You know, that's which is a pretty attractive place to be. 774 00:35:44,680 --> 00:35:47,480 Speaker 6: You could get some value investors, maybe some European investors 775 00:35:47,880 --> 00:35:51,320 Speaker 6: interested in buying it there, but it doesn't have material 776 00:35:51,360 --> 00:35:54,440 Speaker 6: gap earnings. And on their earnings call, there was no 777 00:35:54,640 --> 00:35:58,200 Speaker 6: real you know, sense of trajectory towards that, and you 778 00:35:58,239 --> 00:36:01,040 Speaker 6: see the share price down sixteen p exactly, and like, 779 00:36:01,280 --> 00:36:03,360 Speaker 6: the top line results were actually pretty good, and so 780 00:36:04,040 --> 00:36:06,800 Speaker 6: there's a real issue here on the financial side as well. 781 00:36:07,680 --> 00:36:11,520 Speaker 6: It's incredibly disappointing to me that management teams haven't embraced 782 00:36:11,520 --> 00:36:14,920 Speaker 6: this as a way to cut costs themselves, and I 783 00:36:15,000 --> 00:36:15,840 Speaker 6: expect they will. 784 00:36:16,000 --> 00:36:19,640 Speaker 2: Yeah, talk to us about this specifically. Are we going 785 00:36:19,640 --> 00:36:23,000 Speaker 2: to see big layoffs across the SaaS space in the 786 00:36:23,040 --> 00:36:24,960 Speaker 2: near term and what do you think is the timeframe 787 00:36:24,960 --> 00:36:26,000 Speaker 2: for that great question? 788 00:36:26,160 --> 00:36:29,160 Speaker 6: I think we will. I think we've seen that management 789 00:36:29,160 --> 00:36:32,840 Speaker 6: teams do respond to price signals. If you look at 790 00:36:33,080 --> 00:36:35,480 Speaker 6: the history of the sector, it was in twenty twenty 791 00:36:35,520 --> 00:36:37,960 Speaker 6: three when there was a round of layoffs and companies 792 00:36:38,000 --> 00:36:40,360 Speaker 6: showed margin, and then they've kind of resisted it for 793 00:36:40,400 --> 00:36:43,640 Speaker 6: the last two years. The thing about it is layoffs 794 00:36:43,640 --> 00:36:46,319 Speaker 6: can help you move faster, right. I think if you 795 00:36:46,400 --> 00:36:50,160 Speaker 6: look within any company today, unfortunately, there is this spectrum 796 00:36:50,200 --> 00:36:53,480 Speaker 6: of employees and how fast they've adopted AI, whether they're 797 00:36:53,480 --> 00:36:56,000 Speaker 6: still doing things the old way or they're on cloud code, 798 00:36:56,200 --> 00:36:58,759 Speaker 6: cloud cowork, kind of changing the way that they work. 799 00:36:59,160 --> 00:37:02,320 Speaker 6: And employees who are on the lower end are actually 800 00:37:02,360 --> 00:37:05,040 Speaker 6: slowing you down as a company. They're not even zero 801 00:37:05,120 --> 00:37:08,319 Speaker 6: marginal product, they're negative marginal product. There's just been such 802 00:37:08,320 --> 00:37:10,800 Speaker 6: a change in how you work, especially in software development, 803 00:37:11,200 --> 00:37:14,120 Speaker 6: and so I think management teams are going to realize 804 00:37:14,120 --> 00:37:18,440 Speaker 6: that there's two benefits to actually doing layoffs in addition 805 00:37:18,440 --> 00:37:20,120 Speaker 6: to the obvious pain of it and the kind of 806 00:37:20,200 --> 00:37:24,000 Speaker 6: human costs which I never forget to discuss. But one 807 00:37:24,040 --> 00:37:27,480 Speaker 6: is saving money and showing your shareholders you're financially disciplined 808 00:37:27,520 --> 00:37:30,279 Speaker 6: and probably seeing your share price stabilize, especially if you're 809 00:37:30,280 --> 00:37:32,920 Speaker 6: trading at some very low multiple. And the second is 810 00:37:33,560 --> 00:37:36,520 Speaker 6: moving faster and also almost as importantly, being able to 811 00:37:36,560 --> 00:37:39,440 Speaker 6: pay your top performing employees. The war for talent in 812 00:37:39,480 --> 00:37:42,360 Speaker 6: Silicon Valley has never been more intense right now. Is 813 00:37:42,400 --> 00:37:45,359 Speaker 6: talking to a private company you invested in and they're 814 00:37:45,400 --> 00:37:48,000 Speaker 6: losing employees left and right to these high growth AI 815 00:37:48,080 --> 00:37:51,680 Speaker 6: companies who can afford to pay huge compackages in both 816 00:37:51,680 --> 00:37:54,720 Speaker 6: equity and stock, and you want to keep your good people. 817 00:37:54,960 --> 00:37:55,440 Speaker 5: You don't want it. 818 00:37:55,520 --> 00:37:57,839 Speaker 6: You don't want these AI companies that pluck away all 819 00:37:57,920 --> 00:37:59,960 Speaker 6: the best people and leave you with the folks who 820 00:38:00,080 --> 00:38:02,160 Speaker 6: relative leddites. And so I do think we'll see this. 821 00:38:02,360 --> 00:38:05,160 Speaker 6: It's very sad that that will have to happen, but 822 00:38:06,040 --> 00:38:08,319 Speaker 6: it's the obvious path forward for the sector, and I 823 00:38:08,360 --> 00:38:10,360 Speaker 6: think if done right, it accelerates innovation. 824 00:38:10,640 --> 00:38:13,680 Speaker 4: I have a tangential question on that note, which is 825 00:38:13,960 --> 00:38:17,560 Speaker 4: whenever we talk about technological disruption, you know, people bring 826 00:38:17,560 --> 00:38:22,040 Speaker 4: out examples of like remember when Excel was basically actual, 827 00:38:22,120 --> 00:38:25,520 Speaker 4: people sat down with like papers in front of them 828 00:38:25,880 --> 00:38:29,720 Speaker 4: doing the math. And those people didn't disappear when Excel 829 00:38:29,800 --> 00:38:34,280 Speaker 4: got created, but they started doing new things. I imagine 830 00:38:34,320 --> 00:38:36,319 Speaker 4: a lot of people are very interested right now in 831 00:38:36,600 --> 00:38:42,360 Speaker 4: alternative careers for basic commoditized coders. What do those actually 832 00:38:42,400 --> 00:38:45,080 Speaker 4: look like? Yeah, I feel like you might have some 833 00:38:45,160 --> 00:38:46,800 Speaker 4: insight here the alternative. 834 00:38:46,800 --> 00:38:48,880 Speaker 6: Well, so I think there's two ways to answer the question. Right, 835 00:38:48,880 --> 00:38:50,120 Speaker 6: There's like what do you do if you want to 836 00:38:50,160 --> 00:38:52,560 Speaker 6: stay a coder? And then there's what are the careers 837 00:38:52,600 --> 00:38:54,239 Speaker 6: that are going to still exist over time? 838 00:38:54,320 --> 00:38:54,520 Speaker 5: Right? 839 00:38:55,040 --> 00:38:57,120 Speaker 6: I think if you think about what's happening to coding, 840 00:38:57,880 --> 00:39:00,520 Speaker 6: it reminds me a lot of civil engineering, and so 841 00:39:00,840 --> 00:39:02,760 Speaker 6: it's kind of a funky example, but you know, civil 842 00:39:02,760 --> 00:39:05,319 Speaker 6: engineers used to work pen and paper doing calculus will 843 00:39:05,360 --> 00:39:08,799 Speaker 6: this bridge hold up or not? That's been obsolete for 844 00:39:08,840 --> 00:39:11,080 Speaker 6: a very long time. All those calculations are done by 845 00:39:11,080 --> 00:39:13,480 Speaker 6: a computer. They're kind of clicking and moving, and they 846 00:39:13,520 --> 00:39:16,359 Speaker 6: go on site, they collect some data, they talk to stakeholders, 847 00:39:16,600 --> 00:39:19,480 Speaker 6: and they're effectively project managing this computer that can do 848 00:39:19,520 --> 00:39:22,080 Speaker 6: the physics part of their job. For them, it's important 849 00:39:22,080 --> 00:39:24,840 Speaker 6: that they understand the physics in case something looks strange, 850 00:39:25,080 --> 00:39:28,000 Speaker 6: but they're not doing much physics right. That's clearly where 851 00:39:28,040 --> 00:39:30,360 Speaker 6: software engineering is heading in the near term, and a 852 00:39:30,400 --> 00:39:32,719 Speaker 6: lot of companies that's already there. And these companies are 853 00:39:32,760 --> 00:39:35,759 Speaker 6: still hiring software engineers because that job is valuable, and 854 00:39:35,800 --> 00:39:38,640 Speaker 6: in fact, each individual software engineer is way more productive 855 00:39:38,920 --> 00:39:42,440 Speaker 6: than they were before. And there's happily elastic demand for software, 856 00:39:42,480 --> 00:39:44,879 Speaker 6: Like we still are undersupplied with software in the world, 857 00:39:45,160 --> 00:39:46,560 Speaker 6: and so there's quite a bit of room to go 858 00:39:46,600 --> 00:39:49,560 Speaker 6: to add those and so I'm not necessarily bearish on 859 00:39:50,440 --> 00:39:53,360 Speaker 6: the demand for software engineers, at least for the next 860 00:39:53,520 --> 00:39:55,680 Speaker 6: you know, three to five years beyond that if things 861 00:39:55,680 --> 00:39:58,520 Speaker 6: get weird hard to tell. But then for people more broadly, 862 00:39:58,680 --> 00:40:02,759 Speaker 6: I think the the best advice is just adaptability, you know, 863 00:40:03,000 --> 00:40:06,319 Speaker 6: constantly trying and testing these tools, making sure you're staying 864 00:40:06,320 --> 00:40:08,720 Speaker 6: at the cutting edge of them, and then being aware 865 00:40:08,760 --> 00:40:11,600 Speaker 6: of what's human right. I think in like in my 866 00:40:11,719 --> 00:40:14,200 Speaker 6: work and venture investing, you know, there's a lot of 867 00:40:14,239 --> 00:40:16,600 Speaker 6: data that comes out of human relationships that an AI 868 00:40:16,640 --> 00:40:18,799 Speaker 6: wouldn't have access to you know, an AI can't call 869 00:40:18,880 --> 00:40:21,439 Speaker 6: its friend at another fund and ask how company is doing. 870 00:40:21,760 --> 00:40:24,160 Speaker 6: Not yet, at least they have to make some friends first. 871 00:40:24,239 --> 00:40:26,480 Speaker 3: Right, they're talking about they are talking to each other 872 00:40:26,520 --> 00:40:27,000 Speaker 3: on moultbook. 873 00:40:27,080 --> 00:40:28,759 Speaker 6: Right, they're talking about their Moult book. Yeah, so maybe 874 00:40:28,760 --> 00:40:30,120 Speaker 6: if there's an AI agent from Sequi. 875 00:40:31,480 --> 00:40:33,080 Speaker 3: I was intrigued by that. 876 00:40:33,280 --> 00:40:35,759 Speaker 5: Yeah, it was very evocative, but pretty fake. 877 00:40:35,840 --> 00:40:38,040 Speaker 4: Well also there was that Wired article of the guy 878 00:40:38,080 --> 00:40:41,480 Speaker 4: who like infiltrated as a bought and pretended to. 879 00:40:41,440 --> 00:40:44,880 Speaker 2: Be They're like, oh, why are we let's create a 880 00:40:44,920 --> 00:40:46,759 Speaker 2: new language just for us, Like they're not making new. 881 00:40:46,719 --> 00:40:49,120 Speaker 6: Languages right, But I think I think the rough mental 882 00:40:49,120 --> 00:40:51,600 Speaker 6: model is if there was any effort to outsource your 883 00:40:51,680 --> 00:40:54,759 Speaker 6: job to India, that's risk because that tells you that 884 00:40:54,840 --> 00:40:57,320 Speaker 6: job can be done by person, someone who's not physically 885 00:40:57,360 --> 00:41:00,960 Speaker 6: present in a space. And you know, if you like 886 00:41:01,040 --> 00:41:04,920 Speaker 6: working on problems in isolation, not socially with other people, 887 00:41:05,239 --> 00:41:08,120 Speaker 6: you know, grinding out math problems or little coding assignments, 888 00:41:08,520 --> 00:41:11,160 Speaker 6: that's probably also a pretty tough place to be. It's 889 00:41:11,200 --> 00:41:12,239 Speaker 6: going to be more social world. 890 00:41:12,440 --> 00:41:14,960 Speaker 4: This is something we've touched on before, which makes me 891 00:41:15,080 --> 00:41:18,440 Speaker 4: kind of sad, which is the edge in the AI 892 00:41:18,560 --> 00:41:23,080 Speaker 4: world becomes like sociability right, And to some extent we 893 00:41:23,160 --> 00:41:25,440 Speaker 4: talked about this in the context of look smack, I 894 00:41:25,520 --> 00:41:27,200 Speaker 4: know you love it, Joe, I do not. 895 00:41:27,920 --> 00:41:32,160 Speaker 2: Can I say two little observations from my time vibe 896 00:41:32,160 --> 00:41:35,040 Speaker 2: coding in twenty twenty six that are interesting. One is 897 00:41:35,480 --> 00:41:39,080 Speaker 2: I have zero technical background. I've been surprised by the 898 00:41:39,200 --> 00:41:42,799 Speaker 2: speed with which I can build intuitions about when it's 899 00:41:42,840 --> 00:41:45,960 Speaker 2: going off the rails, like when it's doing something that 900 00:41:46,000 --> 00:41:48,759 Speaker 2: doesn't seem right. Like I joke that vibe coding is 901 00:41:48,840 --> 00:41:51,560 Speaker 2: just typing make it better presenter over and over again 902 00:41:51,800 --> 00:41:54,760 Speaker 2: and then hitting yes. When it offers to do something. 903 00:41:55,680 --> 00:41:58,960 Speaker 2: You actually can start to build an intuition fairly quickly, 904 00:41:59,040 --> 00:42:01,160 Speaker 2: like when this doesn't make sense, and then the other thing. 905 00:42:01,160 --> 00:42:04,320 Speaker 2: And this relates to your question of like trusting the AI. 906 00:42:04,760 --> 00:42:06,359 Speaker 2: So one of the things I'm doing is I'm having 907 00:42:06,360 --> 00:42:08,960 Speaker 2: a lot of documents get annotated, and that I do 908 00:42:09,000 --> 00:42:11,560 Speaker 2: that through the Claude API, which actually runs up the 909 00:42:11,600 --> 00:42:13,799 Speaker 2: bills a little bit. And one thing this API run 910 00:42:13,920 --> 00:42:16,839 Speaker 2: was gonna cost like one hundred dollars, and I stupidly 911 00:42:17,080 --> 00:42:18,600 Speaker 2: asked Claude it is like, is this a good thing. 912 00:42:18,600 --> 00:42:21,359 Speaker 2: It's like, well, when you're done with this API API run, 913 00:42:21,520 --> 00:42:24,080 Speaker 2: you're gonna have this annotated asset that no one else 914 00:42:24,120 --> 00:42:25,480 Speaker 2: has done, and that'll be very. 915 00:42:25,640 --> 00:42:28,320 Speaker 3: It was sort of useless what I did, so you shouldn't. 916 00:42:28,719 --> 00:42:29,320 Speaker 5: It's selling it. 917 00:42:29,880 --> 00:42:31,120 Speaker 3: Selling itself is like, oh. 918 00:42:31,040 --> 00:42:34,160 Speaker 2: Yeah, use the API, Joe run this like annotated all 919 00:42:34,200 --> 00:42:36,680 Speaker 2: these documents. It wasn't actually like a good use of 920 00:42:36,719 --> 00:42:39,600 Speaker 2: my time. So you can't really always they're just gonna 921 00:42:39,840 --> 00:42:42,960 Speaker 2: they're gonna just sell these things. So Tracy asked about data, 922 00:42:43,160 --> 00:42:45,960 Speaker 2: and there's one other sectors that I'm interested in. 923 00:42:46,360 --> 00:42:47,719 Speaker 3: You see these companies like. 924 00:42:47,760 --> 00:42:50,480 Speaker 7: Moody's or fair Isaac. 925 00:42:50,480 --> 00:42:53,399 Speaker 2: Or sp Global they have an index, yeah, and they're 926 00:42:53,400 --> 00:42:55,799 Speaker 2: getting they're selling off too. And it's not like this 927 00:42:55,840 --> 00:42:58,960 Speaker 2: is another area where like it's you know, people were 928 00:42:59,320 --> 00:43:02,320 Speaker 2: fund managers for a long time. Unless things get really weird, 929 00:43:02,920 --> 00:43:05,279 Speaker 2: the super they are gonna be like benchmarking themselves off 930 00:43:05,320 --> 00:43:06,799 Speaker 2: of like that S and P five hundred for a 931 00:43:06,840 --> 00:43:10,120 Speaker 2: long time, or lenders are gonna be using the FICO 932 00:43:10,440 --> 00:43:13,759 Speaker 2: for a very long time, et cetera. Intuitively, it would 933 00:43:13,800 --> 00:43:15,560 Speaker 2: strike be is this would be a very hard thing 934 00:43:15,640 --> 00:43:16,640 Speaker 2: for ad or replace. 935 00:43:16,920 --> 00:43:19,520 Speaker 6: I share your intuition, right, I can't say I fully 936 00:43:19,600 --> 00:43:23,000 Speaker 6: understand the selloff and these companies, I wonder if there's 937 00:43:23,040 --> 00:43:25,440 Speaker 6: not some parts of their businesses that are more services 938 00:43:25,520 --> 00:43:28,799 Speaker 6: or consulting heads that people are there's often like combinations 939 00:43:28,960 --> 00:43:31,440 Speaker 6: right where like I don't think anyone suspecting that, like, 940 00:43:31,520 --> 00:43:33,360 Speaker 6: you know, the S and P five hundred is going 941 00:43:33,440 --> 00:43:37,520 Speaker 6: to be displaced as an index throughout Yeah. Maybe, but look, 942 00:43:37,600 --> 00:43:41,200 Speaker 6: we're in a world where folks are very happy to 943 00:43:41,239 --> 00:43:44,760 Speaker 6: shoot first and ask questions later once AI risk comes out. 944 00:43:45,000 --> 00:43:48,600 Speaker 2: Actually going back to your hedge fund time, like how 945 00:43:48,680 --> 00:43:50,880 Speaker 2: much is it just the sort of the nature of 946 00:43:50,920 --> 00:43:55,200 Speaker 2: hedge fund traders right now where there's very little stomach 947 00:43:55,239 --> 00:43:58,720 Speaker 2: to take any downside risk and appear to look stupid 948 00:43:58,760 --> 00:44:00,840 Speaker 2: for missing you know, yeah the bag here, and how 949 00:44:00,920 --> 00:44:02,480 Speaker 2: much do you think that's contributing to some of these 950 00:44:02,480 --> 00:44:02,959 Speaker 2: market money? 951 00:44:03,000 --> 00:44:03,600 Speaker 5: It's a great question. 952 00:44:03,600 --> 00:44:05,000 Speaker 6: I won't speak to my fund because I think CO 953 00:44:05,120 --> 00:44:07,480 Speaker 6: two and Tiger cubs like it are a fairly small 954 00:44:07,560 --> 00:44:09,520 Speaker 6: share of the overall market and dollars, right, but if 955 00:44:09,560 --> 00:44:12,920 Speaker 6: you look at trading volumes, the pod shops, Citadel, Ballyas 956 00:44:12,920 --> 00:44:15,040 Speaker 6: and the millennium are very large share of the volumes, 957 00:44:15,040 --> 00:44:19,680 Speaker 6: and yeah, those people can't afford draughtouts, right. And the 958 00:44:19,719 --> 00:44:24,399 Speaker 6: scary thing about this for them is because it's not fundamental. 959 00:44:24,719 --> 00:44:28,320 Speaker 6: Because the companies aren't struggling themselves, they have no idea 960 00:44:28,360 --> 00:44:31,520 Speaker 6: when it will stop, right, And so you're left predicting 961 00:44:31,520 --> 00:44:33,320 Speaker 6: this thing. And you're like, well, I bet my career 962 00:44:33,640 --> 00:44:36,440 Speaker 6: that people are going to feel better about software companies 963 00:44:36,480 --> 00:44:38,600 Speaker 6: in three to six months than they do right now. 964 00:44:38,880 --> 00:44:41,160 Speaker 6: And you know, your one open AI model release or 965 00:44:41,200 --> 00:44:44,319 Speaker 6: anthropic model release away from more fear. And so I 966 00:44:44,360 --> 00:44:48,000 Speaker 6: do think there's a lot of short termism right now, 967 00:44:48,520 --> 00:44:51,000 Speaker 6: and there's all but again I think we come back 968 00:44:51,000 --> 00:44:53,480 Speaker 6: to the SBC point. But there's also no valuation support, 969 00:44:53,680 --> 00:44:56,839 Speaker 6: no real valuation support. You know, in normal times, if 970 00:44:56,840 --> 00:44:58,560 Speaker 6: companies were like this, they'd be buying back a bunch 971 00:44:58,600 --> 00:45:01,799 Speaker 6: of their stock, taking their share out, issuing dividends. You know, 972 00:45:01,840 --> 00:45:03,440 Speaker 6: I have a friend who works at a mutual fund 973 00:45:03,480 --> 00:45:05,600 Speaker 6: with is a lot of dividend funds that would love 974 00:45:05,680 --> 00:45:08,719 Speaker 6: to buy dividending companies growing ten to fifteen percent, like 975 00:45:08,760 --> 00:45:11,560 Speaker 6: a lot of software companies. But they're not right, and 976 00:45:11,640 --> 00:45:14,480 Speaker 6: so you've kind of lost the ability to kind of 977 00:45:14,480 --> 00:45:16,800 Speaker 6: put an actual floor in underneath evaluations. 978 00:45:16,800 --> 00:45:19,360 Speaker 2: As a result of that, Jared Sleeper, thank you so 979 00:45:19,440 --> 00:45:22,480 Speaker 2: much for coming on odd Laws and explaining how socrow works. 980 00:45:22,520 --> 00:45:23,799 Speaker 5: My pleasure is super fun. 981 00:45:23,960 --> 00:45:37,240 Speaker 7: Thanks Tracy. 982 00:45:37,280 --> 00:45:40,080 Speaker 2: I thought that was really interesting. I'm very fascinated with 983 00:45:40,160 --> 00:45:43,359 Speaker 2: this idea that in the short term most of these 984 00:45:43,400 --> 00:45:46,799 Speaker 2: businesses are doing fine. In the long term it might 985 00:45:46,840 --> 00:45:49,560 Speaker 2: go to zero. But also with the short term they're 986 00:45:49,560 --> 00:45:53,240 Speaker 2: not really doing fine because actually, you on a gap basis, 987 00:45:53,239 --> 00:45:55,279 Speaker 2: they're not making much money. I guess that sort of 988 00:45:55,280 --> 00:45:57,239 Speaker 2: makes sense that they're all selling off right now. 989 00:45:57,360 --> 00:46:00,680 Speaker 4: Yeah, I keep thinking this is probably a stretch, but 990 00:46:00,719 --> 00:46:04,440 Speaker 4: I keep thinking back to that book Full Jobs. Yeah, remember, 991 00:46:04,719 --> 00:46:07,360 Speaker 4: and like the argument there is a lot of jobs 992 00:46:07,400 --> 00:46:12,200 Speaker 4: exist not because like people are doing anything specific, but 993 00:46:12,280 --> 00:46:15,799 Speaker 4: because they're like providing some sort of social value in 994 00:46:15,840 --> 00:46:18,560 Speaker 4: a way. So, for instance, like you have a person 995 00:46:18,600 --> 00:46:22,920 Speaker 4: who is essentially the designated scapegoat for senior management. And 996 00:46:22,960 --> 00:46:25,640 Speaker 4: I keep thinking of you know, business is basically an 997 00:46:25,640 --> 00:46:28,600 Speaker 4: ecosystem of different players, So it might be that in 998 00:46:28,640 --> 00:46:32,960 Speaker 4: the new AI world the role of software companies just 999 00:46:33,120 --> 00:46:37,400 Speaker 4: kind of changes, like their social role changes. And I 1000 00:46:37,440 --> 00:46:39,640 Speaker 4: don't know what the price or the valuation looks like 1001 00:46:39,719 --> 00:46:40,799 Speaker 4: on that it. 1002 00:46:40,800 --> 00:46:45,000 Speaker 2: Still feels like to me, I have no I was 1003 00:46:45,040 --> 00:46:46,759 Speaker 2: going to say something about how businesses we're going to 1004 00:46:47,000 --> 00:46:49,160 Speaker 2: what do I know? I have no idea how businesses 1005 00:46:49,400 --> 00:46:51,440 Speaker 2: are going to buy software in the future. I did 1006 00:46:51,520 --> 00:46:53,719 Speaker 2: think that was really helpful, Like I really don't know 1007 00:46:53,760 --> 00:46:56,640 Speaker 2: anything about how the software business works generally, so I 1008 00:46:56,719 --> 00:46:59,520 Speaker 2: find that very helpful. One other interesting thing though, and 1009 00:46:59,560 --> 00:47:01,839 Speaker 2: it may sort of speak to think about this risk, 1010 00:47:02,280 --> 00:47:05,080 Speaker 2: is just the idea that, like, even high end software 1011 00:47:05,120 --> 00:47:06,600 Speaker 2: is not that much money, right, So if you have 1012 00:47:06,640 --> 00:47:09,200 Speaker 2: a salesperson who's making two hundred and fifty thousand dollars, 1013 00:47:09,560 --> 00:47:12,799 Speaker 2: what is one thousand dollars a year from salesforce. 1014 00:47:12,280 --> 00:47:12,919 Speaker 7: To do their job? 1015 00:47:13,400 --> 00:47:16,279 Speaker 2: Particularly and then also given the fact that you know, 1016 00:47:16,440 --> 00:47:19,280 Speaker 2: free and open source software has existed for a long time, 1017 00:47:19,600 --> 00:47:21,960 Speaker 2: but still you know you want to pay an implementer 1018 00:47:22,160 --> 00:47:25,680 Speaker 2: for a company that like manages et cetera. Getting from 1019 00:47:25,680 --> 00:47:29,759 Speaker 2: like here to there where okay Ai really changes the 1020 00:47:29,880 --> 00:47:34,120 Speaker 2: nature of software buying does feel like you have to get. 1021 00:47:33,920 --> 00:47:36,920 Speaker 3: Into this is going to be weird toward tear, but maybe. 1022 00:47:36,680 --> 00:47:37,120 Speaker 7: Things are good. 1023 00:47:37,160 --> 00:47:39,480 Speaker 3: I think things probably are going to get really weird. 1024 00:47:39,400 --> 00:47:41,840 Speaker 4: Yeah, I think that's that's a pretty good bet. Right, Like, 1025 00:47:41,960 --> 00:47:45,800 Speaker 4: if you bet on weirdness, if there is a weirdness index. 1026 00:47:45,680 --> 00:47:47,200 Speaker 3: Someone should build that weirdness index. 1027 00:47:47,320 --> 00:47:49,920 Speaker 4: That would be a pretty good investment. Yeah, all right, 1028 00:47:49,960 --> 00:47:51,719 Speaker 4: shall we leave it there, Let's live it there. This 1029 00:47:51,760 --> 00:47:54,319 Speaker 4: has been another episode of the All Thoughts podcast. I'm 1030 00:47:54,360 --> 00:47:57,400 Speaker 4: Tracy Alloway. You can follow me at Tracy Alloway. 1031 00:47:57,080 --> 00:47:58,960 Speaker 2: And I'm Joe Why Isn't Though? You can follow me 1032 00:47:59,080 --> 00:48:02,000 Speaker 2: at the Stalwart. Follow our guest Jared Sleeper He's at 1033 00:48:02,120 --> 00:48:06,000 Speaker 2: Jared's Sleeper. Follow our producers Carmen Rodriguez at Carmen armand 1034 00:48:06,040 --> 00:48:09,319 Speaker 2: dash Ol Bennett at dashbod and Kilbrooks at Kilbrooks. For 1035 00:48:09,400 --> 00:48:11,960 Speaker 2: more odd Laws content, go to Bloomberg dot com slash 1036 00:48:11,960 --> 00:48:14,200 Speaker 2: odd Lots or of a daily newsletter on all of our. 1037 00:48:14,120 --> 00:48:16,000 Speaker 7: Episodes, and you can chat about all. 1038 00:48:15,880 --> 00:48:18,240 Speaker 2: Of these topics twenty four to seven in our discord 1039 00:48:18,600 --> 00:48:20,480 Speaker 2: Discord dot gg slash odd. 1040 00:48:20,280 --> 00:48:22,880 Speaker 4: Lots And if you enjoy odd Lots, if you like 1041 00:48:22,920 --> 00:48:27,840 Speaker 4: it when we talk about the casastrophy Casastrophe, then please 1042 00:48:27,960 --> 00:48:31,840 Speaker 4: leave a positive review on your favorite podcast platform. And remember, 1043 00:48:31,920 --> 00:48:34,400 Speaker 4: if you are a Bloomberg subscriber, you can listen to 1044 00:48:34,560 --> 00:48:37,600 Speaker 4: all of our episodes absolutely ad free. All you need 1045 00:48:37,640 --> 00:48:40,360 Speaker 4: to do is find the Bloomberg channel on Apple Podcasts 1046 00:48:40,400 --> 00:48:43,040 Speaker 4: and follow the instructions there. Thanks for listening.