1 00:00:02,720 --> 00:00:14,000 Speaker 1: Bloomberg Audio Studios, Podcasts, radio News. 2 00:00:18,360 --> 00:00:21,840 Speaker 2: Hello and welcome to another episode of the Odd Lots podcast. 3 00:00:21,880 --> 00:00:25,400 Speaker 2: I'm Joe Wassental and I'm Tracy Alloway. So, Tracy, you're 4 00:00:25,400 --> 00:00:28,880 Speaker 2: cool like if I, like, you know, just start doing 5 00:00:28,880 --> 00:00:32,000 Speaker 2: this part time as I like build out my software business, right, Like, 6 00:00:32,040 --> 00:00:32,720 Speaker 2: you're cool about that. 7 00:00:32,960 --> 00:00:36,440 Speaker 3: I was gonna say, I've been thinking about AI and productivity, 8 00:00:36,680 --> 00:00:40,720 Speaker 3: and so far your productivity has gone down, Joe, because 9 00:00:40,760 --> 00:00:44,080 Speaker 3: instead of doing Odd Lots things, you're coding your own software. 10 00:00:44,560 --> 00:00:47,639 Speaker 2: Except that I'm creating content for the Odd Lots newsletter 11 00:00:47,680 --> 00:00:53,120 Speaker 2: about coding, and that is productivity a creative debatable debatable, 12 00:00:53,159 --> 00:00:55,880 Speaker 2: But but you're cool with that. You're cool with like 13 00:00:55,960 --> 00:00:57,800 Speaker 2: me like, oh, I'm just gonna like check in part 14 00:00:57,800 --> 00:00:59,480 Speaker 2: time on Odd Lots when we have a recording. 15 00:00:59,520 --> 00:01:01,680 Speaker 3: Well, so of course not okay, good, of course not. 16 00:01:01,840 --> 00:01:03,600 Speaker 2: Yeah, that's the right answer. I want you to yes, 17 00:01:03,680 --> 00:01:05,640 Speaker 2: I want you to be really sad. But like a 18 00:01:05,680 --> 00:01:08,360 Speaker 2: few other people, you know, I have like caught the 19 00:01:08,400 --> 00:01:10,959 Speaker 2: sort of like bug of like AI coding and I'm 20 00:01:11,000 --> 00:01:13,680 Speaker 2: totally blown away. I've like played with it from the 21 00:01:13,720 --> 00:01:16,040 Speaker 2: big I started playing around with it last year, but 22 00:01:16,080 --> 00:01:18,200 Speaker 2: then over the holidays. I've been writing about this in 23 00:01:18,240 --> 00:01:20,839 Speaker 2: the newsletter. Suddenly, like my Twitter feed is like clud code, 24 00:01:20,840 --> 00:01:24,320 Speaker 2: clug code, clud code. I do just cursor before, which 25 00:01:24,319 --> 00:01:27,560 Speaker 2: I was very impressed by at the time. And so 26 00:01:27,680 --> 00:01:29,800 Speaker 2: when I got home from vacation, one of the first 27 00:01:29,840 --> 00:01:31,559 Speaker 2: things I did is like figure out how to install 28 00:01:31,640 --> 00:01:34,920 Speaker 2: clug code on my computer. And I was like, oh, 29 00:01:35,080 --> 00:01:39,080 Speaker 2: I am like hooked. This is actually like I see 30 00:01:39,120 --> 00:01:41,200 Speaker 2: why I have my Twitter feed is just like people 31 00:01:41,200 --> 00:01:41,880 Speaker 2: posting about this. 32 00:01:42,160 --> 00:01:42,520 Speaker 4: All right. 33 00:01:42,560 --> 00:01:44,520 Speaker 3: So I have to say I have not tried it 34 00:01:44,760 --> 00:01:47,319 Speaker 3: because I only have a work computer and I can't 35 00:01:47,360 --> 00:01:51,960 Speaker 3: install new software, and I probably definitely cannot install new 36 00:01:52,000 --> 00:01:53,760 Speaker 3: software that then makes changes to. 37 00:01:53,720 --> 00:01:54,920 Speaker 2: Your existing software. 38 00:01:55,080 --> 00:01:57,680 Speaker 3: I don't think Bloomberg would like that, but I have 39 00:01:57,800 --> 00:02:00,760 Speaker 3: seen the hype. Lots of people talk talking about it. 40 00:02:01,200 --> 00:02:05,000 Speaker 3: Have you seen claud Cowork Have you heard. 41 00:02:04,920 --> 00:02:06,080 Speaker 2: Of Oh yeah, yeah, yeah yeah. 42 00:02:06,160 --> 00:02:08,680 Speaker 3: So one of the criticisms of claud code was that 43 00:02:08,800 --> 00:02:11,560 Speaker 3: you know, like, okay code, but you still need some 44 00:02:11,680 --> 00:02:14,880 Speaker 3: background knowledge in coding, because like, you know, the interface 45 00:02:14,960 --> 00:02:18,200 Speaker 3: is kind of like it pies and all of that 46 00:02:18,360 --> 00:02:22,800 Speaker 3: or nineteen nineties Cowork apparently like goes a step further 47 00:02:23,000 --> 00:02:27,600 Speaker 3: for normal people in quoting. It makes it super super easy. 48 00:02:27,720 --> 00:02:32,400 Speaker 3: And the funniest thing is that apparently cloud code actually 49 00:02:32,440 --> 00:02:33,720 Speaker 3: coded Cowork. 50 00:02:34,080 --> 00:02:37,399 Speaker 2: So this is like really relates to my experience last 51 00:02:37,480 --> 00:02:40,320 Speaker 2: year and then this year, which is that even last year, 52 00:02:40,440 --> 00:02:43,760 Speaker 2: like trying to use the AI coding tools, it was 53 00:02:43,800 --> 00:02:47,280 Speaker 2: an annoying process because there are various things that you 54 00:02:47,320 --> 00:02:49,280 Speaker 2: had to do in the actual command line of the 55 00:02:49,280 --> 00:02:52,320 Speaker 2: computer that were like I didn't I don't know command 56 00:02:52,400 --> 00:02:55,040 Speaker 2: line vernacular, and you have to install these libraries and 57 00:02:55,080 --> 00:02:56,880 Speaker 2: stuff like that. So there was this sort of like 58 00:02:57,440 --> 00:03:03,680 Speaker 2: barrier that existed. But what's really changed in the last 59 00:03:03,760 --> 00:03:06,400 Speaker 2: year or with the with Claude code, which has actually 60 00:03:06,400 --> 00:03:07,920 Speaker 2: been around for a while and I should have like 61 00:03:07,919 --> 00:03:11,080 Speaker 2: played with it before, is that like, because it sits 62 00:03:11,160 --> 00:03:14,480 Speaker 2: on your computer, it sort of takes away it de 63 00:03:14,560 --> 00:03:15,840 Speaker 2: abstracts it and so when you talk. 64 00:03:15,680 --> 00:03:17,880 Speaker 3: About like that it actually does the stuff. 65 00:03:17,560 --> 00:03:19,520 Speaker 2: It does it it just like oh, it's like, oh, 66 00:03:19,520 --> 00:03:22,280 Speaker 2: we're gonna need to install this open source natural language 67 00:03:22,320 --> 00:03:25,160 Speaker 2: processing library. It just does it automatically. Instead of me 68 00:03:25,240 --> 00:03:27,240 Speaker 2: trying like figure out like what are the right keystrokes 69 00:03:27,240 --> 00:03:29,480 Speaker 2: to pull that in or why is this not going 70 00:03:29,480 --> 00:03:32,640 Speaker 2: into the right file? Folder or whatever, and so like 71 00:03:32,639 --> 00:03:35,440 Speaker 2: what like Cowork, it's like all like all of these 72 00:03:35,480 --> 00:03:38,200 Speaker 2: sort of like little frictions, like these technical things like 73 00:03:38,400 --> 00:03:43,280 Speaker 2: command line user very rapidly are like dissipating and so 74 00:03:43,320 --> 00:03:45,120 Speaker 2: that like then you have something like Cowork where it's 75 00:03:45,200 --> 00:03:47,320 Speaker 2: just like they know they're taking care of that, and 76 00:03:47,440 --> 00:03:50,120 Speaker 2: so you get this like user interface that's just like 77 00:03:50,280 --> 00:03:53,960 Speaker 2: it's just getting easier and friendlier. There's almost no technical 78 00:03:54,000 --> 00:03:55,240 Speaker 2: frictions at all anymore. 79 00:03:55,280 --> 00:03:58,040 Speaker 3: Also, it feels very iterative, like the code is improving 80 00:03:58,120 --> 00:04:00,400 Speaker 3: upon itself at this point I think was one of 81 00:04:00,440 --> 00:04:01,680 Speaker 3: Claude's main selling points. 82 00:04:01,880 --> 00:04:04,720 Speaker 2: Well, this is like you've seen like people talk about like, 83 00:04:04,760 --> 00:04:07,720 Speaker 2: oh is AGI here? And this is like part of 84 00:04:07,760 --> 00:04:10,400 Speaker 2: the debate because the premp one of the ideas I 85 00:04:10,400 --> 00:04:12,800 Speaker 2: guess behind AGI is like, well what happens when you 86 00:04:12,840 --> 00:04:15,680 Speaker 2: have software that can train itself and so forth? And 87 00:04:15,760 --> 00:04:17,960 Speaker 2: I don't really know if I buy that, but you 88 00:04:18,040 --> 00:04:20,520 Speaker 2: do just see like how fast the iteration cycles are, 89 00:04:20,880 --> 00:04:23,719 Speaker 2: and I think we want to get into this. In part, 90 00:04:24,000 --> 00:04:26,480 Speaker 2: they're fast because a bunch of people are suddenly getting excited. 91 00:04:26,480 --> 00:04:29,200 Speaker 2: So then the human provides this sort of like we're 92 00:04:29,240 --> 00:04:31,400 Speaker 2: sowing the seeds of our own demise because we're so 93 00:04:31,560 --> 00:04:35,800 Speaker 2: enthusiastically participating in the evolution. But I just like it's 94 00:04:35,800 --> 00:04:38,640 Speaker 2: suddenly clear, like, oh, this is going to change I 95 00:04:38,680 --> 00:04:42,000 Speaker 2: think computing. And the other thing is the code works, 96 00:04:42,240 --> 00:04:44,840 Speaker 2: like it creates code that like this is like there's 97 00:04:44,839 --> 00:04:46,520 Speaker 2: no bugs. You know it works. 98 00:04:47,600 --> 00:04:50,360 Speaker 3: Did you see speaking of automating yourself, you see there 99 00:04:50,400 --> 00:04:52,320 Speaker 3: was a post on Reddit from a lawyer who said 100 00:04:52,360 --> 00:04:55,760 Speaker 3: he's basically used claud code to automate like his entire job, 101 00:04:55,800 --> 00:04:56,960 Speaker 3: and he hasn't told anyone. 102 00:04:57,440 --> 00:04:59,800 Speaker 2: I'm not exactly surprised because the other thing that I 103 00:05:00,000 --> 00:05:04,080 Speaker 2: experimented with is and I haven't one hundred percent verified this, 104 00:05:04,160 --> 00:05:07,120 Speaker 2: but on jobs Day last week, I downloaded the full 105 00:05:07,160 --> 00:05:09,960 Speaker 2: pdf and I just typed into the cloud code like 106 00:05:10,480 --> 00:05:13,560 Speaker 2: find the most interesting details and make some charts based 107 00:05:13,600 --> 00:05:15,440 Speaker 2: on and it did it in like a couple of minutes. 108 00:05:15,480 --> 00:05:17,400 Speaker 2: I have no like ability to like I've never like 109 00:05:17,440 --> 00:05:21,599 Speaker 2: built charts myself or whatever, like designing or whatever. And 110 00:05:21,680 --> 00:05:24,320 Speaker 2: I didn't totally confirm yet that the data was all correct, 111 00:05:24,320 --> 00:05:26,559 Speaker 2: but I'm pretty sure it was because everything I spot checked, 112 00:05:26,680 --> 00:05:28,440 Speaker 2: so I didn't just that crucial I didn't tell you 113 00:05:28,600 --> 00:05:30,200 Speaker 2: I know I didn't. That's why I didn't want to like, oh, 114 00:05:30,320 --> 00:05:32,960 Speaker 2: like here's what, here's the today's jobs reporting charts. 115 00:05:33,080 --> 00:05:33,400 Speaker 5: But I did. 116 00:05:34,200 --> 00:05:37,320 Speaker 3: So what application did it actually build it in the charts? 117 00:05:37,520 --> 00:05:39,200 Speaker 2: I don't know. I just had a file like that's 118 00:05:39,200 --> 00:05:41,120 Speaker 2: the thing. I had a file on my computer at 119 00:05:41,120 --> 00:05:43,920 Speaker 2: that point. What kind of file, like a P and 120 00:05:44,000 --> 00:05:46,479 Speaker 2: G file, like an image file, that's the crazy thing. 121 00:05:46,800 --> 00:05:48,880 Speaker 2: I don't know. And so there was just this image 122 00:05:48,920 --> 00:05:50,760 Speaker 2: that had a bunch of charts, and my spot checks 123 00:05:50,800 --> 00:05:54,360 Speaker 2: did suggest like I didn't see anything off. And people 124 00:05:54,440 --> 00:05:56,839 Speaker 2: get paid money to like build that kind of stuff 125 00:05:56,839 --> 00:05:58,360 Speaker 2: for like analysts and stuff like that. 126 00:05:58,480 --> 00:06:01,160 Speaker 3: And right, so this is the other big question. If 127 00:06:01,200 --> 00:06:05,120 Speaker 3: everyone can build their own software, what actually happens to software? 128 00:06:05,160 --> 00:06:07,640 Speaker 3: And I was reading something I forget who it was by, 129 00:06:07,960 --> 00:06:10,800 Speaker 3: but someone used claud code to create they wanted a 130 00:06:10,839 --> 00:06:13,719 Speaker 3: website that would basically make the money for doing nothing, 131 00:06:13,839 --> 00:06:15,880 Speaker 3: and that was the prompt And did. 132 00:06:15,720 --> 00:06:16,160 Speaker 2: They do it? 133 00:06:16,440 --> 00:06:16,760 Speaker 6: Yeah. 134 00:06:16,839 --> 00:06:20,240 Speaker 3: So the idea that the model came up with was 135 00:06:20,680 --> 00:06:23,920 Speaker 3: you can sell prompts, packages of good prompts and sell 136 00:06:23,960 --> 00:06:26,560 Speaker 3: them for like forty bucks and you'll make tons of money. 137 00:06:27,160 --> 00:06:29,680 Speaker 3: And I was thinking about that like, Okay, it's possible 138 00:06:29,760 --> 00:06:32,760 Speaker 3: to make money that way, but also why wouldn't I 139 00:06:32,839 --> 00:06:35,400 Speaker 3: just use claud code to do the same thing. 140 00:06:36,400 --> 00:06:38,800 Speaker 2: There are many big questions that we use an economy 141 00:06:38,800 --> 00:06:40,200 Speaker 2: are going to have to think about, and I think 142 00:06:40,279 --> 00:06:42,400 Speaker 2: my main takeaway is we're gonna have to think about 143 00:06:42,400 --> 00:06:44,560 Speaker 2: these sooner red than later. But what is clud code? 144 00:06:44,560 --> 00:06:46,640 Speaker 2: Why is everyone so hyped about it? Like what is 145 00:06:46,680 --> 00:06:50,240 Speaker 2: it about this particular piece of software versus what exists 146 00:06:50,240 --> 00:06:52,800 Speaker 2: from open Aye and Gemini and all this stuff, Like 147 00:06:52,800 --> 00:06:55,719 Speaker 2: why is this captured everyone's imagination? We really do have 148 00:06:55,720 --> 00:06:59,200 Speaker 2: the perfect guess because it's someone who, unlike me, has 149 00:06:59,240 --> 00:07:01,440 Speaker 2: been getting their hands and the stuff for longer. One 150 00:07:01,480 --> 00:07:03,800 Speaker 2: of the few people that I know who is into 151 00:07:04,000 --> 00:07:08,240 Speaker 2: lollms before chat GPT existed and was actually using them 152 00:07:08,320 --> 00:07:11,520 Speaker 2: via the API, and was actually talking about their technical 153 00:07:11,560 --> 00:07:14,880 Speaker 2: capacity to do things like coding even before November of 154 00:07:14,920 --> 00:07:17,640 Speaker 2: twenty twenty two. So truly the perfect guests We're going 155 00:07:17,720 --> 00:07:19,280 Speaker 2: to be speaking with Noah Bryer. He is the co 156 00:07:19,360 --> 00:07:22,720 Speaker 2: founder of Alefic, which is a consultancy that helps big 157 00:07:22,760 --> 00:07:25,920 Speaker 2: companies deal with AI stuff. So, uh, Noah, thank you 158 00:07:25,920 --> 00:07:27,160 Speaker 2: so much for coming on odd lots. 159 00:07:27,240 --> 00:07:28,200 Speaker 5: Thank you for having me. 160 00:07:28,680 --> 00:07:31,160 Speaker 2: What is eleven? What's the deal? How are you like 161 00:07:31,320 --> 00:07:34,960 Speaker 2: using llms before chat GPT existed? I don't know. I 162 00:07:35,000 --> 00:07:36,280 Speaker 2: know very few people who were doing that. 163 00:07:36,880 --> 00:07:39,679 Speaker 6: I had the good fortune of shutting down a startup 164 00:07:39,720 --> 00:07:41,200 Speaker 6: in twenty twenty two, and so I had a lot 165 00:07:41,240 --> 00:07:42,280 Speaker 6: of free time on my hands. 166 00:07:42,400 --> 00:07:43,720 Speaker 2: And then how are you using it though? 167 00:07:43,760 --> 00:07:43,840 Speaker 3: Like? 168 00:07:43,880 --> 00:07:45,720 Speaker 2: How did you look at your like how did you 169 00:07:45,880 --> 00:07:48,080 Speaker 2: aware that there was this thing that could be of 170 00:07:48,160 --> 00:07:49,000 Speaker 2: potential used to. 171 00:07:49,080 --> 00:07:51,000 Speaker 6: What was So my very first thing I was doing 172 00:07:51,120 --> 00:07:53,880 Speaker 6: was using gethub Copilot, which at the time was built 173 00:07:53,920 --> 00:07:57,240 Speaker 6: into VS code, and it was autocomplete inside VS code. 174 00:07:57,240 --> 00:07:59,800 Speaker 6: So it was a nice and pretty immediately realized that 175 00:07:59,800 --> 00:08:03,920 Speaker 6: there were certain coding tasks that it could just handle completely. 176 00:08:04,960 --> 00:08:08,040 Speaker 6: Anything that was very pattern based. So if you write code, 177 00:08:08,080 --> 00:08:10,120 Speaker 6: you write a lot of tests. If you write tests, 178 00:08:10,200 --> 00:08:12,880 Speaker 6: every test kind of follows the same pattern, and you 179 00:08:12,920 --> 00:08:14,520 Speaker 6: want it to follow the same pattern. You're looking for 180 00:08:14,560 --> 00:08:17,160 Speaker 6: that structure, and over time, because it was looking at 181 00:08:17,200 --> 00:08:19,200 Speaker 6: your code base, it was able to basically autocomplete it. 182 00:08:20,000 --> 00:08:24,240 Speaker 6: I also started playing with the GPT three API, which 183 00:08:24,240 --> 00:08:25,880 Speaker 6: had come out. I think that came out in November 184 00:08:25,920 --> 00:08:28,240 Speaker 6: of twenty twenty one, and that was the first time 185 00:08:28,280 --> 00:08:31,640 Speaker 6: it was publicly available to everybody, and they had a 186 00:08:31,720 --> 00:08:35,040 Speaker 6: large language, Boddel as we know it today, available to them. 187 00:08:35,080 --> 00:08:38,000 Speaker 6: So I was just testing and building things, and I 188 00:08:38,040 --> 00:08:41,840 Speaker 6: pretty immediately realized the very first thing I did where 189 00:08:42,000 --> 00:08:44,720 Speaker 6: it just blew my mind was I built a web scraper. 190 00:08:44,800 --> 00:08:47,240 Speaker 6: So I was just trying to pull pricing data from 191 00:08:47,240 --> 00:08:49,520 Speaker 6: a website. And I've done a lot of this in 192 00:08:49,559 --> 00:08:52,199 Speaker 6: my career. It's maybe the most annoying task you have 193 00:08:52,320 --> 00:08:54,360 Speaker 6: to do in all of coding, because HTML is the 194 00:08:54,360 --> 00:08:58,920 Speaker 6: most miserable language to have to parse. And I just 195 00:08:59,000 --> 00:09:00,600 Speaker 6: had this thing where I took the page, I took 196 00:09:00,600 --> 00:09:02,200 Speaker 6: the content, I took the text, and I gave it 197 00:09:02,240 --> 00:09:04,040 Speaker 6: to the AI and I asked it to give me 198 00:09:04,080 --> 00:09:06,600 Speaker 6: back the pricing table and give me back the pricing table. 199 00:09:06,920 --> 00:09:09,480 Speaker 6: And I just thought, I'll never do it the other 200 00:09:09,520 --> 00:09:09,880 Speaker 6: way again. 201 00:09:09,920 --> 00:09:10,320 Speaker 5: That's it. 202 00:09:10,400 --> 00:09:16,040 Speaker 3: Yeah, that HTML mentioned just brought up like memories of 203 00:09:16,080 --> 00:09:19,080 Speaker 3: me in like the mid nineties on HTML goodies. Do 204 00:09:19,080 --> 00:09:21,840 Speaker 3: you remember that side? Yeah? I wonder if it's still 205 00:09:22,200 --> 00:09:26,640 Speaker 3: Is it still up? That would be wild? Does cloud code? 206 00:09:26,679 --> 00:09:29,520 Speaker 3: Does that count as AGI? This seems to be the debate, 207 00:09:29,600 --> 00:09:30,560 Speaker 3: right is it AGI? 208 00:09:31,160 --> 00:09:33,720 Speaker 6: I try not to wait into what's AGI and what's not. 209 00:09:33,880 --> 00:09:36,360 Speaker 6: I think my guess on AGI, for what it's worth 210 00:09:36,480 --> 00:09:39,240 Speaker 6: is that it's probably going to be a conversation like 211 00:09:39,280 --> 00:09:41,640 Speaker 6: the Turing Test, where everybody thought it was really really 212 00:09:41,679 --> 00:09:43,640 Speaker 6: important for a really long time. We thought the Turing 213 00:09:43,640 --> 00:09:46,800 Speaker 6: Test was the biggest thing for seventy years or whatever, 214 00:09:46,880 --> 00:09:50,280 Speaker 6: and then CHATCHYBT very clearly passed the Turing test, and 215 00:09:50,280 --> 00:09:53,360 Speaker 6: now everybody pretends, like we it's not just that they forgot, 216 00:09:53,400 --> 00:09:56,920 Speaker 6: they pretend that it never mattered. Oh and so I 217 00:09:56,960 --> 00:09:59,440 Speaker 6: am kind of guessing that that's going to be what 218 00:09:59,480 --> 00:10:00,960 Speaker 6: the conversation. It's like, it's just going to be a 219 00:10:00,960 --> 00:10:04,120 Speaker 6: sort of forever moving goalpost, because it turns out that 220 00:10:04,160 --> 00:10:07,000 Speaker 6: the idea we had for what general intelligence looks like 221 00:10:07,720 --> 00:10:11,200 Speaker 6: is not quite that. But I also think, you know, 222 00:10:11,360 --> 00:10:14,600 Speaker 6: the computer scientists and the sort of serious AI researchers 223 00:10:14,800 --> 00:10:17,600 Speaker 6: would say that much of what's going on inside claud 224 00:10:17,640 --> 00:10:20,720 Speaker 6: code is not the model itself. It's the model paired 225 00:10:20,760 --> 00:10:23,720 Speaker 6: with a human and I think that is a pretty 226 00:10:23,760 --> 00:10:24,480 Speaker 6: important distinction. 227 00:10:24,559 --> 00:10:25,760 Speaker 5: But I don't know about AGI. 228 00:10:41,600 --> 00:10:46,640 Speaker 2: Well, okay, so you were using GPT to code prior 229 00:10:46,720 --> 00:10:49,920 Speaker 2: to the release of jag GUPT. So therefore, coding models 230 00:10:49,920 --> 00:10:53,120 Speaker 2: have been around a long time. So what is for 231 00:10:53,200 --> 00:10:56,120 Speaker 2: those who haven't played around with it? What is claude code? 232 00:10:56,160 --> 00:10:58,760 Speaker 2: Because again, coding models have been around for a long time. 233 00:11:00,040 --> 00:11:02,599 Speaker 2: Well maybe have heard of cursor, copilot or some of 234 00:11:02,679 --> 00:11:05,760 Speaker 2: these other harnesses, et cetera. What is cloud code? 235 00:11:05,840 --> 00:11:08,920 Speaker 5: So if we back up first and we go to copilot. 236 00:11:09,160 --> 00:11:12,560 Speaker 6: So Copilot was the first sort of commercial application of 237 00:11:12,559 --> 00:11:17,120 Speaker 6: a largelanguage model by most accounts, And what Copilot did 238 00:11:17,160 --> 00:11:19,360 Speaker 6: in its initial instantiation was just. 239 00:11:19,440 --> 00:11:20,600 Speaker 2: Auto Microsoft product. 240 00:11:20,640 --> 00:11:23,400 Speaker 6: It's a Microsoft product, so Microsoft doones, get hub, getthub, 241 00:11:23,520 --> 00:11:28,120 Speaker 6: develop copilot. It was Microsoft had the partnership with open Ai, 242 00:11:28,280 --> 00:11:30,120 Speaker 6: and so they built it in and what it was 243 00:11:30,160 --> 00:11:33,280 Speaker 6: doing was doing autocomplete. So if you're writing code, a 244 00:11:33,320 --> 00:11:35,679 Speaker 6: lot of writing code is boiler plate or trying to 245 00:11:35,760 --> 00:11:38,320 Speaker 6: remember the name of a function. And you know, the 246 00:11:38,400 --> 00:11:41,400 Speaker 6: reason stack overflow existed was because you can never remember 247 00:11:41,480 --> 00:11:45,000 Speaker 6: the exact name of that function or the exact rejects 248 00:11:45,080 --> 00:11:47,320 Speaker 6: that you need to use in order to find and 249 00:11:47,360 --> 00:11:50,520 Speaker 6: replace something, and so you would go search for it. 250 00:11:50,559 --> 00:11:53,160 Speaker 6: And they realized that you could just build that into 251 00:11:53,200 --> 00:11:56,959 Speaker 6: the ID your code editor and and have it autocomplete 252 00:11:57,000 --> 00:12:02,640 Speaker 6: for you, and it was pretty amazing. Then, uh, Chatchipt 253 00:12:02,800 --> 00:12:04,640 Speaker 6: came out, and even before that, I had built a 254 00:12:04,679 --> 00:12:07,199 Speaker 6: simple chatbot for myself because I realized that, hey, I 255 00:12:07,200 --> 00:12:09,360 Speaker 6: could just ask this and instead of going and searching 256 00:12:09,400 --> 00:12:13,559 Speaker 6: stack overflow. It was totally capable of answering code questions, 257 00:12:13,559 --> 00:12:15,800 Speaker 6: and it was capable of writing rejects or doing these 258 00:12:15,800 --> 00:12:18,480 Speaker 6: things and didn't make mistakes. Yes, but like there's famous 259 00:12:18,520 --> 00:12:22,120 Speaker 6: mistakes on stack overflow of incorrect rejects that now exists 260 00:12:22,120 --> 00:12:25,520 Speaker 6: in every code base in the world, and so you know, 261 00:12:25,559 --> 00:12:26,920 Speaker 6: I think there were a lot of us just kind 262 00:12:26,920 --> 00:12:29,480 Speaker 6: of playing with these things and realizing they were a 263 00:12:29,559 --> 00:12:33,000 Speaker 6: huge boon. And so I think, really the next step 264 00:12:33,040 --> 00:12:35,080 Speaker 6: is Cursor comes out. And the thing Curser realized that 265 00:12:35,240 --> 00:12:37,719 Speaker 6: co Pilot didn't was that it wasn't good enough to 266 00:12:37,720 --> 00:12:40,160 Speaker 6: have autocomplete. You also needed the Q and A because 267 00:12:40,160 --> 00:12:42,640 Speaker 6: you have these things that you can't just auto complete. 268 00:12:42,640 --> 00:12:43,880 Speaker 6: You want to be able to ask the question and 269 00:12:43,920 --> 00:12:46,079 Speaker 6: answer it. And then Chatchipt came out and everybody was 270 00:12:46,080 --> 00:12:50,280 Speaker 6: switching between ID and then I think, really the next 271 00:12:50,280 --> 00:12:52,840 Speaker 6: big piece is that claud code came out. And what 272 00:12:52,920 --> 00:12:56,040 Speaker 6: claud Code did that was so remarkable was they took 273 00:12:56,080 --> 00:12:59,079 Speaker 6: the same set of models really and they took them 274 00:12:59,080 --> 00:13:01,600 Speaker 6: out of the chatbot and they really just gave it 275 00:13:01,679 --> 00:13:06,880 Speaker 6: some very basic functionality to operate within your machine. 276 00:13:06,960 --> 00:13:07,160 Speaker 5: Right. 277 00:13:07,200 --> 00:13:08,760 Speaker 6: And so you know, if you really look at kind 278 00:13:08,760 --> 00:13:11,079 Speaker 6: of what exists within claud code, you're calling out to 279 00:13:11,120 --> 00:13:13,760 Speaker 6: a model, and they gave it capability around sort of 280 00:13:13,880 --> 00:13:16,000 Speaker 6: two big things. One is you can read and write 281 00:13:16,000 --> 00:13:19,320 Speaker 6: files on your computer, and then two is that you 282 00:13:19,400 --> 00:13:24,040 Speaker 6: can operate Unix the base commands, the bashed commands that 283 00:13:24,200 --> 00:13:29,240 Speaker 6: exist in your environment. And again, because these models were 284 00:13:29,280 --> 00:13:31,600 Speaker 6: trained on the Internet, and there's no greater source of 285 00:13:31,600 --> 00:13:33,840 Speaker 6: information on the Internet than how to make the Internet, 286 00:13:34,080 --> 00:13:38,040 Speaker 6: they know how to use Unix commands incredibly well, right, 287 00:13:38,040 --> 00:13:40,680 Speaker 6: because Unix has existed for whatever it is, sixty years, 288 00:13:40,679 --> 00:13:43,120 Speaker 6: and the way these commands were designed, they're all designed 289 00:13:43,120 --> 00:13:45,920 Speaker 6: to be very very simple. There's a find command, and 290 00:13:46,000 --> 00:13:47,640 Speaker 6: you know, there's some thing called re rep and it 291 00:13:47,679 --> 00:13:50,520 Speaker 6: can search to a code base. And Unix has this 292 00:13:50,600 --> 00:13:53,240 Speaker 6: sort of beautiful way of tying one command to another, 293 00:13:53,280 --> 00:13:55,120 Speaker 6: so you can take the output of one command and 294 00:13:55,120 --> 00:13:58,120 Speaker 6: send it to another. And they kind of just gave 295 00:13:58,200 --> 00:14:00,839 Speaker 6: the model access to these two or three very simple things, 296 00:14:01,400 --> 00:14:03,800 Speaker 6: and it kind of turned out that it unlocked a 297 00:14:03,840 --> 00:14:06,720 Speaker 6: whole bunch of functionality. I don't think even the people 298 00:14:06,800 --> 00:14:10,320 Speaker 6: who built it fully realized. Like one example that I 299 00:14:10,320 --> 00:14:13,199 Speaker 6: think about a lot is just the challenge you have 300 00:14:13,360 --> 00:14:15,040 Speaker 6: with all of these. 301 00:14:14,880 --> 00:14:16,880 Speaker 5: AI models is that they're stateless. 302 00:14:17,200 --> 00:14:20,920 Speaker 6: So every time you talk to chat GPT, it's sending 303 00:14:20,960 --> 00:14:23,960 Speaker 6: your entire conversation history back to chat GPT because it 304 00:14:23,960 --> 00:14:28,600 Speaker 6: has no saved history of that chat, right, And that's fine, 305 00:14:28,640 --> 00:14:30,560 Speaker 6: it's the way it works. It's just fact. But it 306 00:14:30,680 --> 00:14:32,960 Speaker 6: means that you know, it forgets things. It doesn't no 307 00:14:33,040 --> 00:14:37,160 Speaker 6: conversation to conversation. And one very easy way to save 308 00:14:37,280 --> 00:14:41,080 Speaker 6: your state is just write it to a file. And 309 00:14:41,160 --> 00:14:44,560 Speaker 6: so you give it right access and it can create files. 310 00:14:44,560 --> 00:14:47,440 Speaker 6: And now all of a sudden you've overcome this, like 311 00:14:47,680 --> 00:14:51,520 Speaker 6: probably the single biggest challenge that exists inside these large 312 00:14:51,560 --> 00:14:54,600 Speaker 6: language models, which is that they're fundamentally stateless. 313 00:14:55,000 --> 00:14:58,400 Speaker 3: So Claude writes itself little like memory notes right to 314 00:14:58,920 --> 00:15:01,960 Speaker 3: remember the entire ca context of the conversation, and that's 315 00:15:01,960 --> 00:15:03,200 Speaker 3: how it solved that problem. 316 00:15:03,480 --> 00:15:05,760 Speaker 6: No, so there's sort of two things going on in 317 00:15:05,840 --> 00:15:06,560 Speaker 6: claude code. 318 00:15:06,600 --> 00:15:07,280 Speaker 5: Beneath the hood. 319 00:15:07,520 --> 00:15:10,080 Speaker 6: There's one thing that works exactly like chetchipt or any 320 00:15:10,080 --> 00:15:13,240 Speaker 6: of these other ones, which is it's maintaining a conversation history. 321 00:15:13,280 --> 00:15:18,840 Speaker 6: So every message you send it and every action it takes, 322 00:15:18,880 --> 00:15:22,200 Speaker 6: it's recording to a log, which is just one big file. 323 00:15:22,400 --> 00:15:25,240 Speaker 6: That's really no different than what chet ChiPT can do. 324 00:15:25,520 --> 00:15:28,160 Speaker 6: Where it gets really interesting though, is it can also 325 00:15:28,440 --> 00:15:32,600 Speaker 6: write files that it can then read, so whereas that 326 00:15:32,640 --> 00:15:36,280 Speaker 6: conversation history is all saved off, and eventually that conversation 327 00:15:36,320 --> 00:15:37,600 Speaker 6: gets too long and it needs to do a thing 328 00:15:37,640 --> 00:15:41,120 Speaker 6: called compaction. And when it compacts it, it tries to 329 00:15:41,160 --> 00:15:44,560 Speaker 6: sort of just remember the bits because there the total. 330 00:15:44,320 --> 00:15:44,880 Speaker 5: Window is. 331 00:15:47,440 --> 00:15:50,160 Speaker 6: Large. But I mean it's like one hundred thousand tokens 332 00:15:50,840 --> 00:15:51,160 Speaker 6: I mean. 333 00:15:51,040 --> 00:15:54,320 Speaker 3: By memory notes, right. It compacts the information into the 334 00:15:54,400 --> 00:15:56,400 Speaker 3: important stuff, then retreat. 335 00:15:56,520 --> 00:15:56,960 Speaker 5: It does that. 336 00:15:57,720 --> 00:15:59,800 Speaker 6: It only does that at the end, like once it 337 00:15:59,840 --> 00:16:02,680 Speaker 6: runs out of space, once it runs out of context window. 338 00:16:02,880 --> 00:16:04,600 Speaker 6: So it has two hundred thousand tokens I think, and 339 00:16:04,600 --> 00:16:06,840 Speaker 6: two hundred thousand tokens in rough terms is probably one 340 00:16:06,880 --> 00:16:10,920 Speaker 6: hundred and fifty thousand words. It says, okay, it's time 341 00:16:10,960 --> 00:16:13,600 Speaker 6: for me to compact all of this stuff, and so 342 00:16:13,640 --> 00:16:16,280 Speaker 6: it still saves your whole history. 343 00:16:15,960 --> 00:16:16,600 Speaker 5: On your computer. 344 00:16:16,640 --> 00:16:19,320 Speaker 6: You still have the entire message, but for that session, 345 00:16:19,720 --> 00:16:23,280 Speaker 6: it just compacts it down to this, you know, maybe 346 00:16:23,360 --> 00:16:27,920 Speaker 6: twenty five thousand token memory of what it was. 347 00:16:29,400 --> 00:16:32,240 Speaker 2: And is this like something that was not obvious before 348 00:16:32,360 --> 00:16:36,400 Speaker 2: as a solution like this compaction? How important is it 349 00:16:36,920 --> 00:16:40,320 Speaker 2: for this being? Like okay? As a human I can 350 00:16:40,320 --> 00:16:42,600 Speaker 2: work on this on a project for a long time, 351 00:16:42,680 --> 00:16:44,920 Speaker 2: like how much of an an unlock was there? 352 00:16:45,120 --> 00:16:47,640 Speaker 6: I'm not sure compaction was the okay walk. I think 353 00:16:47,680 --> 00:16:52,280 Speaker 6: the compaction functionality is helpful. The way chat GPT does it, 354 00:16:52,320 --> 00:16:54,320 Speaker 6: for what it's worth is they don't do compaction. They 355 00:16:54,360 --> 00:16:57,600 Speaker 6: just forget your messages eventually. So if you're in one chat, 356 00:16:58,000 --> 00:17:01,000 Speaker 6: eventually your oldest message is going to fall off the back. 357 00:17:02,200 --> 00:17:06,680 Speaker 6: For coding, that's probably less helpful, but there are trade offs. 358 00:17:06,720 --> 00:17:10,399 Speaker 6: I both techniques work. I think fundamentally, the thing that 359 00:17:10,520 --> 00:17:14,440 Speaker 6: is special about cloud code is not the compaction. It's 360 00:17:14,680 --> 00:17:17,199 Speaker 6: the ability to write and read files on your computer, 361 00:17:17,560 --> 00:17:19,520 Speaker 6: which means you can always write off memories. 362 00:17:19,960 --> 00:17:21,600 Speaker 2: And then what does that mean? Write off memory? 363 00:17:21,760 --> 00:17:24,359 Speaker 6: So you could say, hey, it's really important that I 364 00:17:24,400 --> 00:17:26,800 Speaker 6: remember this thing for future sessions. I want to always 365 00:17:26,880 --> 00:17:30,040 Speaker 6: work this way. So in a code base of mine, 366 00:17:30,720 --> 00:17:34,119 Speaker 6: I have a set of documentation that explains how I 367 00:17:34,200 --> 00:17:37,200 Speaker 6: like to do things. And cloud code makes a mistake, 368 00:17:37,240 --> 00:17:39,320 Speaker 6: and so the next time I can write a memory, 369 00:17:39,680 --> 00:17:42,159 Speaker 6: essentially it's written as a thing they call a skill, 370 00:17:43,000 --> 00:17:44,640 Speaker 6: and you can write it off and you say, hey, 371 00:17:44,760 --> 00:17:46,840 Speaker 6: whenever you run into this, I want you to operate 372 00:17:46,880 --> 00:17:50,800 Speaker 6: in this kind of way. And that existing across every 373 00:17:50,840 --> 00:17:53,960 Speaker 6: session is really a thing you can only do when 374 00:17:53,960 --> 00:17:56,600 Speaker 6: you can store it as a file. It's a thing 375 00:17:56,640 --> 00:17:58,760 Speaker 6: you can't do in quite the same way when you're 376 00:17:58,800 --> 00:18:00,840 Speaker 6: operating in this environment or it's just going back and 377 00:18:00,880 --> 00:18:03,840 Speaker 6: forth to the API. So this access to the file 378 00:18:03,920 --> 00:18:06,000 Speaker 6: system is one really big piece. And then the second 379 00:18:06,040 --> 00:18:09,640 Speaker 6: is is just the Unix commands, I mean computers. Every 380 00:18:09,640 --> 00:18:13,359 Speaker 6: computer program lives on top of these sort of baseline 381 00:18:14,400 --> 00:18:18,720 Speaker 6: functions and the way that the designers of Unix built 382 00:18:18,760 --> 00:18:22,600 Speaker 6: them is really elegant, and they're very small. They all 383 00:18:22,640 --> 00:18:27,480 Speaker 6: do one thing, and they're all composable, and in coding terms, 384 00:18:27,480 --> 00:18:30,359 Speaker 6: composable means they can be chained together, right, and so 385 00:18:31,240 --> 00:18:34,920 Speaker 6: you can say, hey, look for files that mentioned this word, 386 00:18:35,119 --> 00:18:37,560 Speaker 6: and then from those files, I want you to take 387 00:18:37,600 --> 00:18:40,280 Speaker 6: this second action, and then from the output of that action, 388 00:18:40,320 --> 00:18:41,400 Speaker 6: I want you to take a third action. 389 00:18:41,440 --> 00:18:43,119 Speaker 5: And that's just built into Unix. 390 00:18:43,160 --> 00:18:45,760 Speaker 6: You literally just put a little pipe in between and 391 00:18:45,840 --> 00:18:48,320 Speaker 6: you just pipe them from one to another and that's it. 392 00:18:48,400 --> 00:18:51,199 Speaker 6: And so you give it access to write these commands, 393 00:18:51,200 --> 00:18:52,960 Speaker 6: and all of a sudden, it gets these sort of 394 00:18:52,960 --> 00:18:55,600 Speaker 6: second and third order effects that are just incredibly powerful 395 00:18:55,600 --> 00:18:57,720 Speaker 6: and built over a really long time. 396 00:18:58,320 --> 00:19:01,159 Speaker 3: So how much of claud code, the way it's different 397 00:19:01,200 --> 00:19:04,400 Speaker 3: to other models, how much of that was overcoming technological 398 00:19:04,480 --> 00:19:08,760 Speaker 3: challenges versus just having a good idea, because hearing you 399 00:19:08,840 --> 00:19:12,280 Speaker 3: describe it, I mean, giving access to a computer seems 400 00:19:12,359 --> 00:19:15,760 Speaker 3: like kind of obvious, like let's just do that. 401 00:19:17,640 --> 00:19:19,480 Speaker 6: I don't have a good answer to that. I think 402 00:19:19,560 --> 00:19:22,000 Speaker 6: that it was kind of just a good idea. Yeah, 403 00:19:22,200 --> 00:19:25,040 Speaker 6: I think they did some patterns really well. They're clearly 404 00:19:25,119 --> 00:19:30,119 Speaker 6: incredibly talented, not just engineers, but kind of thinkers about 405 00:19:30,119 --> 00:19:33,680 Speaker 6: how to structure it. Like the primitives inside cloud code 406 00:19:33,720 --> 00:19:36,280 Speaker 6: are just smart. And then the thing that they've done 407 00:19:36,640 --> 00:19:41,880 Speaker 6: and Boris Cherney, who's the lead developer on cloud coded anthropic, 408 00:19:41,920 --> 00:19:44,920 Speaker 6: he talks about light and demand a lot right, and 409 00:19:45,000 --> 00:19:47,280 Speaker 6: Layton Demand is basically just, hey, look at the ways 410 00:19:47,320 --> 00:19:49,520 Speaker 6: people are using these systems and then figure out ways 411 00:19:49,520 --> 00:19:52,080 Speaker 6: to make that a part of the product itself. I 412 00:19:52,119 --> 00:19:54,600 Speaker 6: think what they've done brilliantly, and this is kind of 413 00:19:54,640 --> 00:19:56,800 Speaker 6: easy when you have a community of developers who are 414 00:19:56,840 --> 00:19:58,440 Speaker 6: nerds who want to go talk about all the ways 415 00:19:58,480 --> 00:20:01,399 Speaker 6: that they're using these things. Is they have I am 416 00:20:01,480 --> 00:20:03,679 Speaker 6: amazed at the speed in which you know, I have 417 00:20:03,720 --> 00:20:07,199 Speaker 6: a small community of fifteen CTOs who all use this 418 00:20:07,240 --> 00:20:10,600 Speaker 6: stuff religiously. And you know, when we first started that community, 419 00:20:10,720 --> 00:20:12,879 Speaker 6: it took them a month to I would see it 420 00:20:12,920 --> 00:20:14,480 Speaker 6: in the chat, and then a month later it would 421 00:20:14,480 --> 00:20:17,399 Speaker 6: get built into cloud code, and then increasingly. 422 00:20:16,920 --> 00:20:18,959 Speaker 5: It's like a day later. It feels like they're just 423 00:20:19,160 --> 00:20:19,960 Speaker 5: they're just listening to it. 424 00:20:20,000 --> 00:20:22,040 Speaker 6: But I think they're just not only tapped in, but 425 00:20:22,040 --> 00:20:26,080 Speaker 6: they're really fundamentally, you know, they're dog fooding it. They 426 00:20:26,200 --> 00:20:29,080 Speaker 6: use their own products when you you know, they talk 427 00:20:29,119 --> 00:20:33,480 Speaker 6: about the productivity engineering productivity at Anthropic. You know, despite 428 00:20:33,560 --> 00:20:36,600 Speaker 6: growing at a crazy clip, it continues to go up. 429 00:20:36,640 --> 00:20:40,639 Speaker 6: And you know, anybody who's built had to manage large 430 00:20:40,680 --> 00:20:44,720 Speaker 6: scale pieces of software, large scale code basis knows that's 431 00:20:44,880 --> 00:20:46,119 Speaker 6: not the norm. 432 00:20:46,840 --> 00:20:52,000 Speaker 2: So VS code and cursor, these are IDEs. Cloud code 433 00:20:52,040 --> 00:20:54,520 Speaker 2: is not an ID. What it's called a CLI is. 434 00:20:54,520 --> 00:20:56,160 Speaker 6: A CLI a command line interface. 435 00:20:56,320 --> 00:20:59,600 Speaker 2: Got it and the other labs now they also have CLIs. 436 00:21:00,080 --> 00:21:02,800 Speaker 2: So why are we all talking about claud code and 437 00:21:02,640 --> 00:21:06,520 Speaker 2: I chattypts is called codex. I don't know what Geminis 438 00:21:06,560 --> 00:21:06,960 Speaker 2: is called. 439 00:21:07,080 --> 00:21:08,000 Speaker 5: I think it's just called the gem. 440 00:21:08,760 --> 00:21:11,720 Speaker 2: Why are we all talking about claud code rather than 441 00:21:11,840 --> 00:21:14,480 Speaker 2: the other cla is that kind of have the same thing? Like, 442 00:21:14,520 --> 00:21:15,320 Speaker 2: what is the difference? 443 00:21:15,880 --> 00:21:18,040 Speaker 6: I think, first and foremost they were first, okay, so 444 00:21:18,560 --> 00:21:20,600 Speaker 6: and I think they've they've had a lot more and 445 00:21:20,920 --> 00:21:24,399 Speaker 6: you know, from my very personal opinion, I think they've 446 00:21:24,680 --> 00:21:28,879 Speaker 6: done something smarter and better as far as the permissioning model. 447 00:21:29,000 --> 00:21:30,720 Speaker 6: So you know, one of the really dangerous things is 448 00:21:30,720 --> 00:21:32,360 Speaker 6: you've got the same running on your computer. You don't 449 00:21:32,400 --> 00:21:36,080 Speaker 6: want it to go and delete everything. And they have 450 00:21:36,119 --> 00:21:38,679 Speaker 6: a very fine grain permissioning model where you can say, hey, 451 00:21:38,720 --> 00:21:40,520 Speaker 6: I want to allow this just this one time. I 452 00:21:40,520 --> 00:21:41,560 Speaker 6: want to always allow it. 453 00:21:41,680 --> 00:21:43,840 Speaker 2: You know, I always click always allow. I'm living on 454 00:21:43,840 --> 00:21:44,240 Speaker 2: the edge. 455 00:21:44,480 --> 00:21:47,040 Speaker 6: You can you can next time you run it. You 456 00:21:47,040 --> 00:21:50,080 Speaker 6: can just do a flag that says dangerously skip permissions 457 00:21:50,920 --> 00:21:54,840 Speaker 6: and and it'll just They call it yolo mode. I 458 00:21:54,880 --> 00:21:58,560 Speaker 6: think I think more fundamentally though, if I look at 459 00:21:58,760 --> 00:22:02,480 Speaker 6: at Codex versus Code, I think it's a difference in 460 00:22:02,480 --> 00:22:07,760 Speaker 6: philosophy around what you want AI to do to me. Codex, 461 00:22:08,400 --> 00:22:12,800 Speaker 6: which is excellent, is very focused on building an agent 462 00:22:12,880 --> 00:22:15,199 Speaker 6: that you could just give something to and it'll just 463 00:22:15,280 --> 00:22:17,600 Speaker 6: go do it. So I want to give it that task. 464 00:22:17,680 --> 00:22:19,560 Speaker 6: I don't want to intercede. I don't want to give 465 00:22:19,600 --> 00:22:24,680 Speaker 6: it any more feedback. And claud code is much more 466 00:22:24,720 --> 00:22:28,159 Speaker 6: designed to be kind of a pair programmer. And so 467 00:22:28,320 --> 00:22:31,040 Speaker 6: you know, in engineering, pair programming has existed for a while. 468 00:22:31,280 --> 00:22:34,320 Speaker 6: It's a really weird sort of productivity thing where you 469 00:22:34,400 --> 00:22:36,480 Speaker 6: put two engineers on the same problem and it turns 470 00:22:36,480 --> 00:22:40,120 Speaker 6: out that you can get better code and multiplier Yeah, 471 00:22:40,160 --> 00:22:41,800 Speaker 6: and it sort of makes up for the fact that 472 00:22:41,800 --> 00:22:44,240 Speaker 6: obviously you know you're doubling the staff on it. But 473 00:22:44,359 --> 00:22:47,240 Speaker 6: because of how many fewer bugs because you've both said svies, 474 00:22:47,720 --> 00:22:50,200 Speaker 6: it has seemed to work out for many folks. Most 475 00:22:50,280 --> 00:22:53,840 Speaker 6: companies don't practice it, but I think cloud code fundamentally 476 00:22:53,880 --> 00:22:56,120 Speaker 6: is much more designed in that way. 477 00:22:56,200 --> 00:22:58,280 Speaker 5: It's a pair programmer. It's they. You know. 478 00:22:58,320 --> 00:23:01,560 Speaker 6: Whenever I start a project, I start in plan mode. 479 00:23:01,600 --> 00:23:03,440 Speaker 6: So you start in plan mode. You put together a plan. 480 00:23:03,480 --> 00:23:04,639 Speaker 6: I really, I mean it has spent a lot of 481 00:23:04,720 --> 00:23:07,159 Speaker 6: time in plan mode. You go through, it gives you 482 00:23:07,160 --> 00:23:09,000 Speaker 6: a plan, back, it asks you how you feel you 483 00:23:09,000 --> 00:23:11,320 Speaker 6: can give it a whole bunch of direction, and then 484 00:23:11,359 --> 00:23:13,760 Speaker 6: it's only then that it goes off and it goes 485 00:23:13,800 --> 00:23:16,359 Speaker 6: into it. So you know, we're working together. And I 486 00:23:16,359 --> 00:23:18,560 Speaker 6: actually have a whole system now that I've designed where 487 00:23:19,680 --> 00:23:22,280 Speaker 6: I use a task management system called Linear, So I 488 00:23:22,280 --> 00:23:25,359 Speaker 6: have claud Code write tasks off to Linear, and then 489 00:23:25,640 --> 00:23:29,520 Speaker 6: I've worked with claud Code to write a document that 490 00:23:29,600 --> 00:23:31,919 Speaker 6: helps sort of decide a set of heuristics to decide 491 00:23:31,920 --> 00:23:34,240 Speaker 6: when you should assign it to Codex versus when you 492 00:23:34,280 --> 00:23:36,120 Speaker 6: should give it to claud Code. And so if it's 493 00:23:36,119 --> 00:23:38,240 Speaker 6: tightly defined enough and simple enough, I just send it 494 00:23:38,240 --> 00:23:41,200 Speaker 6: off to Codex and it does it totally independently. And 495 00:23:41,240 --> 00:23:44,280 Speaker 6: then if it's complicated enough that I think it requires 496 00:23:44,320 --> 00:23:47,879 Speaker 6: my time and attention, then it saves it for me 497 00:23:48,920 --> 00:23:52,840 Speaker 6: us to do together, and we'll work on it together. 498 00:23:52,880 --> 00:23:55,520 Speaker 6: And so if it's sort of touching a kind of 499 00:23:55,600 --> 00:23:58,520 Speaker 6: important enough if it's changing some part of the data model. 500 00:23:58,880 --> 00:24:01,440 Speaker 6: There's these other kind of you know, fairly basic extet 501 00:24:01,480 --> 00:24:05,080 Speaker 6: of criteria that I use, But that to me is 502 00:24:05,080 --> 00:24:09,480 Speaker 6: the fundamental distinction. And you know, I find cloud code 503 00:24:09,480 --> 00:24:12,800 Speaker 6: in that way to be just it sort of fits 504 00:24:12,920 --> 00:24:15,359 Speaker 6: what I want to do and how I want to 505 00:24:15,400 --> 00:24:17,080 Speaker 6: work much better. 506 00:24:17,640 --> 00:24:21,320 Speaker 3: Talk a little bit more about how it actually impacts 507 00:24:21,440 --> 00:24:24,960 Speaker 3: the workflow of an engineer, because you know, my impression 508 00:24:25,160 --> 00:24:29,760 Speaker 3: was people can code, right, Like, the coding problem is 509 00:24:29,880 --> 00:24:31,919 Speaker 3: kind of solved at this point, and even if you 510 00:24:32,000 --> 00:24:34,760 Speaker 3: can't code, even if you're not a professional engineer, you 511 00:24:34,800 --> 00:24:38,880 Speaker 3: can hire someone from like India or Indonesia or wherever 512 00:24:39,040 --> 00:24:40,920 Speaker 3: to just write you a code. Maybe it'll take them 513 00:24:40,920 --> 00:24:44,120 Speaker 3: a week instead of like two days with claud code. 514 00:24:44,240 --> 00:24:49,040 Speaker 3: But how much does this actually change the workflow for 515 00:24:49,080 --> 00:24:50,399 Speaker 3: an engineer. 516 00:24:50,880 --> 00:24:53,399 Speaker 6: As completely as it could be changed. I mean, I 517 00:24:53,440 --> 00:24:58,879 Speaker 6: would say that over the last three months, I've written, personally, 518 00:24:59,640 --> 00:25:01,679 Speaker 6: I don't know a few hundred lines of code. Like 519 00:25:01,840 --> 00:25:05,920 Speaker 6: I am mostly a manager of a set of agents 520 00:25:05,960 --> 00:25:08,879 Speaker 6: who are writing code on my behalf. And you know, 521 00:25:08,960 --> 00:25:11,680 Speaker 6: increasingly what I think is interesting. I've been thinking about 522 00:25:11,680 --> 00:25:14,400 Speaker 6: this a bunch lately, is like, in some ways, it's 523 00:25:14,400 --> 00:25:16,479 Speaker 6: just bringing me back to the core challenge that has 524 00:25:16,480 --> 00:25:19,639 Speaker 6: always existed in software development, which is how do you 525 00:25:19,720 --> 00:25:23,320 Speaker 6: manage all large scale software. 526 00:25:23,000 --> 00:25:24,680 Speaker 5: Development project acrossation. 527 00:25:25,200 --> 00:25:28,680 Speaker 6: It has become a coordination problem, and I spend a 528 00:25:28,760 --> 00:25:33,000 Speaker 6: lot of time sort of now designing my CLAUD code 529 00:25:33,040 --> 00:25:35,879 Speaker 6: system to ensure that code goes through all the proper 530 00:25:35,960 --> 00:25:39,280 Speaker 6: spec checks and that it has all these things. The 531 00:25:39,359 --> 00:25:43,040 Speaker 6: other thing that you know makes code a particularly good 532 00:25:43,080 --> 00:25:45,480 Speaker 6: place to do this is that code is verifiable in 533 00:25:45,520 --> 00:25:47,879 Speaker 6: a way that you know most other work is not. 534 00:25:48,440 --> 00:25:51,720 Speaker 6: So you know, with code, you can verify that the 535 00:25:51,800 --> 00:25:53,840 Speaker 6: build works right. So you can say, hey, I want 536 00:25:53,880 --> 00:25:56,040 Speaker 6: to build this, I want to build this package. I 537 00:25:56,040 --> 00:25:57,560 Speaker 6: want to make sure that it's actually going to build 538 00:25:57,560 --> 00:25:59,240 Speaker 6: and that there's going to be no failures. That's a 539 00:25:59,320 --> 00:26:01,520 Speaker 6: very easy check. It's either true or it's not true. 540 00:26:01,880 --> 00:26:06,639 Speaker 6: There's also coders use linting, and so linting is a 541 00:26:06,680 --> 00:26:10,280 Speaker 6: way to kind of look at it's static code analysis, 542 00:26:10,280 --> 00:26:11,960 Speaker 6: so it basically tries to sort. 543 00:26:11,760 --> 00:26:12,680 Speaker 5: Of find. 544 00:26:15,200 --> 00:26:17,119 Speaker 6: Things in your code base that are not going to 545 00:26:17,160 --> 00:26:19,840 Speaker 6: work ahead of time where you can predict that. Obviously, 546 00:26:19,880 --> 00:26:23,840 Speaker 6: you can't predict Alan Turing prove that you can't predict 547 00:26:24,560 --> 00:26:26,280 Speaker 6: with certainty whether code is going to run, but there's 548 00:26:26,280 --> 00:26:28,879 Speaker 6: certain patterns and things that it can find. It's essentially 549 00:26:28,960 --> 00:26:32,159 Speaker 6: does static pattern analysis, and so you know, you have 550 00:26:32,200 --> 00:26:35,000 Speaker 6: it run all these things. But the more kind of 551 00:26:35,000 --> 00:26:38,040 Speaker 6: opinionated you can be about that, and the more steps 552 00:26:38,080 --> 00:26:39,800 Speaker 6: you can have it go through, so I find, you know, 553 00:26:39,880 --> 00:26:42,359 Speaker 6: now I'm kind of the designer, which, honestly, as an 554 00:26:42,440 --> 00:26:45,360 Speaker 6: entrepreneur and as a CEO of company is like that's 555 00:26:45,480 --> 00:26:48,240 Speaker 6: kind of always been my job. Like I've been not 556 00:26:48,440 --> 00:26:50,160 Speaker 6: I have less and less been a person who writes 557 00:26:50,160 --> 00:26:52,240 Speaker 6: code and more and more been a person who designs 558 00:26:52,640 --> 00:26:54,440 Speaker 6: a system in that case of company with a bunch 559 00:26:54,440 --> 00:26:56,840 Speaker 6: of people who write code. 560 00:26:57,440 --> 00:27:00,199 Speaker 2: One of the funny things it seems to me is 561 00:27:00,240 --> 00:27:05,280 Speaker 2: that setting aside Claud code, Claude itself has a reputation 562 00:27:05,560 --> 00:27:08,720 Speaker 2: for it's a nicer chat bot to talk to. People 563 00:27:08,840 --> 00:27:13,280 Speaker 2: find it, and you know, chat gbt seems to really 564 00:27:13,359 --> 00:27:16,040 Speaker 2: be psycho fantic. I still think it's I know it's improved, 565 00:27:16,040 --> 00:27:17,600 Speaker 2: but I actually don't think it's improved or not. I 566 00:27:17,680 --> 00:27:22,719 Speaker 2: still people like the prose style of Claud Claude and 567 00:27:22,760 --> 00:27:26,360 Speaker 2: I'm curious that in the pair trading pair trading, I'm 568 00:27:26,359 --> 00:27:30,199 Speaker 2: thinking about finance, the pair engineering model, whether there is 569 00:27:30,320 --> 00:27:33,159 Speaker 2: also an edge there which is like, here is a 570 00:27:33,280 --> 00:27:36,440 Speaker 2: chat bot that is not annoying to talk to while 571 00:27:36,440 --> 00:27:39,280 Speaker 2: you're iterating, and whether that is like a meaningful distinction 572 00:27:39,720 --> 00:27:43,000 Speaker 2: between you know, coding with codex or whatever. 573 00:27:43,840 --> 00:27:47,800 Speaker 6: Yeah, I I don't know. It still can be very annoying, 574 00:27:47,840 --> 00:27:52,560 Speaker 6: I can tell you. And it'll still sometimes be overly, 575 00:27:54,480 --> 00:27:57,239 Speaker 6: overly effusive with me about a design choice, I mean, 576 00:27:57,680 --> 00:28:00,640 Speaker 6: or sort of notice something which I could live without. 577 00:28:01,000 --> 00:28:04,160 Speaker 2: So I work on this project that's doing this linguistic 578 00:28:04,200 --> 00:28:07,040 Speaker 2: things and I eventually had to say, like, give it 579 00:28:07,040 --> 00:28:09,399 Speaker 2: to me straight, how bad is this? And then so 580 00:28:09,440 --> 00:28:12,000 Speaker 2: I said, I said, Actually, what I said was assume 581 00:28:12,080 --> 00:28:14,359 Speaker 2: for a moment that you are a quantitative linguistics for 582 00:28:14,440 --> 00:28:17,840 Speaker 2: the PhD. Give me your honest assessment of where we 583 00:28:17,880 --> 00:28:21,480 Speaker 2: are with this. And it said like you've developed a 584 00:28:21,560 --> 00:28:24,680 Speaker 2: nice toy and there's no evidence that it actually does. 585 00:28:24,800 --> 00:28:26,639 Speaker 2: And I was like, okay, that's nice to hear. I 586 00:28:26,680 --> 00:28:29,040 Speaker 2: actually like, you know, I appreciate that, and I thank 587 00:28:29,040 --> 00:28:32,280 Speaker 2: you very blunt, not you know, it's still like polite, 588 00:28:32,480 --> 00:28:35,320 Speaker 2: but it was like, this, doesn't you haven't really shown anything. 589 00:28:35,400 --> 00:28:37,560 Speaker 2: You haven't really established at all that your saltware does 590 00:28:37,600 --> 00:28:38,280 Speaker 2: what it claims to. 591 00:28:38,600 --> 00:28:42,560 Speaker 6: Yeah, I think so. I think stylistically I kind of 592 00:28:42,600 --> 00:28:46,920 Speaker 6: personally agree. My theory by the way, on Claude versus 593 00:28:47,920 --> 00:28:50,800 Speaker 6: Opening Eye Chat GPT models is I think Claude is 594 00:28:50,840 --> 00:28:54,000 Speaker 6: actually better at sort of reflecting what you give it. 595 00:28:54,640 --> 00:28:56,240 Speaker 6: And so I think part of why we think it's 596 00:28:56,280 --> 00:28:57,920 Speaker 6: better is it it's better at. 597 00:28:57,800 --> 00:28:59,080 Speaker 5: Pretending it's us. 598 00:28:59,280 --> 00:29:01,520 Speaker 6: Yeah, And so we tend to like that is this 599 00:29:01,600 --> 00:29:05,160 Speaker 6: is purely speculation, but that's always been my theory on so. 600 00:29:05,200 --> 00:29:06,440 Speaker 2: It flatters you in a different way. 601 00:29:06,720 --> 00:29:09,520 Speaker 5: I think it's flattering you in a much more way. 602 00:29:09,600 --> 00:29:09,840 Speaker 1: Yeah. 603 00:29:09,960 --> 00:29:15,360 Speaker 6: Interest But for a long time, just Anthropic has been 604 00:29:15,400 --> 00:29:18,040 Speaker 6: producing the best coding models, you know. I mean there's 605 00:29:18,240 --> 00:29:21,160 Speaker 6: there can be some debate there now, but you know, 606 00:29:21,200 --> 00:29:24,280 Speaker 6: there's a great curse story from Cursor actually where Cursor 607 00:29:24,480 --> 00:29:27,920 Speaker 6: basically wasn't that good, and then SunNet three point five 608 00:29:27,960 --> 00:29:30,760 Speaker 6: came out and all of a sudden, Cursor was amazing, 609 00:29:30,960 --> 00:29:33,760 Speaker 6: and Cursor became a tool that everybody started using. But 610 00:29:33,800 --> 00:29:35,840 Speaker 6: it wasn't until this other model came out and they 611 00:29:35,880 --> 00:29:38,800 Speaker 6: made that the default model, and you know, I for 612 00:29:38,840 --> 00:29:40,720 Speaker 6: what it's worth. I think the other takeaway from that, 613 00:29:40,840 --> 00:29:43,480 Speaker 6: which is a kind of big theme we see in 614 00:29:43,520 --> 00:29:46,160 Speaker 6: the market as a thing that the cloud Code team 615 00:29:46,200 --> 00:29:49,160 Speaker 6: has talked about, is you constantly have to be building 616 00:29:49,320 --> 00:29:52,600 Speaker 6: ahead with AI in a way that is very unique 617 00:29:52,640 --> 00:29:55,280 Speaker 6: in the world of software, where you kind of always 618 00:29:55,320 --> 00:29:58,160 Speaker 6: want to build things that are working at like seventy 619 00:29:58,240 --> 00:30:00,480 Speaker 6: or eighty percent, because if you really spend the time 620 00:30:00,520 --> 00:30:02,640 Speaker 6: to get it up to ninety or one hundred, you're 621 00:30:02,680 --> 00:30:04,240 Speaker 6: going to lose all the games you get when the 622 00:30:04,280 --> 00:30:07,160 Speaker 6: next model comes out, and the you know, with the 623 00:30:07,200 --> 00:30:09,560 Speaker 6: amount of capex being spent on these models, like there's 624 00:30:09,600 --> 00:30:11,480 Speaker 6: a next model that's going to come out that's going 625 00:30:11,520 --> 00:30:13,240 Speaker 6: to be awesome, and you just kind of want to 626 00:30:13,240 --> 00:30:15,480 Speaker 6: be downstream from that, and you don't want to waste 627 00:30:15,520 --> 00:30:18,400 Speaker 6: six months getting an extra three percent when that new 628 00:30:18,400 --> 00:30:19,960 Speaker 6: model is going to give you an extra seven. 629 00:30:20,840 --> 00:30:23,200 Speaker 3: Yeah, this is the only certainty with AI is like 630 00:30:23,280 --> 00:30:24,680 Speaker 3: there's always going to be a new model. 631 00:30:25,440 --> 00:30:27,920 Speaker 2: The worst model ever use is the one that we're using. 632 00:30:28,240 --> 00:30:40,080 Speaker 4: That's right, That's right. 633 00:30:44,560 --> 00:30:47,800 Speaker 3: Are we all going to become coding illiterate? Are we 634 00:30:47,880 --> 00:30:50,360 Speaker 3: just going to forget how to code. If everyone's using 635 00:30:50,640 --> 00:30:52,440 Speaker 3: you know, general language to do. 636 00:30:52,480 --> 00:30:54,600 Speaker 2: Forget, I've never learned. Yeah, okay, you know what I've 637 00:30:54,600 --> 00:30:55,240 Speaker 2: been thinking about. 638 00:30:55,560 --> 00:30:56,040 Speaker 1: You know that. 639 00:30:57,680 --> 00:31:01,160 Speaker 2: Scott Karp, the CEO Palanteer, here's that line. He's like, 640 00:31:01,400 --> 00:31:03,120 Speaker 2: when I was young, I was too poor to have 641 00:31:03,160 --> 00:31:04,760 Speaker 2: a car, or so I didn't get a so I 642 00:31:04,800 --> 00:31:06,960 Speaker 2: never learned to drive. And now I'm too rich, so 643 00:31:07,000 --> 00:31:09,240 Speaker 2: I never learned to drive. I feel like when I 644 00:31:09,280 --> 00:31:11,240 Speaker 2: was young, I was too dumb to learn to code, 645 00:31:11,400 --> 00:31:12,320 Speaker 2: and now. 646 00:31:12,480 --> 00:31:13,280 Speaker 3: You leaped ahead. 647 00:31:13,440 --> 00:31:16,080 Speaker 2: Yeah, now I'm too smart to learn Python or HTML 648 00:31:16,160 --> 00:31:16,560 Speaker 2: or whatever. 649 00:31:17,440 --> 00:31:19,920 Speaker 6: I have a couple of takes on this person one personally. 650 00:31:19,960 --> 00:31:23,880 Speaker 6: So first one is I just think like this is 651 00:31:23,920 --> 00:31:25,160 Speaker 6: the worry of all technology. 652 00:31:25,240 --> 00:31:25,440 Speaker 5: Ever. 653 00:31:25,600 --> 00:31:28,440 Speaker 6: There was a paper that came out that showed that 654 00:31:28,720 --> 00:31:32,400 Speaker 6: people were, uh, you know, they were forgetting more things 655 00:31:32,480 --> 00:31:34,880 Speaker 6: or something because they were using chat GBD. But you know, uh, 656 00:31:35,800 --> 00:31:39,440 Speaker 6: in Phadrious Plato was worried that people were gonna forget 657 00:31:39,480 --> 00:31:42,440 Speaker 6: things because they started writing things down. And you know, 658 00:31:42,480 --> 00:31:44,160 Speaker 6: I think the trade off there was pretty good. We 659 00:31:44,240 --> 00:31:47,840 Speaker 6: got the scientific revolution a couple of other things, So, uh, 660 00:31:48,400 --> 00:31:51,040 Speaker 6: you know, I think that's the sort of natural knee jerk. 661 00:31:52,720 --> 00:31:57,200 Speaker 6: With that said, it is. I it's very strange when 662 00:31:57,240 --> 00:32:00,160 Speaker 6: you have people, you know, the Cloud Code team is 663 00:32:00,200 --> 00:32:04,080 Speaker 6: talking about how little code they write. Now, I draw 664 00:32:04,080 --> 00:32:06,800 Speaker 6: a distinction between the sort of Vibe coding and the 665 00:32:07,080 --> 00:32:10,479 Speaker 6: kind of amateur people who have never written code. And 666 00:32:10,840 --> 00:32:12,800 Speaker 6: I think that is amazing, by the way, And I 667 00:32:12,800 --> 00:32:15,800 Speaker 6: think there's a lot of software developers who are really 668 00:32:15,800 --> 00:32:19,680 Speaker 6: mad about that because there they claim it's for safety 669 00:32:19,680 --> 00:32:21,800 Speaker 6: reasons or whatever, but I think fundamentally it's just they've 670 00:32:21,800 --> 00:32:25,480 Speaker 6: got people on their turf. But I think that's incredible. 671 00:32:25,480 --> 00:32:29,240 Speaker 6: I mean, my hat, my my nine year old Vibe 672 00:32:29,240 --> 00:32:33,880 Speaker 6: coded a website wo and for Secret Santa. She's now ten. 673 00:32:34,080 --> 00:32:35,520 Speaker 6: She would get bad at me if I called her nine, 674 00:32:35,560 --> 00:32:37,600 Speaker 6: but I think she vibe good at when she was nine. 675 00:32:38,040 --> 00:32:40,440 Speaker 6: But that that's awesome, right, I don't know, that's amazing. 676 00:32:40,480 --> 00:32:43,000 Speaker 6: That's a way for people to express themselves in a 677 00:32:43,040 --> 00:32:46,440 Speaker 6: way that they couldn't before you did your your linguistics process. 678 00:32:46,480 --> 00:32:51,480 Speaker 6: That's that's fun and interesting. But yeah, I also think 679 00:32:51,520 --> 00:32:54,720 Speaker 6: the other the the thing that's happening with professional software 680 00:32:54,720 --> 00:32:56,960 Speaker 6: developers when you hear from anthropic or you know, when 681 00:32:56,960 --> 00:33:00,600 Speaker 6: I'm talking about it's you know, the code going through 682 00:33:00,600 --> 00:33:04,360 Speaker 6: this process, and you know, all the code still gets 683 00:33:04,400 --> 00:33:06,080 Speaker 6: reviewed by people. We're not letting it get out the 684 00:33:06,120 --> 00:33:08,400 Speaker 6: door if it's not at the same level as human 685 00:33:08,560 --> 00:33:11,480 Speaker 6: and it's just But what's amazing is I'm I'm running 686 00:33:11,520 --> 00:33:13,400 Speaker 6: five of these sessions at a time, right, and so 687 00:33:13,440 --> 00:33:16,000 Speaker 6: I've got like software being developed in parallel in a 688 00:33:16,040 --> 00:33:19,280 Speaker 6: way that is unimaginable. And you know the other thing 689 00:33:19,320 --> 00:33:23,080 Speaker 6: is just now, the best software engineers wrote the least 690 00:33:23,080 --> 00:33:26,800 Speaker 6: code anyway. You know the sort of classic story of 691 00:33:26,840 --> 00:33:29,640 Speaker 6: like the difference between a junior developer or senior developers 692 00:33:29,640 --> 00:33:32,720 Speaker 6: that a junior developer gets a problem and they sit 693 00:33:32,760 --> 00:33:34,320 Speaker 6: down and they put their fingers on the keyboard and 694 00:33:34,320 --> 00:33:37,320 Speaker 6: they start writing code. And a senior developer gets a 695 00:33:37,320 --> 00:33:39,920 Speaker 6: problem and sits there for three hours and tries to 696 00:33:39,920 --> 00:33:41,360 Speaker 6: figure out what the best way to solve it is 697 00:33:41,400 --> 00:33:43,720 Speaker 6: and then spends five minutes writing code to get it done. 698 00:33:43,960 --> 00:33:46,400 Speaker 3: True elegance is restraint. That's what I say. 699 00:33:46,840 --> 00:33:47,280 Speaker 4: What are you. 700 00:33:47,280 --> 00:33:50,440 Speaker 2: Seeing in the companies you're working for? Like, I find 701 00:33:50,480 --> 00:33:53,200 Speaker 2: it hard to believe and I was maybe skeptical of this, 702 00:33:53,320 --> 00:33:55,800 Speaker 2: but it feels like right now we're here with technology, 703 00:33:55,800 --> 00:33:58,640 Speaker 2: where like if viral like company is like, like I said, 704 00:33:58,640 --> 00:34:00,920 Speaker 2: you can build charts of data in a way that 705 00:34:01,040 --> 00:34:02,800 Speaker 2: used to be like someone would have had to get 706 00:34:02,800 --> 00:34:04,760 Speaker 2: their hands dirty, ear, et cetera. And the companies that 707 00:34:04,840 --> 00:34:07,720 Speaker 2: you talk to is right now that having an effect 708 00:34:07,800 --> 00:34:09,920 Speaker 2: on how they think about what positions they're hiring for 709 00:34:09,960 --> 00:34:11,680 Speaker 2: and the skills they're looking for and so forth. 710 00:34:12,200 --> 00:34:17,360 Speaker 6: I think that it's hard to answer, right, really, I 711 00:34:17,440 --> 00:34:22,239 Speaker 6: think that certainly. I do think I personally think if 712 00:34:22,280 --> 00:34:24,440 Speaker 6: I look at the sort of layoffs in the technology 713 00:34:24,480 --> 00:34:26,080 Speaker 6: industry of the last couple of years, I think some 714 00:34:26,200 --> 00:34:28,600 Speaker 6: part of that is just looking at the output of 715 00:34:28,640 --> 00:34:33,000 Speaker 6: these models and saying, hey, these models are able to 716 00:34:33,000 --> 00:34:35,880 Speaker 6: produce it, you know, the median, and I have a 717 00:34:35,880 --> 00:34:38,120 Speaker 6: whole bunch of sort of middle managers who are producing 718 00:34:38,160 --> 00:34:41,040 Speaker 6: it the sixty fifth percentile. And it's like I can 719 00:34:41,120 --> 00:34:44,440 Speaker 6: produce median for a dollar fifty per million tokens, or 720 00:34:44,480 --> 00:34:48,440 Speaker 6: I can produce sixty fifth percentile for hover many hundreds 721 00:34:48,480 --> 00:34:50,160 Speaker 6: of thousands dollars a year. It's it's a sort of 722 00:34:50,160 --> 00:34:53,399 Speaker 6: fairly simple trade off, I think. So I do think 723 00:34:53,400 --> 00:34:55,000 Speaker 6: there's a lot of downstream effects. I think the other 724 00:34:55,000 --> 00:34:57,480 Speaker 6: thing that's happening, is is kind of like middle management 725 00:34:57,560 --> 00:35:00,239 Speaker 6: is under threat because it's the realization that hey, like, 726 00:35:00,520 --> 00:35:02,880 Speaker 6: part of what these models are amazing at is is 727 00:35:03,360 --> 00:35:05,080 Speaker 6: I think of them as like a fuzzy interface. They 728 00:35:05,120 --> 00:35:07,920 Speaker 6: can sort of turn any data into any other data, right. 729 00:35:07,960 --> 00:35:11,200 Speaker 6: You can sort of transform data from one format to another. 730 00:35:11,360 --> 00:35:13,000 Speaker 6: You can take a PDF and you can turn it 731 00:35:13,040 --> 00:35:15,279 Speaker 6: into charts, right, And there's whole people who exist or 732 00:35:15,400 --> 00:35:17,120 Speaker 6: you know, if you think about what product managers do 733 00:35:17,160 --> 00:35:19,000 Speaker 6: a lot of what product managers do is they take 734 00:35:19,239 --> 00:35:21,040 Speaker 6: how people are using a product and they try to 735 00:35:21,080 --> 00:35:25,319 Speaker 6: transform it into a format that engineers can then use 736 00:35:25,400 --> 00:35:27,000 Speaker 6: to figure out what to do. And I think a 737 00:35:27,040 --> 00:35:31,080 Speaker 6: lot of those kind of a lot of those pieces 738 00:35:31,239 --> 00:35:35,880 Speaker 6: that used to just be kind of transferring knowledge. 739 00:35:36,080 --> 00:35:39,520 Speaker 2: I've always said, Tracy, I think one of the most 740 00:35:39,560 --> 00:35:42,920 Speaker 2: important roles in any organization is essentially translation work. And 741 00:35:42,960 --> 00:35:46,120 Speaker 2: you see it in the newsroom where it's like here's 742 00:35:46,160 --> 00:35:50,919 Speaker 2: a team specialized in emerging market currencies, and then they 743 00:35:50,960 --> 00:35:53,280 Speaker 2: have to like they have to then tell the senior 744 00:35:53,400 --> 00:35:56,040 Speaker 2: editors what they're working on. But the senior editors, who 745 00:35:56,040 --> 00:35:58,920 Speaker 2: are maybe more generalists, don't really know like why like 746 00:35:59,000 --> 00:36:02,360 Speaker 2: some sort of like you know, one yen carry is 747 00:36:02,400 --> 00:36:05,600 Speaker 2: important and that a really important role within any organization 748 00:36:05,719 --> 00:36:08,680 Speaker 2: is essentially the team that can translate between the generalist 749 00:36:08,680 --> 00:36:11,319 Speaker 2: team and the specialist team. Absolutely, and so that's an 750 00:36:11,320 --> 00:36:14,120 Speaker 2: interesting observation and the sort of engineering world of like, okay, 751 00:36:14,120 --> 00:36:17,000 Speaker 2: these are tools that are in sometimes translation tools. 752 00:36:17,480 --> 00:36:19,840 Speaker 3: So we talked I agree completely, by the way, but 753 00:36:20,160 --> 00:36:23,359 Speaker 3: we talked about vibe coding and Joe has this application 754 00:36:24,160 --> 00:36:26,879 Speaker 3: that I don't think you're looking to monetize. 755 00:36:26,400 --> 00:36:27,960 Speaker 2: No, it's I'm just trying to make it for the 756 00:36:28,000 --> 00:36:30,160 Speaker 2: good of the world, right, Okay, so when did that 757 00:36:30,160 --> 00:36:33,640 Speaker 2: become a crown? I'm not a lotizing it, But. 758 00:36:33,719 --> 00:36:37,400 Speaker 3: Like this opens up massive questions for software as a service, 759 00:36:37,480 --> 00:36:41,239 Speaker 3: right versas because if everyone can write their own software, 760 00:36:42,480 --> 00:36:45,320 Speaker 3: you can replicate anything that's out there that is currently 761 00:36:45,480 --> 00:36:47,400 Speaker 3: charging money. What's going to happen to software? 762 00:36:48,200 --> 00:36:50,839 Speaker 6: I think software is pretty screwed. A lot of it, 763 00:36:50,880 --> 00:36:53,120 Speaker 6: at least not all of it. You know, you still 764 00:36:53,480 --> 00:36:55,880 Speaker 6: it depends on whether you call that cloud provider software 765 00:36:55,960 --> 00:36:57,640 Speaker 6: or not. You know, you still need to run this 766 00:36:57,640 --> 00:37:00,360 Speaker 6: stuff somewhere. And I think there's there's certain kind of 767 00:37:00,400 --> 00:37:02,920 Speaker 6: software that you know, you just don't really want to 768 00:37:02,960 --> 00:37:06,160 Speaker 6: be in the business of writing, you know, I as 769 00:37:06,200 --> 00:37:09,040 Speaker 6: someone who's tried to build a project management system, I'd 770 00:37:09,080 --> 00:37:11,840 Speaker 6: really rather I don't think anybody should be in that business. 771 00:37:13,400 --> 00:37:15,960 Speaker 6: But I do think fundamentally, I mean, we see this 772 00:37:16,120 --> 00:37:19,759 Speaker 6: every day inside enterprises. The sort of build versus buy 773 00:37:20,840 --> 00:37:23,960 Speaker 6: pendulum has just swung. And you know, I mean I 774 00:37:24,120 --> 00:37:26,960 Speaker 6: used to run a SaaS company and we sold to enterprises, 775 00:37:27,000 --> 00:37:29,919 Speaker 6: and you know, for a long time that I think 776 00:37:29,960 --> 00:37:32,520 Speaker 6: that made a lot of sense, right because like, hey, 777 00:37:32,880 --> 00:37:34,440 Speaker 6: it just didn't make sense to try to build this 778 00:37:34,480 --> 00:37:37,400 Speaker 6: thing on your own. And so but the price of 779 00:37:37,400 --> 00:37:39,960 Speaker 6: that was, you know, won the price, right, like, and 780 00:37:40,040 --> 00:37:41,759 Speaker 6: it got to be more and more expensive. The other 781 00:37:41,800 --> 00:37:43,320 Speaker 6: price was that you were paying for a lot of 782 00:37:43,360 --> 00:37:45,600 Speaker 6: stuff you didn't need, right, because the whole job of 783 00:37:45,640 --> 00:37:49,719 Speaker 6: building SaaS is you need to generalize problems, and so 784 00:37:49,920 --> 00:37:52,719 Speaker 6: you build things that are going to work for everybody. 785 00:37:53,040 --> 00:37:54,680 Speaker 5: And that means either you. 786 00:37:54,600 --> 00:37:56,279 Speaker 6: Have to sort of adapt or you have to build 787 00:37:56,320 --> 00:38:00,880 Speaker 6: this sort of very configurable software. And I think, and 788 00:38:01,400 --> 00:38:03,560 Speaker 6: what I see you just you know firsthand, is that 789 00:38:05,080 --> 00:38:10,440 Speaker 6: inside these organizations you can now solve very specific problems 790 00:38:10,440 --> 00:38:14,680 Speaker 6: that are highly valuable and not only can you solve 791 00:38:14,680 --> 00:38:18,279 Speaker 6: them better than generic software, but you can actually, in 792 00:38:18,320 --> 00:38:20,360 Speaker 6: a lot of ways do it for less money because 793 00:38:20,360 --> 00:38:23,000 Speaker 6: you're trying to tackle less stuff. You didn't need the 794 00:38:23,080 --> 00:38:25,319 Speaker 6: sixteen other features. You bought it for the one that 795 00:38:25,400 --> 00:38:29,480 Speaker 6: you really really cared about. And so I think that 796 00:38:29,640 --> 00:38:31,400 Speaker 6: part of it, you know, I don't like there's I 797 00:38:31,680 --> 00:38:34,120 Speaker 6: definitely think there are pieces of the software industry that 798 00:38:34,239 --> 00:38:36,200 Speaker 6: are gonna, you know, come out the other side. You're 799 00:38:36,200 --> 00:38:38,799 Speaker 6: gonna nobody wants to deal with payroll, right like you know, 800 00:38:38,840 --> 00:38:41,480 Speaker 6: somebody you're still gonna buy some payroll software and you're 801 00:38:41,520 --> 00:38:44,000 Speaker 6: still gonna have that. But you know, I do think 802 00:38:44,000 --> 00:38:46,840 Speaker 6: there are a lot of pieces where the software existed 803 00:38:46,960 --> 00:38:50,000 Speaker 6: essentially as a kind of wrapper around a database, and 804 00:38:50,040 --> 00:38:53,120 Speaker 6: now you're just gonna, you know, with just the database, 805 00:38:53,160 --> 00:38:55,120 Speaker 6: you can do that. And then you know, the other 806 00:38:55,160 --> 00:38:57,040 Speaker 6: piece I'd say here is it's this is not this 807 00:38:57,160 --> 00:38:59,239 Speaker 6: is a kind of confluence of circumstances where it's not 808 00:38:59,360 --> 00:39:03,880 Speaker 6: just the coding, it's also the fact that you have 809 00:39:04,080 --> 00:39:06,200 Speaker 6: ai to do a whole bunch of work. So you know, 810 00:39:06,200 --> 00:39:08,719 Speaker 6: if we pick on CRM for a second, right, like 811 00:39:09,120 --> 00:39:13,960 Speaker 6: you know, Salesforce dot Com Salesforce dot com. We can 812 00:39:14,239 --> 00:39:16,400 Speaker 6: you know, you look at what the interface of that is. 813 00:39:16,440 --> 00:39:20,120 Speaker 6: And essentially it has existed to get salespeople to take 814 00:39:20,239 --> 00:39:22,839 Speaker 6: unstructured data, which is sales meetings, and turn it into 815 00:39:22,880 --> 00:39:25,839 Speaker 6: structured data that so can be stored into database. And 816 00:39:25,920 --> 00:39:29,960 Speaker 6: now you have AI, and AI is very capable of 817 00:39:30,040 --> 00:39:33,680 Speaker 6: taking unstructured data directly from the source, so you have 818 00:39:33,719 --> 00:39:37,160 Speaker 6: people recording meetings, and then it can structure it into 819 00:39:37,200 --> 00:39:39,400 Speaker 6: any data that you want. This is one of the 820 00:39:39,480 --> 00:39:42,320 Speaker 6: very first sort of mind blowing moments I had was 821 00:39:42,360 --> 00:39:46,399 Speaker 6: that I could give it a Jason interface. I could 822 00:39:46,400 --> 00:39:49,480 Speaker 6: describe exactly what I wanted the data structure to be, 823 00:39:49,920 --> 00:39:52,480 Speaker 6: and it would give me back that information in that 824 00:39:52,560 --> 00:39:55,080 Speaker 6: data structure. And we've just basically been having a bunch 825 00:39:55,080 --> 00:39:56,799 Speaker 6: of humans do that work for a very long time, 826 00:39:56,840 --> 00:39:58,880 Speaker 6: whether it's in CRM or project management or any of 827 00:39:58,880 --> 00:40:00,800 Speaker 6: these other places. And the ability to just kind of 828 00:40:00,840 --> 00:40:02,640 Speaker 6: get rid of that whole thing, I think it really 829 00:40:02,640 --> 00:40:05,480 Speaker 6: does bring into question the value of a lot of 830 00:40:05,480 --> 00:40:06,400 Speaker 6: these software companies. 831 00:40:06,520 --> 00:40:08,840 Speaker 2: Well, so we have seen like a lot of software sucks. 832 00:40:08,880 --> 00:40:11,960 Speaker 2: They look like melting ice cubes right now. Maybe they 833 00:40:12,440 --> 00:40:14,799 Speaker 2: So what is it I want to talk. I mean, 834 00:40:14,840 --> 00:40:17,000 Speaker 2: this is like you know, our listeners who are investors, 835 00:40:17,080 --> 00:40:19,800 Speaker 2: there's a pretty high stakes question of like what residual 836 00:40:19,960 --> 00:40:22,960 Speaker 2: value there is. But talk a little bit more about Salesforce. 837 00:40:23,000 --> 00:40:24,480 Speaker 2: Maybe there would be a time to learn what sale 838 00:40:24,840 --> 00:40:28,040 Speaker 2: what it actually does as it's massively being disrupted. Now 839 00:40:28,040 --> 00:40:30,560 Speaker 2: we get around to learning what Salesforce is. But I 840 00:40:30,600 --> 00:40:32,560 Speaker 2: know it's like many things there are apps that people 841 00:40:32,560 --> 00:40:35,279 Speaker 2: built onto Salesforce. But this sounds like we're hitting on 842 00:40:35,360 --> 00:40:37,160 Speaker 2: when I think probably one of the crucial questions for 843 00:40:37,239 --> 00:40:39,600 Speaker 2: like the future of the software industry. So talk a 844 00:40:39,600 --> 00:40:42,120 Speaker 2: little bit more about like the current approach and what 845 00:40:42,160 --> 00:40:45,879 Speaker 2: people are buying when they buy a package or subscribe 846 00:40:45,960 --> 00:40:49,000 Speaker 2: to a service from Salesforce, and then what the unlocked 847 00:40:49,000 --> 00:40:53,319 Speaker 2: opportunity is from having AI like live in the same 848 00:40:53,320 --> 00:40:54,520 Speaker 2: world as all your files. 849 00:40:55,040 --> 00:40:58,759 Speaker 6: Yeah, so I think if we take CRM as the 850 00:40:58,800 --> 00:41:01,640 Speaker 6: general category, know the biggest players there are. 851 00:41:01,640 --> 00:41:02,719 Speaker 2: That's customer relationship. 852 00:41:02,719 --> 00:41:05,920 Speaker 6: Customer relationship management, that's like what you know, Salesforce does it, 853 00:41:05,960 --> 00:41:08,600 Speaker 6: SVP does it, HubSpot does it for the mid market. 854 00:41:10,160 --> 00:41:12,040 Speaker 6: You know, when I think about that product and I 855 00:41:12,040 --> 00:41:15,600 Speaker 6: think about the way we've used it inside enterprise sales organizations. Essentially, 856 00:41:15,680 --> 00:41:18,280 Speaker 6: you know, it's a database of companies, it's a database 857 00:41:18,280 --> 00:41:20,360 Speaker 6: of contacts, it's a database of deals you have in 858 00:41:20,400 --> 00:41:22,799 Speaker 6: the pipeline, and it's a way to track all those 859 00:41:22,840 --> 00:41:25,200 Speaker 6: deals you guys hit on something before that I think 860 00:41:25,239 --> 00:41:30,680 Speaker 6: is really it, which is like inside companies, there is 861 00:41:30,719 --> 00:41:33,400 Speaker 6: a huge group of people and who exist to answer 862 00:41:33,440 --> 00:41:36,000 Speaker 6: the question from management of what is the status of 863 00:41:36,080 --> 00:41:38,720 Speaker 6: something right? And you know that can be sales management, 864 00:41:38,760 --> 00:41:40,360 Speaker 6: it can be product management, it doesn't matter, right, it 865 00:41:40,400 --> 00:41:43,520 Speaker 6: could be within a newsroom. Somebody wants to know what 866 00:41:43,560 --> 00:41:45,879 Speaker 6: the status is and somebody else exists to go figure 867 00:41:45,920 --> 00:41:49,200 Speaker 6: out what the answer to that question is. And so fundamentally, 868 00:41:49,719 --> 00:41:52,880 Speaker 6: I think those CRM tools are bought first and foremost 869 00:41:53,120 --> 00:41:55,760 Speaker 6: to answer what is the status right? What's my pipeline 870 00:41:55,760 --> 00:41:58,040 Speaker 6: look like? And to answer what your pipeline looks like, 871 00:41:58,040 --> 00:42:01,319 Speaker 6: you need a bunch of salespeople putting deals in and 872 00:42:01,360 --> 00:42:03,520 Speaker 6: those deals are associated with contacts and companies and they 873 00:42:03,520 --> 00:42:07,240 Speaker 6: say when is that deal going to close? And essentially 874 00:42:07,239 --> 00:42:09,880 Speaker 6: you were asking the sales people to make the updates 875 00:42:09,880 --> 00:42:14,560 Speaker 6: in the system to do that and just very tactically, 876 00:42:14,680 --> 00:42:17,319 Speaker 6: I mean, you know, I run a company. Now we 877 00:42:17,440 --> 00:42:18,880 Speaker 6: talk to a lot of we have a lot of 878 00:42:18,880 --> 00:42:23,400 Speaker 6: sales calls. We record those calls and they get transcribed, 879 00:42:23,440 --> 00:42:26,440 Speaker 6: and the AI then looks through them and makes decisions 880 00:42:26,480 --> 00:42:30,440 Speaker 6: about where this deal should be in the process. And 881 00:42:31,400 --> 00:42:34,080 Speaker 6: it's much better than having somebody try to go update it, 882 00:42:34,120 --> 00:42:36,600 Speaker 6: because those people never updated anywhere. The secret of all 883 00:42:36,640 --> 00:42:39,239 Speaker 6: of this enterprise software is that nobody was using it 884 00:42:39,239 --> 00:42:42,520 Speaker 6: the way that anybody wanted to anyway. And so you know, 885 00:42:43,120 --> 00:42:46,680 Speaker 6: I think that that is sort of you know, a 886 00:42:46,680 --> 00:42:48,359 Speaker 6: lot of what's happening there. Again, it's sort of some 887 00:42:48,400 --> 00:42:50,640 Speaker 6: of it's the coding, some of it's just the core capabilities. 888 00:42:50,680 --> 00:42:52,759 Speaker 6: And then you know, you still need databases, right, so 889 00:42:52,800 --> 00:42:54,320 Speaker 6: it's like, you know, you look at what data bricks 890 00:42:54,320 --> 00:42:56,400 Speaker 6: and stuff like. You know, I think those folks are 891 00:42:56,440 --> 00:42:58,840 Speaker 6: still sort of genuinely sitting in a pretty good place 892 00:42:58,880 --> 00:43:01,919 Speaker 6: where you know, all software has to sit on side 893 00:43:02,320 --> 00:43:05,000 Speaker 6: on top of some database that you can sort of 894 00:43:05,040 --> 00:43:08,080 Speaker 6: read and write to. But you know, I think some 895 00:43:08,120 --> 00:43:13,239 Speaker 6: of those categories that were specifically focused on kind of 896 00:43:13,320 --> 00:43:16,799 Speaker 6: like human input. Now, of course, you know, Salesforce has 897 00:43:16,800 --> 00:43:18,880 Speaker 6: a whole AI thing and they're saying, hey, you shouldn't 898 00:43:18,880 --> 00:43:22,320 Speaker 6: have humans inputting in salesforce. You know, at sales is 899 00:43:22,400 --> 00:43:24,799 Speaker 6: just one small piece. They have a whole customer support thing, 900 00:43:25,080 --> 00:43:28,680 Speaker 6: which obviously also has an interesting implication where you know 901 00:43:28,680 --> 00:43:31,160 Speaker 6: you're doing support with AI agents and so some of 902 00:43:31,200 --> 00:43:32,880 Speaker 6: it comes back to seats. I mean, you know, it 903 00:43:32,920 --> 00:43:36,000 Speaker 6: gets to be fairly complicated, but I do think I 904 00:43:36,000 --> 00:43:40,280 Speaker 6: think the fundamental underlying thing is anybody who buys software 905 00:43:41,080 --> 00:43:44,920 Speaker 6: that is, you know, uh SaaS, you're always buying for 906 00:43:44,960 --> 00:43:48,480 Speaker 6: a subset of the functionality that's nobody is using one 907 00:43:48,520 --> 00:43:50,839 Speaker 6: hundred percent of the functionality of SaaS, and so there's 908 00:43:50,840 --> 00:43:52,880 Speaker 6: always a trade off that's happening there where you know 909 00:43:53,000 --> 00:43:55,399 Speaker 6: you're spending more money than you need to because you're 910 00:43:55,440 --> 00:43:58,040 Speaker 6: not using all of these pieces. And so you know, 911 00:43:58,160 --> 00:44:00,600 Speaker 6: if you can more narrowly focus that, that is where 912 00:44:00,680 --> 00:44:02,560 Speaker 6: you could say, hey, we could solve this kind of 913 00:44:02,560 --> 00:44:04,400 Speaker 6: more narrow problem. And not only can we solve it 914 00:44:04,400 --> 00:44:07,320 Speaker 6: more narrowly, we can solve it way more effectively because 915 00:44:07,640 --> 00:44:10,080 Speaker 6: you know, the trick with AI is that the more 916 00:44:10,520 --> 00:44:13,520 Speaker 6: specific you are with it, the better the output is. Right, 917 00:44:13,560 --> 00:44:15,879 Speaker 6: So it's like, if you know, if outside of coding, 918 00:44:15,920 --> 00:44:18,480 Speaker 6: if you just ask chet gpz to write you a story. 919 00:44:18,600 --> 00:44:21,680 Speaker 6: It's going to write you a very very median story, right, 920 00:44:22,200 --> 00:44:25,600 Speaker 6: sort of exactly the median. But if you work with 921 00:44:25,640 --> 00:44:28,439 Speaker 6: it and you you know, then you're going to get it. 922 00:44:28,560 --> 00:44:31,640 Speaker 6: The more of your own expertise, you imbuing it, the 923 00:44:32,280 --> 00:44:34,319 Speaker 6: further up above the media, and it's going to be 924 00:44:34,360 --> 00:44:35,840 Speaker 6: and it's going to be you know. Of course that 925 00:44:35,920 --> 00:44:39,399 Speaker 6: also means it's less where the line is between what's 926 00:44:39,440 --> 00:44:41,640 Speaker 6: AI and what's not AI is going to continue to 927 00:44:41,640 --> 00:44:43,759 Speaker 6: get lorier, Joe, how. 928 00:44:43,719 --> 00:44:45,760 Speaker 3: Much does Claude Code actually cost? 929 00:44:46,280 --> 00:44:46,640 Speaker 4: Do you know? 930 00:44:47,280 --> 00:44:47,640 Speaker 5: Well? 931 00:44:48,600 --> 00:44:51,920 Speaker 2: I paid for the two two hundred dollars a month version, 932 00:44:52,360 --> 00:44:56,080 Speaker 2: but like high roller, yeah, I know, but uh, you know, 933 00:44:56,160 --> 00:44:57,640 Speaker 2: I think it's you can get it with the pro 934 00:44:57,760 --> 00:44:59,920 Speaker 2: version of like or whatever. The sub version of that 935 00:45:00,040 --> 00:45:02,200 Speaker 2: blow twenty dollars. But I hit a limit fairly quickly, 936 00:45:02,280 --> 00:45:03,719 Speaker 2: and I was like, I didn't have my website up 937 00:45:03,800 --> 00:45:06,600 Speaker 2: so like, and then I bought the five. Then I 938 00:45:06,640 --> 00:45:09,520 Speaker 2: paid five dollars for the extra compute, and I was like, 939 00:45:09,560 --> 00:45:10,160 Speaker 2: this is dumb. 940 00:45:10,239 --> 00:45:13,640 Speaker 3: I think, yeah, okay, so we're going out to. 941 00:45:13,800 --> 00:45:16,560 Speaker 2: Two nice dinners right month, that's not you know, when 942 00:45:16,560 --> 00:45:18,080 Speaker 2: I think about that way, it doesn't see that big. 943 00:45:17,920 --> 00:45:19,680 Speaker 3: Of a deal, it's worth it to you. Yeah, okay, 944 00:45:19,719 --> 00:45:21,600 Speaker 3: so I think we can all agree this is like 945 00:45:21,680 --> 00:45:26,319 Speaker 3: a valuable service that claud Code is providing. But we 946 00:45:26,400 --> 00:45:29,080 Speaker 3: touched on this in the intro. It seems like the 947 00:45:29,160 --> 00:45:33,320 Speaker 3: models just keep replicating themselves really really quickly. So anything 948 00:45:33,320 --> 00:45:36,520 Speaker 3: that claud Code can do, I would expect another model 949 00:45:36,560 --> 00:45:39,120 Speaker 3: will come in in like a month maybe less and 950 00:45:39,160 --> 00:45:41,880 Speaker 3: do the exact same thing. What does that mean for 951 00:45:41,960 --> 00:45:46,279 Speaker 3: the actual like valuations of these companies and the models, Like, 952 00:45:46,360 --> 00:45:48,360 Speaker 3: how are they going to monetize it when it seems 953 00:45:48,440 --> 00:45:53,239 Speaker 3: so difficult to actually differentiate yourself, especially for like a 954 00:45:53,280 --> 00:45:55,719 Speaker 3: substantial portion of time. 955 00:45:55,800 --> 00:45:58,279 Speaker 6: Yeah, well, so again here I think we have to 956 00:45:58,440 --> 00:46:02,480 Speaker 6: distinguish between claud Code and the Claude model. So in 957 00:46:02,760 --> 00:46:06,040 Speaker 6: claud Code's case, if you're using you know, the latest version, 958 00:46:06,040 --> 00:46:08,680 Speaker 6: you're using Opus four point five, which is the model. 959 00:46:08,760 --> 00:46:11,640 Speaker 6: Opus four point five has a price of something in 960 00:46:11,680 --> 00:46:14,480 Speaker 6: the dollar fifty to two dollars for a million input 961 00:46:14,560 --> 00:46:16,880 Speaker 6: tokens and whatever it is on the output, which is 962 00:46:16,920 --> 00:46:19,640 Speaker 6: like roughly the going rate for cutting edge modles. Gemini 963 00:46:19,800 --> 00:46:25,160 Speaker 6: three pro is the same price open a Chachipt five 964 00:46:25,200 --> 00:46:27,560 Speaker 6: point two is. They're all the same price. So the 965 00:46:27,560 --> 00:46:29,759 Speaker 6: first thing is is you have to differentiate between those. 966 00:46:30,760 --> 00:46:32,960 Speaker 6: And so I think a big part of what Anthropic 967 00:46:33,040 --> 00:46:34,880 Speaker 6: is trying to do is they're trying to lock people 968 00:46:34,880 --> 00:46:39,200 Speaker 6: into claud Code. In fact, there's just some controversy amongst 969 00:46:39,480 --> 00:46:43,560 Speaker 6: some nerds where open Code, which is a competitor to 970 00:46:43,640 --> 00:46:47,879 Speaker 6: claud Code, used to let you use your Claude Max 971 00:46:47,880 --> 00:46:50,120 Speaker 6: two hundred dollars. So the trick with the Claude Max 972 00:46:50,160 --> 00:46:53,320 Speaker 6: plan is if you're just buying those that number of tokens, 973 00:46:53,320 --> 00:46:56,000 Speaker 6: it would cost you significantly more than two hundred dollars. 974 00:46:56,040 --> 00:47:00,400 Speaker 6: It is a super super discounted plan. So like you, 975 00:47:00,400 --> 00:47:03,560 Speaker 6: you are probably you have the access, I have the 976 00:47:03,600 --> 00:47:06,239 Speaker 6: access to use I would guess in the thousand or 977 00:47:06,239 --> 00:47:10,640 Speaker 6: two thousand dollars of tokens for my two hundred dollars 978 00:47:10,640 --> 00:47:13,359 Speaker 6: a month. So it's a very very heavily subsidized plan. 979 00:47:13,640 --> 00:47:15,840 Speaker 6: And open Code, which is an open source version of 980 00:47:16,000 --> 00:47:19,920 Speaker 6: Claude code, a sort of competitor. They had found a 981 00:47:19,920 --> 00:47:22,600 Speaker 6: way that they would let you use your Claude Max 982 00:47:22,640 --> 00:47:27,680 Speaker 6: plan with open code, and Anthropic last week shut that down, 983 00:47:28,360 --> 00:47:31,520 Speaker 6: and some open Code people got very upset because they said, 984 00:47:31,560 --> 00:47:34,160 Speaker 6: like this is not what you're supposed to do or 985 00:47:34,520 --> 00:47:37,400 Speaker 6: I'm not sure exactly what they said. I never felt 986 00:47:37,440 --> 00:47:39,400 Speaker 6: like I got a particularly good argument out of it. 987 00:47:40,280 --> 00:47:42,600 Speaker 6: But you know, I do think part of what they're 988 00:47:42,600 --> 00:47:47,040 Speaker 6: trying to get at because is that, you know, at 989 00:47:47,040 --> 00:47:49,839 Speaker 6: the very top, models like these are all amazing, like 990 00:47:50,040 --> 00:47:56,040 Speaker 6: the Google Opening and Anthropic, their best models are all 991 00:47:56,080 --> 00:47:58,319 Speaker 6: on par with each other. I mean, I would move 992 00:47:58,320 --> 00:48:00,920 Speaker 6: them around a little bit. I still think Opus four 993 00:48:01,000 --> 00:48:03,120 Speaker 6: point five is the best model out there, but you know, 994 00:48:03,160 --> 00:48:08,439 Speaker 6: I mean that might change tomorrow. Like and that's where 995 00:48:08,480 --> 00:48:10,560 Speaker 6: something like cloud code is really interesting because it's a 996 00:48:11,200 --> 00:48:14,080 Speaker 6: product that is very it's just theirs. It's a piece 997 00:48:14,080 --> 00:48:17,120 Speaker 6: of software. It's not an AI model, and so it's 998 00:48:17,160 --> 00:48:21,000 Speaker 6: sort of it's less able to be disrupted. Now again, 999 00:48:21,120 --> 00:48:24,080 Speaker 6: I think if if somebody else wanted to copy that exactly, 1000 00:48:24,800 --> 00:48:28,319 Speaker 6: they could. Codex has one, Gemini has one. I just 1001 00:48:28,320 --> 00:48:30,439 Speaker 6: think they take a very different tact with it where 1002 00:48:30,440 --> 00:48:32,360 Speaker 6: it's much less and so, you know, I think what 1003 00:48:32,360 --> 00:48:34,120 Speaker 6: they're trying to do is get developers like me to 1004 00:48:34,160 --> 00:48:36,440 Speaker 6: feel very comfortable inside that so that when we go 1005 00:48:36,560 --> 00:48:39,640 Speaker 6: open I still open Codex or tried Gemini, or I 1006 00:48:39,719 --> 00:48:42,840 Speaker 6: was playing with open code the other day, and it 1007 00:48:42,960 --> 00:48:44,960 Speaker 6: just doesn't feel familiar in the same way that you know, 1008 00:48:45,000 --> 00:48:47,120 Speaker 6: if you're trying to move somebody from a PC to 1009 00:48:47,160 --> 00:48:48,480 Speaker 6: a MAC, it doesn't feel familiar. Right. 1010 00:48:48,480 --> 00:48:50,680 Speaker 3: They want to own like the ecosystem. 1011 00:48:50,080 --> 00:48:54,160 Speaker 2: The environment, that environment or what a world. Noah, thank 1012 00:48:54,200 --> 00:48:56,200 Speaker 2: you so much for coming on odd Laws. I was 1013 00:48:56,239 --> 00:48:58,560 Speaker 2: like dying to do an episode about this topic. 1014 00:48:58,600 --> 00:48:59,359 Speaker 5: Thanks for having me. 1015 00:49:00,080 --> 00:49:03,160 Speaker 2: Way, I don't have AI psychosis. I have a claud complex. 1016 00:49:03,600 --> 00:49:05,239 Speaker 3: Why is everyone making that joke? 1017 00:49:05,360 --> 00:49:06,439 Speaker 2: Wait, which joke? 1018 00:49:07,000 --> 00:49:08,080 Speaker 3: The psychosis joke? 1019 00:49:08,600 --> 00:49:10,160 Speaker 2: I think you were going to be proud of me 1020 00:49:10,200 --> 00:49:11,480 Speaker 2: for saying Claude complex. 1021 00:49:11,760 --> 00:49:13,040 Speaker 3: Oh oh that is very good. 1022 00:49:13,680 --> 00:49:16,400 Speaker 2: I do one pun finally for Tracy And why was 1023 00:49:16,400 --> 00:49:17,239 Speaker 2: it for making that joke? 1024 00:49:17,320 --> 00:49:18,640 Speaker 5: Well, I was a joke. 1025 00:49:18,840 --> 00:49:20,759 Speaker 2: I was handing your sirt. I finally make a pun, 1026 00:49:20,880 --> 00:49:21,879 Speaker 2: and you just jump right over. 1027 00:49:22,160 --> 00:49:25,520 Speaker 3: Everyone keeps saying that claud code is AI psychosis for 1028 00:49:25,600 --> 00:49:28,960 Speaker 3: smart people, right, Like, how did that beck come a thing? Yeah? 1029 00:49:29,080 --> 00:49:30,439 Speaker 2: All right, but there's a good pun. 1030 00:49:30,560 --> 00:49:33,160 Speaker 3: Also very pro coded. I find you think so all 1031 00:49:33,160 --> 00:49:34,400 Speaker 3: of AI is pro coded. 1032 00:49:34,880 --> 00:49:37,279 Speaker 2: Uh, this is true. We should talk more about this, 1033 00:49:37,320 --> 00:49:39,120 Speaker 2: you know, we should have David Shorn he's been doing 1034 00:49:39,120 --> 00:49:41,319 Speaker 2: a lot of polling about various demographics and how they 1035 00:49:41,320 --> 00:49:45,319 Speaker 2: feel about AI. We should need some interesting that. Yeah, 1036 00:49:45,320 --> 00:49:47,719 Speaker 2: we should do that anyway. No, thank you so much 1037 00:49:47,719 --> 00:49:48,480 Speaker 2: for coming on offline. 1038 00:49:48,520 --> 00:49:49,319 Speaker 5: Thanks for having me. 1039 00:50:01,480 --> 00:50:03,960 Speaker 2: Well that was fun, Tracy. I really like I It's 1040 00:50:04,200 --> 00:50:07,399 Speaker 2: obvious to anyone who's been within five minutes five feet 1041 00:50:07,440 --> 00:50:09,320 Speaker 2: of me for the last two weeks. I'm like totally 1042 00:50:09,400 --> 00:50:12,720 Speaker 2: addicted and gone down, I know, gone down the rabbit 1043 00:50:12,760 --> 00:50:16,840 Speaker 2: hole and stuff. But like I for the first time, unironically, 1044 00:50:17,000 --> 00:50:21,160 Speaker 2: I'm like, Okay, this is transformative technology beyond being very 1045 00:50:21,160 --> 00:50:22,120 Speaker 2: impressive technology. 1046 00:50:22,200 --> 00:50:26,000 Speaker 3: Right, So I've been coming to a conclusion, which is 1047 00:50:26,040 --> 00:50:31,040 Speaker 3: that you know, AI can be both under hyped and 1048 00:50:31,160 --> 00:50:34,920 Speaker 3: overvalued simultaneously. Like and I feel like that's kind of 1049 00:50:34,920 --> 00:50:36,480 Speaker 3: where we are at the moment. 1050 00:50:36,239 --> 00:50:37,560 Speaker 2: That were you're making your stock call. 1051 00:50:37,840 --> 00:50:42,040 Speaker 3: Yeah no, but seriously, like it it's a big deal. 1052 00:50:42,360 --> 00:50:45,400 Speaker 3: It's going to change the way we work, But is 1053 00:50:45,400 --> 00:50:48,839 Speaker 3: it monetizable? Can you differentiate the actual models? The better 1054 00:50:48,920 --> 00:50:52,440 Speaker 3: the technology gets, like, the easier it is to just 1055 00:50:52,480 --> 00:50:54,960 Speaker 3: do what everyone else is doing. And also like the 1056 00:50:55,040 --> 00:50:57,520 Speaker 3: compute gets cheaper and cheaper. So I just don't know 1057 00:50:57,560 --> 00:50:58,879 Speaker 3: how you monetize this. 1058 00:50:59,040 --> 00:51:01,799 Speaker 2: Well, so that's very interesting his point, which is that 1059 00:51:01,840 --> 00:51:05,560 Speaker 2: it's the tokens are heavily subsidized still, and so that 1060 00:51:05,560 --> 00:51:08,080 Speaker 2: if you're paying and actually using that two hundred dollars 1061 00:51:08,080 --> 00:51:10,880 Speaker 2: max program and you actually use it to the limit, 1062 00:51:11,400 --> 00:51:14,640 Speaker 2: Claude is going to lose money on this and then 1063 00:51:14,680 --> 00:51:18,040 Speaker 2: the prices keep dropping. And I know, like claud code 1064 00:51:18,200 --> 00:51:21,400 Speaker 2: is okay, they're attempting to create something that resembles a 1065 00:51:21,440 --> 00:51:24,400 Speaker 2: traditional software ecosystem that you feel as a user that 1066 00:51:24,440 --> 00:51:28,719 Speaker 2: you're locked into. But so far, in my various like 1067 00:51:29,000 --> 00:51:32,200 Speaker 2: since November twenty twenty two when I started playing with AI, 1068 00:51:32,680 --> 00:51:35,920 Speaker 2: it hasn't felt like anyone has established lock in with anything. 1069 00:51:36,160 --> 00:51:41,120 Speaker 2: And it's very it's very movable, and I suspect even 1070 00:51:41,160 --> 00:51:43,319 Speaker 2: though I have this file now on my desktop that 1071 00:51:43,400 --> 00:51:47,400 Speaker 2: has a file called claud md that gives instructions, et cetera, 1072 00:51:47,800 --> 00:51:50,320 Speaker 2: I'm certain that if I open this file with Codex 1073 00:51:50,400 --> 00:51:52,800 Speaker 2: or Googles, I could probably just pick it up the same. 1074 00:51:52,920 --> 00:51:55,640 Speaker 3: Yeah. I also think there's a fundamental issue with the 1075 00:51:55,640 --> 00:51:58,840 Speaker 3: lock in strategy because when you're talking about technology on 1076 00:51:58,880 --> 00:52:02,719 Speaker 3: the Internet, it just feels very against the grain to 1077 00:52:02,920 --> 00:52:06,600 Speaker 3: try to lock people into anything, and we've seen various 1078 00:52:06,600 --> 00:52:10,319 Speaker 3: projects over the years, and it's a lot harder than 1079 00:52:10,320 --> 00:52:10,840 Speaker 3: it looks. 1080 00:52:11,840 --> 00:52:13,840 Speaker 2: Yeah, I mean, I guess I would say it's a 1081 00:52:13,840 --> 00:52:15,600 Speaker 2: lot harder than it looks. But then we also know 1082 00:52:15,640 --> 00:52:17,680 Speaker 2: the flip side, which is that tons of people are 1083 00:52:17,719 --> 00:52:19,920 Speaker 2: locked into software that they hate, right, Yeah, people are 1084 00:52:20,200 --> 00:52:22,120 Speaker 2: Oh I hate people. How many times have you Oh, 1085 00:52:22,320 --> 00:52:26,080 Speaker 2: I hate Outlook right? Or I hate Microsoft teams and 1086 00:52:26,160 --> 00:52:28,040 Speaker 2: I hate this and I spend money on it every 1087 00:52:28,040 --> 00:52:30,239 Speaker 2: month and my organization can't move off of it, or 1088 00:52:30,239 --> 00:52:32,200 Speaker 2: we can't migrate off of it. So I do think 1089 00:52:32,239 --> 00:52:35,160 Speaker 2: that cuts both ways. I do think he offered the 1090 00:52:35,200 --> 00:52:40,520 Speaker 2: best explanation I've heard of why the AI coding models 1091 00:52:40,960 --> 00:52:43,840 Speaker 2: are a threat to a lot of pretty big software businesses, 1092 00:52:44,160 --> 00:52:47,680 Speaker 2: especially especially the point about how the user never uses 1093 00:52:47,760 --> 00:52:50,680 Speaker 2: all of the features that they actually that the software 1094 00:52:50,680 --> 00:52:53,759 Speaker 2: got built for, and therefore maybe the build versus by 1095 00:52:53,920 --> 00:52:57,560 Speaker 2: calculation really starts to shift when they can just design 1096 00:52:57,600 --> 00:52:58,920 Speaker 2: that one feature very quickly. 1097 00:52:59,560 --> 00:53:02,200 Speaker 3: I totally agree on the software side, it seems like 1098 00:53:02,239 --> 00:53:05,720 Speaker 3: an existential threat, but just like the locked in ecosystem 1099 00:53:05,960 --> 00:53:09,480 Speaker 3: of a particular model. I know he said it's not 1100 00:53:09,520 --> 00:53:12,360 Speaker 3: actually a model, but that seems like a bigger issue 1101 00:53:12,360 --> 00:53:14,120 Speaker 3: to me. I don't know. I guess we'll see. 1102 00:53:14,280 --> 00:53:15,799 Speaker 2: We're going to see and I don't know. I kind 1103 00:53:15,840 --> 00:53:16,880 Speaker 2: of think we're going to see quickly. 1104 00:53:17,200 --> 00:53:20,680 Speaker 3: Yeah, that's again, that's the only certainty is like stuff 1105 00:53:20,719 --> 00:53:21,160 Speaker 3: is happening. 1106 00:53:21,360 --> 00:53:22,120 Speaker 2: What is happening now? 1107 00:53:22,239 --> 00:53:22,439 Speaker 4: Yeah? 1108 00:53:22,520 --> 00:53:23,520 Speaker 3: Okay, shall we leave it there? 1109 00:53:23,600 --> 00:53:24,359 Speaker 2: Let's leave it there. 1110 00:53:24,480 --> 00:53:26,760 Speaker 3: This has been another episode of the Odd Lots podcast. 1111 00:53:26,840 --> 00:53:30,040 Speaker 3: I'm Tracy Alloway. You can follow me at Tracy Alloway. 1112 00:53:29,680 --> 00:53:31,799 Speaker 2: And I'm Jolly Isn't all. You can follow me at 1113 00:53:31,840 --> 00:53:34,719 Speaker 2: the Stalwart. Follow our guest Noah Brier, He's at Hey, 1114 00:53:34,760 --> 00:53:38,600 Speaker 2: It's Noah. Follow our producers Kerman, Rodriguez at Carman Arman, Dashill, 1115 00:53:38,600 --> 00:53:41,400 Speaker 2: ben Att at Dashbot, and Kilbrooks at Kilbrooks. And for 1116 00:53:41,440 --> 00:53:43,960 Speaker 2: more odd Laws content, go to Bloomberg dot com slash 1117 00:53:43,960 --> 00:53:46,640 Speaker 2: odd Lots with a daily newsletter and all of our episodes, 1118 00:53:46,800 --> 00:53:48,680 Speaker 2: and you can chat about all of these topics twenty 1119 00:53:48,719 --> 00:53:51,800 Speaker 2: four to seven in our discord Discord dot gg slash 1120 00:53:51,800 --> 00:53:52,280 Speaker 2: odd Lots. 1121 00:53:52,520 --> 00:53:54,560 Speaker 3: And if you enjoy odd Lots, if you like it 1122 00:53:54,600 --> 00:53:57,719 Speaker 3: when we talk about advances in AI, then please leave 1123 00:53:57,800 --> 00:54:01,400 Speaker 3: us a positive review on your favorite podcast platform and remember, 1124 00:54:01,440 --> 00:54:03,960 Speaker 3: if you are a Bloomberg subscriber, you can listen to 1125 00:54:04,000 --> 00:54:07,080 Speaker 3: all of our episodes absolutely ad free. All you need 1126 00:54:07,120 --> 00:54:09,800 Speaker 3: to do is find the Bloomberg channel on Apple Podcast 1127 00:54:09,880 --> 00:54:28,400 Speaker 3: and follow the instructions there. Thanks for listening.