1 00:00:02,920 --> 00:00:16,560 Speaker 1: Bloomberg Audio Studios, Podcasts, Radio News. 2 00:00:18,040 --> 00:00:21,360 Speaker 2: Hello and welcome to another episode of the Odd Lots Podcast. 3 00:00:21,400 --> 00:00:23,599 Speaker 3: I'm Jill Wisenthal and I'm Tracy Alloway. 4 00:00:23,920 --> 00:00:26,920 Speaker 2: Tracy, I've always had this idea for the podcast, or 5 00:00:26,960 --> 00:00:30,319 Speaker 2: a thing that I've wanted to do. Okay, conceptually with 6 00:00:30,440 --> 00:00:34,400 Speaker 2: podcasts is schedule every guest for two interviewers. So you 7 00:00:34,479 --> 00:00:36,280 Speaker 2: have the opening interview and you ask a bunch of 8 00:00:36,360 --> 00:00:39,159 Speaker 2: questions and then it's, oh God, I really wish I 9 00:00:39,200 --> 00:00:41,440 Speaker 2: had followed up on that. I had more. I was 10 00:00:41,600 --> 00:00:43,519 Speaker 2: just starting to sort of get my head around this thing. 11 00:00:43,600 --> 00:00:43,680 Speaker 1: Now. 12 00:00:43,720 --> 00:00:45,640 Speaker 2: I could have asked the good questions and then like, 13 00:00:45,720 --> 00:00:48,640 Speaker 2: have the person come back next week. Also, the audience complains, 14 00:00:48,640 --> 00:00:50,480 Speaker 2: I wish it as that and then fill in all 15 00:00:50,479 --> 00:00:54,080 Speaker 2: those gaps that had been inspired by the previous conversation. 16 00:00:54,320 --> 00:00:55,920 Speaker 3: I don't think it's a bad idea. I think it 17 00:00:55,920 --> 00:00:58,840 Speaker 3: would double the number of episodes that we put out. 18 00:00:59,320 --> 00:01:03,320 Speaker 3: But sure there are topics that come up, usually things 19 00:01:03,360 --> 00:01:06,080 Speaker 3: that were just kind of new to and we're trying 20 00:01:06,120 --> 00:01:08,800 Speaker 3: to learn about specifically technical things, and one of those 21 00:01:09,000 --> 00:01:09,319 Speaker 3: has to. 22 00:01:09,319 --> 00:01:12,600 Speaker 2: Be AI, right, Ai? And also, you know, I really 23 00:01:12,680 --> 00:01:14,880 Speaker 2: had a great time. I guess last month we were 24 00:01:14,880 --> 00:01:17,039 Speaker 2: in Chicago. Yeah, we talked to a bunch of different 25 00:01:17,080 --> 00:01:21,880 Speaker 2: it was like got trading related trip. We interviewed Don Wilson, 26 00:01:22,040 --> 00:01:23,960 Speaker 2: we interviewed the head of the CMME. We had some 27 00:01:23,959 --> 00:01:25,880 Speaker 2: other chats. So they're all about the world of trading. 28 00:01:26,200 --> 00:01:27,920 Speaker 2: When it comes to trading, it's like, you know, we 29 00:01:27,959 --> 00:01:31,319 Speaker 2: talked to long term investors, portfolio managers and daomas. We 30 00:01:31,400 --> 00:01:33,280 Speaker 2: talked to some people in the hedge fund space who 31 00:01:33,319 --> 00:01:36,279 Speaker 2: like maybe have a holding period of several weeks or whatever. 32 00:01:36,840 --> 00:01:39,080 Speaker 2: I actually really want to learn more about the trading 33 00:01:39,160 --> 00:01:41,080 Speaker 2: like these people who have like a holding time of 34 00:01:41,360 --> 00:01:43,399 Speaker 2: one second or something like that, because that's where a 35 00:01:43,400 --> 00:01:45,360 Speaker 2: lot of the tech and a lot of the actual 36 00:01:45,440 --> 00:01:47,880 Speaker 2: like action is and how that world makes money and 37 00:01:47,920 --> 00:01:51,200 Speaker 2: how they actually deployed technology is very interesting, but still 38 00:01:51,200 --> 00:01:52,600 Speaker 2: something I don't have my handle. 39 00:01:52,360 --> 00:01:55,560 Speaker 3: On, well, the practical application, right, and also the culture 40 00:01:55,640 --> 00:01:57,960 Speaker 3: of AI on Wall Street. I find that really interesting 41 00:01:58,000 --> 00:02:00,200 Speaker 3: because I remember, I guess it was like more than 42 00:02:00,200 --> 00:02:02,520 Speaker 3: a decade ago, but remember Lloyd Blank find saying that 43 00:02:02,560 --> 00:02:06,680 Speaker 3: Golden Sachs is a technology. Yeah, and all these bank 44 00:02:06,760 --> 00:02:09,720 Speaker 3: CEOs saying we're going to install pingpong tables to get 45 00:02:09,760 --> 00:02:12,800 Speaker 3: all the coders, and now I see ads at trading 46 00:02:12,840 --> 00:02:15,160 Speaker 3: firms and it's like, we have a data center full 47 00:02:15,200 --> 00:02:17,280 Speaker 3: of B two hundreds, or we have a data center 48 00:02:17,320 --> 00:02:19,359 Speaker 3: full of G three hundreds. Come work for us. 49 00:02:19,440 --> 00:02:21,440 Speaker 2: The only thing besides all their tech that I know 50 00:02:21,680 --> 00:02:23,880 Speaker 2: is like every time you read a profile of any 51 00:02:23,960 --> 00:02:27,320 Speaker 2: trading company, like and they love to play backcam and 52 00:02:27,320 --> 00:02:30,959 Speaker 2: they love to play all the article the chessboards are out, 53 00:02:31,280 --> 00:02:33,760 Speaker 2: they could be seen playing chess over launch, et cetera. 54 00:02:33,960 --> 00:02:36,120 Speaker 2: I get it, Okay, they like us, they like games, 55 00:02:36,160 --> 00:02:38,799 Speaker 2: they like whatever, let's move the ball for Well. 56 00:02:38,720 --> 00:02:42,880 Speaker 3: There's also the underlying theme of is this all hype, right? 57 00:02:43,160 --> 00:02:45,640 Speaker 3: Because you two get the sense sometimes that companies are 58 00:02:45,680 --> 00:02:48,520 Speaker 3: putting out press releases where they just mention AI to 59 00:02:48,560 --> 00:02:50,800 Speaker 3: tick a box, to be seen to be doing something 60 00:02:51,240 --> 00:02:54,080 Speaker 3: and hope that their stock actually goes up. And because 61 00:02:54,120 --> 00:02:57,200 Speaker 3: so much of this is proprietary and people kind of 62 00:02:57,240 --> 00:03:00,360 Speaker 3: have an excuse not to go into detail about it, 63 00:03:01,040 --> 00:03:04,320 Speaker 3: sometimes you do get the feeling that people are just 64 00:03:04,520 --> 00:03:06,480 Speaker 3: talking about it and not actually using it. 65 00:03:06,960 --> 00:03:09,360 Speaker 2: Cynics and I'm not saying. 66 00:03:09,160 --> 00:03:11,640 Speaker 3: This myself, I know you're not a cynic. 67 00:03:11,720 --> 00:03:17,639 Speaker 2: Speaking of trading and technology. Cinics would say that comes 68 00:03:17,720 --> 00:03:20,000 Speaker 2: deal with Google to both of clouds, to you put 69 00:03:20,040 --> 00:03:21,920 Speaker 2: trading on the cloud with hype, that that was a 70 00:03:21,919 --> 00:03:24,200 Speaker 2: press release. People have said that people have made that 71 00:03:24,280 --> 00:03:26,560 Speaker 2: charge and they don't understand why. You don't have to comment. 72 00:03:26,600 --> 00:03:27,840 Speaker 2: You don't have to say anything further on that. 73 00:03:27,960 --> 00:03:29,720 Speaker 3: I do have a comment, but I'll hold it for 74 00:03:29,800 --> 00:03:30,119 Speaker 3: our guess. 75 00:03:30,120 --> 00:03:32,640 Speaker 2: I'm just there is this world where people do press 76 00:03:32,680 --> 00:03:34,960 Speaker 2: releases and cynics go. I don't really understand the point anyway. 77 00:03:35,000 --> 00:03:37,560 Speaker 2: There's a very long lind up. Let's learn more about 78 00:03:37,600 --> 00:03:40,240 Speaker 2: the world of trading. Let's learn more about AI and 79 00:03:40,280 --> 00:03:43,160 Speaker 2: tech specifically, what does it even mean to apply AI 80 00:03:43,200 --> 00:03:45,320 Speaker 2: within the realm of trading. We're going to be speaking 81 00:03:45,360 --> 00:03:47,920 Speaker 2: with Ian Dunning. He is the head of AI at 82 00:03:48,000 --> 00:03:51,840 Speaker 2: Hudson Rivert Trading. He's previously at deep Mind, so his 83 00:03:52,440 --> 00:03:55,960 Speaker 2: trading and AI bonafides are about as good. 84 00:03:55,840 --> 00:03:56,360 Speaker 4: As it gets. 85 00:03:56,440 --> 00:03:58,520 Speaker 3: With me, you've established them, we've established that. 86 00:03:58,920 --> 00:04:00,840 Speaker 2: Really the perfect guest answer all our questions. So I 87 00:04:01,080 --> 00:04:03,200 Speaker 2: thank you so much for coming on the podcast. 88 00:04:03,280 --> 00:04:04,480 Speaker 4: Yeah, I'm really happy to be here. 89 00:04:04,560 --> 00:04:08,800 Speaker 5: I agree you as the mystique factor is kind of overblown, 90 00:04:08,920 --> 00:04:11,000 Speaker 5: even if it's understandable white people embrace it. 91 00:04:11,040 --> 00:04:13,720 Speaker 2: Sometimes we're gonna blow past the mystiq. Let's start with 92 00:04:13,840 --> 00:04:16,919 Speaker 2: some like really just like rhudimentary questions, Like just the 93 00:04:16,920 --> 00:04:19,960 Speaker 2: first one is like Huns the River trading as a company, 94 00:04:20,360 --> 00:04:21,400 Speaker 2: how does it make money? 95 00:04:21,720 --> 00:04:27,440 Speaker 5: Yeah, so we are a sort of quantitative automated proprietary 96 00:04:27,480 --> 00:04:30,240 Speaker 5: trading firm. Which is a lot of words, but I 97 00:04:30,279 --> 00:04:31,760 Speaker 5: guess the way I see it is we are a 98 00:04:31,800 --> 00:04:35,520 Speaker 5: service provider to markets. Okay, the most clear example is 99 00:04:35,720 --> 00:04:38,120 Speaker 5: market making. There is like a sort of utility to 100 00:04:38,160 --> 00:04:41,760 Speaker 5: the world of being really just buy yourself any product, anytime, anywhere, 101 00:04:41,839 --> 00:04:46,600 Speaker 5: and for us that means stocks, futures, options, crypto bonds. 102 00:04:47,240 --> 00:04:50,279 Speaker 5: And if you could say, build a magical machine to 103 00:04:50,800 --> 00:04:52,960 Speaker 5: quote a price to buy it or sell at any instrument, 104 00:04:53,040 --> 00:04:55,400 Speaker 5: and you would want to be like the best possible price, 105 00:04:55,440 --> 00:04:56,359 Speaker 5: like the tightest price. 106 00:04:56,680 --> 00:04:57,640 Speaker 4: People would trade with you. 107 00:04:58,160 --> 00:05:00,560 Speaker 5: They would be happy because there's a count for their 108 00:05:00,560 --> 00:05:02,320 Speaker 5: trade and they get a kind of good price, like 109 00:05:02,360 --> 00:05:03,080 Speaker 5: a low spread. 110 00:05:03,320 --> 00:05:04,400 Speaker 4: And we're happy because. 111 00:05:04,160 --> 00:05:06,400 Speaker 5: We essentially pick up a penny in front of a steamroller, 112 00:05:06,440 --> 00:05:08,920 Speaker 5: like we are making sort of money from that spread, 113 00:05:09,040 --> 00:05:10,920 Speaker 5: and we can pick up the pennies in front of 114 00:05:10,920 --> 00:05:14,600 Speaker 5: a steamroller if we have a really magical device which 115 00:05:14,600 --> 00:05:16,200 Speaker 5: tells us how what everything should be. 116 00:05:16,120 --> 00:05:17,120 Speaker 2: When the steamroller is coming. 117 00:05:17,240 --> 00:05:17,840 Speaker 4: Yeah, it tells us. 118 00:05:17,760 --> 00:05:19,680 Speaker 5: When the steam roll is coming. And so I think 119 00:05:19,720 --> 00:05:22,720 Speaker 5: that's kind of the very very sophisticated sort of middleman 120 00:05:22,880 --> 00:05:24,520 Speaker 5: in some sense. And the same we've had Amazon is 121 00:05:24,560 --> 00:05:27,480 Speaker 5: Amazon doesn't make stuff, but it's a very valuable, profitable 122 00:05:27,480 --> 00:05:30,200 Speaker 5: company provides a service. People get value of same thing. 123 00:05:30,240 --> 00:05:34,560 Speaker 5: We're moving stock spons through time and space between different counterpartties. 124 00:05:34,600 --> 00:05:36,120 Speaker 2: And yeah, we will. 125 00:05:36,000 --> 00:05:38,760 Speaker 3: Ask you about the steamroller in a few minutes. But 126 00:05:39,320 --> 00:05:41,960 Speaker 3: before we do that, how does AI or the way 127 00:05:41,960 --> 00:05:45,679 Speaker 3: you're using AI actually differ from the algorithmic or quant 128 00:05:45,680 --> 00:05:48,640 Speaker 3: trading of olds, Because I guess that one of the 129 00:05:48,720 --> 00:05:52,600 Speaker 3: questions is, is this, you know, a sort of evolutionary change, 130 00:05:52,720 --> 00:05:55,919 Speaker 3: you know, maybe a marginal improvement on what already exists, 131 00:05:56,120 --> 00:05:59,279 Speaker 3: or is this something seismic and a step change the 132 00:05:59,480 --> 00:06:01,560 Speaker 3: big ship in the way trading actually works. 133 00:06:01,920 --> 00:06:05,120 Speaker 5: Yeah, I mean I don't want to overstate ourselves in 134 00:06:05,160 --> 00:06:07,840 Speaker 5: some sense, because into space, as you mentioned before, it's 135 00:06:07,920 --> 00:06:10,400 Speaker 5: very like opaque what sort of different firms of this 136 00:06:10,520 --> 00:06:13,640 Speaker 5: class are doing. I can siddenly speak to our own experience, 137 00:06:13,800 --> 00:06:15,760 Speaker 5: which is we've been doing this type of trading for 138 00:06:15,880 --> 00:06:19,240 Speaker 5: twenty plus years, and much like everyone who was doing this, 139 00:06:19,440 --> 00:06:22,960 Speaker 5: the way it kind of worked was you handcraft features. 140 00:06:22,800 --> 00:06:24,359 Speaker 5: It's sort of based on human intuition. 141 00:06:24,480 --> 00:06:25,560 Speaker 4: Oh, I don't know. 142 00:06:25,600 --> 00:06:27,479 Speaker 5: If the order book looks imbalanced, there is more people 143 00:06:27,480 --> 00:06:29,200 Speaker 5: wanting to buy than sell, the price is going to 144 00:06:29,279 --> 00:06:31,880 Speaker 5: go up soon, or something like that. And maybe you 145 00:06:32,360 --> 00:06:34,000 Speaker 5: get a bunch of very smart people and they think 146 00:06:34,080 --> 00:06:36,080 Speaker 5: very hard, like it's almost like making a very fancy 147 00:06:36,160 --> 00:06:39,240 Speaker 5: watch kind of artistanally craft all these pieces, and then 148 00:06:39,240 --> 00:06:43,080 Speaker 5: maybe you use relatively simple mathematical techniques like linear regression 149 00:06:43,080 --> 00:06:46,680 Speaker 5: to combine those predictors. And I've been going to conferences 150 00:06:46,720 --> 00:06:48,640 Speaker 5: and things and recruiting for a long time, and even 151 00:06:48,680 --> 00:06:51,520 Speaker 5: if today's going the internet, you'll people say things like, oh, 152 00:06:51,560 --> 00:06:54,479 Speaker 5: that's all you can do in finance. For some reason, 153 00:06:54,480 --> 00:06:57,240 Speaker 5: they'll say this. They'll say something like, oh it's too noisy, 154 00:06:57,320 --> 00:07:00,080 Speaker 5: or markets are too nonstationary or things like this, and 155 00:07:00,120 --> 00:07:03,080 Speaker 5: so that's all you can do. And I guess that 156 00:07:03,320 --> 00:07:06,320 Speaker 5: belief isn't really backed up by anything in my opinion, 157 00:07:06,440 --> 00:07:09,760 Speaker 5: and like lived experience, I guess, And so we sort 158 00:07:09,800 --> 00:07:12,000 Speaker 5: of viewed it more for a long time as well, 159 00:07:12,760 --> 00:07:15,360 Speaker 5: everything that's happening in the world, and ideally you would 160 00:07:15,360 --> 00:07:17,040 Speaker 5: put this into kind of like a machine that does 161 00:07:17,120 --> 00:07:18,920 Speaker 5: not have any human biases. 162 00:07:19,240 --> 00:07:20,040 Speaker 4: I don't know how to. 163 00:07:19,960 --> 00:07:23,800 Speaker 5: Trade stocks myself, like I buy broad market ets, what 164 00:07:23,880 --> 00:07:25,640 Speaker 5: do I know? And so we but if you could 165 00:07:25,640 --> 00:07:27,840 Speaker 5: put all the data into a box and it kind 166 00:07:27,840 --> 00:07:30,360 Speaker 5: of could jurn all about data, it would find things 167 00:07:30,400 --> 00:07:33,280 Speaker 5: that you would never be able to do, this handcrafted thing. 168 00:07:33,960 --> 00:07:36,880 Speaker 5: And we started doing that very early, relatively in se 169 00:07:37,000 --> 00:07:40,960 Speaker 5: twenty fourteen twenty thirteen period, and over time, over less 170 00:07:41,000 --> 00:07:43,400 Speaker 5: a decade or so, much like in other contexts that 171 00:07:43,440 --> 00:07:46,000 Speaker 5: are not finance, there has been sort of a hockey 172 00:07:46,000 --> 00:07:48,080 Speaker 5: stick and you can measure by the sizes of the 173 00:07:48,120 --> 00:07:52,400 Speaker 5: models the compute deployed and over time that way of 174 00:07:52,560 --> 00:07:56,840 Speaker 5: modeling the markets initially was not like a hybrid with 175 00:07:56,880 --> 00:07:57,560 Speaker 5: the traditional way. 176 00:07:57,680 --> 00:07:59,320 Speaker 4: Bially kind of just like overtook it. 177 00:07:59,360 --> 00:08:03,280 Speaker 5: Entirely and so now our trading is entirely driven by 178 00:08:03,680 --> 00:08:07,640 Speaker 5: this magical machine consumes of a data. I kind of 179 00:08:07,720 --> 00:08:09,560 Speaker 5: keep saying this magical machine. It consumes of a data 180 00:08:09,560 --> 00:08:11,640 Speaker 5: for a reason, which is that this is how chat 181 00:08:11,680 --> 00:08:15,360 Speaker 5: GPT is trained, it consumes all the data all the Internet. 182 00:08:15,520 --> 00:08:18,400 Speaker 5: It's kind of scraped and connected into one place. When 183 00:08:18,400 --> 00:08:20,600 Speaker 5: you train a model, that kind of takes it all 184 00:08:20,760 --> 00:08:23,280 Speaker 5: and something emergent comes from it. And that's why I'm 185 00:08:23,360 --> 00:08:25,160 Speaker 5: kind of a bit of leading. But that's why I'm 186 00:08:25,160 --> 00:08:26,520 Speaker 5: talking about in the sense, and I think that is 187 00:08:26,560 --> 00:08:28,800 Speaker 5: materially different from the like I'm using my intuition of 188 00:08:28,800 --> 00:08:31,320 Speaker 5: the markets to kind of construct a predictive model. 189 00:08:31,640 --> 00:08:34,839 Speaker 3: So just to be clear, how much of usefulness of 190 00:08:34,880 --> 00:08:38,600 Speaker 3: AI here is about execution and the fact that you 191 00:08:38,640 --> 00:08:41,760 Speaker 3: can crunch a lot of data really quickly with hundreds 192 00:08:41,880 --> 00:08:47,400 Speaker 3: or thousands of GPUs versus spotting sophisticated patterns or discrepancies 193 00:08:47,440 --> 00:08:48,480 Speaker 3: that you can exploit. 194 00:08:48,800 --> 00:08:49,760 Speaker 4: I think it's both. 195 00:08:49,840 --> 00:08:52,400 Speaker 5: I think one of the things that people sort of 196 00:08:52,480 --> 00:08:54,560 Speaker 5: missed with a whole like do a linear regression type 197 00:08:54,559 --> 00:08:57,160 Speaker 5: thing is when you really think about how much data 198 00:08:57,200 --> 00:08:59,640 Speaker 5: there is in financial markets generated. And when I say data, 199 00:08:59,679 --> 00:09:01,880 Speaker 5: I think it's important to think of it as every 200 00:09:01,960 --> 00:09:03,960 Speaker 5: event that happens in market. It's not the sort of 201 00:09:04,000 --> 00:09:06,480 Speaker 5: time serious of prices, but like the actual low level 202 00:09:06,679 --> 00:09:11,360 Speaker 5: substrate people are quoting trading retracting quotes that like low 203 00:09:11,440 --> 00:09:15,440 Speaker 5: level stuff is internet scale data set sizes, and one 204 00:09:15,440 --> 00:09:18,240 Speaker 5: of us sort of bitter lesson ye type things of 205 00:09:18,320 --> 00:09:20,800 Speaker 5: AI was like, you know, you shouldn't think too hard 206 00:09:20,800 --> 00:09:22,959 Speaker 5: about how to feature engineer this in pre process, that 207 00:09:22,960 --> 00:09:25,960 Speaker 5: you should kind of throttle in to something a form 208 00:09:26,040 --> 00:09:28,280 Speaker 5: of computation that can kind of make use of internet 209 00:09:28,320 --> 00:09:31,200 Speaker 5: scale data. In the twenty tens, it was like computer 210 00:09:31,280 --> 00:09:34,520 Speaker 5: vision people used to make detectors for edges of images 211 00:09:34,520 --> 00:09:36,320 Speaker 5: and things and they would combine them and same thing. 212 00:09:36,360 --> 00:09:38,320 Speaker 5: It's like that was a good approach, but you know, 213 00:09:38,520 --> 00:09:40,839 Speaker 5: it's completely dominated by the idea of getting very large 214 00:09:40,880 --> 00:09:42,920 Speaker 5: umber of GPUs and a kind of a pretty generic 215 00:09:43,160 --> 00:09:46,520 Speaker 5: neural network form empowering through it. As for like the 216 00:09:46,840 --> 00:09:49,240 Speaker 5: how is it finding things that other methods could not. 217 00:09:50,000 --> 00:09:53,679 Speaker 5: It's very hard to say our models are not very interpretable. 218 00:09:53,760 --> 00:09:57,400 Speaker 5: And I think that's fine because, as Joe mentioned, our 219 00:09:57,480 --> 00:09:59,920 Speaker 5: sort of trading style and holding times, a bit of 220 00:10:00,200 --> 00:10:03,040 Speaker 5: it is like minutes, hours, maybe like a low single 221 00:10:03,040 --> 00:10:06,320 Speaker 5: digit days for the most part, And I guess in 222 00:10:06,360 --> 00:10:09,840 Speaker 5: my mind it's unreasonable to expect them to be interpretable 223 00:10:10,400 --> 00:10:13,000 Speaker 5: because I don't know if I looked at the autobook 224 00:10:13,080 --> 00:10:16,200 Speaker 5: data for Tesla or something. Am I really going to 225 00:10:16,200 --> 00:10:17,680 Speaker 5: be able to tell you better than random with the 226 00:10:17,679 --> 00:10:20,080 Speaker 5: price of Tesla will be in a minute's time? And 227 00:10:20,120 --> 00:10:21,440 Speaker 5: so I kind of think it like that, if you 228 00:10:21,440 --> 00:10:25,239 Speaker 5: have something that's clearly superhuman already, what level of interpretability 229 00:10:25,280 --> 00:10:28,000 Speaker 5: because you expect like this is very different right to 230 00:10:28,040 --> 00:10:28,520 Speaker 5: normal AI. 231 00:10:28,679 --> 00:10:32,120 Speaker 2: Right, this is gets into some areas that I'm very interested. 232 00:10:32,280 --> 00:10:35,280 Speaker 2: But just to like establish what we're talking about, you're 233 00:10:35,320 --> 00:10:38,240 Speaker 2: trading a stock like a Tesla in video, et cetera 234 00:10:38,960 --> 00:10:42,839 Speaker 2: with your magic machine machine. We had another episode where 235 00:10:43,400 --> 00:10:44,200 Speaker 2: that was the money box. 236 00:10:44,240 --> 00:10:46,800 Speaker 3: That's a magic bo different, that's a different one. 237 00:10:46,920 --> 00:10:50,960 Speaker 2: With this AI machine, it is sort of arguably grown, right, 238 00:10:50,960 --> 00:10:52,800 Speaker 2: and it's sort of grown in a lab more than 239 00:10:52,840 --> 00:10:55,600 Speaker 2: it is programmed, much like a chatbot. I know, it's 240 00:10:55,679 --> 00:10:59,520 Speaker 2: very different technology, like what is the price of in 241 00:10:59,640 --> 00:11:01,720 Speaker 2: video going to be tomorrow? Or what is the price 242 00:11:01,760 --> 00:11:04,600 Speaker 2: of ennvideo going to be this afternoon? What you're saying 243 00:11:04,840 --> 00:11:09,280 Speaker 2: is with your technology, you have a better chance of 244 00:11:09,320 --> 00:11:12,000 Speaker 2: getting that right, that you actually might be able to 245 00:11:12,080 --> 00:11:14,840 Speaker 2: make an informed prediction about the future in a way 246 00:11:14,880 --> 00:11:17,680 Speaker 2: that you couldn't have done, say ten years ago. Yes, 247 00:11:17,840 --> 00:11:20,120 Speaker 2: and that people who talked about this they would come 248 00:11:20,200 --> 00:11:22,160 Speaker 2: up with reasons. Oh, the stock market. It's not like 249 00:11:22,240 --> 00:11:24,920 Speaker 2: chess or go, and therefore you can't really do predictions 250 00:11:24,960 --> 00:11:27,760 Speaker 2: the same way. But what you're saying is that with 251 00:11:27,920 --> 00:11:31,120 Speaker 2: these models, which are different than lllums, there is some, 252 00:11:31,280 --> 00:11:33,640 Speaker 2: at least on a short time scale, predictive capacity. 253 00:11:33,800 --> 00:11:36,680 Speaker 5: Yes, I think I find this still to mistake a 254 00:11:36,679 --> 00:11:38,319 Speaker 5: little bit hard to believe. I think you get this 255 00:11:38,400 --> 00:11:41,240 Speaker 5: kind of efficient market hypothesis stuff jumped into your head. 256 00:11:41,400 --> 00:11:43,320 Speaker 5: It seems if someone is saying they can predict like 257 00:11:43,360 --> 00:11:46,280 Speaker 5: the price of a stock in an hour, your instinctual 258 00:11:46,320 --> 00:11:48,840 Speaker 5: reaction is incredulity, Like just sounds like you're kind of 259 00:11:48,880 --> 00:11:52,240 Speaker 5: bluffing or making it up. But no, these models can 260 00:11:52,320 --> 00:11:53,960 Speaker 5: predict this, And I think it's the way to kind 261 00:11:53,960 --> 00:11:56,640 Speaker 5: of reconcile the like really man like kind of distinction 262 00:11:56,760 --> 00:11:59,640 Speaker 5: is that the predictions are very bad in some sense. 263 00:11:59,679 --> 00:12:02,199 Speaker 5: We don't no way to talk about like accuracy. But I 264 00:12:02,240 --> 00:12:03,520 Speaker 5: think the way to think about it is like the 265 00:12:03,559 --> 00:12:06,800 Speaker 5: accuracy is like fifty point one percent type thing, like 266 00:12:06,840 --> 00:12:09,880 Speaker 5: they're only a little bit better than random. 267 00:12:10,240 --> 00:12:12,800 Speaker 3: But I suppose an extra one percent like blows up 268 00:12:12,840 --> 00:12:14,040 Speaker 3: your profits if you're doing it. 269 00:12:14,720 --> 00:12:16,880 Speaker 5: Doing it scale doing it enough times and over time 270 00:12:16,880 --> 00:12:19,920 Speaker 5: you kind of realize the biased coin flip. And as 271 00:12:19,960 --> 00:12:22,000 Speaker 5: for why it might be possible to do this without 272 00:12:22,080 --> 00:12:26,400 Speaker 5: kind of invoking magic, It's like markets are very beautiful 273 00:12:26,440 --> 00:12:29,439 Speaker 5: interaction of like many different potties, all the different kind 274 00:12:29,440 --> 00:12:32,560 Speaker 5: of utilities and risk preferences and things, and the only 275 00:12:32,559 --> 00:12:34,559 Speaker 5: way you really see what people are doing is by 276 00:12:34,600 --> 00:12:36,520 Speaker 5: like the actions they take in markets, and you kind 277 00:12:36,520 --> 00:12:40,040 Speaker 5: of it's sucking up all that like signal, that micro 278 00:12:40,120 --> 00:12:42,160 Speaker 5: signal and extrapolating. 279 00:12:41,600 --> 00:12:45,600 Speaker 2: The synicism or the skepticism about the possibility of machines 280 00:12:45,640 --> 00:12:48,480 Speaker 2: that could predict the price of stocks is a little strange, right, 281 00:12:48,520 --> 00:12:51,840 Speaker 2: because machines ingest data then whatever, maybe they see a 282 00:12:51,880 --> 00:12:54,760 Speaker 2: pattern more likely than not, this consolation of data means 283 00:12:54,800 --> 00:12:57,200 Speaker 2: tomorrow will be green. Humans do this all the time. 284 00:12:58,040 --> 00:12:59,560 Speaker 2: What else do we have besides data? 285 00:12:59,679 --> 00:12:59,880 Speaker 4: Right? 286 00:13:00,120 --> 00:13:02,800 Speaker 2: You have an analyst and they put out of Tesla 287 00:13:02,920 --> 00:13:04,679 Speaker 2: or whatever in video is going to go to five 288 00:13:04,720 --> 00:13:05,520 Speaker 2: hundred dollars a show? 289 00:13:05,640 --> 00:13:08,600 Speaker 3: How dare you insinuate I'm not smarter than a computer show? 290 00:13:08,800 --> 00:13:11,800 Speaker 2: We were like all humans have this data and much 291 00:13:11,840 --> 00:13:15,160 Speaker 2: less data, and yet humans are making predictions all the time. 292 00:13:15,240 --> 00:13:17,520 Speaker 2: They have a whole industry of it, so the idea 293 00:13:17,559 --> 00:13:20,560 Speaker 2: that therefore was for some reason a computer couldn't do 294 00:13:20,600 --> 00:13:25,000 Speaker 2: this with much more data analysts ever, have I understand 295 00:13:25,000 --> 00:13:27,760 Speaker 2: why the cynicism comes off as a little strength. 296 00:13:28,200 --> 00:13:30,880 Speaker 3: I think some of the doubt stems from this idea 297 00:13:31,080 --> 00:13:34,079 Speaker 3: that a lot of these models tend to be backward looking, right, 298 00:13:34,240 --> 00:13:38,439 Speaker 3: and some of them occasionally are pretty bad at spotting 299 00:13:38,559 --> 00:13:41,800 Speaker 3: or reacting to big regime breaks. And I guess the 300 00:13:41,880 --> 00:13:45,440 Speaker 3: thinking again sometimes is that maybe humans are more flexible, 301 00:13:45,480 --> 00:13:47,959 Speaker 3: maybe more adaptive in their thinking, and they can kind 302 00:13:47,960 --> 00:13:51,560 Speaker 3: of spot these big cultural shifts. How do you actually, 303 00:13:51,840 --> 00:13:54,720 Speaker 3: I guess, prepare for those big pattern changes. 304 00:13:55,000 --> 00:13:58,720 Speaker 5: Yeah, I was at HIT for COVID, and I thought 305 00:13:58,760 --> 00:14:00,840 Speaker 5: that was kind of like the most that was a pattern, 306 00:14:00,880 --> 00:14:04,880 Speaker 5: that was a big pattern break, and things went totally fine. Actually, 307 00:14:04,880 --> 00:14:06,920 Speaker 5: it was more of an engineering crisis in some ways. 308 00:14:07,000 --> 00:14:09,720 Speaker 5: Stock market volumes exploded and every system was just like 309 00:14:09,760 --> 00:14:11,840 Speaker 5: screaming trying to keep up with the volume of activity. 310 00:14:11,880 --> 00:14:15,800 Speaker 5: But in terms of the predictions they stayed quite good. 311 00:14:16,080 --> 00:14:19,400 Speaker 5: And I had like Rickoncilus in my head as well. 312 00:14:19,560 --> 00:14:21,840 Speaker 5: I guess it is a matter of like horizon and 313 00:14:21,880 --> 00:14:25,000 Speaker 5: like how far in the future are we talking intra day. 314 00:14:25,360 --> 00:14:27,640 Speaker 5: I think a lot of the price movement is driven 315 00:14:27,800 --> 00:14:30,840 Speaker 5: by just observing, like the flows. It's hard for us 316 00:14:30,880 --> 00:14:33,720 Speaker 5: as humans to observe, but it's like the relative patterns 317 00:14:33,720 --> 00:14:35,840 Speaker 5: of buyers and cells in the markets. And it's like, yes, 318 00:14:35,920 --> 00:14:38,720 Speaker 5: during COVID, volatility was massive and prices were moving up 319 00:14:38,760 --> 00:14:40,440 Speaker 5: and down a lot, but they were going up and 320 00:14:40,600 --> 00:14:44,000 Speaker 5: down during say March twenty twenty, and so these models 321 00:14:44,040 --> 00:14:45,520 Speaker 5: it was sort of out of domain for a human. 322 00:14:45,560 --> 00:14:47,560 Speaker 4: But I don't think out of domain in some sense 323 00:14:47,560 --> 00:14:48,200 Speaker 4: for the models. 324 00:14:49,200 --> 00:14:51,560 Speaker 5: But I guess I also don't know how you would 325 00:14:51,560 --> 00:14:54,000 Speaker 5: apply this thinking if you were trying to make it a. 326 00:14:53,680 --> 00:14:55,240 Speaker 4: Month ahead predictions. 327 00:14:55,400 --> 00:14:58,280 Speaker 5: I often get like people being like, oh, everyone knows 328 00:14:58,320 --> 00:15:00,800 Speaker 5: hedge funds, which were not a hedge fund is like 329 00:15:00,000 --> 00:15:03,280 Speaker 5: like flipping coins, and it's some survivor bias thing. And 330 00:15:03,720 --> 00:15:06,600 Speaker 5: you know, I genuinely don't know about months out prediction 331 00:15:06,720 --> 00:15:09,400 Speaker 5: stuff that is not a data rich environment. 332 00:15:10,080 --> 00:15:13,240 Speaker 2: Just by definition, there have been more days than months, 333 00:15:13,280 --> 00:15:16,480 Speaker 2: so therefore prediction on a day basis you're offered a 334 00:15:16,520 --> 00:15:17,240 Speaker 2: lot more data that. 335 00:15:17,920 --> 00:15:21,520 Speaker 5: The rule of thumb is basically very useful and it 336 00:15:21,560 --> 00:15:24,280 Speaker 5: extends all the way down to seconds, and we see 337 00:15:24,280 --> 00:15:27,120 Speaker 5: that empirically all the time. And so yeah, I guess 338 00:15:27,240 --> 00:15:28,920 Speaker 5: all the things I'm saying, do you have this heavy 339 00:15:28,960 --> 00:15:30,560 Speaker 5: out that it does rely on sudden of it being 340 00:15:30,600 --> 00:15:33,520 Speaker 5: a certain level of signals and noise. I definitely cannot 341 00:15:33,680 --> 00:15:36,560 Speaker 5: make reasonable claims about the price of things in like 342 00:15:36,600 --> 00:15:39,080 Speaker 5: a month using the same kind of like AI hammer. 343 00:15:39,680 --> 00:15:42,920 Speaker 5: I guess also to be specific, I'm talking a lot 344 00:15:42,960 --> 00:15:45,560 Speaker 5: about using market data to make these predictions. And that's 345 00:15:45,600 --> 00:15:48,280 Speaker 5: because on the sort of intra day timescale that is 346 00:15:48,320 --> 00:15:50,440 Speaker 5: the most important thing. It's all about flows and things 347 00:15:50,480 --> 00:15:52,720 Speaker 5: have been back and forth. If you're thinking about things 348 00:15:52,760 --> 00:15:55,240 Speaker 5: in a month's timescale, I think that's fundamentals. 349 00:15:55,840 --> 00:15:57,880 Speaker 4: And can AI be used for that? 350 00:15:58,200 --> 00:16:01,040 Speaker 5: I don't know, to be honest, and it's definitely outside 351 00:16:01,280 --> 00:16:04,640 Speaker 5: my wheelhouse. And I guess people have various opinions about that, 352 00:16:05,320 --> 00:16:07,120 Speaker 5: and maybe some people very much would like to claim 353 00:16:07,160 --> 00:16:09,840 Speaker 5: that they can, and you know, others maybe don't. But 354 00:16:09,840 --> 00:16:14,120 Speaker 5: it's definitely outside of my Arab expertise. And I don't know, Wait, talk. 355 00:16:13,960 --> 00:16:16,360 Speaker 3: To us about the data that you're using, or talk more, 356 00:16:16,400 --> 00:16:19,440 Speaker 3: because this is another area where people tend to talk 357 00:16:19,560 --> 00:16:23,880 Speaker 3: in PR speak, sometimes we have access to all this data, 358 00:16:24,240 --> 00:16:27,600 Speaker 3: unusual data, alternative data, and that's going to enable us 359 00:16:27,600 --> 00:16:30,200 Speaker 3: to use AI better. What are you actually looking at 360 00:16:30,240 --> 00:16:32,360 Speaker 3: and what have you found? I guess most useful? 361 00:16:32,600 --> 00:16:35,200 Speaker 5: Well, I think the thing that I found most counterintuitive 362 00:16:35,200 --> 00:16:37,560 Speaker 5: when I started was that when you're thinking about predicting 363 00:16:37,600 --> 00:16:40,640 Speaker 5: the prices of anything a minute, an hour out, I 364 00:16:40,760 --> 00:16:43,520 Speaker 5: far the most useful thing is just market data. This 365 00:16:43,640 --> 00:16:46,160 Speaker 5: is the market data feeds you can buy from the 366 00:16:46,240 --> 00:16:49,080 Speaker 5: exchanges for a pretty reasonable price. People often think of 367 00:16:49,160 --> 00:16:51,880 Speaker 5: some sort of like competitive moat. The data feeds for 368 00:16:51,920 --> 00:16:54,480 Speaker 5: these exchanges are not particularly high. I mean crypto, you 369 00:16:54,480 --> 00:16:56,480 Speaker 5: know where it's just like a wild West. But everyone 370 00:16:56,520 --> 00:16:58,880 Speaker 5: can collect these feeds, and so that is the most 371 00:16:58,960 --> 00:17:01,920 Speaker 5: useful raw ingredient. That is the most true expression of 372 00:17:01,920 --> 00:17:04,399 Speaker 5: everyone's intense right they're going to the market, that quoting, 373 00:17:04,400 --> 00:17:07,320 Speaker 5: the buying, selling, That is the primary ingredient. People get 374 00:17:07,359 --> 00:17:08,960 Speaker 5: kind of caught up on the whole, like do you 375 00:17:09,000 --> 00:17:11,720 Speaker 5: have a Twitter feed type of thing? And Bloomberg sells 376 00:17:11,720 --> 00:17:15,760 Speaker 5: a Twitter feed through a state of products, and it's 377 00:17:15,840 --> 00:17:18,920 Speaker 5: every now and then obviously something happens, news happens during 378 00:17:19,000 --> 00:17:21,800 Speaker 5: market hours that moves, the price justicates the price. But 379 00:17:21,840 --> 00:17:24,520 Speaker 5: if you really coldly rationalize that that is a relatively 380 00:17:24,640 --> 00:17:28,560 Speaker 5: infrequent thing compared to the overall massive markets. So I 381 00:17:28,800 --> 00:17:31,800 Speaker 5: thinking entry day, I think these market data feeds, it's 382 00:17:31,840 --> 00:17:34,439 Speaker 5: literally like a little events someone quoted that this price 383 00:17:34,720 --> 00:17:35,600 Speaker 5: and this size. 384 00:17:35,760 --> 00:17:36,760 Speaker 4: It's all anonymous. 385 00:17:36,920 --> 00:17:39,080 Speaker 5: Market data feeds are anonymous, and so that is the 386 00:17:39,200 --> 00:17:41,880 Speaker 5: raw stuff, and it is vast. There are just millions 387 00:17:41,880 --> 00:17:44,160 Speaker 5: and millions of events per day, per stock, per future. 388 00:17:44,960 --> 00:17:47,880 Speaker 5: When you get to the day days timescale, that's where 389 00:17:47,920 --> 00:17:50,399 Speaker 5: the alternative data quote unquote kind of really comes in 390 00:17:50,400 --> 00:17:53,840 Speaker 5: as an alternative to market data, the SEC filings, the 391 00:17:53,920 --> 00:17:57,960 Speaker 5: news feeds, balance sheets, brokers reports. 392 00:17:57,640 --> 00:17:58,159 Speaker 4: Things like this. 393 00:17:58,320 --> 00:18:01,359 Speaker 5: That's where that comes in, a vast sea of data 394 00:18:01,400 --> 00:18:05,239 Speaker 5: offerings that people try and sell that I think in 395 00:18:05,240 --> 00:18:07,120 Speaker 5: that kind of situation, you know, it's a very low 396 00:18:07,200 --> 00:18:10,399 Speaker 5: shop environment you start getting into, and it can be 397 00:18:10,440 --> 00:18:12,840 Speaker 5: hard to attribute the extra shop needs of these things. 398 00:18:13,280 --> 00:18:16,480 Speaker 5: But in some sense it's also very democratized where maybe 399 00:18:16,480 --> 00:18:19,919 Speaker 5: people collecting very secret data sets, but my inbox, and 400 00:18:19,960 --> 00:18:21,439 Speaker 5: I'm not even the person in charge of buying these 401 00:18:21,440 --> 00:18:24,240 Speaker 5: alternative data sets, is often full of people trying to 402 00:18:24,240 --> 00:18:26,840 Speaker 5: sell me the latest alternative data set, and I think 403 00:18:26,840 --> 00:18:29,159 Speaker 5: a lot of them don't necessarily have much predictive value, 404 00:18:29,720 --> 00:18:31,080 Speaker 5: but clearly as a market for. 405 00:18:31,480 --> 00:18:32,880 Speaker 3: What's the craziest one you've seen? 406 00:18:33,320 --> 00:18:34,199 Speaker 4: Huh can you remember? 407 00:18:34,680 --> 00:18:37,919 Speaker 5: I mean, people have definitely reacted very strongly to the 408 00:18:38,000 --> 00:18:40,960 Speaker 5: Wall Street bets Era tried to kind of create a 409 00:18:40,960 --> 00:18:44,439 Speaker 5: bunch of reddity extracted thing and go beyond just like 410 00:18:44,600 --> 00:18:47,120 Speaker 5: raw captures of Reddit and trying to distill. 411 00:18:46,800 --> 00:18:50,560 Speaker 4: It into something. It's just even just thinking about it. 412 00:18:50,600 --> 00:18:53,080 Speaker 5: The meme stock thing is talked about more after it happens, 413 00:18:53,119 --> 00:18:54,600 Speaker 5: and it happens before, and so like. 414 00:18:54,880 --> 00:19:13,159 Speaker 2: I don't know, it's sort of a sideways question. You 415 00:19:13,200 --> 00:19:16,440 Speaker 2: mentioned interpretability, and just give me something I've been wondering 416 00:19:16,480 --> 00:19:19,040 Speaker 2: about AI for a while, not even in the finance 417 00:19:19,080 --> 00:19:22,639 Speaker 2: realm specifically, you're a deep mind, which of course produced 418 00:19:22,640 --> 00:19:26,320 Speaker 2: a great GOT player better than the greatest grandmaster in 419 00:19:26,359 --> 00:19:29,520 Speaker 2: the world. I play chess. We know that chess engines 420 00:19:29,960 --> 00:19:33,159 Speaker 2: are much better than any human. On the other hand, 421 00:19:33,920 --> 00:19:36,439 Speaker 2: as far as I can tell, there is no good 422 00:19:36,560 --> 00:19:39,399 Speaker 2: AI chess tutor. So in other words, the chess crush 423 00:19:39,440 --> 00:19:41,440 Speaker 2: a doo. But like I've never been able to get 424 00:19:41,440 --> 00:19:43,760 Speaker 2: a thing where it's okay, you did this move, but 425 00:19:43,840 --> 00:19:47,280 Speaker 2: you know what, you're closing this rook file and down 426 00:19:47,320 --> 00:19:50,600 Speaker 2: the line because like it doesn't do that. The chess 427 00:19:50,600 --> 00:19:54,880 Speaker 2: dot com human talk is very rudimentary, et cetera. Can 428 00:19:54,920 --> 00:19:57,280 Speaker 2: you talk a little bit about why there are these 429 00:19:57,400 --> 00:20:01,520 Speaker 2: problems where some version of AI or machine learning or 430 00:20:01,560 --> 00:20:05,560 Speaker 2: whatever can do fantastically well, but then the actual explanation 431 00:20:05,840 --> 00:20:08,359 Speaker 2: of what it's doing, which I think is kind of 432 00:20:08,359 --> 00:20:13,400 Speaker 2: what interpretability is, can't articulate in a plane English why 433 00:20:13,520 --> 00:20:15,960 Speaker 2: it's able to do what it does. 434 00:20:16,240 --> 00:20:19,800 Speaker 5: I think it's just because these neural networks are it's 435 00:20:19,880 --> 00:20:24,280 Speaker 5: just like a big old blob of numbers, and what 436 00:20:24,320 --> 00:20:27,199 Speaker 5: we're aiming to do in maturnit these models is to 437 00:20:27,400 --> 00:20:31,360 Speaker 5: almost like free ourselves from almost all structure, and they 438 00:20:31,440 --> 00:20:33,159 Speaker 5: might learn things in a way that is nothing at 439 00:20:33,200 --> 00:20:36,879 Speaker 5: all like how we learn things. And so my best 440 00:20:36,920 --> 00:20:39,080 Speaker 5: guess for like why it's hot is because they might 441 00:20:39,160 --> 00:20:42,359 Speaker 5: be reasoning in some sense internally, and people use these 442 00:20:42,400 --> 00:20:43,080 Speaker 5: words like reasoning. 443 00:20:43,119 --> 00:20:43,920 Speaker 4: It kind of makes me win. 444 00:20:43,920 --> 00:20:47,640 Speaker 5: So I've seen imagination and things used about neural networks. 445 00:20:48,160 --> 00:20:50,560 Speaker 5: I don't know if it's like kind of anthropomorphization of 446 00:20:50,600 --> 00:20:53,360 Speaker 5: them as kind of dangerous because they are essentially processing 447 00:20:53,400 --> 00:20:55,840 Speaker 5: things internally in this way that I think is inherently 448 00:20:56,800 --> 00:20:59,280 Speaker 5: not like how we do. And that is my best 449 00:20:59,400 --> 00:21:02,280 Speaker 5: sort of Yes, there are some interesting counter examples. One 450 00:21:02,320 --> 00:21:04,200 Speaker 5: of my favorite set of things in It Possible years 451 00:21:04,320 --> 00:21:07,280 Speaker 5: was Golden Gate Claude, which was anthropic made that the 452 00:21:07,359 --> 00:21:09,720 Speaker 5: model basically get very interested in the Golden gate Bridge. 453 00:21:09,880 --> 00:21:11,600 Speaker 5: Every question they asked would come back to the Golden 454 00:21:11,600 --> 00:21:16,720 Speaker 5: gate Bridge, and so they're not completely impenetrable, but it's 455 00:21:16,800 --> 00:21:18,439 Speaker 5: clear that like, I guess how I'd be on a 456 00:21:18,440 --> 00:21:20,119 Speaker 5: point to kind of map this back to how anyway, 457 00:21:20,280 --> 00:21:22,800 Speaker 5: Like we think it's very attempting to and exciting too, 458 00:21:22,800 --> 00:21:25,600 Speaker 5: and as especially for like AI safety applications, which is 459 00:21:25,680 --> 00:21:28,239 Speaker 5: not really relevant to me so much, but I think 460 00:21:28,280 --> 00:21:29,320 Speaker 5: it's for attempting to try. 461 00:21:29,560 --> 00:21:30,639 Speaker 1: Yeah. 462 00:21:30,840 --> 00:21:33,280 Speaker 2: No, it strikes me is that if you could solve 463 00:21:33,320 --> 00:21:35,640 Speaker 2: that many jobs, would you could actually make a lot 464 00:21:35,680 --> 00:21:38,360 Speaker 2: of productivity gains. But I do think that's an important 465 00:21:38,600 --> 00:21:41,560 Speaker 2: hurdle when you're training your models. So your models are 466 00:21:41,560 --> 00:21:43,879 Speaker 2: different than large language models, et cetera, but what they 467 00:21:43,960 --> 00:21:46,320 Speaker 2: have in common is there's a credible amount of data, 468 00:21:46,359 --> 00:21:50,920 Speaker 2: incredible amount of compute demand, how applicable if someone had 469 00:21:50,960 --> 00:21:55,080 Speaker 2: worked on lllms, would your training process be to them? 470 00:21:55,080 --> 00:21:58,399 Speaker 2: How interpret how could they move from that environment to yours? 471 00:21:58,440 --> 00:22:01,479 Speaker 2: Are there enough similarities in the base notions and compute 472 00:22:01,520 --> 00:22:04,480 Speaker 2: and requirements to train a model such as yours versus 473 00:22:04,560 --> 00:22:06,280 Speaker 2: people are doing it the major labs. 474 00:22:06,280 --> 00:22:09,959 Speaker 5: I would say now in twenty twenty five, absolutely, But 475 00:22:10,000 --> 00:22:12,159 Speaker 5: I would not have said that in twenty twenty. And 476 00:22:12,320 --> 00:22:14,840 Speaker 5: this is something that kind of caught me by surprise 477 00:22:15,040 --> 00:22:17,960 Speaker 5: having done this for a while now, is that our 478 00:22:18,000 --> 00:22:22,560 Speaker 5: problems are kind of defined by long sequential strings of 479 00:22:22,560 --> 00:22:25,879 Speaker 5: information in some sense, and extrapolating from that. If I 480 00:22:25,880 --> 00:22:27,680 Speaker 5: think back to the pastive AI, it was like is 481 00:22:27,720 --> 00:22:29,919 Speaker 5: this is it a hot dog or not? It's kind 482 00:22:29,960 --> 00:22:33,160 Speaker 5: of like the like image classifier, you know, test. Then 483 00:22:33,520 --> 00:22:35,200 Speaker 5: there was some stuff with audio and things. I was 484 00:22:35,200 --> 00:22:38,879 Speaker 5: a little bit more familiar robotics, eh. But when we 485 00:22:38,960 --> 00:22:40,680 Speaker 5: got to the sort of LM error, it got very 486 00:22:40,680 --> 00:22:45,080 Speaker 5: interesting because suddenly the problems were very similar and that 487 00:22:45,160 --> 00:22:48,840 Speaker 5: you have you want to think back over like long histories, 488 00:22:49,000 --> 00:22:49,920 Speaker 5: long contexts. 489 00:22:49,960 --> 00:22:51,040 Speaker 4: Okay, that sounds good. 490 00:22:51,119 --> 00:22:52,399 Speaker 5: You've got a lot of data and you want to 491 00:22:52,440 --> 00:22:54,200 Speaker 5: turn through it in as efficient way as possible. 492 00:22:54,760 --> 00:22:56,160 Speaker 4: You also have to serve this model. 493 00:22:56,320 --> 00:23:01,040 Speaker 5: One has to run in like relatively reasonable speed, especially 494 00:23:01,080 --> 00:23:03,760 Speaker 5: for the LM. Is there a million people typing into 495 00:23:03,800 --> 00:23:05,639 Speaker 5: chat GBT dot com and they want to hear a 496 00:23:05,720 --> 00:23:09,080 Speaker 5: response in a relatively prompt manner. Of course for us also, 497 00:23:09,200 --> 00:23:10,880 Speaker 5: the models have to make their predictions in a prompt 498 00:23:10,920 --> 00:23:13,440 Speaker 5: manner of voice of predictions that I'm useful. So all 499 00:23:13,480 --> 00:23:15,960 Speaker 5: these things mean that our sort of way of thinking 500 00:23:15,960 --> 00:23:19,200 Speaker 5: about it has become very similar to the frontier LM things. 501 00:23:19,800 --> 00:23:22,240 Speaker 5: It's just a very different modality. We're operating on I 502 00:23:22,280 --> 00:23:26,000 Speaker 5: guess primarily text, and we're operating on this fileless interpretable 503 00:23:26,040 --> 00:23:30,800 Speaker 5: but still sequential stream of tokens, except our tokens are 504 00:23:30,960 --> 00:23:33,439 Speaker 5: market events. And so it's a lot of fun because 505 00:23:33,520 --> 00:23:35,000 Speaker 5: you know, in terms of like the research that is 506 00:23:35,040 --> 00:23:36,520 Speaker 5: still published, you can kind of look at it for 507 00:23:36,560 --> 00:23:39,920 Speaker 5: inspiration and draw our comparisons. But it's also very much 508 00:23:39,920 --> 00:23:42,439 Speaker 5: its own problem, which is kind of keeps me interested 509 00:23:42,440 --> 00:23:44,800 Speaker 5: every day because it's like its own unique thing. 510 00:23:45,119 --> 00:23:45,679 Speaker 4: It's different. 511 00:23:46,480 --> 00:23:48,400 Speaker 3: I want to go back to the point you made 512 00:23:48,520 --> 00:23:53,000 Speaker 3: about data and I guess democratizing finance in many ways, 513 00:23:53,080 --> 00:23:55,800 Speaker 3: and maybe this is a weird question. But I'm thinking 514 00:23:55,840 --> 00:23:58,120 Speaker 3: back to the twenty tens and we used to talk 515 00:23:58,119 --> 00:24:02,080 Speaker 3: about the big investment banks as flow monsters. They see 516 00:24:02,160 --> 00:24:04,320 Speaker 3: all these orders, they get all these orders, they see 517 00:24:04,320 --> 00:24:08,200 Speaker 3: all the flow, and that allows them to optimize on 518 00:24:08,359 --> 00:24:13,159 Speaker 3: funding costs and other expenses. Is the idea that data 519 00:24:13,320 --> 00:24:17,639 Speaker 3: and AI can kind of replicate that advantage so that everyone, 520 00:24:17,920 --> 00:24:20,840 Speaker 3: or not everyone but Hudson at least becomes its own 521 00:24:20,880 --> 00:24:21,800 Speaker 3: little flow monster. 522 00:24:22,280 --> 00:24:26,400 Speaker 5: Yeah, I think there's still some trends and markets that 523 00:24:26,720 --> 00:24:28,720 Speaker 5: worry me a little bit in terms of I guess 524 00:24:28,800 --> 00:24:32,240 Speaker 5: our platonic ideal market structure is probably like everyone trades 525 00:24:32,280 --> 00:24:34,760 Speaker 5: on exchange in a centralized place, but that is not 526 00:24:34,960 --> 00:24:38,200 Speaker 5: really how things seem to be going. And there's a 527 00:24:38,280 --> 00:24:43,560 Speaker 5: huge amount of like off exchange dark quasi dark volume, 528 00:24:43,840 --> 00:24:46,800 Speaker 5: and I think there's still a lot of qunits of 529 00:24:46,800 --> 00:24:50,760 Speaker 5: the trading world where like being in the room is 530 00:24:50,880 --> 00:24:52,840 Speaker 5: kind of like this big advantage. And this is a 531 00:24:52,960 --> 00:24:55,720 Speaker 5: very much anti AI play in some senses. Data is 532 00:24:55,800 --> 00:24:57,720 Speaker 5: hidden the data, the flow data is hidden it and 533 00:24:57,800 --> 00:25:00,000 Speaker 5: it's not something that you can feed into a machine. 534 00:25:00,240 --> 00:25:01,760 Speaker 4: This very spas amounts of. 535 00:25:01,720 --> 00:25:05,000 Speaker 5: It, So that's kind of an interesting trend a lot 536 00:25:05,000 --> 00:25:07,040 Speaker 5: of us did get sales get reported in a centralized 537 00:25:07,040 --> 00:25:10,800 Speaker 5: place later, but it's not prompt enough to be useful, 538 00:25:11,400 --> 00:25:14,200 Speaker 5: and so to the AI thrives on data, this is 539 00:25:14,359 --> 00:25:16,200 Speaker 5: in some sense like an issue for the long run, 540 00:25:16,359 --> 00:25:18,560 Speaker 5: you need to kind of be in the rooms where 541 00:25:18,840 --> 00:25:20,520 Speaker 5: the sort of trading is happening. 542 00:25:20,800 --> 00:25:23,719 Speaker 2: I'm glad you brought that up, because that's specifically what 543 00:25:23,760 --> 00:25:27,879 Speaker 2: I'm curious about from the sort of physical infrastructure side, 544 00:25:27,960 --> 00:25:31,199 Speaker 2: Like if I have a queria to chat GPT, I 545 00:25:31,200 --> 00:25:33,960 Speaker 2: don't care if the model is like trained in like Eblene, 546 00:25:34,040 --> 00:25:38,080 Speaker 2: Texas or wherever it gets back to me and whatever. 547 00:25:38,400 --> 00:25:41,960 Speaker 2: But I know that for high frequency trading, at least 548 00:25:42,000 --> 00:25:44,359 Speaker 2: on the execution side, there are certain parts that you 549 00:25:44,440 --> 00:25:47,119 Speaker 2: want to be literally co located, and you want to 550 00:25:47,160 --> 00:25:50,439 Speaker 2: have the shortest possible wire, and however short it is, 551 00:25:51,200 --> 00:25:53,600 Speaker 2: ideally you would like it to be shorter. Can you 552 00:25:53,640 --> 00:25:58,359 Speaker 2: talk about the differences and similarities between essentially your physical 553 00:25:58,400 --> 00:26:02,879 Speaker 2: hardware stack verse is what would be required at a 554 00:26:02,960 --> 00:26:04,640 Speaker 2: large language model frontier lab. 555 00:26:04,960 --> 00:26:05,600 Speaker 4: Yeah. 556 00:26:05,640 --> 00:26:07,760 Speaker 5: I think at a bulk level there was actually some 557 00:26:07,800 --> 00:26:10,040 Speaker 5: pretty similar things. So I can think about it as 558 00:26:10,040 --> 00:26:12,919 Speaker 5: like latency and throughput latency being the time to react 559 00:26:12,960 --> 00:26:14,919 Speaker 5: and then throughput kind of like how much thinking you 560 00:26:14,920 --> 00:26:17,359 Speaker 5: can do in a certain period of time. And so 561 00:26:18,000 --> 00:26:20,800 Speaker 5: you're right that like this space demands like low latency. 562 00:26:21,280 --> 00:26:22,640 Speaker 5: Early in the twenty ten, so it was a sort 563 00:26:22,680 --> 00:26:25,800 Speaker 5: of Flashboys book and perception where it was like really 564 00:26:25,920 --> 00:26:29,879 Speaker 5: kind of about arbitraging latency. I'm happy to report that 565 00:26:29,960 --> 00:26:32,119 Speaker 5: in some sense all the latency has been arbitraged. 566 00:26:32,160 --> 00:26:34,719 Speaker 2: For the most part, there's no more engine shortening. 567 00:26:34,760 --> 00:26:37,399 Speaker 5: The wire is probably like a little bit, but it's 568 00:26:37,680 --> 00:26:39,920 Speaker 5: relatively small, and like I think if you look at 569 00:26:39,920 --> 00:26:44,400 Speaker 5: the big quant trading firms that need to like really 570 00:26:44,480 --> 00:26:46,639 Speaker 5: make the wires as short as they possibly can is 571 00:26:46,720 --> 00:26:49,040 Speaker 5: done or are no longer relevant, which is great because 572 00:26:49,040 --> 00:26:51,800 Speaker 5: I find out stuff pretty boring. Personally, I think about 573 00:26:51,840 --> 00:26:53,800 Speaker 5: it more as like, for a given kind of like 574 00:26:54,680 --> 00:26:58,800 Speaker 5: speed of response, you should be the smartest person. So 575 00:26:58,840 --> 00:27:00,600 Speaker 5: it feels like this curve, if you're going to take 576 00:27:00,720 --> 00:27:03,119 Speaker 5: a second to come up with your trading decision, it'd 577 00:27:03,160 --> 00:27:05,320 Speaker 5: be a really really good decision. And then it doesn't 578 00:27:05,359 --> 00:27:07,199 Speaker 5: kind of matter that it took a second. And if 579 00:27:07,200 --> 00:27:09,560 Speaker 5: you're going to take a microsecond, well a you probably 580 00:27:09,600 --> 00:27:11,399 Speaker 5: can't do too much in a microsecond, but you know 581 00:27:11,720 --> 00:27:14,240 Speaker 5: it'll still be the best response in a microsecond, and 582 00:27:14,320 --> 00:27:15,000 Speaker 5: so you. 583 00:27:14,920 --> 00:27:16,280 Speaker 2: Could be a little worse. You can be a little 584 00:27:16,320 --> 00:27:17,400 Speaker 2: worse than the second. 585 00:27:17,160 --> 00:27:18,080 Speaker 4: Yeah, for sure. 586 00:27:18,520 --> 00:27:22,479 Speaker 5: And so essentially for our training, we use the cloud. 587 00:27:22,560 --> 00:27:25,120 Speaker 5: We have our own training data centers that we've built ourselves. 588 00:27:25,600 --> 00:27:28,679 Speaker 5: That is basically the same, although much much smaller scale 589 00:27:28,880 --> 00:27:31,280 Speaker 5: the scale of Googles and sayings. I don't know, it 590 00:27:31,280 --> 00:27:34,360 Speaker 5: blows my mind the spending on stuff like this. We are, 591 00:27:34,680 --> 00:27:37,280 Speaker 5: I think big if you're not comparing us to Google 592 00:27:37,320 --> 00:27:41,040 Speaker 5: a meta, but that's not like bajillions of dollars. So 593 00:27:41,119 --> 00:27:44,199 Speaker 5: training is kind of the same inference. We need to 594 00:27:44,200 --> 00:27:46,840 Speaker 5: put a devices close to the exchanges, and we need 595 00:27:46,880 --> 00:27:49,720 Speaker 5: to think very hot about the power usage and the latency. 596 00:27:49,880 --> 00:27:52,480 Speaker 5: But we have hardware teams, We make our own FPGAs, 597 00:27:53,040 --> 00:27:55,119 Speaker 5: we make our own chips, and we use off the 598 00:27:55,119 --> 00:27:57,439 Speaker 5: shelf GPUs, And what we try and do is we 599 00:27:57,480 --> 00:27:58,840 Speaker 5: try and make sure of it for any given set 600 00:27:58,840 --> 00:28:02,080 Speaker 5: of speed or response, making the smartest possible decision you can. 601 00:28:02,160 --> 00:28:07,400 Speaker 2: So you can kind of field programmabile gate rate. Oh yeah, sorry, Yeah. 602 00:28:07,520 --> 00:28:10,760 Speaker 5: Basically, all these different devices have different latencies and through puts. 603 00:28:10,800 --> 00:28:12,960 Speaker 5: GPUs have very high through puts. They are that's what 604 00:28:13,000 --> 00:28:15,320 Speaker 5: they're useful for, right and so, but the problem with 605 00:28:15,440 --> 00:28:18,879 Speaker 5: markets is they're kind of like narrow. The amount of 606 00:28:18,960 --> 00:28:21,680 Speaker 5: traffic flowing into these like lms from everyone typing into 607 00:28:21,680 --> 00:28:24,000 Speaker 5: their redbos it's massive, and they do also some clever 608 00:28:24,119 --> 00:28:26,440 Speaker 5: things kind of batch up requests and process and things. 609 00:28:26,880 --> 00:28:28,960 Speaker 5: We don't really have that luxury really, like the markets 610 00:28:29,000 --> 00:28:30,480 Speaker 5: are going to happen at the speed they happen. We 611 00:28:30,520 --> 00:28:32,040 Speaker 5: can't kind of like duck out for a while and 612 00:28:32,080 --> 00:28:33,680 Speaker 5: catch up. We kind of need to stay in the game. 613 00:28:34,240 --> 00:28:36,760 Speaker 5: So we have always sort of interesting design challenges around 614 00:28:36,760 --> 00:28:39,920 Speaker 5: how do we use GPUs which are relatively high latency. 615 00:28:39,960 --> 00:28:40,840 Speaker 4: They take a while to give. 616 00:28:40,760 --> 00:28:43,560 Speaker 5: Back a result, but they can process the whole stock 617 00:28:43,600 --> 00:28:47,080 Speaker 5: market on one GPU type of thing versus the fast response, 618 00:28:47,120 --> 00:28:50,240 Speaker 5: And so we have whole teams dedicated to thinking about, Okay, 619 00:28:50,240 --> 00:28:53,560 Speaker 5: I've got this like intelligent blob, how do I get 620 00:28:53,640 --> 00:28:56,520 Speaker 5: ounces out of it in different ways at different speeds? 621 00:28:57,000 --> 00:28:59,520 Speaker 5: And that I think is where a lot of smarts 622 00:28:59,520 --> 00:29:02,520 Speaker 5: are going in this world these days, rather than like 623 00:29:02,560 --> 00:29:04,920 Speaker 5: how do I make sure my microwave towers are like 624 00:29:05,040 --> 00:29:08,640 Speaker 5: slightly better aligned somewhere in rural Pennsylvania, which is a 625 00:29:08,680 --> 00:29:09,240 Speaker 5: cool challenge. 626 00:29:09,320 --> 00:29:10,600 Speaker 4: It's own right, but it's done. 627 00:29:10,680 --> 00:29:10,960 Speaker 3: I think. 628 00:29:11,000 --> 00:29:12,800 Speaker 5: I think people have found the straightest line from New 629 00:29:12,880 --> 00:29:14,960 Speaker 5: Jersey to Chicago. 630 00:29:30,360 --> 00:29:33,720 Speaker 3: Joe brought up some of the cynicism around CME's cloud 631 00:29:33,720 --> 00:29:36,640 Speaker 3: deal with Google, and this came up speaking of a 632 00:29:36,680 --> 00:29:39,400 Speaker 3: specific cynic who went on the record in one of 633 00:29:39,400 --> 00:29:43,080 Speaker 3: our episodes. Don Wilson basically made the argument that matching 634 00:29:43,120 --> 00:29:46,320 Speaker 3: on a cloud doesn't necessarily make sense because you might 635 00:29:46,360 --> 00:29:49,040 Speaker 3: put into orders and you're not really sure which order 636 00:29:49,080 --> 00:29:51,760 Speaker 3: gets filled first. I guess you're kind of back in 637 00:29:51,800 --> 00:29:54,560 Speaker 3: that black box environment, or maybe it's a latency issue. 638 00:29:54,600 --> 00:29:56,720 Speaker 3: I don't know, is that a problem that you're seeing. 639 00:29:57,000 --> 00:29:59,720 Speaker 5: That's something that I worry about. A general philosophy is 640 00:29:59,760 --> 00:30:02,560 Speaker 5: MA should be very like transparent and as fair as possible, 641 00:30:02,600 --> 00:30:05,360 Speaker 5: So like equalizing access is a good thing in terms 642 00:30:05,400 --> 00:30:07,880 Speaker 5: of participants. Shouldn't be at to like basically pull weird 643 00:30:07,960 --> 00:30:10,760 Speaker 5: tricks to be faster. On the other hand, I think 644 00:30:10,760 --> 00:30:13,600 Speaker 5: you want reliability, so like this concept of like orders 645 00:30:13,640 --> 00:30:15,760 Speaker 5: arriving at different times and being filled in different orders 646 00:30:15,840 --> 00:30:18,000 Speaker 5: just doesn't seem like a very sensible way to run 647 00:30:18,040 --> 00:30:20,800 Speaker 5: a market. It's something that requires a lot of effort 648 00:30:21,000 --> 00:30:24,840 Speaker 5: to engineer around, and it's just a good market design 649 00:30:25,000 --> 00:30:27,680 Speaker 5: to have. It is a very widespread though, in existing 650 00:30:27,720 --> 00:30:30,280 Speaker 5: exchanges across the world. We've traded in like a vast 651 00:30:30,360 --> 00:30:33,760 Speaker 5: number of countries, and some of the exchanges have such 652 00:30:33,840 --> 00:30:37,160 Speaker 5: amazing hardware that like, if two orders are sent within 653 00:30:37,360 --> 00:30:39,760 Speaker 5: like a nanosecond of each other, this exchange will never 654 00:30:39,960 --> 00:30:41,720 Speaker 5: process them in the wrong order, even if it's one 655 00:30:41,760 --> 00:30:44,120 Speaker 5: hundred different network ports and they're all connected. They have 656 00:30:44,200 --> 00:30:46,840 Speaker 5: this amazing time stamping stuff. On the other hand, you 657 00:30:46,920 --> 00:30:50,400 Speaker 5: might have like a crypto exchange where it kind of 658 00:30:50,440 --> 00:30:54,160 Speaker 5: feels like a kid learned JavaScript and ran set up 659 00:30:54,160 --> 00:30:56,239 Speaker 5: a website and you're kind of like you send an 660 00:30:56,320 --> 00:30:58,520 Speaker 5: order and you may may not be confirmed that they 661 00:30:58,560 --> 00:31:00,640 Speaker 5: even received it, and then you kind of have to 662 00:31:00,640 --> 00:31:03,080 Speaker 5: refresh your like account balance page like five. 663 00:31:02,880 --> 00:31:04,600 Speaker 4: Minutes later to see if there's many in it or not. 664 00:31:04,720 --> 00:31:07,320 Speaker 5: And we kind of we'll take we'll deal with it 665 00:31:07,440 --> 00:31:09,080 Speaker 5: as it is, but certainly we have a preference for 666 00:31:09,160 --> 00:31:12,640 Speaker 5: kind of equalized access but sort of predictable outcomes, and 667 00:31:12,680 --> 00:31:15,040 Speaker 5: I think that kind of leads to like people spending 668 00:31:15,040 --> 00:31:18,320 Speaker 5: efforts I think it's not astly very great thing for 669 00:31:18,440 --> 00:31:20,520 Speaker 5: society for people to be like stressing very hot about 670 00:31:20,520 --> 00:31:20,960 Speaker 5: why it lam. 671 00:31:21,320 --> 00:31:24,560 Speaker 2: Yeah, no, probably, I'm glad. I'm glad that you report 672 00:31:24,600 --> 00:31:27,560 Speaker 2: that we've moved on a little bit since then. Where 673 00:31:27,600 --> 00:31:30,280 Speaker 2: are your constraints? You know, when you talk to LLLM people, 674 00:31:30,320 --> 00:31:32,840 Speaker 2: there's debates about right now, is it electricity? Is that 675 00:31:32,880 --> 00:31:36,000 Speaker 2: the big constraint? Is it there just aret enough GPUs? 676 00:31:36,320 --> 00:31:40,000 Speaker 2: Is it talent? Is it whatever? When you think about 677 00:31:40,040 --> 00:31:42,719 Speaker 2: where you are now versus the optimal version of where 678 00:31:42,800 --> 00:31:44,240 Speaker 2: or is it I mean, data is the other big 679 00:31:44,280 --> 00:31:46,479 Speaker 2: one because there's all this concern that lllms are going 680 00:31:46,520 --> 00:31:49,000 Speaker 2: to run out of training data, et cetera. Where is 681 00:31:49,040 --> 00:31:51,400 Speaker 2: the big constraint for you that you feel like you're 682 00:31:51,400 --> 00:31:52,560 Speaker 2: solving for right now? 683 00:31:52,720 --> 00:31:55,440 Speaker 5: I think in terms of like really long term strategic planning, 684 00:31:55,480 --> 00:31:59,840 Speaker 5: electricity is like quite clearly a very binding consideration. When 685 00:31:59,840 --> 00:32:04,560 Speaker 5: we think about spitting up new GPU based training data centers, 686 00:32:05,200 --> 00:32:08,440 Speaker 5: it really feels like, is there electricity? Like finding a 687 00:32:08,440 --> 00:32:10,720 Speaker 5: piece of land to put a building in. There's a 688 00:32:10,720 --> 00:32:11,240 Speaker 5: lot of land. 689 00:32:11,480 --> 00:32:14,239 Speaker 4: Yeah, the electricity negotiation. 690 00:32:13,800 --> 00:32:17,040 Speaker 2: That's an issue at agr T even for us. 691 00:32:17,080 --> 00:32:18,760 Speaker 5: You know, because we have a sort of hybrid mix 692 00:32:18,840 --> 00:32:21,080 Speaker 5: of using cloud providers and building our own data centers, 693 00:32:21,720 --> 00:32:25,200 Speaker 5: and yeah, the negotiations and thinking about power constraints. We 694 00:32:25,240 --> 00:32:28,160 Speaker 5: have an existing data center in a very cold place 695 00:32:28,720 --> 00:32:31,680 Speaker 5: and we want to make it bigger. And the data 696 00:32:31,680 --> 00:32:35,160 Speaker 5: center people are fantastic to work with, but they're saying like, well, 697 00:32:35,160 --> 00:32:36,920 Speaker 5: we need to go talk to like the power grid 698 00:32:37,320 --> 00:32:40,080 Speaker 5: and negotiate this next trunch and so on, and it's 699 00:32:40,120 --> 00:32:42,680 Speaker 5: just it often feels like that is the bottleneck. And 700 00:32:43,120 --> 00:32:45,280 Speaker 5: on the terms of a GPU availability, it definitely was 701 00:32:45,320 --> 00:32:47,800 Speaker 5: a crunch at some point in the past, but I 702 00:32:47,840 --> 00:32:48,760 Speaker 5: don't feel like that. 703 00:32:48,760 --> 00:32:52,560 Speaker 2: Is a little more the entire stock market. Say a 704 00:32:52,560 --> 00:32:55,080 Speaker 2: little bit more about how you perceive the GPU market. 705 00:32:54,880 --> 00:32:55,240 Speaker 4: Right, I think. 706 00:32:55,280 --> 00:32:57,400 Speaker 5: I think if we ask for GPUs, we will get 707 00:32:57,440 --> 00:33:00,320 Speaker 5: them to live in a prompt manner, not necessar early 708 00:33:00,360 --> 00:33:02,680 Speaker 5: like next day. But I don't feel like that is 709 00:33:02,720 --> 00:33:04,840 Speaker 5: the thing that we have a long pull and spinning 710 00:33:04,920 --> 00:33:05,320 Speaker 5: up more. 711 00:33:05,840 --> 00:33:07,920 Speaker 2: When was the when was the worst of the crunch? 712 00:33:08,680 --> 00:33:11,640 Speaker 5: I guess twenty twenty three, late twenty twenty three felt 713 00:33:11,640 --> 00:33:13,000 Speaker 5: pretty bad. 714 00:33:13,080 --> 00:33:13,320 Speaker 4: I was. 715 00:33:13,320 --> 00:33:16,120 Speaker 5: I guess that was like the Nvidia Hopper generation, and 716 00:33:16,640 --> 00:33:20,120 Speaker 5: I saw also a number in Ploomberg yesterday that I 717 00:33:20,120 --> 00:33:21,880 Speaker 5: think it was like Nvidia conference yesterday and I said 718 00:33:21,920 --> 00:33:25,120 Speaker 5: something like it was like one million Hopper class GPUs 719 00:33:25,400 --> 00:33:28,880 Speaker 5: have been made, but already like four million Blackwell class 720 00:33:28,920 --> 00:33:30,840 Speaker 5: gp has been made. So I think there's been a 721 00:33:30,920 --> 00:33:33,000 Speaker 5: ramp up of supply. But I don't think they're also 722 00:33:33,040 --> 00:33:36,240 Speaker 5: sitting on unsold inventory either. I think it is being consumed. 723 00:33:36,240 --> 00:33:38,280 Speaker 5: But yeah, in terms of like what is the hod thing, 724 00:33:38,360 --> 00:33:42,440 Speaker 5: I think electricity, And I am it's insane. As a 725 00:33:42,640 --> 00:33:45,040 Speaker 5: very millennial person, I guess climate change was a big 726 00:33:45,080 --> 00:33:47,200 Speaker 5: thing growing up in college, but a lot of discussion 727 00:33:47,440 --> 00:33:49,520 Speaker 5: about climate change, and to see people spinning up data 728 00:33:49,520 --> 00:33:53,880 Speaker 5: centers very fast by basically buying as many gas turbines 729 00:33:53,920 --> 00:33:58,160 Speaker 5: as they can and putting them outside, I'm like, WHOA, Like, yeah, 730 00:33:58,200 --> 00:34:00,280 Speaker 5: what are we doing? It's wild, but this the only 731 00:34:00,320 --> 00:34:02,520 Speaker 5: way to get electricity promptly. You just have to throw 732 00:34:02,560 --> 00:34:05,400 Speaker 5: guests turbines outside the building and turn them on. It's 733 00:34:05,440 --> 00:34:06,479 Speaker 5: pretty radical stuff. 734 00:34:06,520 --> 00:34:09,400 Speaker 4: And I don't know how all the numbers. 735 00:34:09,000 --> 00:34:11,719 Speaker 5: Of people talking about for future data center expansion kind 736 00:34:11,760 --> 00:34:14,720 Speaker 5: of math out because you just back of the envelope 737 00:34:14,760 --> 00:34:16,360 Speaker 5: the power usage and things. 738 00:34:16,120 --> 00:34:18,399 Speaker 4: And I know that the sam Oltman's of the world. 739 00:34:18,440 --> 00:34:19,560 Speaker 4: I've thought about this, We've talked about this. 740 00:34:19,560 --> 00:34:20,960 Speaker 5: So oh, we need to be generating this much new 741 00:34:20,960 --> 00:34:25,320 Speaker 5: power generation per you of time, but there's such daunting numbers. 742 00:34:25,320 --> 00:34:27,120 Speaker 5: I just don't know how that is all going to 743 00:34:27,239 --> 00:34:30,280 Speaker 5: work out. But yeah, even for us in the grand 744 00:34:30,280 --> 00:34:33,120 Speaker 5: scheme of things, like a much smaller player in terms 745 00:34:33,160 --> 00:34:36,000 Speaker 5: of power consumption, we think in terms of like tens 746 00:34:36,000 --> 00:34:40,040 Speaker 5: of megawatts, not gigawatts, which is more than most towns 747 00:34:40,080 --> 00:34:41,239 Speaker 5: and cities and things. 748 00:34:41,239 --> 00:34:43,040 Speaker 4: But still, but we. 749 00:34:42,960 --> 00:34:45,360 Speaker 5: Find it like a challenge to find electricity at a 750 00:34:45,400 --> 00:34:46,160 Speaker 5: reasonable price. 751 00:34:46,600 --> 00:34:48,759 Speaker 3: On this note, can you talk to us a little 752 00:34:48,760 --> 00:34:51,880 Speaker 3: bit more about where competitive advantage actually comes from in 753 00:34:51,960 --> 00:34:55,840 Speaker 3: this space, because if the GPU crunch is somewhat solved, 754 00:34:55,880 --> 00:34:58,520 Speaker 3: and if latency isn't as big an issue as it 755 00:34:58,640 --> 00:35:01,080 Speaker 3: used to be, where are people actually getting their edge from? 756 00:35:01,280 --> 00:35:04,359 Speaker 4: Right? I mean people? Talent is one of your other things. 757 00:35:04,400 --> 00:35:06,520 Speaker 5: You asked that a constraint it is It is a 758 00:35:06,680 --> 00:35:12,000 Speaker 5: very competitive people market. We're essentially asking for people to 759 00:35:12,200 --> 00:35:15,400 Speaker 5: know a lot of things, be both good researchers and 760 00:35:15,440 --> 00:35:18,399 Speaker 5: good engineers. Because I don't know in this AI era 761 00:35:18,440 --> 00:35:20,840 Speaker 5: of a distinction is pretty blurry. It's not something you 762 00:35:20,880 --> 00:35:23,759 Speaker 5: can just wipeboard and then the coding is a little 763 00:35:23,760 --> 00:35:26,160 Speaker 5: bit afterwards. Any kind of research idea you have is 764 00:35:26,200 --> 00:35:29,320 Speaker 5: intimately connected with how you implement it. So that's already 765 00:35:29,360 --> 00:35:32,440 Speaker 5: like a tough ask. So people are constrained. People that 766 00:35:32,480 --> 00:35:34,560 Speaker 5: we like I want to find and we pay well 767 00:35:34,640 --> 00:35:37,000 Speaker 5: for those people as a result, and it is competitive. 768 00:35:37,640 --> 00:35:40,319 Speaker 5: But I think the more subtle edge is almost like 769 00:35:41,040 --> 00:35:45,160 Speaker 5: putting it all together. Do you have people who can, 770 00:35:45,480 --> 00:35:48,080 Speaker 5: like an engineering team that can collect double data recorded, 771 00:35:48,640 --> 00:35:51,640 Speaker 5: make it available to the GPU training data center. This 772 00:35:51,760 --> 00:35:55,520 Speaker 5: is like many I guess it's petabyte scale data sets 773 00:35:56,080 --> 00:35:59,960 Speaker 5: and just stroying too much data, streaming it from wherever 774 00:36:00,000 --> 00:36:02,040 Speaker 5: a stored to wherever in the world the training data 775 00:36:02,080 --> 00:36:06,680 Speaker 5: center is reliably these training runs are very expensive and 776 00:36:06,719 --> 00:36:09,120 Speaker 5: then once you've got that model serving it, so it 777 00:36:09,280 --> 00:36:11,239 Speaker 5: kind of sounds to everything and maybe that's kind of 778 00:36:11,239 --> 00:36:13,359 Speaker 5: like a lame concert, but it really is. I think 779 00:36:13,400 --> 00:36:17,759 Speaker 5: you need to be just optimizing the whole stack. And 780 00:36:17,840 --> 00:36:20,759 Speaker 5: so like my team is like the AI team, so 781 00:36:20,960 --> 00:36:22,800 Speaker 5: net what that really means in practice is we're focused 782 00:36:22,840 --> 00:36:25,200 Speaker 5: on training the models, which is an important but not 783 00:36:25,280 --> 00:36:28,160 Speaker 5: sufficient part of a whole stack, because we would be 784 00:36:28,239 --> 00:36:30,200 Speaker 5: kind of dead in the water without the teams at 785 00:36:30,320 --> 00:36:32,480 Speaker 5: HIT who think about how to actually kind of get 786 00:36:32,600 --> 00:36:35,279 Speaker 5: the data and things TV systems and then the decisions 787 00:36:35,280 --> 00:36:38,280 Speaker 5: out to the markets and keep up when things get busy, 788 00:36:38,360 --> 00:36:41,160 Speaker 5: all these things. So when I think about our competitors, 789 00:36:42,000 --> 00:36:44,320 Speaker 5: I think there is a benefit to scale. I can't 790 00:36:44,400 --> 00:36:47,800 Speaker 5: imagine how you would start a new company like HIT 791 00:36:48,520 --> 00:36:52,279 Speaker 5: in the year twenty twenty five because of the huge 792 00:36:52,280 --> 00:36:55,239 Speaker 5: initial lift to kind of build enough engineering scale to 793 00:36:56,040 --> 00:36:58,120 Speaker 5: achieve this sort of thing. And so I think are 794 00:36:58,160 --> 00:37:02,080 Speaker 5: sort of peer companies also have invested very heavily in engineering, 795 00:37:02,160 --> 00:37:04,120 Speaker 5: and we'll continue to do so. And there was an 796 00:37:04,160 --> 00:37:06,239 Speaker 5: article in the FT like a little like a week 797 00:37:06,320 --> 00:37:09,920 Speaker 5: or two ago about how firms like HIT are kind 798 00:37:09,920 --> 00:37:13,160 Speaker 5: of extending themselves more into slower trading and there are 799 00:37:13,160 --> 00:37:15,439 Speaker 5: firms that are kind of you know, those slower firms 800 00:37:15,480 --> 00:37:16,760 Speaker 5: that's trying to kind of go faster. 801 00:37:17,160 --> 00:37:19,279 Speaker 2: And yeah, I was just gonna ask about, just like 802 00:37:19,360 --> 00:37:23,640 Speaker 2: on the prediction standpoint, Okay, maybe you could predict what's 803 00:37:23,680 --> 00:37:26,960 Speaker 2: with some reasonable confidence what's gonna happen in the next hour. Sometimes, 804 00:37:26,960 --> 00:37:29,799 Speaker 2: if you're lucky, maybe a day like maybe a month. 805 00:37:29,840 --> 00:37:32,719 Speaker 2: It's just ridiculous. But do you in your work is 806 00:37:32,760 --> 00:37:34,520 Speaker 2: that horizon? Has it broadened? 807 00:37:34,600 --> 00:37:35,399 Speaker 4: It is? Yeah. 808 00:37:35,440 --> 00:37:37,480 Speaker 5: I think one of the things for people who are 809 00:37:37,520 --> 00:37:39,600 Speaker 5: aware of HIT even at all, I think is still 810 00:37:39,640 --> 00:37:42,520 Speaker 5: a perception is sort of a pre twenty twenty perception 811 00:37:42,560 --> 00:37:44,640 Speaker 5: of like we are purely high frequency trading firm, but 812 00:37:44,680 --> 00:37:47,160 Speaker 5: we would say we are both high frequency and medium 813 00:37:47,200 --> 00:37:49,520 Speaker 5: frequency trading firm, and it's like a big part of 814 00:37:49,520 --> 00:37:51,919 Speaker 5: our business. One way to think about it, I think 815 00:37:51,960 --> 00:37:53,560 Speaker 5: is that by if I really have a view on 816 00:37:53,600 --> 00:37:55,719 Speaker 5: what a stock should be in like five days time, 817 00:37:56,360 --> 00:37:58,120 Speaker 5: Let's say I want to buy that stock, I'm going 818 00:37:58,200 --> 00:38:01,879 Speaker 5: to acquire that stock over time, and maybe it's what's 819 00:38:01,880 --> 00:38:03,960 Speaker 5: the best time to buy that stuck over the five 820 00:38:04,040 --> 00:38:05,560 Speaker 5: day period, Well, I have a model that tells me 821 00:38:05,640 --> 00:38:08,720 Speaker 5: that the best pricing an hour. So maybe the shorter 822 00:38:08,840 --> 00:38:11,640 Speaker 5: term model should inform the longer term trade and cascading 823 00:38:11,640 --> 00:38:12,319 Speaker 5: all the way down. 824 00:38:12,640 --> 00:38:16,040 Speaker 2: When you're doing this sort of slightly longer term or 825 00:38:16,080 --> 00:38:20,439 Speaker 2: slightly slower frequency trading, is the fundamental job still the same, 826 00:38:20,480 --> 00:38:23,880 Speaker 2: which is you're in the liquidity provision service business, just 827 00:38:23,960 --> 00:38:27,560 Speaker 2: of or longer you want to hold that warehousing or 828 00:38:27,600 --> 00:38:29,600 Speaker 2: does it some because when I think of a fund, 829 00:38:29,840 --> 00:38:31,879 Speaker 2: when I think of a hedge fund, I certainly don't 830 00:38:31,880 --> 00:38:34,600 Speaker 2: think of maybe to some extent, some of their strategies 831 00:38:34,640 --> 00:38:37,480 Speaker 2: might be sort of liquidity provision. It's more directional. Is 832 00:38:37,520 --> 00:38:40,680 Speaker 2: it still that or is the fundamental reason why you 833 00:38:40,719 --> 00:38:43,600 Speaker 2: make money the service you provide? Does it change by definition? 834 00:38:43,760 --> 00:38:44,880 Speaker 2: Change over that horizon? 835 00:38:44,880 --> 00:38:47,759 Speaker 5: I think the market making service provision does break down. 836 00:38:47,760 --> 00:38:49,719 Speaker 5: I think that stretches analogy too far. I think you 837 00:38:49,760 --> 00:38:51,759 Speaker 5: have to think of it as like liquidity taking, which 838 00:38:51,840 --> 00:38:56,239 Speaker 5: somehow seems more like aggressive or something. But the we're 839 00:38:56,280 --> 00:38:59,279 Speaker 5: trading against orders resting on the book. Someone was like, 840 00:38:59,320 --> 00:39:00,759 Speaker 5: I want to sell this dock, and we're like, we 841 00:39:00,760 --> 00:39:03,000 Speaker 5: will buy it from you because we think that in 842 00:39:03,040 --> 00:39:04,680 Speaker 5: the long run will be worth doing it, and so 843 00:39:04,719 --> 00:39:07,160 Speaker 5: we do cross to spread and we do pay this 844 00:39:07,280 --> 00:39:09,600 Speaker 5: transaction costs. Sometimes, you know, you can also kind of 845 00:39:09,760 --> 00:39:13,640 Speaker 5: acquire position by market making, but with a tilt. So really, 846 00:39:13,640 --> 00:39:15,560 Speaker 5: at the longer horizons, I think the sort of market 847 00:39:15,560 --> 00:39:18,400 Speaker 5: making service analogy does break down. But in some sense 848 00:39:18,560 --> 00:39:20,719 Speaker 5: there's always a counterpartty and they wanted to trade for 849 00:39:20,800 --> 00:39:23,640 Speaker 5: a reason, and I think a mental model that I 850 00:39:23,680 --> 00:39:25,440 Speaker 5: don't know. You tell me if this sounds like too 851 00:39:25,600 --> 00:39:26,680 Speaker 5: touishing rushy. 852 00:39:26,360 --> 00:39:28,360 Speaker 2: But love a mental model. 853 00:39:28,400 --> 00:39:31,839 Speaker 5: Yeah, you mentioned go chess, right, So the thing about 854 00:39:31,880 --> 00:39:34,839 Speaker 5: those is that there there's zero sum games is only 855 00:39:34,880 --> 00:39:38,360 Speaker 5: one winner. It's truly like a no, like someone someone's unhappy. 856 00:39:38,400 --> 00:39:41,360 Speaker 5: Someone was in there maybe equally unhappy plus one minus one. 857 00:39:42,040 --> 00:39:44,719 Speaker 5: I think the reason that trading works is because it 858 00:39:44,760 --> 00:39:48,239 Speaker 5: is in some sense positive sum. You know, money is conserved, 859 00:39:48,400 --> 00:39:50,400 Speaker 5: and I guess the little fee goes to exchange, So 860 00:39:50,440 --> 00:39:52,160 Speaker 5: in some sense money is at that moment of a 861 00:39:52,239 --> 00:39:57,680 Speaker 5: trade is actually negative a little. But utility people's general 862 00:39:57,760 --> 00:40:00,560 Speaker 5: happiness I don't know, might paycheck go into my fur 863 00:40:00,600 --> 00:40:03,920 Speaker 5: one k provider and it bias some ETFs. I'm relatively 864 00:40:03,960 --> 00:40:06,440 Speaker 5: like insensitive to how exactly that happens. I just I'm 865 00:40:06,440 --> 00:40:09,040 Speaker 5: not gonna look at it for never forty years, right, No. 866 00:40:10,280 --> 00:40:11,879 Speaker 4: I try not to look at it, especially lately. 867 00:40:12,160 --> 00:40:14,360 Speaker 5: But uh yeah, like the utility, My utility is a 868 00:40:14,440 --> 00:40:16,440 Speaker 5: very long horizon, and so someone sells it to me 869 00:40:16,520 --> 00:40:19,080 Speaker 5: like at one cent different, I don't really care. So 870 00:40:19,400 --> 00:40:21,080 Speaker 5: but like the person who made the sense happy and 871 00:40:21,120 --> 00:40:23,120 Speaker 5: I'm happy because I got good liquidity didn't cross a 872 00:40:23,200 --> 00:40:27,280 Speaker 5: huge spread. So that is kind of why I think 873 00:40:27,760 --> 00:40:29,319 Speaker 5: it all kind of makes sense and white people are 874 00:40:29,320 --> 00:40:32,000 Speaker 5: trading together. But it's also why like thinking about markets 875 00:40:32,040 --> 00:40:34,080 Speaker 5: like an alpha go sense doesn't make sense because it's 876 00:40:34,120 --> 00:40:37,080 Speaker 5: kind of doesn't really apply. If you thought of markets 877 00:40:37,120 --> 00:40:40,200 Speaker 5: as hit and all our competitives all kind of in 878 00:40:40,239 --> 00:40:43,000 Speaker 5: some sort of like deathmatch, who's the smartest, who's trying 879 00:40:43,000 --> 00:40:45,359 Speaker 5: to pick each other off, then markets would be kind 880 00:40:45,360 --> 00:40:47,200 Speaker 5: of like this giant standof where no one would be trading. 881 00:40:47,200 --> 00:40:49,440 Speaker 5: Everyone be kind to be like waiting. But obviously markets 882 00:40:49,440 --> 00:40:51,880 Speaker 5: are very vibrant. I think it's because even when we 883 00:40:51,960 --> 00:40:54,600 Speaker 5: were crossing the spread, because a crossing is prettingaintone who 884 00:40:54,600 --> 00:40:57,680 Speaker 5: wanted to sell for whatever reason. If we're right, I 885 00:40:57,680 --> 00:41:00,479 Speaker 5: guess in five days time they might be like less happy, but. 886 00:41:00,400 --> 00:41:02,280 Speaker 4: Maybe they weren't. Actually, maybe they were just like hedging 887 00:41:02,320 --> 00:41:03,600 Speaker 4: a position. They don't care what. 888 00:41:03,560 --> 00:41:06,080 Speaker 5: The stocks prices in five days. They just wanted to 889 00:41:06,120 --> 00:41:07,960 Speaker 5: like hedge their position, and we traded with them. So 890 00:41:08,280 --> 00:41:10,480 Speaker 5: that's the way I tell rick and styles in my head. 891 00:41:10,480 --> 00:41:12,760 Speaker 5: But it can still be like a sort of service 892 00:41:12,800 --> 00:41:16,000 Speaker 5: provision we make mindly only because someone else wants to trade. 893 00:41:16,200 --> 00:41:18,320 Speaker 4: If no one was trading, we wouldn't exist, right. 894 00:41:18,440 --> 00:41:21,840 Speaker 3: And different market participants with different motivations and goals and aims. 895 00:41:22,120 --> 00:41:24,439 Speaker 3: I want to go back to the talent question. Yeah 896 00:41:24,480 --> 00:41:27,279 Speaker 3: for a second, and I get the sense that engineers 897 00:41:27,640 --> 00:41:31,640 Speaker 3: like open source and they like contributing to the research 898 00:41:31,880 --> 00:41:34,480 Speaker 3: ecosystem on AI. And then I get the sense that 899 00:41:34,520 --> 00:41:38,280 Speaker 3: trading firms probably do not like open source, and they're 900 00:41:38,440 --> 00:41:42,680 Speaker 3: much more into protecting their proprietary models or data or whatever. 901 00:41:43,160 --> 00:41:45,920 Speaker 3: How does a company like HRT, how do you actually 902 00:41:45,920 --> 00:41:46,920 Speaker 3: balance that tension? 903 00:41:47,280 --> 00:41:47,520 Speaker 4: Yeah? 904 00:41:47,560 --> 00:41:49,279 Speaker 5: I mean this is also like a sort of really 905 00:41:49,360 --> 00:41:52,799 Speaker 5: honest answer in that many years ago. This is a 906 00:41:52,880 --> 00:41:56,640 Speaker 5: relative comparative disadvantage for us for recruiting some We often 907 00:41:56,640 --> 00:41:59,560 Speaker 5: have conversations with, maybe especially PhDs who are graduating, and 908 00:41:59,600 --> 00:42:01,520 Speaker 5: they would like, well, I can go to Google and 909 00:42:01,560 --> 00:42:03,560 Speaker 5: I can still publish my research, and that kind of 910 00:42:03,560 --> 00:42:04,480 Speaker 5: gives me optionality. 911 00:42:04,520 --> 00:42:05,719 Speaker 4: People will know who I am. 912 00:42:06,239 --> 00:42:09,040 Speaker 5: If I go into an HRT or like firm, I 913 00:42:09,080 --> 00:42:12,319 Speaker 5: essentially go behind this veil and I never emerge and 914 00:42:12,400 --> 00:42:13,759 Speaker 5: people just had to kind of take it on faith. 915 00:42:13,760 --> 00:42:16,399 Speaker 5: I did smart things for many years, and I would 916 00:42:16,400 --> 00:42:19,799 Speaker 5: have basically no strong counter argument apart from the fact 917 00:42:19,840 --> 00:42:22,399 Speaker 5: that actually writing papers is kind of overrated. I've been there, 918 00:42:22,600 --> 00:42:25,200 Speaker 5: done that, as when you get older you will not care. 919 00:42:26,239 --> 00:42:29,879 Speaker 5: Now though, there's this interesting situation where this golden era 920 00:42:30,000 --> 00:42:31,360 Speaker 5: may be of like being able to be work at 921 00:42:31,360 --> 00:42:33,160 Speaker 5: a big tech company and be paid for public research 922 00:42:33,239 --> 00:42:36,000 Speaker 5: is very much over The papers that do come out 923 00:42:36,040 --> 00:42:38,359 Speaker 5: of the big AI labs are essentially kind of either 924 00:42:38,360 --> 00:42:42,560 Speaker 5: a very stale or not important, and if you're working 925 00:42:42,640 --> 00:42:44,959 Speaker 5: on the most important cutting edge things, you can't share 926 00:42:45,000 --> 00:42:47,600 Speaker 5: what you're doing and it's very secretive. So some since 927 00:42:47,600 --> 00:42:50,080 Speaker 5: the problem solved itself a little bit for me, and 928 00:42:50,120 --> 00:42:53,279 Speaker 5: people now recognize that IP should be protected. I've even 929 00:42:53,360 --> 00:42:56,399 Speaker 5: seen some of us sort of AI lab people think 930 00:42:56,400 --> 00:43:00,040 Speaker 5: out a lot about non competes in public thinking tweeting 931 00:43:00,040 --> 00:43:01,200 Speaker 5: about non competes. 932 00:43:00,880 --> 00:43:03,120 Speaker 4: And things, which is an amazing ton of events, because 933 00:43:03,120 --> 00:43:03,600 Speaker 4: I feel like. 934 00:43:03,640 --> 00:43:04,520 Speaker 2: That was very anesthetical. 935 00:43:05,040 --> 00:43:07,680 Speaker 5: I mean, they're like literally effectively banned in the state 936 00:43:07,719 --> 00:43:10,120 Speaker 5: of California, and I think people were almost like proud 937 00:43:10,160 --> 00:43:12,640 Speaker 5: of this fact, and which also kind of hold it 938 00:43:12,680 --> 00:43:15,600 Speaker 5: against the New York sort of trading world, being like, oh, 939 00:43:15,600 --> 00:43:17,719 Speaker 5: look at these people, are there non competes and things, 940 00:43:18,080 --> 00:43:20,680 Speaker 5: And then someone comes along and pays one hundred million 941 00:43:20,680 --> 00:43:24,719 Speaker 5: dollars or whatever for like your researchers, and a lot 942 00:43:24,719 --> 00:43:26,879 Speaker 5: of that money is being paid for talent, but it's 943 00:43:26,920 --> 00:43:29,759 Speaker 5: also in some sense paying for intellectual property. 944 00:43:30,200 --> 00:43:31,360 Speaker 4: And like, those people. 945 00:43:31,200 --> 00:43:34,960 Speaker 5: Know how the soup is made and they are not 946 00:43:35,000 --> 00:43:38,400 Speaker 5: writing it down and not committing any explicit sort of 947 00:43:38,480 --> 00:43:41,440 Speaker 5: IP theft. But if you hire five people who've been 948 00:43:41,440 --> 00:43:43,920 Speaker 5: making the soup, they know they know a lot of 949 00:43:43,960 --> 00:43:47,960 Speaker 5: process knowledge, and you might suddenly feel a little differently 950 00:43:48,040 --> 00:43:51,000 Speaker 5: about protecting that. We spend a lot of time training 951 00:43:51,000 --> 00:43:53,600 Speaker 5: our employees. Takes a long time for them to be productive. 952 00:43:54,600 --> 00:43:56,200 Speaker 5: In some sense, it would be a shame if people 953 00:43:56,200 --> 00:43:58,360 Speaker 5: could just take that knowledge and immediately leave. 954 00:43:58,600 --> 00:44:02,239 Speaker 4: And so, yeah, just. 955 00:44:02,200 --> 00:44:05,680 Speaker 3: Going back to the steamroller. I promised, I promised we would. 956 00:44:05,719 --> 00:44:09,319 Speaker 3: When I hear AI in trading or I know people 957 00:44:09,360 --> 00:44:13,319 Speaker 3: are very excited about agent based AI nowadays, part of 958 00:44:13,320 --> 00:44:17,200 Speaker 3: me thinks back to one of the more amusing events 959 00:44:17,239 --> 00:44:20,120 Speaker 3: in financial history, which is Joe. I'm sure you remember 960 00:44:20,120 --> 00:44:22,960 Speaker 3: at the time that one of night Capital's algos. 961 00:44:23,200 --> 00:44:24,880 Speaker 2: Would not find that to be an amusing Yeah, did 962 00:44:24,920 --> 00:44:27,960 Speaker 2: all the worst Nightmare possible, but using for them the 963 00:44:28,000 --> 00:44:29,960 Speaker 2: peanut gallery, right. 964 00:44:29,480 --> 00:44:32,880 Speaker 3: Right, schadenfreud. So this algo went rogue and bought like 965 00:44:32,960 --> 00:44:37,319 Speaker 3: seven billion dollars worth of stuff? Yeah, exactly, what are 966 00:44:37,320 --> 00:44:40,799 Speaker 3: the guardrails that you put in place to avoid the 967 00:44:40,840 --> 00:44:42,240 Speaker 3: destiny of night capital. 968 00:44:42,440 --> 00:44:45,759 Speaker 5: So every training cycle we have a talk about the 969 00:44:45,840 --> 00:44:48,440 Speaker 5: nightmare with a K and we have multiple x the 970 00:44:48,560 --> 00:44:51,239 Speaker 5: night employees at HIT, as you might expect just from 971 00:44:51,280 --> 00:44:53,920 Speaker 5: the lineage of a successful trading firm that ended in 972 00:44:54,040 --> 00:44:56,279 Speaker 5: a kind of unhappy way, and we have many people 973 00:44:56,320 --> 00:44:57,200 Speaker 5: who were at night. 974 00:44:57,239 --> 00:44:59,439 Speaker 2: This story is crazy and successful trading firm that ended 975 00:44:59,440 --> 00:45:00,719 Speaker 2: about fifteen yeah. 976 00:45:00,800 --> 00:45:04,720 Speaker 5: Yeah, So it's fair to say that that stuff haunts us, 977 00:45:04,880 --> 00:45:07,359 Speaker 5: and we try and take as many lessons away from 978 00:45:07,360 --> 00:45:11,120 Speaker 5: that as possible. Defense and layers. So I think one 979 00:45:11,160 --> 00:45:12,840 Speaker 5: of the things that I'd like to emphasize with the 980 00:45:12,840 --> 00:45:15,040 Speaker 5: AI stuff in particular is that it is not like 981 00:45:15,120 --> 00:45:18,680 Speaker 5: there's some neural network directly sending orders to NIZ. It 982 00:45:18,719 --> 00:45:23,320 Speaker 5: is in some sense providing a plan and then traditional human, 983 00:45:23,600 --> 00:45:28,040 Speaker 5: heavily audited, risk checked layers take the actions and that's 984 00:45:28,160 --> 00:45:30,319 Speaker 5: just kind of how it has to be. And so 985 00:45:30,920 --> 00:45:33,759 Speaker 5: for us we are kind of on an operational day 986 00:45:33,760 --> 00:45:36,160 Speaker 5: to day basis. It's just many, many layers of sanity 987 00:45:36,239 --> 00:45:38,520 Speaker 5: checking throughout the day, and then at a sort of 988 00:45:38,600 --> 00:45:42,239 Speaker 5: high level it's very careful process including processes to specifically 989 00:45:42,320 --> 00:45:45,879 Speaker 5: avoid the KCG type scenario of how are you even 990 00:45:46,000 --> 00:45:49,239 Speaker 5: releasing new versions and what pre released checks do you run? 991 00:45:49,320 --> 00:45:52,759 Speaker 5: And audits and we even during the day we have 992 00:45:52,840 --> 00:45:54,360 Speaker 5: some I don't know, I guess you'd call them like 993 00:45:54,480 --> 00:45:57,840 Speaker 5: sanity checks of the neural networks to make sure that 994 00:45:57,880 --> 00:45:59,879 Speaker 5: they are producing the values that we expected they would 995 00:45:59,880 --> 00:46:02,600 Speaker 5: be reducing. And those sort of checking processes are kind 996 00:46:02,600 --> 00:46:04,680 Speaker 5: of a little bit behind because they can't keep up 997 00:46:04,719 --> 00:46:06,920 Speaker 5: with the like flow, but like for enough to kind 998 00:46:06,920 --> 00:46:10,440 Speaker 5: of just again like every tex of a numeric stability 999 00:46:10,480 --> 00:46:13,279 Speaker 5: of the model saying and things. It's not it's not 1000 00:46:13,320 --> 00:46:15,279 Speaker 5: about losing money or making money in today, because it's 1001 00:46:15,280 --> 00:46:18,000 Speaker 5: not like, oh, like risk in the kind of financial sense. 1002 00:46:18,000 --> 00:46:21,000 Speaker 5: It's like operational risk. But paranoia is deep and that's 1003 00:46:21,000 --> 00:46:23,920 Speaker 5: probably something that's still very different I think, from this 1004 00:46:24,080 --> 00:46:27,360 Speaker 5: market from the sort of other AI world, which I 1005 00:46:27,360 --> 00:46:29,920 Speaker 5: guess anything goes and like failure rates to kind of 1006 00:46:29,960 --> 00:46:32,399 Speaker 5: just priced in. Yeah, but yeah, you could you could 1007 00:46:32,440 --> 00:46:35,120 Speaker 5: imagine just ruining everything, and I guess we worry about 1008 00:46:35,160 --> 00:46:38,600 Speaker 5: losing money, but I think we worry more about taking 1009 00:46:38,600 --> 00:46:41,680 Speaker 5: an action that a regulator would not want us to do, 1010 00:46:42,280 --> 00:46:44,879 Speaker 5: because if you lose that trust of regulators, you lose 1011 00:46:44,880 --> 00:46:46,719 Speaker 5: it for a very long time. And we trade in 1012 00:46:46,760 --> 00:46:49,320 Speaker 5: a lot of markets and we pay very close attention, 1013 00:46:49,440 --> 00:46:52,040 Speaker 5: and I have deep respect for the regulators and their 1014 00:46:52,040 --> 00:46:54,480 Speaker 5: decisions and all those markets and the rules are sometimes 1015 00:46:54,560 --> 00:46:56,920 Speaker 5: very complex, and man, do we watch that stuff like 1016 00:46:56,920 --> 00:46:58,359 Speaker 5: a hawk, because you know, you don't be kicked out 1017 00:46:58,400 --> 00:47:01,319 Speaker 5: of a country for making an operational error. And this 1018 00:47:01,440 --> 00:47:04,200 Speaker 5: is a very low tolerance culture from regulators in terms 1019 00:47:04,200 --> 00:47:07,200 Speaker 5: of making mistakes. So we stress it a lot, and 1020 00:47:07,400 --> 00:47:09,400 Speaker 5: I think we should because it's it's like the profit 1021 00:47:09,440 --> 00:47:11,560 Speaker 5: you make in ten years by still being in the 1022 00:47:11,600 --> 00:47:13,879 Speaker 5: game versus move fast and break things. It's not move 1023 00:47:13,960 --> 00:47:16,040 Speaker 5: fast and break things, PA still want to move fast. 1024 00:47:16,280 --> 00:47:19,120 Speaker 2: I have like a million more questions, but for the 1025 00:47:19,160 --> 00:47:21,480 Speaker 2: sake of time, I'll just ask one more. And I 1026 00:47:21,520 --> 00:47:24,760 Speaker 2: don't know even know whether it's something you're in position 1027 00:47:24,840 --> 00:47:26,640 Speaker 2: great position to answer about. It's something I actually want 1028 00:47:26,680 --> 00:47:29,160 Speaker 2: to do an entire episode about at some point. But 1029 00:47:30,120 --> 00:47:33,920 Speaker 2: as you would characterize it, what happens in the second 1030 00:47:34,040 --> 00:47:36,839 Speaker 2: after a jobs report is released. And what I'm talking 1031 00:47:36,880 --> 00:47:40,160 Speaker 2: about specifically is numbers either flash on the screen or 1032 00:47:40,200 --> 00:47:42,759 Speaker 2: a piece of a text appears on a website, and 1033 00:47:42,920 --> 00:47:46,560 Speaker 2: markets move around a lot all that and there's people. 1034 00:47:46,880 --> 00:47:49,120 Speaker 2: Then suddenly it's actually the jobs report was good, and 1035 00:47:49,160 --> 00:47:50,600 Speaker 2: if you actually look at the wage number and then 1036 00:47:50,600 --> 00:47:53,160 Speaker 2: the six But in that instant, in that first micro 1037 00:47:53,280 --> 00:47:56,520 Speaker 2: second after the release, markets are already moving, certainly before 1038 00:47:56,560 --> 00:47:59,479 Speaker 2: any human has had a chance to read the thing 1039 00:47:59,719 --> 00:48:02,800 Speaker 2: or for view. So what I assume is that there's 1040 00:48:02,920 --> 00:48:05,800 Speaker 2: training on here's the text and here are the things 1041 00:48:05,840 --> 00:48:08,120 Speaker 2: and whatever. But to as you would put it, or 1042 00:48:08,120 --> 00:48:11,520 Speaker 2: from the perspective of hr T, what happens in the 1043 00:48:11,520 --> 00:48:13,120 Speaker 2: millisecond after an event? 1044 00:48:13,480 --> 00:48:16,080 Speaker 5: Yeah, so yeah, I mean, so we have like a 1045 00:48:16,120 --> 00:48:19,200 Speaker 5: Bloomberg headlines feed that it's like pretty low latency, and 1046 00:48:19,239 --> 00:48:21,920 Speaker 5: if it's like an important articleize like a star and 1047 00:48:21,960 --> 00:48:24,040 Speaker 5: a feed things like this, right, But you can do 1048 00:48:24,080 --> 00:48:26,799 Speaker 5: everything from having kind of a hand crafted logic to 1049 00:48:26,800 --> 00:48:30,040 Speaker 5: look for keywords through to putting it through like an 1050 00:48:30,080 --> 00:48:33,720 Speaker 5: AI model. One of the things that I like still 1051 00:48:34,840 --> 00:48:37,080 Speaker 5: kind of kind of wrap my head around is, I guess, 1052 00:48:37,080 --> 00:48:40,480 Speaker 5: without saying specific company names, there are options trading firms 1053 00:48:41,040 --> 00:48:46,600 Speaker 5: that have thousands of people that are essentially cyborg trading options. 1054 00:48:47,200 --> 00:48:50,080 Speaker 5: They have maybe ten people trading like options for a 1055 00:48:50,160 --> 00:48:54,719 Speaker 5: single big stock like in VIDEOSA, and they are humans 1056 00:48:54,760 --> 00:48:57,640 Speaker 5: staring at the feeds for these things and clicking buttons, 1057 00:48:57,719 --> 00:48:59,520 Speaker 5: and they have user interfaces that will sit up for 1058 00:48:59,600 --> 00:49:01,120 Speaker 5: them to hit the green button. 1059 00:49:00,840 --> 00:49:04,360 Speaker 4: Of the red button. Essentially very fast. It's weird. We 1060 00:49:04,400 --> 00:49:06,840 Speaker 4: actually want for a hackathon. 1061 00:49:06,880 --> 00:49:09,399 Speaker 5: We got a PlayStation controller and kind of gave people 1062 00:49:09,480 --> 00:49:12,359 Speaker 5: a chance to try and practice reacting to events. It's 1063 00:49:12,360 --> 00:49:16,520 Speaker 5: really tough, but it's a learnable skill. I think in 1064 00:49:16,560 --> 00:49:20,200 Speaker 5: an efficient market sense, this should be ai able. It 1065 00:49:20,239 --> 00:49:22,480 Speaker 5: is challenging though, because if you imagine to kind of 1066 00:49:22,560 --> 00:49:25,160 Speaker 5: plumbing it into chet GBT, it would be too slow, 1067 00:49:25,520 --> 00:49:27,120 Speaker 5: Like the latency would probably be sufficiently high. 1068 00:49:27,120 --> 00:49:29,279 Speaker 4: I mean it's not that fast, right. 1069 00:49:29,280 --> 00:49:31,239 Speaker 5: It's fast for any normal day to day thing, but 1070 00:49:31,239 --> 00:49:34,040 Speaker 5: for markets it's kind of slow also. And this is 1071 00:49:34,080 --> 00:49:36,600 Speaker 5: like a very interesting research challenge. Is like you can't 1072 00:49:36,960 --> 00:49:40,240 Speaker 5: literally use chet GBT to back test anything. It knows 1073 00:49:40,320 --> 00:49:44,319 Speaker 5: every jerme, heal speech, and knows what happened afterwards because 1074 00:49:44,320 --> 00:49:46,640 Speaker 5: it's trained on the whole internet. So how do you 1075 00:49:46,719 --> 00:49:49,960 Speaker 5: really get confidence that for the next federals of speech 1076 00:49:50,000 --> 00:49:53,600 Speaker 5: it's going to do the right thing. Traditionally in finance 1077 00:49:53,640 --> 00:49:55,000 Speaker 5: you back to us things to see how you're done 1078 00:49:55,000 --> 00:49:57,200 Speaker 5: in the past. But if in this case it's all 1079 00:49:57,280 --> 00:50:00,279 Speaker 5: could of ensemble, like it's seen it all before. And 1080 00:50:00,360 --> 00:50:02,640 Speaker 5: I've seen academic finance papers if they try and like 1081 00:50:02,760 --> 00:50:04,920 Speaker 5: grapple of this and they say it's still works. They 1082 00:50:04,920 --> 00:50:06,960 Speaker 5: try and account for this, but I know, just this 1083 00:50:06,960 --> 00:50:09,200 Speaker 5: stuff is really that smart. Yeah, The whole kind of 1084 00:50:09,239 --> 00:50:12,640 Speaker 5: thesis is that it's memorized, everything has been trained on, 1085 00:50:13,320 --> 00:50:14,960 Speaker 5: so why would it be reliable? 1086 00:50:15,200 --> 00:50:16,799 Speaker 4: And so when if you see someone being like. 1087 00:50:16,719 --> 00:50:20,000 Speaker 5: Oh I ran every federal reserves speech through giant GBT 1088 00:50:20,080 --> 00:50:22,080 Speaker 5: and it got it right like nine out of ten times, 1089 00:50:22,120 --> 00:50:24,719 Speaker 5: it's like only nine out of ten times, Like why 1090 00:50:24,719 --> 00:50:28,279 Speaker 5: not one hundred percent? So I do find that I 1091 00:50:28,320 --> 00:50:30,120 Speaker 5: do think that it is interesting there are how many 1092 00:50:30,200 --> 00:50:34,000 Speaker 5: humans that's still involved on relatively high speed trading. There 1093 00:50:34,040 --> 00:50:37,200 Speaker 5: are a lot of people still doing this and instead 1094 00:50:37,200 --> 00:50:39,640 Speaker 5: of niche products. And it's presumably because it's very hard 1095 00:50:39,680 --> 00:50:43,480 Speaker 5: to integrate all the information. It's AGI twenty I don't 1096 00:50:43,520 --> 00:50:45,600 Speaker 5: know twenty twenty eight twenty thirty. I don't know, there's 1097 00:50:45,600 --> 00:50:48,160 Speaker 5: still a lot of humans trading stock and options and 1098 00:50:48,200 --> 00:50:49,640 Speaker 5: so like, I don't know how to reconcile that, but 1099 00:50:49,760 --> 00:50:50,560 Speaker 5: I think about that. 1100 00:50:50,840 --> 00:50:53,200 Speaker 4: When I read fun being. 1101 00:50:53,040 --> 00:50:55,720 Speaker 2: Dunning, it was a fantastic There really are like hours 1102 00:50:55,760 --> 00:51:00,279 Speaker 2: more of conversation, so we can back next week next 1103 00:51:00,280 --> 00:51:02,719 Speaker 2: week's episode. But no, that was great, thank you for 1104 00:51:02,719 --> 00:51:03,680 Speaker 2: having really appreciate it. 1105 00:51:03,760 --> 00:51:17,880 Speaker 6: Yeah, pleasure, Thank you, Tracy. 1106 00:51:17,960 --> 00:51:20,400 Speaker 2: I thought that was really great. I like this idea, 1107 00:51:20,480 --> 00:51:22,560 Speaker 2: this sort of anti symicism, because you do hear a 1108 00:51:22,600 --> 00:51:25,360 Speaker 2: lot of people say, oh no, like AI could solve 1109 00:51:25,440 --> 00:51:28,200 Speaker 2: things like chess or whatever, but the stock market is 1110 00:51:28,239 --> 00:51:31,759 Speaker 2: fundamentally different, and I've never been totally satisfied with some 1111 00:51:31,800 --> 00:51:34,960 Speaker 2: of the theories for why. And like I get stocks 1112 00:51:35,000 --> 00:51:37,839 Speaker 2: are not like necessarily like a solvable problem in quite 1113 00:51:37,840 --> 00:51:40,520 Speaker 2: the same way. But humans make money on the market 1114 00:51:40,640 --> 00:51:44,400 Speaker 2: by matching patterns. Why can't smart silicon brains do the 1115 00:51:44,400 --> 00:51:44,799 Speaker 2: same thing. 1116 00:51:45,400 --> 00:51:48,400 Speaker 3: Well, there's also history. Now we have many years of 1117 00:51:48,560 --> 00:51:51,640 Speaker 3: HFT trading and yeah, gruthically driven trading where people have 1118 00:51:51,760 --> 00:51:54,080 Speaker 3: made a lot of money, So it seems to be working. 1119 00:51:54,560 --> 00:51:57,720 Speaker 3: The light bulb moment for me was where Ian talked 1120 00:51:57,760 --> 00:52:00,560 Speaker 3: about the timeframe and the importance of the time frame, 1121 00:52:00,640 --> 00:52:04,120 Speaker 3: and I think that's really the key in many ways. 1122 00:52:04,160 --> 00:52:07,880 Speaker 3: It's adapting what you're doing with AI to the data 1123 00:52:07,920 --> 00:52:11,040 Speaker 3: that's available and the data on markets. Most of it 1124 00:52:11,080 --> 00:52:13,880 Speaker 3: is going to be very short term and more seconds 1125 00:52:13,880 --> 00:52:16,759 Speaker 3: and minutes, more minutes than days, et cetera, et cetera. 1126 00:52:17,440 --> 00:52:19,759 Speaker 3: And a lot of the data is also biased to 1127 00:52:20,040 --> 00:52:24,879 Speaker 3: immediacy versus past analysis, which he spoke about as well. 1128 00:52:25,360 --> 00:52:28,480 Speaker 2: It is always funny in finance people. It's like, oh, 1129 00:52:28,719 --> 00:52:33,160 Speaker 2: seventeen out of nineteen times there's been this death cross 1130 00:52:33,160 --> 00:52:35,279 Speaker 2: of the S and P five hundred stocks went down. 1131 00:52:35,360 --> 00:52:38,720 Speaker 2: It's like any serious data scientists will spit at that sampoint. 1132 00:52:38,719 --> 00:52:41,960 Speaker 2: It's like beyond a joke level to talk about a 1133 00:52:42,040 --> 00:52:43,360 Speaker 2: sample size of nineteen. 1134 00:52:43,840 --> 00:52:47,920 Speaker 3: Yeah, but death cross in a headline. It's so tempted. 1135 00:52:48,040 --> 00:52:50,920 Speaker 2: That's true. You all, you cannot advice to journalists. Never 1136 00:52:51,000 --> 00:52:53,360 Speaker 2: pass up a chance to put death cross. I was 1137 00:52:53,440 --> 00:52:56,640 Speaker 2: glad to hear thought a few things interesting. One is 1138 00:52:56,880 --> 00:52:59,719 Speaker 2: I was glad to hear that the wire length problem 1139 00:52:59,800 --> 00:53:01,719 Speaker 2: is no longer. Yeah, it's not just as racing it 1140 00:53:01,800 --> 00:53:02,480 Speaker 2: closer to the extreme. 1141 00:53:02,560 --> 00:53:04,359 Speaker 3: I was kind of boring when people were talking about 1142 00:53:04,400 --> 00:53:06,680 Speaker 3: the Cold War and HFT and all of that. 1143 00:53:06,960 --> 00:53:11,000 Speaker 2: It's interesting that the GPU market is eased versus where 1144 00:53:11,000 --> 00:53:12,600 Speaker 2: it may have been a couple of years ago. And 1145 00:53:12,600 --> 00:53:15,600 Speaker 2: it's interesting they even at a scale a good trading shop, 1146 00:53:16,040 --> 00:53:19,279 Speaker 2: that electricity is proving to be a main constraint, which 1147 00:53:19,360 --> 00:53:22,160 Speaker 2: does raise questions about are we just going to hit 1148 00:53:22,239 --> 00:53:25,680 Speaker 2: up against a wall given some of the AI plans 1149 00:53:25,760 --> 00:53:28,320 Speaker 2: that so many people are banking on for the chatbots. 1150 00:53:28,400 --> 00:53:31,640 Speaker 3: Yeah, I thought also, I guess the cultural shift in 1151 00:53:31,680 --> 00:53:33,759 Speaker 3: some of the last Yeah, it was really interesting this 1152 00:53:33,840 --> 00:53:38,760 Speaker 3: idea that they've become more proprietary and perhaps more mysterious 1153 00:53:38,960 --> 00:53:43,399 Speaker 3: in some ways, rather than the trading firms becoming more open. Yeah. 1154 00:53:43,560 --> 00:53:47,719 Speaker 2: Lots of great conversation, answer some questions. Yea plenty more. 1155 00:53:48,040 --> 00:53:50,560 Speaker 3: That was helpful, and I'm sure we'll talk to him again, 1156 00:53:50,719 --> 00:53:53,960 Speaker 3: maybe not next week, but soon next year. All right, 1157 00:53:54,000 --> 00:53:55,719 Speaker 3: shall we leave it there, Let's leave it there. This 1158 00:53:55,760 --> 00:53:58,200 Speaker 3: has been another episode of the All Thoughts podcast. I'm 1159 00:53:58,239 --> 00:54:01,120 Speaker 3: Tracy Alloway. You can follow me at Tracy Alloway. 1160 00:54:01,000 --> 00:54:03,760 Speaker 2: And I'm joll Wisenthal. You can follow me at The Stalwart. 1161 00:54:03,920 --> 00:54:07,000 Speaker 2: Follow our guest Ian Dunning. He's at Ian Dunning. Follow 1162 00:54:07,040 --> 00:54:10,440 Speaker 2: our producers Carmen Rodriguez at Carmen Arman, dash Ol Bennett 1163 00:54:10,440 --> 00:54:13,760 Speaker 2: at dashbod and kill Brooks at Kilbrooks. More odd Laws content, 1164 00:54:13,800 --> 00:54:15,840 Speaker 2: Go to Bloomberg dot com slash odd Lots with the 1165 00:54:15,920 --> 00:54:18,440 Speaker 2: daily newsletter and all of our episodes, and you can 1166 00:54:18,520 --> 00:54:20,719 Speaker 2: chat about all of these topics twenty four seven in 1167 00:54:20,880 --> 00:54:24,120 Speaker 2: our discord Discord dot gg slash odlines. 1168 00:54:24,239 --> 00:54:26,680 Speaker 3: And if you enjoy odd Lots, if you like it 1169 00:54:26,760 --> 00:54:29,720 Speaker 3: when we dive into how companies are actually using AI, 1170 00:54:29,880 --> 00:54:32,360 Speaker 3: then please leave us a positive review on your favorite 1171 00:54:32,360 --> 00:54:36,080 Speaker 3: podcast platform. And remember, if you are a Bloomberg subscriber, 1172 00:54:36,120 --> 00:54:39,360 Speaker 3: you can listen to all of our episodes absolutely ad free. 1173 00:54:39,600 --> 00:54:41,680 Speaker 3: All you need to do is find the Bloomberg channel 1174 00:54:41,719 --> 00:54:44,880 Speaker 3: on Apple Podcasts and follow the instructions there. Thanks for 1175 00:54:44,920 --> 00:54:45,280 Speaker 3: listening 1176 00:55:02,040 --> 00:55:02,080 Speaker 4: In