1 00:00:12,600 --> 00:00:16,120 Speaker 1: Hello, and welcome to What Goes Up a weekly markets podcast. 2 00:00:16,200 --> 00:00:18,400 Speaker 1: My name is Mike Reagan. I'm a senior editor at 3 00:00:18,400 --> 00:00:22,200 Speaker 1: Bloomberg and I'm Boldana Higher, across asset reporter with Bloomberg 4 00:00:22,840 --> 00:00:25,759 Speaker 1: at this week on the show, well, is the proverbial 5 00:00:25,920 --> 00:00:29,480 Speaker 1: fed put dead or is the strike price just so 6 00:00:29,640 --> 00:00:32,440 Speaker 1: low that it may as well be? That's the question 7 00:00:32,560 --> 00:00:35,400 Speaker 1: on everyone's mind this week after Jerome pal signaled that 8 00:00:35,440 --> 00:00:38,120 Speaker 1: the recent boltility in the stock market will not be 9 00:00:38,320 --> 00:00:40,880 Speaker 1: enough to force the Central Bank to amend its plans 10 00:00:40,960 --> 00:00:44,320 Speaker 1: to raise interest rates and shrink its balance sheet. What 11 00:00:44,479 --> 00:00:46,839 Speaker 1: does it mean for markets? We'll get into it with 12 00:00:46,920 --> 00:00:49,880 Speaker 1: the chief investment officer of a firm that focuses on 13 00:00:50,120 --> 00:00:54,720 Speaker 1: artificial intelligence. But first, the two things I have to 14 00:00:54,800 --> 00:00:57,400 Speaker 1: tell you, um. First of all, my crazy thing this 15 00:00:57,520 --> 00:01:00,600 Speaker 1: week is custom made for you, so I okay, I 16 00:01:00,760 --> 00:01:05,880 Speaker 1: expect big things from you on this Secondly, condolences UH 17 00:01:06,400 --> 00:01:09,800 Speaker 1: to you and Jeffrey Gunlock and all the other Buffalo 18 00:01:09,959 --> 00:01:12,280 Speaker 1: Bills fans out there. I know this is this is 19 00:01:12,319 --> 00:01:15,840 Speaker 1: a bad week, so bad that it's Veldona's turn to 20 00:01:15,959 --> 00:01:19,240 Speaker 1: hide under a blanket during the podcast this week. And Vildonna, 21 00:01:19,280 --> 00:01:21,960 Speaker 1: I think if there were like a pure play company 22 00:01:22,640 --> 00:01:26,000 Speaker 1: that makes folding tables, I would assorted them this week. 23 00:01:26,080 --> 00:01:29,759 Speaker 1: What do you think? Uh, yeah, it's it's a bad 24 00:01:29,840 --> 00:01:32,520 Speaker 1: time for Buffalo Bills fans for all of Western New York. 25 00:01:33,040 --> 00:01:36,600 Speaker 1: I was depressed on Sunday on Monday. I'm depressed today. 26 00:01:37,120 --> 00:01:39,160 Speaker 1: I don't know how I'm going to get through this answer, 27 00:01:39,280 --> 00:01:42,360 Speaker 1: but thank you. I appreciate it. At least at least 28 00:01:42,880 --> 00:01:47,000 Speaker 1: everybody watched the game was like record numbers of people 29 00:01:47,040 --> 00:01:51,280 Speaker 1: who watched it. And now everybody's discussing overtime rules for 30 00:01:51,520 --> 00:01:54,080 Speaker 1: the NFL, which is which is kind of cool. Yeah, 31 00:01:54,480 --> 00:01:56,840 Speaker 1: but by everyone, you mean that the team that lost, 32 00:01:57,200 --> 00:01:59,160 Speaker 1: the fans of the team that lost because of the 33 00:01:59,400 --> 00:02:01,920 Speaker 1: Hey I saw, I saw Larry David was talking about 34 00:02:01,920 --> 00:02:06,440 Speaker 1: it too. It is they've already amended it once. So 35 00:02:06,560 --> 00:02:09,200 Speaker 1: we'll see. We'll see. One can hope maybe they'll they'll 36 00:02:09,240 --> 00:02:13,079 Speaker 1: let them play that game over. But I'm excited about 37 00:02:13,160 --> 00:02:15,360 Speaker 1: this guest. Uh this week. We haven't talked to him 38 00:02:15,360 --> 00:02:18,280 Speaker 1: in a while. He's got a new firm he's involved with. 39 00:02:18,840 --> 00:02:20,600 Speaker 1: Tell us about him. Bring him into the show here, 40 00:02:20,680 --> 00:02:23,720 Speaker 1: let's get this going. Yeah, it's Max Kachman. He's the 41 00:02:23,800 --> 00:02:27,359 Speaker 1: chief investment officer at Alpha. Trey, and Max, I want 42 00:02:27,360 --> 00:02:29,480 Speaker 1: to welcome you back to the show. Thanks so much 43 00:02:29,480 --> 00:02:32,280 Speaker 1: for having me. It's awesome to be back. Max. It's 44 00:02:32,360 --> 00:02:34,239 Speaker 1: it's been a while, and I don't think i've talked 45 00:02:34,240 --> 00:02:37,480 Speaker 1: to you since you've started this new gig. And I'm 46 00:02:37,480 --> 00:02:40,200 Speaker 1: gonna I'm gonna confess something here. Max. Whenever I hear 47 00:02:40,280 --> 00:02:44,079 Speaker 1: someone talk about artificial investing or part of me artificial 48 00:02:44,120 --> 00:02:46,720 Speaker 1: intelligence in the investing world, you know what I do 49 00:02:46,880 --> 00:02:49,160 Speaker 1: is I kind of nod my head and I stroked 50 00:02:49,200 --> 00:02:51,959 Speaker 1: my chin as if I know what's going on. But 51 00:02:52,200 --> 00:02:54,440 Speaker 1: just between us, don't let this get out. I really 52 00:02:54,480 --> 00:02:56,519 Speaker 1: have no idea what they're talking about. I'm hoping you 53 00:02:56,600 --> 00:02:58,840 Speaker 1: can give us all your takes on the market in 54 00:02:58,919 --> 00:03:01,080 Speaker 1: the show. But but first let's get into that notion. 55 00:03:01,160 --> 00:03:03,560 Speaker 1: What are you guys up to. What is sort of 56 00:03:04,160 --> 00:03:08,040 Speaker 1: um going on with AI focused investing at the moment, 57 00:03:08,480 --> 00:03:10,760 Speaker 1: What are the applications and where do you see it 58 00:03:10,919 --> 00:03:14,000 Speaker 1: going uh in the future? And if you don't mind 59 00:03:14,040 --> 00:03:16,560 Speaker 1: the for dummies version, I think it would probably be 60 00:03:16,639 --> 00:03:20,799 Speaker 1: best for at least for me. I don't know about Baldonna. Sure, well, 61 00:03:21,000 --> 00:03:23,400 Speaker 1: I'm sure. I'm sure e Donna would understand the intricacies 62 00:03:23,480 --> 00:03:27,000 Speaker 1: of things like recursive neural networks and you know, transformers. 63 00:03:27,120 --> 00:03:30,760 Speaker 1: But Mike, for you, I'll dumb it down. Yeah, thank you. 64 00:03:30,840 --> 00:03:34,120 Speaker 1: I appreciate that. So, first of all, it's really important 65 00:03:34,160 --> 00:03:39,280 Speaker 1: to differentiate artificial intelligence from the broader quant ecosystem because 66 00:03:39,280 --> 00:03:41,480 Speaker 1: a lot of folks know about quant fund Systematic funds 67 00:03:41,560 --> 00:03:45,040 Speaker 1: have trillions of dollars um you know, a stematic strategies 68 00:03:45,080 --> 00:03:48,600 Speaker 1: have trillions of dollars under management. AI as in true 69 00:03:48,680 --> 00:03:52,160 Speaker 1: artificial intelligence is very rare, and in fact, that's why 70 00:03:52,240 --> 00:03:56,040 Speaker 1: I left a pretty strong position to to join this 71 00:03:56,360 --> 00:04:01,080 Speaker 1: plucky startup to lead the new frontier. True artificial intelligence 72 00:04:01,480 --> 00:04:04,960 Speaker 1: is really based on the concept of a neural network, 73 00:04:05,360 --> 00:04:07,560 Speaker 1: and a neural network is kind of what it sounds like. 74 00:04:07,640 --> 00:04:10,480 Speaker 1: It consists of neurons, just like your brain, and those 75 00:04:10,520 --> 00:04:13,960 Speaker 1: neurons fire in different ways to process information, kind of 76 00:04:14,040 --> 00:04:16,880 Speaker 1: the way human mind process it, but very differently from 77 00:04:16,960 --> 00:04:20,640 Speaker 1: how a traditional systematic strategy process it. The best way 78 00:04:20,680 --> 00:04:25,040 Speaker 1: to think about it is your typical quantitative strategy, even 79 00:04:25,080 --> 00:04:28,040 Speaker 1: if it uses machine learning, which is kind of part 80 00:04:28,080 --> 00:04:31,520 Speaker 1: of that AI phrase geology, but it's not quite what 81 00:04:31,640 --> 00:04:35,160 Speaker 1: we consider authentic AI. It has a formula, and that 82 00:04:35,240 --> 00:04:40,200 Speaker 1: formulas designed to systematize a way of thinking or an 83 00:04:40,240 --> 00:04:44,320 Speaker 1: economic theory. And then that formula has you know, a 84 00:04:44,400 --> 00:04:48,719 Speaker 1: bunch of factors, and those factors have coefficients, and basically 85 00:04:48,800 --> 00:04:51,000 Speaker 1: the machine learning part will at a high level kind 86 00:04:51,040 --> 00:04:54,120 Speaker 1: of adjust to coefficient so that we think that value 87 00:04:54,200 --> 00:04:56,839 Speaker 1: is going to do well, it will be a positive coefficient. 88 00:04:56,960 --> 00:04:59,320 Speaker 1: We think value is going to do not so well, 89 00:04:59,400 --> 00:05:01,400 Speaker 1: that could be in that g of coefficient, right, but 90 00:05:01,560 --> 00:05:04,800 Speaker 1: you're still basically making a decision. In that example of 91 00:05:05,160 --> 00:05:09,760 Speaker 1: that we go longer short value factor. A unsupervised deep 92 00:05:09,839 --> 00:05:12,440 Speaker 1: learning matterwork can actually create its own features. It can 93 00:05:12,480 --> 00:05:15,480 Speaker 1: create its own view on what a factor is. It 94 00:05:15,520 --> 00:05:22,120 Speaker 1: does not have to be this preconceived notion of value, large, momentum, quality, etcetera. 95 00:05:22,760 --> 00:05:27,440 Speaker 1: And so that's a key thing about true AI strategies 96 00:05:27,520 --> 00:05:30,760 Speaker 1: is they can adapt on their own without human intervention. 97 00:05:31,279 --> 00:05:34,000 Speaker 1: They're also going to be unbiased. And and this is 98 00:05:34,080 --> 00:05:35,719 Speaker 1: something that I think a lot of times when people 99 00:05:35,800 --> 00:05:39,320 Speaker 1: think about quant funds, they say, well, quant funds don't 100 00:05:39,400 --> 00:05:42,840 Speaker 1: have bias, and that's not quite true. They don't have 101 00:05:43,200 --> 00:05:46,280 Speaker 1: emotion as in a formula. As a formula, it doesn't 102 00:05:46,320 --> 00:05:50,640 Speaker 1: really care if the markets down fifty or up, you know, 103 00:05:51,480 --> 00:05:53,920 Speaker 1: but it will have a bias because it's based on 104 00:05:54,400 --> 00:05:58,799 Speaker 1: again systematizing a way of thinking. That thinking is inherently biased. 105 00:05:58,839 --> 00:06:00,760 Speaker 1: As as I said, it was created by a human. 106 00:06:01,279 --> 00:06:04,800 Speaker 1: An AI model actually does not have that bias in 107 00:06:04,920 --> 00:06:08,200 Speaker 1: it if it's an authentic a A model. And that's 108 00:06:08,279 --> 00:06:10,160 Speaker 1: something that I think is going to be even more 109 00:06:10,200 --> 00:06:12,680 Speaker 1: important as we get into these markets, which will of 110 00:06:12,800 --> 00:06:15,920 Speaker 1: course discuss where things are changing quite rapidly and we're 111 00:06:15,920 --> 00:06:18,600 Speaker 1: going to see patterns emerge that we haven't seen before. 112 00:06:18,640 --> 00:06:22,240 Speaker 1: You're gonna need something that can adapt on its own 113 00:06:22,360 --> 00:06:26,880 Speaker 1: and adapt quite quickly, and that is to me a 114 00:06:27,040 --> 00:06:31,720 Speaker 1: key hallmark of authentic artificial intelligence. So before we get 115 00:06:31,760 --> 00:06:34,520 Speaker 1: to talk about the market, Max, I really like always 116 00:06:34,560 --> 00:06:36,600 Speaker 1: talking with you because you're up for talking about lots 117 00:06:36,640 --> 00:06:40,479 Speaker 1: of different things, including cryptocurrencies, and I believe you guys 118 00:06:40,720 --> 00:06:44,880 Speaker 1: are working on an AI program for a crypto fund 119 00:06:45,240 --> 00:06:47,640 Speaker 1: if you can tell us a little bit about that. Yeah, 120 00:06:47,720 --> 00:06:50,040 Speaker 1: So this is currently kind of our cutting edge R 121 00:06:50,080 --> 00:06:53,360 Speaker 1: and D efforts that we are working on. We're we 122 00:06:53,560 --> 00:06:57,120 Speaker 1: think that AI and digital assets and I want to 123 00:06:57,160 --> 00:06:59,400 Speaker 1: go beyond just saying cryptocurrency, because I think that actually 124 00:06:59,480 --> 00:07:02,880 Speaker 1: is a arroware universe. So and I know you like 125 00:07:03,080 --> 00:07:05,480 Speaker 1: things like pudgy penguins, um, you know, and n f 126 00:07:05,600 --> 00:07:09,040 Speaker 1: t s. So those are all tradeable assets within the 127 00:07:09,160 --> 00:07:13,400 Speaker 1: digital asset ecosystem, and we think that AI is a 128 00:07:13,600 --> 00:07:17,160 Speaker 1: natural way to analyze those assets and trade them, in 129 00:07:17,240 --> 00:07:19,760 Speaker 1: part because it's a brand new asset class. This is 130 00:07:19,840 --> 00:07:22,400 Speaker 1: something that a lot of folks don't realize. I've seen 131 00:07:22,440 --> 00:07:24,760 Speaker 1: a lot of traders who were good in currencies, are 132 00:07:24,800 --> 00:07:28,160 Speaker 1: good in cross asset kind of jump into the crypto world. 133 00:07:28,720 --> 00:07:31,960 Speaker 1: But they bring in those same concepts and they don't 134 00:07:32,040 --> 00:07:35,320 Speaker 1: necessarily work in crypto space. Like, for instance, a lot 135 00:07:35,360 --> 00:07:39,440 Speaker 1: of resistance and support bands in the currency markets are 136 00:07:39,480 --> 00:07:43,160 Speaker 1: based on actually gentle bank purchasers or you know, companies 137 00:07:43,240 --> 00:07:46,239 Speaker 1: that are hedging their their cost of goods. That doesn't 138 00:07:46,280 --> 00:07:50,080 Speaker 1: exist in in crypto. So when you see our supported 139 00:07:50,120 --> 00:07:53,520 Speaker 1: resistance level, it's actually driven by different factors. When we 140 00:07:53,640 --> 00:07:57,520 Speaker 1: think about AI as as well, it really feeds on 141 00:07:57,720 --> 00:08:00,440 Speaker 1: data of course, Like that's that's a something pbably should 142 00:08:00,440 --> 00:08:05,600 Speaker 1: have said earlier, right, aies source of brain food is data. 143 00:08:06,120 --> 00:08:08,080 Speaker 1: The higher quality of a data a smarter b AI 144 00:08:08,200 --> 00:08:11,440 Speaker 1: will be able to be and data on the blockchain 145 00:08:11,640 --> 00:08:16,480 Speaker 1: is very clean and therefore highly nutritious to an artificial 146 00:08:16,520 --> 00:08:21,480 Speaker 1: intelligence engine. And addition to that, you have really low 147 00:08:21,640 --> 00:08:26,720 Speaker 1: efficiencies in the market, so high quality data, low efficiency, 148 00:08:26,880 --> 00:08:29,280 Speaker 1: and then sprinkling top about a really high volatility that 149 00:08:29,440 --> 00:08:32,680 Speaker 1: is kind of the most fertile you know, a sybal 150 00:08:32,960 --> 00:08:37,920 Speaker 1: if you will, for for AI to drive alpha from UM. 151 00:08:37,960 --> 00:08:40,679 Speaker 1: In addition to that, you've got really unique parts within 152 00:08:40,800 --> 00:08:44,480 Speaker 1: the crypto space, such as the actual code basis that UM. 153 00:08:44,559 --> 00:08:48,160 Speaker 1: A lot of these protocols are developed in and those 154 00:08:48,200 --> 00:08:52,080 Speaker 1: are things where you know, one bit of code, reading 155 00:08:52,080 --> 00:08:53,959 Speaker 1: another bit of code and making a decision on it 156 00:08:54,080 --> 00:08:56,760 Speaker 1: and analyzing it is actually something that opens up even 157 00:08:56,840 --> 00:08:59,320 Speaker 1: more doors. So we think there's a lot of really 158 00:08:59,360 --> 00:09:02,880 Speaker 1: amazing up tunities. I can't talk about too much on 159 00:09:02,960 --> 00:09:05,640 Speaker 1: the specifics, but next time I'm on I hopefully we'll 160 00:09:05,679 --> 00:09:07,559 Speaker 1: be able to share a lot more. But we do 161 00:09:07,760 --> 00:09:21,400 Speaker 1: think that AI and digital assets are a perfect marriage. Max. 162 00:09:21,520 --> 00:09:24,840 Speaker 1: Let's bring it into the present tense a little bit 163 00:09:24,960 --> 00:09:27,400 Speaker 1: with what we saw this week in the markets, and 164 00:09:27,960 --> 00:09:30,400 Speaker 1: I'm curious how AI would work with sort of an 165 00:09:30,440 --> 00:09:34,320 Speaker 1: event risk type of situation like we saw this week 166 00:09:34,360 --> 00:09:36,120 Speaker 1: with the fit. I mean, I know there's been a 167 00:09:36,240 --> 00:09:39,600 Speaker 1: lot of work done on sort of natural language processing 168 00:09:39,720 --> 00:09:44,040 Speaker 1: and try to you know, have computers listen or or 169 00:09:44,160 --> 00:09:47,599 Speaker 1: read text and and and make a decision based on that. 170 00:09:48,280 --> 00:09:50,319 Speaker 1: Where is that in in sort of the big scheme 171 00:09:50,360 --> 00:09:52,800 Speaker 1: of things with with Aiyes, it's still you know, is 172 00:09:52,840 --> 00:09:56,079 Speaker 1: that still a little too tricky to to do? Is 173 00:09:56,160 --> 00:09:58,480 Speaker 1: it something that they need sort of the uh, you know, 174 00:09:58,600 --> 00:10:00,760 Speaker 1: the the organic intelligence of a guy like you to 175 00:10:02,040 --> 00:10:04,360 Speaker 1: handle that end of it. I mean, you know, can 176 00:10:04,480 --> 00:10:07,400 Speaker 1: a I listened to Jerome palell and and pick up 177 00:10:07,400 --> 00:10:10,719 Speaker 1: the nuances and and that or is it just you know, 178 00:10:10,800 --> 00:10:13,520 Speaker 1: a matter of reacting to the market signals as he's speaking. 179 00:10:14,679 --> 00:10:16,920 Speaker 1: It's a great question. There's going to be different answers 180 00:10:16,960 --> 00:10:18,640 Speaker 1: depending on who you talk to. I know, on on 181 00:10:18,840 --> 00:10:22,880 Speaker 1: one end, there's one fund that thinks they can actually 182 00:10:23,040 --> 00:10:26,120 Speaker 1: train a camera. Well, they basically put a camera on 183 00:10:26,400 --> 00:10:31,160 Speaker 1: Powell live and they try to to construct the nuance 184 00:10:31,400 --> 00:10:35,360 Speaker 1: of his physical body language and the nu once of 185 00:10:35,480 --> 00:10:39,880 Speaker 1: his speech inflections to figure out what he's doing. Yeah, now, 186 00:10:40,000 --> 00:10:42,840 Speaker 1: I they when I asked him how well about work? 187 00:10:42,920 --> 00:10:46,280 Speaker 1: They kind of got quiet, so I'll just leave it 188 00:10:46,360 --> 00:10:49,040 Speaker 1: at that. Well, if Pal's got a pretty good poker face, 189 00:10:49,120 --> 00:10:51,360 Speaker 1: so he might not be the best, he's he's gotten 190 00:10:51,400 --> 00:10:54,920 Speaker 1: better since let's let's let's let's let's let's put it there. 191 00:10:55,200 --> 00:10:57,080 Speaker 1: I think I think Marrio track he might have been 192 00:10:57,080 --> 00:10:59,240 Speaker 1: a better test case so that you know a little 193 00:10:59,280 --> 00:11:01,480 Speaker 1: more out of me, Yeah, a little more. You know 194 00:11:01,600 --> 00:11:05,640 Speaker 1: his Italian. So there's there's that. But I do think 195 00:11:05,720 --> 00:11:07,360 Speaker 1: when it comes to things like n LP, it's a 196 00:11:07,440 --> 00:11:12,040 Speaker 1: question of the decay and what I mean by decays 197 00:11:12,640 --> 00:11:16,400 Speaker 1: between when the word is uttered and transcripted to when 198 00:11:16,440 --> 00:11:19,000 Speaker 1: the market reacts. What's the lag and can you actually 199 00:11:19,080 --> 00:11:22,480 Speaker 1: capture that? A lot of times the answer is no, 200 00:11:23,200 --> 00:11:25,360 Speaker 1: So we actually haven't found a lot of utility and 201 00:11:25,760 --> 00:11:29,480 Speaker 1: kind of real time n LP because a real trader, 202 00:11:29,800 --> 00:11:32,240 Speaker 1: who is you know, able to hit buy or sell 203 00:11:32,320 --> 00:11:35,439 Speaker 1: as soon as they here even like half of the 204 00:11:35,520 --> 00:11:38,360 Speaker 1: word is going to be a little bit quicker than 205 00:11:39,240 --> 00:11:43,640 Speaker 1: you know a program will be. It's fascinating area of 206 00:11:43,800 --> 00:11:46,560 Speaker 1: research because I feel like, you know, you could read 207 00:11:46,600 --> 00:11:50,320 Speaker 1: the text of a speech and think one thing where 208 00:11:50,400 --> 00:11:53,920 Speaker 1: you could see someone recite the same speech with different 209 00:11:53,920 --> 00:11:56,679 Speaker 1: inflection and sort of different emotions and and come away 210 00:11:56,720 --> 00:11:59,360 Speaker 1: with something something different. So it's a it's a fascinating 211 00:11:59,360 --> 00:12:01,600 Speaker 1: area of recent I think, yeah, and and I think 212 00:12:01,640 --> 00:12:04,520 Speaker 1: there's a lot of words that especially central bankers use 213 00:12:04,600 --> 00:12:08,480 Speaker 1: were almost dead of vocabulary is not a normal person's vocabulary. 214 00:12:08,600 --> 00:12:10,839 Speaker 1: And as FED watchers, like as economists, we kind of 215 00:12:10,960 --> 00:12:15,079 Speaker 1: know what they mean when they say words um, like 216 00:12:16,080 --> 00:12:18,880 Speaker 1: you know, we're gonna be humble um, or we're gonna 217 00:12:18,960 --> 00:12:22,079 Speaker 1: be um, you know, like persistent for for instance, like 218 00:12:22,320 --> 00:12:24,800 Speaker 1: different contexts that could be very hawkish are very dubbish. 219 00:12:25,679 --> 00:12:29,079 Speaker 1: And a traditional n LP model is just gonna say 220 00:12:29,160 --> 00:12:32,360 Speaker 1: this is positive or negative sentiment. It's not gonna be 221 00:12:32,800 --> 00:12:34,480 Speaker 1: quite as good unless you train And this is this 222 00:12:34,559 --> 00:12:36,280 Speaker 1: is actually where we get into some interesting stuff. And 223 00:12:36,320 --> 00:12:40,160 Speaker 1: I have seen this work where you train a model 224 00:12:40,280 --> 00:12:43,400 Speaker 1: on a specific person. You can do that of central bankers. 225 00:12:43,400 --> 00:12:45,719 Speaker 1: I actually had this idea originally back in I think, 226 00:12:47,040 --> 00:12:49,480 Speaker 1: and at the time it wasn't able to really figure 227 00:12:49,480 --> 00:12:51,760 Speaker 1: out how to do that. But I actually wasn't going 228 00:12:51,800 --> 00:12:56,560 Speaker 1: to train a model back then on Bernanky and uh 229 00:12:56,679 --> 00:12:59,840 Speaker 1: and and draggy and say okay, can we figure out 230 00:13:00,480 --> 00:13:04,840 Speaker 1: based on ner specific vocabulary they're specific mannerisms if they're 231 00:13:05,520 --> 00:13:08,319 Speaker 1: you know, being hawkish or dubbish. But back then it 232 00:13:08,400 --> 00:13:11,280 Speaker 1: definitely didn't work. Perhaps in the future, as compute increases, 233 00:13:11,960 --> 00:13:14,959 Speaker 1: we can you know, really isolate that and and that 234 00:13:15,000 --> 00:13:18,000 Speaker 1: will be a very interesting situation where you have actually 235 00:13:18,120 --> 00:13:22,040 Speaker 1: algorithms making those split second decisions. But we're not quite 236 00:13:22,080 --> 00:13:25,720 Speaker 1: there yet. So speaking about this week, I was hoping 237 00:13:25,760 --> 00:13:28,800 Speaker 1: you could sort of just go over briefly and very 238 00:13:28,840 --> 00:13:31,920 Speaker 1: quickly your takeaways from the FED meeting, and I wanted 239 00:13:31,960 --> 00:13:34,839 Speaker 1: to ask you if you think the market is correctly 240 00:13:35,080 --> 00:13:38,640 Speaker 1: interpreting what happened with Paul this week. I think the 241 00:13:38,760 --> 00:13:42,240 Speaker 1: market is been having a really hard time get getting 242 00:13:42,240 --> 00:13:44,080 Speaker 1: its sea legs this January, right. I mean, just look 243 00:13:44,120 --> 00:13:47,160 Speaker 1: at the action we experienced kind of right after the meeting, 244 00:13:47,240 --> 00:13:51,440 Speaker 1: we had the really big drop, then overnight we started 245 00:13:51,520 --> 00:13:55,440 Speaker 1: dropping further and then came back up. I don't think 246 00:13:55,520 --> 00:13:59,360 Speaker 1: Powell said anything really dramatically new. In fact, you and 247 00:13:59,400 --> 00:14:02,240 Speaker 1: I were talking to about that um right right after 248 00:14:02,320 --> 00:14:04,559 Speaker 1: the meeting. To me, it seemed like just the hope 249 00:14:05,240 --> 00:14:07,320 Speaker 1: is gone, right, that was the initial reaction, like, okay, 250 00:14:07,400 --> 00:14:10,800 Speaker 1: but the Powell put is out of the money. It 251 00:14:11,040 --> 00:14:14,559 Speaker 1: may be completely off the table. And that was that 252 00:14:14,640 --> 00:14:17,400 Speaker 1: Initially like oh crap, like we're the training wheels are 253 00:14:17,440 --> 00:14:22,200 Speaker 1: off uh situation for for investors. But then overnight, all 254 00:14:22,200 --> 00:14:25,400 Speaker 1: of a sudden, you know, we got some some I 255 00:14:25,480 --> 00:14:32,160 Speaker 1: guess newfound hope or newfound um confidence that the markets 256 00:14:32,240 --> 00:14:33,760 Speaker 1: can do well on their own. So if you think 257 00:14:33,760 --> 00:14:35,640 Speaker 1: of like again with the training wheels example, it almost 258 00:14:35,640 --> 00:14:37,440 Speaker 1: seemed like you know, you can take your your kid. 259 00:14:38,480 --> 00:14:40,000 Speaker 1: You kind of sent them down the hill and they 260 00:14:40,000 --> 00:14:41,760 Speaker 1: start really wobbly and they think they're going to fall, 261 00:14:41,760 --> 00:14:43,520 Speaker 1: and then they eventually they start peddling and they're like, oh, 262 00:14:43,560 --> 00:14:46,640 Speaker 1: look at me, I'm going But at no point that 263 00:14:46,720 --> 00:14:50,160 Speaker 1: pow will say anything to me. That was different. That 264 00:14:50,400 --> 00:14:54,080 Speaker 1: was you know, unexpected. They've done a really good jap 265 00:14:54,120 --> 00:14:55,960 Speaker 1: of telegraphing what they were going to do. And the 266 00:14:56,040 --> 00:14:57,640 Speaker 1: thing that I always watched for is kind of the 267 00:14:58,800 --> 00:15:01,440 Speaker 1: speed speakers score on the fringes either the most hawkish 268 00:15:01,520 --> 00:15:03,800 Speaker 1: or the most ubblish, and see if do they start 269 00:15:03,920 --> 00:15:06,080 Speaker 1: changing their tune, And they did the most dubbish. Fet 270 00:15:06,160 --> 00:15:11,040 Speaker 1: speakers UM and FMC members got a lot more hawkish 271 00:15:11,280 --> 00:15:13,480 Speaker 1: in the you know kind of weeks coming up to 272 00:15:13,560 --> 00:15:15,720 Speaker 1: this meeting, So it shouldn't have been a surprise that 273 00:15:15,840 --> 00:15:18,240 Speaker 1: the Fed said, Yep, we're gonna hike in March, we're 274 00:15:18,280 --> 00:15:21,840 Speaker 1: gonna end QE, and we're gonna look at tightening. That's 275 00:15:22,840 --> 00:15:25,440 Speaker 1: pretty much what I think everyone should have expected. Yeah, 276 00:15:26,400 --> 00:15:28,480 Speaker 1: and maybe that's you know, maybe they did, and that's 277 00:15:28,480 --> 00:15:32,200 Speaker 1: why we saw such such altility before the meeting, even 278 00:15:32,360 --> 00:15:35,760 Speaker 1: you know, kind of priced it in. But Maxell, well, 279 00:15:35,800 --> 00:15:39,120 Speaker 1: how are you thinking about the rest of the year now, um, 280 00:15:40,280 --> 00:15:45,400 Speaker 1: especially in context of well, if if treasuries and stocks 281 00:15:45,440 --> 00:15:49,080 Speaker 1: are selling off together, you know, uh, that's sixty forty 282 00:15:49,160 --> 00:15:53,480 Speaker 1: portfolio maybe is not as diversified as it once was. 283 00:15:53,880 --> 00:15:56,000 Speaker 1: You know, I think this is a issue with people 284 00:15:56,000 --> 00:15:58,240 Speaker 1: have been talking about for a year, year or more now, 285 00:15:58,320 --> 00:16:01,680 Speaker 1: but it certainly seems to be an urgent topic now, 286 00:16:01,800 --> 00:16:05,080 Speaker 1: you know. Or is it possible we'll see sort of 287 00:16:05,120 --> 00:16:08,320 Speaker 1: assimule tenious weakness in both the bond and the stock 288 00:16:08,400 --> 00:16:11,520 Speaker 1: market And how do you how do you play that? Yeah, So, 289 00:16:11,640 --> 00:16:14,440 Speaker 1: first of all, the answer is yes, I think that's likely. 290 00:16:14,560 --> 00:16:17,120 Speaker 1: That's actually my base cases that we will see weakness 291 00:16:17,960 --> 00:16:20,640 Speaker 1: in stocks for at least some part of this year. 292 00:16:20,680 --> 00:16:23,480 Speaker 1: I do think we'll um, eventually be a little bit 293 00:16:23,600 --> 00:16:27,920 Speaker 1: higher than where we started, but we will likely see 294 00:16:28,000 --> 00:16:30,800 Speaker 1: a bowl correction and we'll also see rates go up. 295 00:16:31,480 --> 00:16:34,080 Speaker 1: And you know, the last time when rates were in 296 00:16:34,120 --> 00:16:36,360 Speaker 1: a secular uptrend, I think Bill, Donna and I weren't 297 00:16:36,400 --> 00:16:41,280 Speaker 1: even born yet. So Mike, Mike had been around for 298 00:16:41,400 --> 00:16:45,360 Speaker 1: you for decades. But full disclosure, I was asked to 299 00:16:45,600 --> 00:16:49,720 Speaker 1: uh to give Mike some some singers before this show started. 300 00:16:50,800 --> 00:16:52,480 Speaker 1: I'll try to get him in where where I can. 301 00:16:53,960 --> 00:16:56,160 Speaker 1: I remember putting my money in the bank and earning 302 00:16:56,280 --> 00:16:58,680 Speaker 1: something on it that was for the Those were the days. 303 00:16:59,160 --> 00:17:02,560 Speaker 1: What what's such bank? Is that like like like a 304 00:17:02,680 --> 00:17:07,000 Speaker 1: physical ethereum wallet. I would I would roll up the 305 00:17:07,080 --> 00:17:09,800 Speaker 1: quarters and nickels and dimes in the little paper wrappers 306 00:17:09,880 --> 00:17:12,680 Speaker 1: from my paper routing and ride my bike down and 307 00:17:13,359 --> 00:17:16,480 Speaker 1: on a little paper to deposit slip and the interest 308 00:17:16,640 --> 00:17:19,160 Speaker 1: was good. Those were the days. Bring back bring back 309 00:17:19,240 --> 00:17:23,160 Speaker 1: those you know, five percent savings deposit accounts. Max's he's 310 00:17:23,160 --> 00:17:27,119 Speaker 1: talking about the early Yeah, I I think it's kind 311 00:17:27,119 --> 00:17:31,560 Speaker 1: of turn up the sanctuary, you know, yeah, novel there 312 00:17:31,760 --> 00:17:36,160 Speaker 1: that that Mike was was spinning a little little Oliver twist. Yeah, 313 00:17:37,400 --> 00:17:40,000 Speaker 1: artificial intelligence back then was like if you got the 314 00:17:40,320 --> 00:17:42,440 Speaker 1: cheap code to Donkey Kong. You know that that was 315 00:17:42,800 --> 00:17:44,840 Speaker 1: that was about as far as it went. I I 316 00:17:45,000 --> 00:17:47,840 Speaker 1: do think with that you may be careful what you 317 00:17:47,880 --> 00:17:49,600 Speaker 1: wish for, because I think we could get those five 318 00:17:49,640 --> 00:17:52,560 Speaker 1: percent interest rates, but that also, you know, I believe 319 00:17:52,640 --> 00:17:55,879 Speaker 1: came with like a twelve percent mortgage. That's fair, so 320 00:17:56,240 --> 00:17:58,480 Speaker 1: you know, fair point. Those of us who locked in 321 00:17:58,840 --> 00:18:02,679 Speaker 1: you know, low mortgages are are pretty happy. Probably at 322 00:18:02,760 --> 00:18:04,840 Speaker 1: this point, I do think we are going to see 323 00:18:04,920 --> 00:18:09,240 Speaker 1: this secular rising rate environment come back, and that really 324 00:18:09,359 --> 00:18:11,399 Speaker 1: posts the challenge not only for sixty forty, but the 325 00:18:11,480 --> 00:18:13,560 Speaker 1: thing that kind of supplanted sixty four for a lot 326 00:18:13,600 --> 00:18:17,840 Speaker 1: of institutional portfolis, which is risk parity. And while you 327 00:18:17,920 --> 00:18:20,680 Speaker 1: know risk parity folks will tell you about it's very sophisticated. 328 00:18:21,080 --> 00:18:23,560 Speaker 1: Um having run risparity strategies before, I can tell you 329 00:18:23,680 --> 00:18:28,400 Speaker 1: that generally speaking, it still relies on the crucial concept 330 00:18:28,480 --> 00:18:32,440 Speaker 1: of bonds go up when stocks go down, and duration 331 00:18:32,600 --> 00:18:35,800 Speaker 1: risk diversifies equity risk. If that no longer is the case, 332 00:18:36,600 --> 00:18:40,480 Speaker 1: you need to create something different. I think it's gonna 333 00:18:40,480 --> 00:18:43,480 Speaker 1: be really important to be more dynamic. It's gonna be 334 00:18:43,960 --> 00:18:46,080 Speaker 1: what I mean about its dynamic in terms of asset classes. 335 00:18:46,680 --> 00:18:51,000 Speaker 1: So the concept of a balanced portfolio is really important. 336 00:18:51,119 --> 00:18:53,960 Speaker 1: I think if we break it down to its really 337 00:18:54,000 --> 00:18:57,280 Speaker 1: building blocks, it's a risk asset that can go up 338 00:18:57,359 --> 00:19:01,760 Speaker 1: and produce capital gains, and then a diversifying outset that 339 00:19:02,040 --> 00:19:05,200 Speaker 1: maybe produces a little bit of income and steady returns, 340 00:19:05,320 --> 00:19:09,000 Speaker 1: but primarily is there to hedge the risky asset. So 341 00:19:09,200 --> 00:19:12,359 Speaker 1: what those two components are. I think that's gonna be 342 00:19:12,400 --> 00:19:16,600 Speaker 1: more dynamic going forward, and the new old Weather strategies 343 00:19:16,640 --> 00:19:18,800 Speaker 1: are going to be playing with those concepts. So they 344 00:19:18,840 --> 00:19:23,760 Speaker 1: may hold equities and bonds like current strategies do. They 345 00:19:23,760 --> 00:19:28,200 Speaker 1: may also hold some amount of um commodities like some 346 00:19:28,320 --> 00:19:31,679 Speaker 1: resperity strategies, but they may at various points hold totally 347 00:19:31,720 --> 00:19:34,400 Speaker 1: different things. They may actually hold some amount of crypto 348 00:19:34,560 --> 00:19:37,760 Speaker 1: and some amount of you know, loans, and some amount 349 00:19:38,000 --> 00:19:42,639 Speaker 1: of stocks, and those assequences will have to keep varying. 350 00:19:42,640 --> 00:19:44,560 Speaker 1: It's gonna be a bit of a musical chair strategy. 351 00:19:45,000 --> 00:19:46,959 Speaker 1: And I know it sounds a lot more complex than 352 00:19:47,000 --> 00:19:51,600 Speaker 1: it is, but the unfortunate byproduct of our reality is 353 00:19:51,680 --> 00:19:54,480 Speaker 1: things do get more complex with time, and if you 354 00:19:54,680 --> 00:19:58,960 Speaker 1: stick to you know your traditional approaches, I think you're 355 00:19:59,119 --> 00:20:03,520 Speaker 1: more liable to actually suffer long term and not achieve 356 00:20:03,560 --> 00:20:06,560 Speaker 1: your objectives as an institutional investor. Do you do, uh, 357 00:20:06,720 --> 00:20:10,080 Speaker 1: commodities play a role in in diversification these days, there's 358 00:20:10,119 --> 00:20:11,960 Speaker 1: that chip already say out of it, you already missed 359 00:20:12,000 --> 00:20:15,560 Speaker 1: that boat. I I think they do. But again it's tricky. 360 00:20:15,880 --> 00:20:17,520 Speaker 1: You know. I think a lot of times people say, oh, 361 00:20:17,520 --> 00:20:21,200 Speaker 1: commodities are a good inflation hedge, and in my research 362 00:20:21,280 --> 00:20:24,359 Speaker 1: I found that commodities are good as a hedge against 363 00:20:24,400 --> 00:20:30,760 Speaker 1: inflation shocks, but there not necessarily a hedge against steady 364 00:20:30,960 --> 00:20:33,920 Speaker 1: rising inflation. And there's a lot of factors that figure 365 00:20:33,960 --> 00:20:37,000 Speaker 1: into commodities, especially once you started actively trading around them, 366 00:20:37,800 --> 00:20:42,240 Speaker 1: that are very idiosyncratic. So you know, OPEC doesn't necessarily 367 00:20:42,320 --> 00:20:44,760 Speaker 1: care about your inflation hedge, your your income story when 368 00:20:44,800 --> 00:20:48,040 Speaker 1: they decide, however, to raise or lower supplies. And you know, 369 00:20:48,119 --> 00:20:52,359 Speaker 1: when people in OPEC cheat and actually ignore the supply constraints, 370 00:20:52,440 --> 00:20:54,639 Speaker 1: then you know, it creates a whole different situation. And 371 00:20:55,000 --> 00:20:56,720 Speaker 1: you know, then we have things like, well, if we 372 00:20:56,840 --> 00:20:59,840 Speaker 1: get the infrastructure bill pass, that will certainly send some 373 00:21:00,000 --> 00:21:05,359 Speaker 1: commodities up higher. But if Conversely, people stops, you know, 374 00:21:05,480 --> 00:21:08,280 Speaker 1: spending as much on restaurants, that will send other commodities down. 375 00:21:08,359 --> 00:21:12,720 Speaker 1: So so it's a more complex ecosystem, and I think 376 00:21:12,920 --> 00:21:14,800 Speaker 1: looking at it at a very high level, like as 377 00:21:14,880 --> 00:21:18,000 Speaker 1: just a single as a class, is not a good idea. 378 00:21:18,000 --> 00:21:22,080 Speaker 1: I think commodities are pretty idiosyncratic. So talking about the 379 00:21:22,359 --> 00:21:25,200 Speaker 1: high level, I wanted to ask you about some maybe 380 00:21:25,280 --> 00:21:29,520 Speaker 1: positive catalysts for the stock market, because the idea is, UH, 381 00:21:29,640 --> 00:21:32,800 Speaker 1: sort of the Powell and the FED can move sooner 382 00:21:32,960 --> 00:21:35,639 Speaker 1: and quicker in terms of hikes this year because the 383 00:21:35,800 --> 00:21:39,760 Speaker 1: economic backgroup backdrop is strong, right. And then the second 384 00:21:39,800 --> 00:21:41,800 Speaker 1: part of this is, and I've heard Gina Martin Adams 385 00:21:41,840 --> 00:21:44,280 Speaker 1: talking about this on different interviews that she was giving 386 00:21:44,359 --> 00:21:47,680 Speaker 1: this week, is earnings and how that might sort of 387 00:21:47,800 --> 00:21:53,240 Speaker 1: help support stocks at least in the near term. Yeah, 388 00:21:53,440 --> 00:21:58,040 Speaker 1: I think that the FED starting earlier is good. There's 389 00:21:58,160 --> 00:22:01,440 Speaker 1: obviously some folks who think that OFF is starting late, 390 00:22:02,760 --> 00:22:05,119 Speaker 1: and I I personally don't think they're starting late. I 391 00:22:05,160 --> 00:22:07,119 Speaker 1: think they're starting at the right time, and I think 392 00:22:07,160 --> 00:22:11,119 Speaker 1: they're signaling like signaling it well, So kudos to UH 393 00:22:11,480 --> 00:22:15,600 Speaker 1: to share Powell on I think navigating this about as 394 00:22:15,680 --> 00:22:20,040 Speaker 1: well as possible. I do think that earnings are also 395 00:22:20,119 --> 00:22:22,119 Speaker 1: going to be a very important catalyst, and it's a 396 00:22:22,200 --> 00:22:25,399 Speaker 1: question of not so much what the street expects, but 397 00:22:25,840 --> 00:22:29,000 Speaker 1: what investors expected. We've seen a bit of a divergence there. 398 00:22:29,119 --> 00:22:32,240 Speaker 1: So if we look at the most recent kind of 399 00:22:32,359 --> 00:22:34,920 Speaker 1: you know, Q four earnings that have been released so far, 400 00:22:35,560 --> 00:22:39,520 Speaker 1: I think um generally speaking, beats were punished a little 401 00:22:39,520 --> 00:22:43,879 Speaker 1: bit and losses were punished severely. And that's an important concept, right. 402 00:22:43,920 --> 00:22:46,520 Speaker 1: So that means even the beats on average or not 403 00:22:47,320 --> 00:22:50,160 Speaker 1: quite what investors expected. And that was my fear going 404 00:22:50,240 --> 00:22:53,879 Speaker 1: into this reporting season, is that the actual investor expectations 405 00:22:53,920 --> 00:22:58,000 Speaker 1: were higher than what we were going to see. So 406 00:22:58,160 --> 00:23:00,439 Speaker 1: then the question is, as we go further and as 407 00:23:00,440 --> 00:23:03,640 Speaker 1: we get into you know, like April, are Q one 408 00:23:03,840 --> 00:23:07,399 Speaker 1: expectations going to be more in line. That's gonna be 409 00:23:07,560 --> 00:23:10,240 Speaker 1: again using that training Wovels analogy, the market has to 410 00:23:10,320 --> 00:23:12,800 Speaker 1: peddle on its own. Now the fat is not going 411 00:23:12,920 --> 00:23:16,240 Speaker 1: to come in. I also think it's important to note 412 00:23:16,359 --> 00:23:19,159 Speaker 1: that fate works both ways, and I'm talking about the 413 00:23:19,320 --> 00:23:22,880 Speaker 1: flexible inflation targeting. So just as the fat was comfortable 414 00:23:22,880 --> 00:23:26,359 Speaker 1: with inflation running over the two percent line for a 415 00:23:26,400 --> 00:23:29,000 Speaker 1: while there will be actually be comfortable with inflation running 416 00:23:29,040 --> 00:23:33,000 Speaker 1: below that. So Powell made it very clear when he 417 00:23:33,080 --> 00:23:36,919 Speaker 1: spoke this week that he cares about the labor market 418 00:23:37,720 --> 00:23:43,400 Speaker 1: and not as surprises, asset prices inflating upwards a byproduct 419 00:23:43,840 --> 00:23:46,000 Speaker 1: of the stimulus that needed to be done to help 420 00:23:46,040 --> 00:23:48,120 Speaker 1: the labor market. And I think investors need to really 421 00:23:48,200 --> 00:23:51,040 Speaker 1: internalize that because because I don't think that's a blood 422 00:24:05,720 --> 00:24:08,520 Speaker 1: all right, Max, you know, Valdanna has a blanket on 423 00:24:08,560 --> 00:24:11,520 Speaker 1: her head for this podcast. I'm gonna put my tinfoil 424 00:24:11,600 --> 00:24:14,359 Speaker 1: hat on my head for a minute and uh and 425 00:24:14,440 --> 00:24:17,160 Speaker 1: tell me what you think of this, uh conspiracy theory 426 00:24:17,200 --> 00:24:19,280 Speaker 1: of mine. Actually it's not really a conspiracy theory. I 427 00:24:19,359 --> 00:24:21,880 Speaker 1: don't really believe this is a conspiracy, but I could 428 00:24:22,000 --> 00:24:28,400 Speaker 1: see sort of the monetary and political outcomes uh coming 429 00:24:28,440 --> 00:24:32,000 Speaker 1: out like this, and that is Um. Jerome pal comes 430 00:24:32,040 --> 00:24:35,639 Speaker 1: out this week. He's very hawkish. Um. We've obviously had 431 00:24:35,680 --> 00:24:38,520 Speaker 1: this correction in the stock market. The year goes on, 432 00:24:38,680 --> 00:24:41,920 Speaker 1: we get into the summer, we start lapping the comparisons 433 00:24:42,000 --> 00:24:44,800 Speaker 1: to last year, and we see that maybe inflation has 434 00:24:44,840 --> 00:24:47,440 Speaker 1: cooled off, at least come off the boil off the 435 00:24:47,520 --> 00:24:51,760 Speaker 1: seven percent. That allows pal to get a little more dubbish, 436 00:24:51,880 --> 00:24:54,800 Speaker 1: you know, a little less hawkish perhaps. Uh. In the 437 00:24:54,920 --> 00:25:00,880 Speaker 1: summer early fall, markets rallying again. Uh. By the time 438 00:25:00,920 --> 00:25:05,280 Speaker 1: the mid term election comes around, everyone's forgotten about this 439 00:25:05,480 --> 00:25:09,720 Speaker 1: ugly spell in the markets, and politically it's beneficial to 440 00:25:09,960 --> 00:25:13,240 Speaker 1: Biden and the Democrats. In theory. Again, I don't really 441 00:25:13,280 --> 00:25:15,200 Speaker 1: believe that's a conspiracy theory, but I could see it 442 00:25:15,280 --> 00:25:18,480 Speaker 1: playing out along those lines. But I'm just curious, you know, 443 00:25:18,880 --> 00:25:20,520 Speaker 1: what do you think of that scenario, and what do 444 00:25:20,600 --> 00:25:22,720 Speaker 1: you think about the mid terms in general? I know 445 00:25:22,840 --> 00:25:26,640 Speaker 1: they're traditionally uh, kind of a weak year for equities. Um, 446 00:25:27,200 --> 00:25:29,240 Speaker 1: you know, is that going to be the case this year? 447 00:25:29,320 --> 00:25:31,399 Speaker 1: Is there is there political risk that we're gonna have 448 00:25:31,440 --> 00:25:34,640 Speaker 1: to deal with later in the year or what? Well, 449 00:25:34,760 --> 00:25:37,639 Speaker 1: I think the second part of your question is the easiest. Yes, well, 450 00:25:37,680 --> 00:25:40,800 Speaker 1: you will have to deal with political risk with midterms around. 451 00:25:41,240 --> 00:25:44,080 Speaker 1: They are very important mid terms. I don't want to 452 00:25:44,119 --> 00:25:46,479 Speaker 1: be that guy who comes on you know every time 453 00:25:46,560 --> 00:25:48,399 Speaker 1: versus elections, save this it's the most important election of 454 00:25:48,640 --> 00:25:51,399 Speaker 1: our time, because you know, it's always easy to make 455 00:25:51,480 --> 00:25:53,359 Speaker 1: that comment. I don't think this is this is the 456 00:25:53,400 --> 00:25:56,080 Speaker 1: most important midterm election of our time. I do think 457 00:25:56,520 --> 00:26:00,240 Speaker 1: it's an interesting one as a someone you know who's 458 00:26:00,240 --> 00:26:03,040 Speaker 1: a bit of a political wonk and also market participant 459 00:26:03,080 --> 00:26:05,720 Speaker 1: who trades on that political information. I think there's some 460 00:26:05,960 --> 00:26:08,680 Speaker 1: unique features which are well known, which is one, you 461 00:26:08,760 --> 00:26:12,880 Speaker 1: obviously have a very tight margin and that margin needs 462 00:26:12,920 --> 00:26:16,359 Speaker 1: to be maintained by the Democrats. And I think with 463 00:26:16,600 --> 00:26:18,840 Speaker 1: that you also have these you know, two outliers in 464 00:26:18,920 --> 00:26:22,400 Speaker 1: the Senate who need to come in line at some point, 465 00:26:22,520 --> 00:26:25,119 Speaker 1: and I think they will because I think at the 466 00:26:25,240 --> 00:26:27,840 Speaker 1: end of the day, there are going to choose their 467 00:26:28,000 --> 00:26:31,760 Speaker 1: party and they're gonna want to keep that majority because 468 00:26:31,760 --> 00:26:33,480 Speaker 1: they know if they don't have friends on the other side, 469 00:26:33,520 --> 00:26:37,159 Speaker 1: even though they've skewed pretty close to it, and so 470 00:26:37,560 --> 00:26:40,359 Speaker 1: you know, I think Mansion and Cinema are going to 471 00:26:40,560 --> 00:26:43,840 Speaker 1: eventually support build Back Better, and I think the timing 472 00:26:43,920 --> 00:26:45,440 Speaker 1: of that is going to be important. We know that 473 00:26:45,560 --> 00:26:49,200 Speaker 1: in Washington a lot of times the theatrics are geared 474 00:26:49,280 --> 00:26:53,359 Speaker 1: towards giving you that like last minute conclusion. I think 475 00:26:53,400 --> 00:26:55,280 Speaker 1: that's probably what's gonna happen. And I actually still think 476 00:26:55,320 --> 00:26:58,040 Speaker 1: build Back Better passes that does, by the way, create 477 00:26:58,080 --> 00:27:01,520 Speaker 1: a little bit of a fist put in the markets 478 00:27:01,800 --> 00:27:05,119 Speaker 1: that I don't think it's totally priced in yet, but 479 00:27:05,960 --> 00:27:09,800 Speaker 1: just how much that gives us is a bit of 480 00:27:09,840 --> 00:27:12,040 Speaker 1: an open question. I think there we do want to 481 00:27:12,080 --> 00:27:15,960 Speaker 1: be careful, you know, I wouldn't start overweighing on industrials 482 00:27:15,960 --> 00:27:20,200 Speaker 1: and materials and financials just yet, but I do think 483 00:27:20,760 --> 00:27:24,240 Speaker 1: those are v sectors that can do better from from that, 484 00:27:24,440 --> 00:27:26,800 Speaker 1: and and beyond that, you know, there will be other 485 00:27:27,359 --> 00:27:30,840 Speaker 1: related midterm theatrics that are are going to start happening. 486 00:27:31,000 --> 00:27:33,080 Speaker 1: I think there we're going to start seeing that more 487 00:27:33,160 --> 00:27:38,600 Speaker 1: towards you know, kind of early third quarter UM is 488 00:27:38,680 --> 00:27:43,440 Speaker 1: when things will really start kind of getting interesting from 489 00:27:43,480 --> 00:27:47,240 Speaker 1: a political standpoint, the drama builds, the drama bills. Max. 490 00:27:47,320 --> 00:27:50,240 Speaker 1: I wanted to ask you about some geopolitical risks that 491 00:27:50,320 --> 00:27:53,080 Speaker 1: you're thinking about, because obviously we have a bunch of 492 00:27:53,080 --> 00:27:56,760 Speaker 1: stuff going around Russia and Ukraine, and so how should 493 00:27:56,800 --> 00:27:59,240 Speaker 1: investors be thinking about that. Do you think that some 494 00:27:59,359 --> 00:28:03,480 Speaker 1: of those risks are properly priced in? And what potentially 495 00:28:03,560 --> 00:28:06,919 Speaker 1: would would an escalation mean for the price of oil 496 00:28:07,000 --> 00:28:10,560 Speaker 1: and inflation and so on. Sure, So, so, one thing 497 00:28:10,920 --> 00:28:14,000 Speaker 1: that we know about Russias, they tend to be the 498 00:28:14,080 --> 00:28:18,159 Speaker 1: most aggressive in winter times because they control the heating 499 00:28:18,240 --> 00:28:22,440 Speaker 1: power for Europe and when Putin goes on, and you know, 500 00:28:22,520 --> 00:28:24,680 Speaker 1: this is one one area where it's nice to have 501 00:28:25,119 --> 00:28:27,280 Speaker 1: been born and raised in the country because I can 502 00:28:27,320 --> 00:28:31,679 Speaker 1: actually listen to Putin in Russian speak to Russian Russian public, 503 00:28:31,840 --> 00:28:34,960 Speaker 1: and he uses very colorful language. Let me just put 504 00:28:35,000 --> 00:28:37,320 Speaker 1: it that way. One that I I'm not going to 505 00:28:37,400 --> 00:28:40,600 Speaker 1: say on the podcast, but but he does talk about 506 00:28:40,600 --> 00:28:45,840 Speaker 1: how he can basically freeze all of the people in Europe, 507 00:28:45,920 --> 00:28:48,360 Speaker 1: and there's some truth about it. That's why you never 508 00:28:48,440 --> 00:28:53,000 Speaker 1: see very meaningful sanctions come out. Now. The biggest concern 509 00:28:53,080 --> 00:28:55,760 Speaker 1: for Putin, of course, is that Ukraine somehow becomes part 510 00:28:55,800 --> 00:28:59,480 Speaker 1: of NATO. So that's the that's really the gamble and 511 00:28:59,520 --> 00:29:03,560 Speaker 1: the risk. I do think that a conflict there is likely, 512 00:29:03,640 --> 00:29:07,400 Speaker 1: unfortunately in terms of a kinetic conflict, because you just 513 00:29:07,520 --> 00:29:10,760 Speaker 1: have this powder keg on both sides, and even though 514 00:29:10,800 --> 00:29:12,840 Speaker 1: it's cold, it's very dry, and in the sense that 515 00:29:12,920 --> 00:29:15,800 Speaker 1: anything could spark it off. I think for the markets 516 00:29:15,880 --> 00:29:17,560 Speaker 1: that's not going to be as big of a risk 517 00:29:18,320 --> 00:29:21,920 Speaker 1: as what's going on further out east um and there 518 00:29:22,200 --> 00:29:25,000 Speaker 1: I'm really thinking about China and Taiwan. So last time 519 00:29:25,640 --> 00:29:31,160 Speaker 1: China flew their jets times over Taiwanese airspace definitely a 520 00:29:31,320 --> 00:29:35,240 Speaker 1: very strong signal. I do think Taiwan has perhaps the 521 00:29:35,400 --> 00:29:39,120 Speaker 1: best defense forces of any small nation, and that defense 522 00:29:39,200 --> 00:29:45,240 Speaker 1: force is called t SMC. Taiwanese Semiconductor Company. Is probably 523 00:29:45,640 --> 00:29:49,880 Speaker 1: the best deterrent because it is so vital to all 524 00:29:50,040 --> 00:29:53,720 Speaker 1: other nations around the world that no one can actually 525 00:29:53,760 --> 00:29:58,040 Speaker 1: afford to risk China taking over t SMC. And for 526 00:29:58,200 --> 00:30:02,560 Speaker 1: that reason, you know, even as there kind of saber 527 00:30:02,640 --> 00:30:05,600 Speaker 1: rattling gets louder, I still think it's a big tail risk, 528 00:30:05,640 --> 00:30:07,440 Speaker 1: but it is also a tail risk that I don't 529 00:30:07,480 --> 00:30:10,520 Speaker 1: think we can fully discounts. So that to me is 530 00:30:10,800 --> 00:30:14,680 Speaker 1: kind of the biggest geopolitical risk. But to take your 531 00:30:14,800 --> 00:30:16,680 Speaker 1: question and look at it a little higher level, but 532 00:30:16,760 --> 00:30:18,880 Speaker 1: I think is driving a lot of geopolitical risk. And 533 00:30:18,920 --> 00:30:21,640 Speaker 1: the last i'd say, like five years or so is 534 00:30:21,720 --> 00:30:25,800 Speaker 1: de globalization. And that's a trend that's been pretty secular. 535 00:30:26,200 --> 00:30:28,120 Speaker 1: We've seen a lot of businesses move from just in 536 00:30:28,240 --> 00:30:32,000 Speaker 1: case inventor sorry, from just in time inventory to just 537 00:30:32,240 --> 00:30:34,640 Speaker 1: in case inventory. And what I mean by that is 538 00:30:34,840 --> 00:30:37,680 Speaker 1: it used to be that um, you could realize much 539 00:30:37,720 --> 00:30:40,200 Speaker 1: better margins if you just ordered whatever you needed. You 540 00:30:40,280 --> 00:30:44,560 Speaker 1: had want typically one plant, usually in China or you know, 541 00:30:44,640 --> 00:30:47,840 Speaker 1: another kind of low labor cost country that you'd get 542 00:30:47,880 --> 00:30:50,280 Speaker 1: all of your goods from, and that made for good 543 00:30:50,360 --> 00:30:53,920 Speaker 1: margins and it made for quick um inventory is talking, 544 00:30:53,960 --> 00:30:56,040 Speaker 1: so you didn't need to build up a lot. Well, now, 545 00:30:56,240 --> 00:30:59,840 Speaker 1: since tradewards really began and this deglobalization movement began, you've 546 00:31:00,160 --> 00:31:02,520 Speaker 1: to just in case inventory where you have a factory 547 00:31:02,520 --> 00:31:05,040 Speaker 1: in China, for instance, and then maybe a second factory 548 00:31:05,080 --> 00:31:08,800 Speaker 1: in Vietnam that's able to take on additional capacity necessary. 549 00:31:09,280 --> 00:31:12,320 Speaker 1: So that means companies are no longer as dependent on 550 00:31:12,480 --> 00:31:16,040 Speaker 1: a specific country. And while that can sound good, it 551 00:31:16,200 --> 00:31:22,280 Speaker 1: also lends a political or a politicians more leeway to 552 00:31:22,840 --> 00:31:26,080 Speaker 1: create more strenuous ties with those countries. And that's where 553 00:31:26,400 --> 00:31:29,800 Speaker 1: you know, trade wars as they progress can actually um 554 00:31:29,960 --> 00:31:36,680 Speaker 1: exacerbate because companies have created hedges, if you will, against 555 00:31:36,880 --> 00:31:38,880 Speaker 1: those situations. And and that to me is a more 556 00:31:39,120 --> 00:31:43,560 Speaker 1: concerning secular trend. Fascinating stuff, Max, you know, I it's 557 00:31:43,560 --> 00:31:47,360 Speaker 1: such a great point about Taiwan semiconductor, especially given the 558 00:31:47,400 --> 00:31:50,040 Speaker 1: state of the chip supply chain these days. And also 559 00:31:50,080 --> 00:31:52,360 Speaker 1: I didn't realize Pewton, you know, said they said the 560 00:31:52,400 --> 00:31:54,160 Speaker 1: quiet stuff out loud like that. I thought, you know, 561 00:31:54,200 --> 00:31:56,360 Speaker 1: I know, it's always assumed he could freeze everyone out 562 00:31:56,400 --> 00:31:58,520 Speaker 1: if he wanted to. I didn't realize he actually talked 563 00:31:58,560 --> 00:32:02,560 Speaker 1: about that. That's pretty that's such Tiden up your straight jackets. 564 00:32:02,720 --> 00:32:06,880 Speaker 1: It's time for the craziest things we saw in markets 565 00:32:07,200 --> 00:32:10,320 Speaker 1: this week. I think it's that time. It is that time. 566 00:32:10,600 --> 00:32:13,680 Speaker 1: As I said, I've Telor made my craziest thing for 567 00:32:14,080 --> 00:32:15,719 Speaker 1: the one in a evil Danta Hirich. But I want 568 00:32:15,760 --> 00:32:18,400 Speaker 1: to hear yours first. What's here. I appreciate that after 569 00:32:18,520 --> 00:32:22,040 Speaker 1: my tough bill's loss, but I want to first say 570 00:32:22,760 --> 00:32:26,240 Speaker 1: you and I never explained the blanket thing to our listeners. 571 00:32:26,320 --> 00:32:27,600 Speaker 1: So I want to give a shout out to our 572 00:32:27,640 --> 00:32:30,720 Speaker 1: producer Laura, who makes me and you hide under blankets 573 00:32:30,760 --> 00:32:33,760 Speaker 1: sometimes for better sound quality. So that's that's what we're 574 00:32:33,800 --> 00:32:36,120 Speaker 1: talking about here. I have a bunch of blankets over 575 00:32:36,240 --> 00:32:40,360 Speaker 1: my head. Yeah sometimes I just tied under them. You know. Yeah, 576 00:32:40,520 --> 00:32:42,480 Speaker 1: I know you did, because I'm feeling it, you know. Yeah, 577 00:32:42,840 --> 00:32:45,720 Speaker 1: that's that's fine, that's fine. I think that's okay. Well, okay, 578 00:32:45,760 --> 00:32:48,840 Speaker 1: so first or I guess second. After my my little 579 00:32:48,840 --> 00:32:51,840 Speaker 1: blanket thing, I want to give a second shout out 580 00:32:51,880 --> 00:32:55,400 Speaker 1: to Ben Emmons of Medley Global Advisors. He's a frequent 581 00:32:55,680 --> 00:32:58,400 Speaker 1: guest of the podcast, and he actually wrote something into 582 00:32:58,480 --> 00:33:00,800 Speaker 1: me that I wanted to read out loud. It's not 583 00:33:00,960 --> 00:33:04,920 Speaker 1: exactly markets related, but it's money related, so I'm gonna 584 00:33:05,000 --> 00:33:07,640 Speaker 1: allow it. He sent a USA Today story in the 585 00:33:07,720 --> 00:33:11,080 Speaker 1: headline is a woman finds out she won three million 586 00:33:11,240 --> 00:33:15,520 Speaker 1: dollar lottery prize after checking her email spam folder. So 587 00:33:15,720 --> 00:33:17,840 Speaker 1: thank you Ben for sending that in. I love that, 588 00:33:17,960 --> 00:33:20,120 Speaker 1: and I think it is markets related. I think lottery 589 00:33:20,200 --> 00:33:23,680 Speaker 1: tickets are a fine investment choice, you know. Yeah, sure, 590 00:33:23,760 --> 00:33:25,760 Speaker 1: it's kind of a tail risk fund. I'll all out, 591 00:33:25,880 --> 00:33:28,720 Speaker 1: all out. Yeah. And so that's just a reminder that 592 00:33:28,880 --> 00:33:31,360 Speaker 1: if anybody else has seen anything weird and wants us 593 00:33:31,360 --> 00:33:33,080 Speaker 1: to know about it, you can give us a call 594 00:33:33,160 --> 00:33:36,120 Speaker 1: on the Crazy Things hotline. That's six four six three 595 00:33:36,200 --> 00:33:39,840 Speaker 1: two four three nine zero, leave us a voicemail, hit 596 00:33:39,960 --> 00:33:42,360 Speaker 1: us up on Twitter, and maybe we'll play or talk 597 00:33:42,440 --> 00:33:45,720 Speaker 1: about your weird thing or crazy thing on the show. 598 00:33:46,760 --> 00:33:49,120 Speaker 1: And then for mine, I have a story courtesy of 599 00:33:49,160 --> 00:33:51,880 Speaker 1: another pod friend. It's Crystal Kim. She was on a 600 00:33:51,960 --> 00:33:54,840 Speaker 1: couple of weeks ago and she wrote about the FOMO 601 00:33:55,000 --> 00:33:57,640 Speaker 1: E t F the Fear of missing out E t F. 602 00:33:58,360 --> 00:34:00,280 Speaker 1: I don't know if you saw this story, but this 603 00:34:00,480 --> 00:34:04,480 Speaker 1: fund buys meme names and other sort of popular high 604 00:34:04,520 --> 00:34:10,200 Speaker 1: flying stocks except except right now, it has so many 605 00:34:10,360 --> 00:34:13,120 Speaker 1: dull names in it. But its own manager said the 606 00:34:13,200 --> 00:34:17,160 Speaker 1: strategy puts him to sleep. And the PHONEO et F 607 00:34:17,320 --> 00:34:21,080 Speaker 1: right now is almost in cash. It's the biggest holding 608 00:34:21,400 --> 00:34:24,759 Speaker 1: is Chevron. It has Campbell Soup in there. And then 609 00:34:24,800 --> 00:34:27,759 Speaker 1: I was checking it out and I look for Game 610 00:34:27,840 --> 00:34:30,719 Speaker 1: Stop in AMC and neither one are part of this 611 00:34:30,840 --> 00:34:34,239 Speaker 1: fund anymore. That is pretty fascinating. Well, they're they're not. 612 00:34:34,520 --> 00:34:37,080 Speaker 1: No fear of missing out on the drops in those stocks. 613 00:34:37,520 --> 00:34:43,680 Speaker 1: I guess that's that's pretty good. Yeah, fear of missing 614 00:34:43,760 --> 00:34:48,080 Speaker 1: out on capital preservation, I guess is the theme this week. 615 00:34:48,680 --> 00:34:50,719 Speaker 1: All right, Max, that's pretty good. What do you got first? 616 00:34:50,760 --> 00:34:54,000 Speaker 1: Have you seen anything crazy? I mean, so whenever I 617 00:34:54,080 --> 00:34:57,640 Speaker 1: look for something crazy on on short notice, I tend 618 00:34:57,719 --> 00:35:01,640 Speaker 1: to go to the metaverse. Now and uh, you know, 619 00:35:02,280 --> 00:35:05,360 Speaker 1: you'll you'll never stopped short of kind of fun figures. 620 00:35:05,400 --> 00:35:07,000 Speaker 1: So I'll start, I'll sort of that and then I'll 621 00:35:07,080 --> 00:35:09,200 Speaker 1: drill down to the actual thing that I thought was crazy. 622 00:35:09,760 --> 00:35:11,920 Speaker 1: They just released the data for the fourth quarter of 623 00:35:12,080 --> 00:35:19,719 Speaker 1: virtual land sales in metaverse three million dollars. So that 624 00:35:19,920 --> 00:35:22,239 Speaker 1: was interesting. And you know, we we we know about 625 00:35:22,280 --> 00:35:24,359 Speaker 1: how seeing inflation has been going on in the real world. 626 00:35:24,520 --> 00:35:31,320 Speaker 1: But Sandbox virtual real estate one last month, two lands 627 00:35:31,760 --> 00:35:34,200 Speaker 1: in the center land went for over two point three 628 00:35:34,280 --> 00:35:37,719 Speaker 1: million dollars. So um, I I don't know if that 629 00:35:37,800 --> 00:35:39,759 Speaker 1: means and that's just for the land or you you 630 00:35:39,880 --> 00:35:41,360 Speaker 1: build on them, and if you have to hire like 631 00:35:41,480 --> 00:35:44,960 Speaker 1: virtual developers to you know, come in and build stuff. 632 00:35:45,480 --> 00:35:49,200 Speaker 1: But it seems expensive to me. But but but the 633 00:35:49,360 --> 00:35:54,320 Speaker 1: thing that really got me was someone bought a yacht 634 00:35:54,920 --> 00:35:58,440 Speaker 1: in Sandbox. So it's not land, it's just really a 635 00:35:58,920 --> 00:36:01,680 Speaker 1: virtual yacht that guess you can sail around this virtual world. 636 00:36:02,200 --> 00:36:04,640 Speaker 1: You guys have any guests for how much they paid 637 00:36:04,640 --> 00:36:07,480 Speaker 1: for his virtual yacht? Oh? Well, I think I saw this. 638 00:36:08,000 --> 00:36:10,920 Speaker 1: I want to say half a billion. Well, if if 639 00:36:10,960 --> 00:36:13,040 Speaker 1: that's Mike's guests, I'll go with that too, because I 640 00:36:13,080 --> 00:36:15,759 Speaker 1: did not see this story, and I'm notoriously about at 641 00:36:15,800 --> 00:36:19,759 Speaker 1: guessing did you say half a billion. Yeah, it was 642 00:36:19,960 --> 00:36:22,040 Speaker 1: well that that that would be a lot um No, 643 00:36:22,200 --> 00:36:24,600 Speaker 1: it was half a half a million. It would be 644 00:36:24,680 --> 00:36:30,680 Speaker 1: closer six or so. Yeah, that's by a few zeros there. Yeah, 645 00:36:32,239 --> 00:36:35,960 Speaker 1: so pretty affordable then, uh you know, I guess and yeah, 646 00:36:36,120 --> 00:36:39,400 Speaker 1: and that's like a bargain now actually cheaper than I 647 00:36:39,440 --> 00:36:42,239 Speaker 1: guess a real yacht. So relative to you know, buying 648 00:36:42,400 --> 00:36:44,279 Speaker 1: two point four million dollars for a plot of land, 649 00:36:44,320 --> 00:36:46,480 Speaker 1: maybe that's what's that's what we should be doing now, 650 00:36:46,560 --> 00:36:50,560 Speaker 1: is buying U n F T yachts in uh the metaverse? Max? 651 00:36:50,840 --> 00:36:54,080 Speaker 1: Are you locking in low rates in the metaverse? Buying 652 00:36:54,760 --> 00:36:56,920 Speaker 1: purchasing land over there? I don't know. I mean, I 653 00:36:57,239 --> 00:37:00,560 Speaker 1: I kind of feel like someone does need to do. 654 00:37:00,640 --> 00:37:03,839 Speaker 1: But I'm going to train the AI to figure out 655 00:37:04,000 --> 00:37:06,799 Speaker 1: how to arbitrage different land plots and maybe we can 656 00:37:06,840 --> 00:37:10,160 Speaker 1: become just a virtual landlord, you know. Um, I don't know. 657 00:37:10,960 --> 00:37:13,320 Speaker 1: Didn't someone pay up, pay up big to be like 658 00:37:13,440 --> 00:37:17,000 Speaker 1: Snoop Dogg's neighbor or something like that? There was there 659 00:37:17,200 --> 00:37:20,160 Speaker 1: was a couple of stories about that last year. Um 660 00:37:20,520 --> 00:37:23,239 Speaker 1: so so yeah, and you know it's funny to me. 661 00:37:23,360 --> 00:37:25,719 Speaker 1: So when I was just starting out like writing code. 662 00:37:25,760 --> 00:37:27,400 Speaker 1: It was one of this thing around this is this 663 00:37:27,440 --> 00:37:29,880 Speaker 1: will now date me, Mike, so I'll join you the 664 00:37:29,960 --> 00:37:33,120 Speaker 1: old guy camp. There's a thing called vermal or virtual 665 00:37:33,200 --> 00:37:36,040 Speaker 1: reality markup language. It was. It was kind of started 666 00:37:36,120 --> 00:37:39,600 Speaker 1: like in the mid nineties, and it looked a lot 667 00:37:39,719 --> 00:37:42,120 Speaker 1: like the metaverse. It never took off. No one cared 668 00:37:42,120 --> 00:37:43,759 Speaker 1: about it. It was kind of silly, but you could 669 00:37:43,800 --> 00:37:47,320 Speaker 1: create these three dimensional virtual worlds by just writing like 670 00:37:47,440 --> 00:37:51,000 Speaker 1: kind of very basic code. Well, I mean, maybe you 671 00:37:51,080 --> 00:37:53,279 Speaker 1: know that should have been something I pursued further. I 672 00:37:53,320 --> 00:37:57,160 Speaker 1: don't know, but yeah, yeah, I get that dust off 673 00:37:57,239 --> 00:38:00,360 Speaker 1: those old ploppy disks. I guess. I see kids in 674 00:38:00,400 --> 00:38:03,200 Speaker 1: the in the coffee shops now and they're all wearing 675 00:38:03,239 --> 00:38:06,320 Speaker 1: like nineties clothes, and I feel like, hey, that's my generation. 676 00:38:06,560 --> 00:38:09,880 Speaker 1: You can't wear rape jeans like that and band shirts 677 00:38:09,960 --> 00:38:12,560 Speaker 1: that you know weren't around, uh or haven't been around 678 00:38:12,600 --> 00:38:14,960 Speaker 1: for like twenty years. That's pretty good. Well, the only 679 00:38:15,000 --> 00:38:17,320 Speaker 1: coating I ever did was as a kid. It was basic, 680 00:38:17,600 --> 00:38:20,840 Speaker 1: and it would be like line ten print Mike is cool, 681 00:38:21,600 --> 00:38:24,560 Speaker 1: and then line twenty go to ten and it would 682 00:38:24,560 --> 00:38:26,560 Speaker 1: just scroll rout and then I'd save it on too 683 00:38:26,640 --> 00:38:29,839 Speaker 1: literally a cassette tape on my there's a radio shack. 684 00:38:29,920 --> 00:38:33,839 Speaker 1: I think computer ride. Anyway, I've really dated myself there, boy. 685 00:38:34,880 --> 00:38:38,399 Speaker 1: All right, as loyal listeners will know you're you're what's 686 00:38:38,440 --> 00:38:40,920 Speaker 1: known as a potter Head, not a pothead. I do 687 00:38:41,040 --> 00:38:44,240 Speaker 1: know some potheads, but a potter Head. I love Harry Potter. 688 00:38:44,760 --> 00:38:48,320 Speaker 1: A big fan of Harry Potter. So J. K. Rowling's 689 00:38:48,400 --> 00:38:51,520 Speaker 1: first book, Harry Potter and the Philosopher's Stone, is heading 690 00:38:51,560 --> 00:38:55,200 Speaker 1: to auction. It's a very special edition, one of just 691 00:38:55,400 --> 00:39:01,759 Speaker 1: five hardbacks printed in so it's are you rare gem 692 00:39:01,920 --> 00:39:05,600 Speaker 1: In fact, her name is listed, I believe is Joanne Rolands, 693 00:39:05,680 --> 00:39:08,319 Speaker 1: not even j K. So that's going up for sale. 694 00:39:08,480 --> 00:39:10,600 Speaker 1: Now I want to put you two to the test. 695 00:39:11,000 --> 00:39:13,480 Speaker 1: There's something else going up for sale or went up 696 00:39:13,520 --> 00:39:16,920 Speaker 1: for sale, actually, and that is the hat Milannia Trump 697 00:39:17,200 --> 00:39:21,440 Speaker 1: wore to her first state dinner when her in President 698 00:39:21,480 --> 00:39:25,120 Speaker 1: Trump had the macrons from France over for dinner. That 699 00:39:25,280 --> 00:39:30,239 Speaker 1: sold at auction. Unfortunately, it's sold in soul the Salona 700 00:39:30,360 --> 00:39:34,880 Speaker 1: blockchain currency that that's not doing so well. So so 701 00:39:35,280 --> 00:39:38,920 Speaker 1: here's the prices right for you. Vill Donna, which had 702 00:39:38,960 --> 00:39:41,880 Speaker 1: a higher value the expected value of the Harry Potter 703 00:39:43,200 --> 00:39:48,080 Speaker 1: hard back hardcover or Milannia Trump's big floppy hat that 704 00:39:48,239 --> 00:39:51,520 Speaker 1: she wore to a state dinner UH with with the 705 00:39:51,920 --> 00:39:55,000 Speaker 1: leaders of France UH and his wife. Give me a 706 00:39:55,080 --> 00:39:58,359 Speaker 1: value on each. Okay, you know I told you I've 707 00:39:58,400 --> 00:40:01,520 Speaker 1: been I've been watching Antiques Brow Show recently because I'm 708 00:40:01,600 --> 00:40:05,000 Speaker 1: so bad at guessing this, so it's like preparation for me. 709 00:40:05,200 --> 00:40:07,239 Speaker 1: And they see all the things that come up on 710 00:40:07,280 --> 00:40:09,120 Speaker 1: the show, and I try to guess and I'm always 711 00:40:09,320 --> 00:40:12,160 Speaker 1: way off. Anyway, I'm going with Harry Potter. I have 712 00:40:12,360 --> 00:40:17,080 Speaker 1: to I know what the hat was supposed to go for. 713 00:40:17,320 --> 00:40:21,040 Speaker 1: I believe I think it was something like five hundred 714 00:40:21,239 --> 00:40:25,400 Speaker 1: thousand dollars. I honestly could be misremembering that, but it 715 00:40:25,520 --> 00:40:27,040 Speaker 1: was part of an n f T collection, which you 716 00:40:27,120 --> 00:40:29,480 Speaker 1: forgot to mention. It was like all a big package, 717 00:40:29,480 --> 00:40:32,920 Speaker 1: I believe. But I'm going with Harry Potter. Okay, I'm 718 00:40:32,960 --> 00:40:36,200 Speaker 1: gonna I'm gonna keep a poker face here. Max. As 719 00:40:36,440 --> 00:40:39,680 Speaker 1: a a student of political culture, what do you think 720 00:40:39,880 --> 00:40:45,080 Speaker 1: Philosopher's stone or Melani's hat? Great question, Um, so you're 721 00:40:45,080 --> 00:40:47,200 Speaker 1: gonna get some AI. I want an AI model on 722 00:40:47,280 --> 00:40:49,160 Speaker 1: this stuff. If you can get an AI model on 723 00:40:49,520 --> 00:40:52,480 Speaker 1: my alternative assets, then we're going to be billionaires. There 724 00:40:52,560 --> 00:40:54,880 Speaker 1: is a way to do that again when when you 725 00:40:54,920 --> 00:40:56,359 Speaker 1: guys have me on next time, I might be able 726 00:40:56,400 --> 00:40:59,000 Speaker 1: to share a few more insights. So I'll just leave 727 00:40:59,000 --> 00:41:01,560 Speaker 1: it as a foreshouting with for now. I'll keep my 728 00:41:02,120 --> 00:41:08,560 Speaker 1: you know, human brain working. So I think that if 729 00:41:08,600 --> 00:41:11,360 Speaker 1: we're not hedging for the fact that it was sold 730 00:41:11,520 --> 00:41:17,120 Speaker 1: on Soul, then I would echo what val Donna said 731 00:41:17,239 --> 00:41:21,680 Speaker 1: and go with Harry Potter. I would think that you 732 00:41:21,760 --> 00:41:25,239 Speaker 1: can get seven figures. There's enough Potter heads out there 733 00:41:25,280 --> 00:41:29,839 Speaker 1: who've done well, and I think they could build that up. Conversely, 734 00:41:31,080 --> 00:41:36,319 Speaker 1: despite how successful Trump's spack was initially, I don't think 735 00:41:36,400 --> 00:41:39,640 Speaker 1: there's as many folks who would want the hat and 736 00:41:39,880 --> 00:41:42,840 Speaker 1: also not on n F T S. Yeah, it's it's tricky. 737 00:41:42,920 --> 00:41:46,120 Speaker 1: You have to sort of uh, you know, engauge the 738 00:41:47,120 --> 00:41:51,080 Speaker 1: enthusiasm of Harry Potter fans versus fans of the Trump 739 00:41:51,640 --> 00:41:55,960 Speaker 1: both very enthusiastic fan basis. I would say, um, so 740 00:41:56,280 --> 00:41:59,360 Speaker 1: I will say that Milania's hat was was a disappointment. 741 00:41:59,480 --> 00:42:03,520 Speaker 1: She wanted to think in Soul. Of course, Soul crashed. 742 00:42:04,239 --> 00:42:06,360 Speaker 1: She only got a hundred and seventy thousand dollars for 743 00:42:06,440 --> 00:42:10,000 Speaker 1: the hat, but the Harry Potter book is only expected 744 00:42:10,040 --> 00:42:13,480 Speaker 1: to sell for thirty thousand pounds, so about forty dollars. 745 00:42:14,880 --> 00:42:18,920 Speaker 1: So so, but I kind of agree with both of you. 746 00:42:19,000 --> 00:42:22,080 Speaker 1: I think the Potter heads uh will show up in force. 747 00:42:22,160 --> 00:42:25,239 Speaker 1: And that's just you know, the hat already sold, the 748 00:42:26,120 --> 00:42:30,040 Speaker 1: book has not sold so very well. Could be could 749 00:42:30,080 --> 00:42:35,280 Speaker 1: be a push, but us as far as expected value, 750 00:42:35,640 --> 00:42:38,000 Speaker 1: L I think you should bid on that one, uh 751 00:42:39,160 --> 00:42:43,919 Speaker 1: and see how it goes. Yeah, it's time for me too. Yep. 752 00:42:44,400 --> 00:42:46,520 Speaker 1: All right with that said, I think that is all 753 00:42:46,560 --> 00:42:49,319 Speaker 1: the time we have. Max. Always a pleasure to catch 754 00:42:49,400 --> 00:42:51,040 Speaker 1: up with you, and good luck with the new venture. 755 00:42:51,160 --> 00:42:53,120 Speaker 1: We definitely are gonna have to catch up again and 756 00:42:53,160 --> 00:42:55,600 Speaker 1: have you back on and see how all how it's 757 00:42:55,640 --> 00:42:59,680 Speaker 1: all going. Thanks so much, guys. Always great to catch 758 00:42:59,760 --> 00:43:10,960 Speaker 1: up with you as well. Thank you, Max. What Goes 759 00:43:11,040 --> 00:43:13,120 Speaker 1: Up will be back next week. Until then, you can 760 00:43:13,120 --> 00:43:16,040 Speaker 1: find us on the Bloomberg Terminal website and app or 761 00:43:16,280 --> 00:43:18,920 Speaker 1: wherever you get your podcasts. We'd love it if you 762 00:43:19,000 --> 00:43:20,759 Speaker 1: took the time to rate and review the show on 763 00:43:20,840 --> 00:43:24,120 Speaker 1: Apple Podcasts so more listeners can find us and you 764 00:43:24,160 --> 00:43:27,480 Speaker 1: can find us on Twitter, follow me at Reaganonymous, Wildonta 765 00:43:27,560 --> 00:43:32,000 Speaker 1: Hirich is at Wildonta. Hirich also followed Bloomberg Podcasts at 766 00:43:32,080 --> 00:43:36,040 Speaker 1: Podcasts and thank you to Charlie Pelletta. Bloomberg Radio What 767 00:43:36,200 --> 00:43:38,840 Speaker 1: Goes Up is produced by Laura Carlson. The head of 768 00:43:38,840 --> 00:43:42,360 Speaker 1: Bloomberg Podcast is francesco Leavie. Thanks for listening, See you 769 00:43:42,440 --> 00:44:00,919 Speaker 1: next time, um,