1 00:00:02,520 --> 00:00:11,840 Speaker 1: Bloomberg Audio Studios, Podcasts, radio news. This is Masters in 2 00:00:11,920 --> 00:00:15,760 Speaker 1: Business with Barry Ritholts on Bloomberg Radio. 3 00:00:16,600 --> 00:00:20,720 Speaker 2: On the latest Masters in Business podcast. What a fascinating 4 00:00:20,760 --> 00:00:26,520 Speaker 2: conversation I sit down with Ben Hunt. He writes Epsilon Theory, 5 00:00:26,680 --> 00:00:31,360 Speaker 2: but he is also the president and co founder of Percion. 6 00:00:32,040 --> 00:00:37,919 Speaker 2: What a fascinating analytics story they've put together. They essentially 7 00:00:38,000 --> 00:00:42,200 Speaker 2: take feeds of everything that's published around the world, whether 8 00:00:42,240 --> 00:00:45,640 Speaker 2: it's in English or Chinese or Russian. They create these 9 00:00:45,760 --> 00:00:53,760 Speaker 2: large language models and use artificial intelligence to identify rising narratives. 10 00:00:53,800 --> 00:00:56,600 Speaker 2: In other words, they're looking for the things that will 11 00:00:56,640 --> 00:01:01,760 Speaker 2: become storylines, but haven't quite that yet. I found this 12 00:01:01,800 --> 00:01:04,880 Speaker 2: conversation to be absolutely fascinating, and I think you will also. 13 00:01:05,480 --> 00:01:10,280 Speaker 2: With no further ado, my conversation with Persians Ben Hunt. 14 00:01:10,959 --> 00:01:13,280 Speaker 2: Ben Hunt, welcome to Bloomberg. 15 00:01:13,920 --> 00:01:16,320 Speaker 3: Thanks for having me. Well, this is a love intro. 16 00:01:16,440 --> 00:01:17,520 Speaker 3: I gotta have you at all my. 17 00:01:17,680 --> 00:01:20,560 Speaker 2: That's right, I'm available for hire. I can introduce you 18 00:01:20,600 --> 00:01:24,120 Speaker 2: at weddings by mitzvahs. Wherever you're giving a toast, I'll 19 00:01:24,120 --> 00:01:27,120 Speaker 2: be happy to tee you up. This is long overdue. 20 00:01:27,120 --> 00:01:31,120 Speaker 2: I've followed your work for so long. I'm fascinated by 21 00:01:31,200 --> 00:01:35,080 Speaker 2: both what you put out in your blog Epsilon Theory, 22 00:01:35,240 --> 00:01:38,240 Speaker 2: thank you, and which is now a blog and a newsletter, 23 00:01:38,360 --> 00:01:41,840 Speaker 2: and the work you do at Percion. We're gonna get 24 00:01:42,000 --> 00:01:45,800 Speaker 2: to that stuff, but before we do, I gotta ask 25 00:01:46,160 --> 00:01:50,720 Speaker 2: PhD from Harvard you were a tenured political science professor. 26 00:01:51,160 --> 00:01:53,480 Speaker 2: Was academia the original career plan? 27 00:01:54,680 --> 00:01:58,000 Speaker 3: You know? It's interesting, So I Academia was always a 28 00:01:58,280 --> 00:02:01,040 Speaker 3: i'll call it away station for me. It ended up 29 00:02:01,040 --> 00:02:05,240 Speaker 3: being a ten ten year way station plus grad school. 30 00:02:05,960 --> 00:02:08,359 Speaker 2: But I that's a little more than a waste. 31 00:02:08,560 --> 00:02:10,280 Speaker 3: It's a little more than a way station. But I 32 00:02:10,600 --> 00:02:13,000 Speaker 3: bet this will be familiar for a lot of your listeners. 33 00:02:13,280 --> 00:02:16,600 Speaker 3: I always had an entrepreneurial bug, you know. I started 34 00:02:16,639 --> 00:02:19,600 Speaker 3: my first company when I was in grad school, started 35 00:02:19,639 --> 00:02:22,320 Speaker 3: another one when I was when I was a professor. 36 00:02:23,120 --> 00:02:26,960 Speaker 3: And as I know, you know a lot of your 37 00:02:27,639 --> 00:02:31,240 Speaker 3: listeners viewers know it is. It is a bug. It's 38 00:02:31,240 --> 00:02:33,720 Speaker 3: not a feature. Yes, for sure, you can't help yourself. 39 00:02:34,080 --> 00:02:35,560 Speaker 2: So let me ask you a question about that. 40 00:02:35,600 --> 00:02:38,239 Speaker 3: Academy is not the place to be an a for sure. 41 00:02:38,800 --> 00:02:41,919 Speaker 2: So I know why I've started a series of companies 42 00:02:42,320 --> 00:02:46,000 Speaker 2: I can't work for other people. Why did you have 43 00:02:46,040 --> 00:02:49,799 Speaker 2: that bug? What motivated you to say I gotta get 44 00:02:49,840 --> 00:02:50,799 Speaker 2: this out into the world. 45 00:02:52,440 --> 00:02:55,920 Speaker 3: I love playing games and solving problems. I have a 46 00:02:55,960 --> 00:03:00,079 Speaker 3: similar issue about working for other people, which fortunately the 47 00:03:00,120 --> 00:03:04,679 Speaker 3: academia solves that to a large degree. I mean, you 48 00:03:04,680 --> 00:03:08,640 Speaker 3: you're working on your own stuff. You follow your own 49 00:03:09,000 --> 00:03:15,480 Speaker 3: intellectual bliss in a way that I've really never rediscovered that. 50 00:03:15,639 --> 00:03:19,399 Speaker 3: The problem with academy, of course, is you know, it's 51 00:03:19,639 --> 00:03:21,840 Speaker 3: very very low stakes. It don't pay. 52 00:03:22,000 --> 00:03:26,760 Speaker 2: That's why, that's why the academic fights are so vicious. 53 00:03:26,360 --> 00:03:30,640 Speaker 3: Because there's nothing at stake, and that that is actually true. 54 00:03:31,040 --> 00:03:36,720 Speaker 3: That's actually true, and so you learn survival techniques and 55 00:03:36,800 --> 00:03:40,040 Speaker 3: that kind of jungle where nothing is really at stake, 56 00:03:40,160 --> 00:03:44,240 Speaker 3: at least monetarily. Because the goal of any sort of 57 00:03:44,320 --> 00:03:51,160 Speaker 3: academia conference or presentation like is to appear smart, right, 58 00:03:51,200 --> 00:03:53,760 Speaker 3: it's not to actually be smart. You're not actually listening 59 00:03:53,800 --> 00:03:58,880 Speaker 3: to a presentation to listen to it. What you learned 60 00:03:58,920 --> 00:04:02,200 Speaker 3: to do is you're listening to the presentation the whole 61 00:04:02,200 --> 00:04:05,760 Speaker 3: time you're trying to calculate what's the most devastating question 62 00:04:05,880 --> 00:04:06,480 Speaker 3: I can ask? 63 00:04:07,560 --> 00:04:09,600 Speaker 2: So you're going to rock. This guy back on his 64 00:04:09,680 --> 00:04:11,960 Speaker 2: heels with a devastating. 65 00:04:11,440 --> 00:04:16,440 Speaker 3: Devastating question, and boy, that gets old after a while, Barry, 66 00:04:16,440 --> 00:04:19,360 Speaker 3: I gotta tell you, yeah, it really does. I loved 67 00:04:19,520 --> 00:04:23,880 Speaker 3: the teaching. I loved the research because, like I say, 68 00:04:23,960 --> 00:04:28,200 Speaker 3: nobody tells you what to work on, but the church 69 00:04:28,440 --> 00:04:34,640 Speaker 3: of academia, the actual institution of academia. Hey, it's for 70 00:04:34,680 --> 00:04:37,039 Speaker 3: the birds, even back when I was doing right, and 71 00:04:38,760 --> 00:04:41,400 Speaker 3: I think it's gotten significantly worse. I can imagine. I 72 00:04:41,440 --> 00:04:42,720 Speaker 3: can imagine what's happening with that. 73 00:04:42,800 --> 00:04:45,120 Speaker 2: So but the question that this leads me to is, 74 00:04:45,600 --> 00:04:48,320 Speaker 2: you've had all these jobs when the world to finance, 75 00:04:48,720 --> 00:04:53,560 Speaker 2: how did you background in academia shape how you view investing, 76 00:04:53,720 --> 00:04:55,920 Speaker 2: risk management, allocation. 77 00:04:58,000 --> 00:05:04,520 Speaker 3: Barry, Starting from academ and then getting into our business 78 00:05:04,560 --> 00:05:07,279 Speaker 3: of investing, I think it was the best thing that 79 00:05:07,320 --> 00:05:09,880 Speaker 3: could have happened for me for when I got into it, 80 00:05:10,040 --> 00:05:13,240 Speaker 3: which is kind of later in life, right, because you know, 81 00:05:13,360 --> 00:05:16,520 Speaker 3: after I left academic, if I start a software company 82 00:05:17,160 --> 00:05:23,440 Speaker 3: and after we sold that software company, Yeah, a buddy 83 00:05:23,480 --> 00:05:26,960 Speaker 3: of mine, I think this happens a lot a buddy 84 00:05:27,000 --> 00:05:29,120 Speaker 3: of you know, you have a buddy who's you need 85 00:05:29,279 --> 00:05:31,479 Speaker 3: to be pretty smart? How would you like to apply. 86 00:05:31,760 --> 00:05:35,520 Speaker 3: We're always we're always talking about company X or technology. 87 00:05:35,600 --> 00:05:39,279 Speaker 3: Why why you know, why don't come in. Let's let's 88 00:05:39,640 --> 00:05:44,400 Speaker 3: give this a try. So that was my path, if 89 00:05:44,400 --> 00:05:46,920 Speaker 3: you want to call it a path. And what really 90 00:05:46,960 --> 00:05:51,839 Speaker 3: sold me on it was that markets. It's the biggest 91 00:05:51,880 --> 00:05:54,960 Speaker 3: game in the world for sure. And like I say, 92 00:05:54,960 --> 00:05:58,080 Speaker 3: I'm a game player. I love a game theorist. 93 00:05:59,160 --> 00:06:00,480 Speaker 2: Let's work that. 94 00:06:01,120 --> 00:06:02,880 Speaker 3: I don't like to talk about that because, you know, 95 00:06:02,960 --> 00:06:06,840 Speaker 3: because the real game theorists get angry. Well, yes, and 96 00:06:07,160 --> 00:06:10,240 Speaker 3: I am a real one because that was that was 97 00:06:10,320 --> 00:06:12,880 Speaker 3: my field for a lot and it's a real field 98 00:06:12,920 --> 00:06:16,800 Speaker 3: and it's a real thing, but it's been so trivialized 99 00:06:16,920 --> 00:06:20,400 Speaker 3: when some talking head will come on, well, let's look 100 00:06:20,400 --> 00:06:22,520 Speaker 3: at the game theory of this, and you just want 101 00:06:22,520 --> 00:06:25,400 Speaker 3: to just, you know, shoot yourself when somebody does this. 102 00:06:25,800 --> 00:06:29,680 Speaker 2: So the other part of your research, the other part 103 00:06:29,720 --> 00:06:33,880 Speaker 2: of your academic focus, was on narrative theory. And so 104 00:06:34,080 --> 00:06:38,680 Speaker 2: let's talk about how did that focus develop. And we'll 105 00:06:38,720 --> 00:06:41,480 Speaker 2: talk a little later about what you do today at 106 00:06:41,520 --> 00:06:45,279 Speaker 2: personant with the narrative machine. 107 00:06:45,279 --> 00:06:48,480 Speaker 3: But not believe it or not, it all ties together. 108 00:06:48,839 --> 00:06:51,320 Speaker 2: I doesn't don't doubt that for a sing, and it's 109 00:06:51,360 --> 00:06:54,760 Speaker 2: what initially led you down that rabbit hole when. 110 00:06:54,600 --> 00:06:56,760 Speaker 3: We think about kind of who's been an influence on 111 00:06:56,839 --> 00:07:02,240 Speaker 3: you in your life. At a very influential undergrad professor 112 00:07:02,600 --> 00:07:06,880 Speaker 3: in political science, and then I had a very influential 113 00:07:07,560 --> 00:07:12,800 Speaker 3: graduate advisor. Again, they don't call it political science up 114 00:07:12,840 --> 00:07:15,560 Speaker 3: at Harvard. They call it government or something like that, 115 00:07:16,200 --> 00:07:19,880 Speaker 3: but it's political science. And the reason I say they 116 00:07:19,880 --> 00:07:25,200 Speaker 3: were influential is that they really got me focused on 117 00:07:25,320 --> 00:07:36,000 Speaker 3: the science side of political science, and that science side. Yes, 118 00:07:36,120 --> 00:07:39,560 Speaker 3: it's kind of some of the typical terrible stuff you 119 00:07:39,560 --> 00:07:44,040 Speaker 3: see in all social sciences like economics, where you've got 120 00:07:44,040 --> 00:07:48,120 Speaker 3: to learn how to deal with structured data, right, and 121 00:07:48,120 --> 00:07:52,960 Speaker 3: and there was a lot to learn. And it's worth 122 00:07:53,000 --> 00:07:55,760 Speaker 3: talking about because I see the same mistakes being made 123 00:07:55,800 --> 00:08:01,800 Speaker 3: over and over again by people in our business who 124 00:08:01,880 --> 00:08:06,720 Speaker 3: want to try to, you know, apply math to data, 125 00:08:07,480 --> 00:08:10,720 Speaker 3: and there are some real pitfalls and some some real 126 00:08:10,840 --> 00:08:15,400 Speaker 3: intellectual capital. I think that you can achieve that within 127 00:08:15,480 --> 00:08:18,560 Speaker 3: academia that you can then bring in and apply to 128 00:08:18,920 --> 00:08:20,239 Speaker 3: the investment world. 129 00:08:20,440 --> 00:08:26,880 Speaker 2: Well, we've certainly seen amongst the quants a very successful 130 00:08:26,920 --> 00:08:32,199 Speaker 2: application of math theory to data. In fact, some of 131 00:08:32,240 --> 00:08:36,120 Speaker 2: the best performing hedge funds are quantitatively driven. 132 00:08:36,320 --> 00:08:39,719 Speaker 3: That's not where you're going, well, it's it's part of 133 00:08:39,760 --> 00:08:42,120 Speaker 3: where I'm going, right, So so there there there's a 134 00:08:42,200 --> 00:08:47,400 Speaker 3: transition in all of all of the sciences, honestly, but 135 00:08:47,480 --> 00:08:53,240 Speaker 3: certainly the social sciences, where where yes, you start with numbers, 136 00:08:53,280 --> 00:08:58,880 Speaker 3: structured numbers, right, uh, price over time, you know, follows 137 00:08:58,920 --> 00:09:01,920 Speaker 3: the things you can calculate, late and measure as those numbers. 138 00:09:02,760 --> 00:09:07,040 Speaker 3: But what was clear immediately in politics, and I think 139 00:09:07,120 --> 00:09:11,200 Speaker 3: has become increasingly clear in the world of investing, is 140 00:09:11,320 --> 00:09:14,160 Speaker 3: it's not just the numbers that you get on your 141 00:09:14,320 --> 00:09:20,760 Speaker 3: Bloomberg terminal. It's also the words and the stories and 142 00:09:20,800 --> 00:09:25,439 Speaker 3: the narratives that are told to us. Politicians have known 143 00:09:25,440 --> 00:09:28,679 Speaker 3: this forever, right, So the story of politics is the 144 00:09:28,720 --> 00:09:34,079 Speaker 3: story of people suggesting laws or policies and then having 145 00:09:34,120 --> 00:09:36,440 Speaker 3: to present it in a way that gets them elected 146 00:09:36,840 --> 00:09:40,240 Speaker 3: or keeps them in power or whatever that is. So 147 00:09:40,280 --> 00:09:43,080 Speaker 3: there's always been a focus, I'll say, more of a 148 00:09:43,080 --> 00:09:50,840 Speaker 3: focus in political science than in economics. With words, economics 149 00:09:50,920 --> 00:09:55,760 Speaker 3: is almost seen as a sideline, right, it's somehow lesser 150 00:09:55,800 --> 00:10:03,680 Speaker 3: than the numbers. So what I was kind of early 151 00:10:03,800 --> 00:10:11,439 Speaker 3: on was applying the same techniques that we have for 152 00:10:11,679 --> 00:10:20,000 Speaker 3: understanding matrices and structured data, but apply it to unstructured data, 153 00:10:20,360 --> 00:10:22,959 Speaker 3: which you know, full circle, this is at the heart 154 00:10:23,160 --> 00:10:26,560 Speaker 3: of all of the generative AI and the. 155 00:10:27,120 --> 00:10:30,560 Speaker 2: AI you're getting way ahead of me now with generative 156 00:10:30,600 --> 00:10:33,200 Speaker 2: I will circle back to that, but it's. 157 00:10:33,080 --> 00:10:38,600 Speaker 3: All the math has not changed in thirty five years 158 00:10:38,640 --> 00:10:43,400 Speaker 3: since I started working with network math around unstructured data. 159 00:10:43,480 --> 00:10:47,360 Speaker 3: I mean, we didn't call it natural language processing back then, 160 00:10:47,480 --> 00:10:50,960 Speaker 3: and we didn't call it, you know, large language modeling, 161 00:10:51,240 --> 00:10:53,120 Speaker 3: but that's exactly what we were doing. 162 00:10:53,320 --> 00:10:57,000 Speaker 2: So what was what was the moment or the catalyst 163 00:10:57,080 --> 00:11:00,280 Speaker 2: for you to say, Hey, I'm working in all these 164 00:11:00,320 --> 00:11:05,760 Speaker 2: other areas. But the narratives continue to pop up over 165 00:11:05,760 --> 00:11:09,199 Speaker 2: and over again on all sorts of different data sets, 166 00:11:09,720 --> 00:11:12,960 Speaker 2: and I think in the financial markets, I can use 167 00:11:13,920 --> 00:11:19,120 Speaker 2: a novel approach to identifying narratives and anticipate where the 168 00:11:19,120 --> 00:11:23,480 Speaker 2: market's going. What led to that sort of insight. 169 00:11:26,800 --> 00:11:30,880 Speaker 3: Not the Great Recession, right, but the aftermath. 170 00:11:30,880 --> 00:11:34,800 Speaker 2: Meaning the twenty tens following the Great Financial Crisis. 171 00:11:34,320 --> 00:11:37,520 Speaker 3: Starting in two thousand and nine, in the recovery that 172 00:11:37,559 --> 00:11:40,520 Speaker 3: we had out of two thousand and nine, and in 173 00:11:40,559 --> 00:11:46,600 Speaker 3: particular when Ben Bernaki and the Federal Reserve moved towards 174 00:11:47,160 --> 00:11:53,720 Speaker 3: a very explicit effort to use their words to impact markets. 175 00:11:53,920 --> 00:11:55,760 Speaker 2: So let's talk a little bit about that, because I 176 00:11:55,880 --> 00:12:01,719 Speaker 2: have some really specific memories of the low of the 177 00:12:02,440 --> 00:12:07,000 Speaker 2: run up afterwards, all the noise that was going on. 178 00:12:10,400 --> 00:12:12,559 Speaker 2: Some of the phrases that have come out of that era, 179 00:12:12,800 --> 00:12:16,439 Speaker 2: like financial repression and other such things are just the 180 00:12:16,600 --> 00:12:23,319 Speaker 2: tip of the narrative, Iceberg. So walk us through your insight. 181 00:12:23,679 --> 00:12:27,559 Speaker 2: It's two thousand and nine. Yep, the market bottoms really 182 00:12:27,640 --> 00:12:30,160 Speaker 2: kind of a v bottom and took off from that. 183 00:12:30,200 --> 00:12:32,760 Speaker 2: What was that March ninth, March seventh, something like that, 184 00:12:33,320 --> 00:12:38,120 Speaker 2: nine and there was no turning back. What were you seeing? 185 00:12:38,280 --> 00:12:42,959 Speaker 2: How were you integrating that into a concept of let's 186 00:12:43,000 --> 00:12:47,199 Speaker 2: identify narratives in order to anticipate market most well. 187 00:12:47,200 --> 00:12:53,400 Speaker 3: So I was co founded a long short fund inside 188 00:12:53,440 --> 00:12:58,880 Speaker 3: of a larger asset manager going back in five and five, 189 00:12:59,200 --> 00:13:05,839 Speaker 3: six seven, and we did well, like everyone else did well. 190 00:13:06,360 --> 00:13:07,640 Speaker 3: And then in O eight we did great. 191 00:13:08,280 --> 00:13:10,320 Speaker 2: Really you know, would wait, I think want to say 192 00:13:10,360 --> 00:13:11,880 Speaker 2: SMP down thirty seven percent? 193 00:13:12,000 --> 00:13:14,320 Speaker 3: Yeah, we were up twenty something net. 194 00:13:14,440 --> 00:13:17,560 Speaker 2: Anything in the green not in the red is amazing. 195 00:13:18,200 --> 00:13:19,800 Speaker 3: Now, I'll tell you, and we can come back to this, 196 00:13:19,920 --> 00:13:23,720 Speaker 3: like the real question you should ask is that, given 197 00:13:24,640 --> 00:13:28,760 Speaker 3: what we believed, why wasn't why weren't we up forty percent? 198 00:13:29,280 --> 00:13:32,760 Speaker 3: That that's actually that that's actually a question you can ask. 199 00:13:32,840 --> 00:13:37,360 Speaker 2: But I'm going to say a forty seven percent relative 200 00:13:37,679 --> 00:13:38,360 Speaker 2: price wing. 201 00:13:39,040 --> 00:13:41,360 Speaker 3: I'll take had A had A had a great year 202 00:13:41,400 --> 00:13:42,160 Speaker 3: in O eight. 203 00:13:42,000 --> 00:13:45,400 Speaker 2: And and did that continue in nine? 204 00:13:45,920 --> 00:13:48,760 Speaker 3: Flat lined right from O nine, So from March of 205 00:13:48,760 --> 00:13:52,480 Speaker 3: O nine, so we did well in January February the 206 00:13:52,480 --> 00:13:55,840 Speaker 3: first quarter right the rest from from March of nine, 207 00:13:56,640 --> 00:13:59,960 Speaker 3: our returns flat lined. So we never we never lost 208 00:14:00,160 --> 00:14:01,760 Speaker 3: money for our clients and our fund. 209 00:14:01,800 --> 00:14:03,560 Speaker 2: But you didn't catch that recovery though. 210 00:14:03,440 --> 00:14:10,600 Speaker 3: Did not catch the recovery, absolutely did not. And the 211 00:14:10,600 --> 00:14:12,880 Speaker 3: the recovery was interesting, right, you're you're right, there was 212 00:14:12,880 --> 00:14:15,680 Speaker 3: a v but there were there were starts and stops 213 00:14:15,679 --> 00:14:19,760 Speaker 3: to it. So the the big move up from the 214 00:14:19,800 --> 00:14:23,880 Speaker 3: bottom in late March going into April, it's like, all right, 215 00:14:24,520 --> 00:14:26,920 Speaker 3: that actually we caught a little bit of that, and 216 00:14:26,960 --> 00:14:32,280 Speaker 3: that that made sense. There was a second leg to 217 00:14:32,320 --> 00:14:37,120 Speaker 3: the rally April May, and then in June. In June 218 00:14:37,200 --> 00:14:41,960 Speaker 3: there was a ferocious rally, but June in particular it 219 00:14:42,240 --> 00:14:47,360 Speaker 3: was a classic crap rally. Right. It was a low quality, 220 00:14:47,920 --> 00:14:52,760 Speaker 3: low quality stuff right the end of March going into 221 00:14:52,800 --> 00:14:56,280 Speaker 3: April rally. It was right, this makes sense, Right, we've 222 00:14:56,320 --> 00:15:02,000 Speaker 3: bottomed the June rally. Wow, No, it was we didn't. 223 00:15:02,040 --> 00:15:02,360 Speaker 3: We didn't. 224 00:15:02,360 --> 00:15:05,080 Speaker 2: Really. Oh that's oh yes, fascinating. 225 00:15:05,120 --> 00:15:07,280 Speaker 3: Go back, go back and look at it, right, And 226 00:15:07,360 --> 00:15:08,160 Speaker 3: let me tell you I don't. 227 00:15:08,760 --> 00:15:11,880 Speaker 2: Just so you know, I have a vivid recollection of 228 00:15:11,960 --> 00:15:15,840 Speaker 2: chatting with Jim Bianco about this, and we were both bullish, 229 00:15:15,880 --> 00:15:19,560 Speaker 2: but for completely different reasons. To me, anytime US equity 230 00:15:19,560 --> 00:15:22,800 Speaker 2: markets are cut in half, I'm a buyer and people 231 00:15:22,800 --> 00:15:24,680 Speaker 2: say nineteen twenty nine, I'm like, great, you got to 232 00:15:24,680 --> 00:15:27,080 Speaker 2: go back a century to find the exception that proves 233 00:15:27,080 --> 00:15:31,920 Speaker 2: the rule. But Jim was early on in the Tina trade. Hey, 234 00:15:31,960 --> 00:15:35,120 Speaker 2: the fit has made everything cash trash bonds are They're 235 00:15:35,160 --> 00:15:39,960 Speaker 2: forcing you into equities, which is what led my post, 236 00:15:40,080 --> 00:15:43,080 Speaker 2: which everybody stole the line, this is the most hated 237 00:15:43,120 --> 00:15:44,480 Speaker 2: bull market in history. 238 00:15:44,760 --> 00:15:46,680 Speaker 3: And I wrote that up, I. 239 00:15:46,680 --> 00:15:49,880 Speaker 2: Send that out, and I heard everybody bar of that. 240 00:15:50,520 --> 00:15:54,760 Speaker 2: But I'm curious as to where the June rally took you. 241 00:15:54,920 --> 00:15:57,080 Speaker 3: Well this and this is where I'm going about the 242 00:15:57,160 --> 00:16:03,000 Speaker 3: role of forward guidance in Jim Beyonc's point about because 243 00:16:03,760 --> 00:16:14,320 Speaker 3: what the FED did wasn't just it's policies around interest rates. 244 00:16:14,400 --> 00:16:16,600 Speaker 3: You know, they took them to zero and that's where 245 00:16:16,600 --> 00:16:17,440 Speaker 3: we stayed. 246 00:16:17,120 --> 00:16:19,360 Speaker 2: Started buying mortgage backs, and. 247 00:16:19,320 --> 00:16:23,600 Speaker 3: They balance balance sheet operations. 248 00:16:22,320 --> 00:16:22,440 Speaker 2: Right. 249 00:16:23,960 --> 00:16:27,080 Speaker 3: Actually, you know, actually, and I look, I think que won, 250 00:16:27,280 --> 00:16:28,640 Speaker 3: I think it saved the world. 251 00:16:28,760 --> 00:16:30,880 Speaker 2: This is what and what was that a trillion dollars 252 00:16:31,000 --> 00:16:34,560 Speaker 2: something something? Yeah, something billions, something like eight hundred billion, 253 00:16:34,720 --> 00:16:35,640 Speaker 2: unthinkable number. 254 00:16:35,720 --> 00:16:38,840 Speaker 3: So I think that So those were specific actions took. 255 00:16:38,880 --> 00:16:43,680 Speaker 3: But but even if you people often say things when 256 00:16:43,680 --> 00:16:48,960 Speaker 3: they're leaving office. So so Bernanki's last speech is valedictory. 257 00:16:48,160 --> 00:16:51,320 Speaker 2: Address, more honest than intended. 258 00:16:51,520 --> 00:16:53,440 Speaker 3: Much more so. And you see this all the time, right, 259 00:16:53,480 --> 00:17:00,400 Speaker 3: George Washington leaves Eisenhower, I mean when when, when when 260 00:17:00,440 --> 00:17:05,040 Speaker 3: freaking Eisenhower warns you against the defense industrial complex? Yeah, 261 00:17:05,480 --> 00:17:09,040 Speaker 3: you might, you might. I'm just saying, that's right. General, 262 00:17:09,160 --> 00:17:13,280 Speaker 3: Like to me, what Bernanki said when he's leaving his 263 00:17:13,480 --> 00:17:17,440 Speaker 3: terms of office. He said, look, we had we had 264 00:17:17,480 --> 00:17:22,840 Speaker 3: two toolkits. One was traditional stuff, interest rates, down to zero. 265 00:17:23,320 --> 00:17:25,600 Speaker 3: At the time, we didn't know we could have nkeadventure straight. 266 00:17:25,680 --> 00:17:28,040 Speaker 3: So you know, that's where we were. Second were the 267 00:17:28,080 --> 00:17:33,600 Speaker 3: balance sheet operations, large scale asset purchases QE quantitative easy. 268 00:17:34,640 --> 00:17:37,480 Speaker 3: He said, you know Q one was great. Did what 269 00:17:37,640 --> 00:17:41,680 Speaker 3: we hoped he would do. Q E two Eh, operation 270 00:17:41,840 --> 00:17:45,320 Speaker 3: twist q E three. This is BRENANKI saying. He said, 271 00:17:45,560 --> 00:17:48,359 Speaker 3: I actually think that might have been a little counterproductive. Huh, 272 00:17:49,320 --> 00:17:54,240 Speaker 3: he said, But we had another toolkit, and that was 273 00:17:54,320 --> 00:17:59,200 Speaker 3: our communication policy. That was forward guidance. We started using 274 00:17:59,240 --> 00:18:03,919 Speaker 3: our words not to communicate to the market what we 275 00:18:04,000 --> 00:18:08,520 Speaker 3: actually felt. We started using our words and coordinating our 276 00:18:08,560 --> 00:18:14,560 Speaker 3: words to change the market, to change market behavior. This 277 00:18:14,640 --> 00:18:17,480 Speaker 3: is what I mean about making a conscious effort to 278 00:18:17,600 --> 00:18:22,359 Speaker 3: tell a story. It's not that it was necessarily lying, 279 00:18:23,280 --> 00:18:26,840 Speaker 3: but they they were using their words and choosing their 280 00:18:26,840 --> 00:18:28,679 Speaker 3: words for effect. 281 00:18:28,520 --> 00:18:31,760 Speaker 2: To shape perception of their shape. 282 00:18:31,600 --> 00:18:35,760 Speaker 3: Market Babe, to shape market behavior. And he said that 283 00:18:35,840 --> 00:18:39,439 Speaker 3: worked better than we had any hope that it could. 284 00:18:40,760 --> 00:18:42,280 Speaker 3: And that's where we are now. 285 00:18:42,640 --> 00:18:46,399 Speaker 2: So for the young'ins listening, I have to point out, 286 00:18:46,480 --> 00:18:49,520 Speaker 2: and you and I are old enough to remember back 287 00:18:49,560 --> 00:18:53,359 Speaker 2: in the days where there were no minutes released. There 288 00:18:53,960 --> 00:18:57,560 Speaker 2: wasn't an announcement, forget a press conference. You had to 289 00:18:57,600 --> 00:19:00,879 Speaker 2: be watching the bond market to figure out what the 290 00:19:00,920 --> 00:19:05,359 Speaker 2: Fed just did. Like today, there's we're holding the having 291 00:19:05,359 --> 00:19:08,879 Speaker 2: this conversation. There was a FED the October meeting, a 292 00:19:09,000 --> 00:19:12,600 Speaker 2: quarter point rate cut, a conversation about all sorts of stuff. 293 00:19:12,600 --> 00:19:14,679 Speaker 2: I really didn't pay a lot of attention to it. 294 00:19:16,440 --> 00:19:18,960 Speaker 2: Lack of clarity, no data, blah blah blah. 295 00:19:20,200 --> 00:19:25,119 Speaker 3: Worse than that, Greenspan would be intentionally vague and obtuse. 296 00:19:25,320 --> 00:19:28,880 Speaker 2: If you understood what I said, then then you miss unders. 297 00:19:29,080 --> 00:19:32,399 Speaker 2: You misinterpreted it exactly. You think you understand what I'm saying. 298 00:19:33,119 --> 00:19:35,960 Speaker 3: You know who led the committee to make all that change, 299 00:19:36,560 --> 00:19:39,440 Speaker 3: Janet Yelling, which is a Spice chair yep. 300 00:19:39,359 --> 00:19:41,679 Speaker 2: Back back in the financial crisis. 301 00:19:41,720 --> 00:19:46,280 Speaker 3: Yep. So this was it was a concerted effort. Bernanki 302 00:19:46,640 --> 00:19:49,800 Speaker 3: yelling to. This is when they also started going, putting 303 00:19:49,840 --> 00:19:54,080 Speaker 3: all the FED governors on a common calendar, right and when, 304 00:19:55,280 --> 00:19:57,840 Speaker 3: and assigning the fake you're going to speak this day, 305 00:19:57,920 --> 00:20:00,560 Speaker 3: You're going to speak that day. That's when all the started. 306 00:20:00,920 --> 00:20:03,920 Speaker 3: It was an intentional effort. And again this is something 307 00:20:03,960 --> 00:20:08,080 Speaker 3: that politicians have known forever, right, politicians craft the message 308 00:20:08,119 --> 00:20:11,879 Speaker 3: and use their words. So I knew the tools to 309 00:20:11,920 --> 00:20:15,800 Speaker 3: try to understand this, but what I wasn't prepared for 310 00:20:16,119 --> 00:20:19,720 Speaker 3: was how it neither was Berneki was how powerful this 311 00:20:19,800 --> 00:20:24,159 Speaker 3: would become to the point where today it's not just 312 00:20:25,320 --> 00:20:29,960 Speaker 3: central bankers using their words as their main policy toolkit, 313 00:20:30,640 --> 00:20:33,919 Speaker 3: but is every ceo? Is every ceo now? I mean 314 00:20:33,920 --> 00:20:37,520 Speaker 3: you go on this network or one of the other networks, 315 00:20:38,560 --> 00:20:41,639 Speaker 3: and what makes for a good CEO is can you 316 00:20:41,720 --> 00:20:45,760 Speaker 3: tell the story? Can you tell the narrative of your 317 00:20:45,840 --> 00:20:50,400 Speaker 3: company to get a multiple? Because a multiple is a narrative, 318 00:20:50,600 --> 00:20:51,679 Speaker 3: a multiple is a story. 319 00:20:52,320 --> 00:20:54,879 Speaker 2: Well, look at stories, look at some of the most 320 00:20:54,920 --> 00:20:59,880 Speaker 2: successful CEOs throughout history. I would throw Jack Welsh into 321 00:20:59,880 --> 00:21:04,399 Speaker 2: that a pile because he was a fabuloust fabulist. He 322 00:21:04,520 --> 00:21:08,560 Speaker 2: was a fabulous fabulouist because the stories he told were great, 323 00:21:09,080 --> 00:21:11,720 Speaker 2: right up until the point where we found out that 324 00:21:11,800 --> 00:21:15,679 Speaker 2: he was running a hedge fund with g Capital and 325 00:21:15,760 --> 00:21:17,560 Speaker 2: they magically always beat by a penny. 326 00:21:17,640 --> 00:21:23,000 Speaker 3: So I remember vividly when ge was coming to our shop. 327 00:21:27,119 --> 00:21:32,320 Speaker 3: What they wanted was a financial financial multiple, right, so 328 00:21:32,359 --> 00:21:35,600 Speaker 3: they were making they they wanted an old world even 329 00:21:35,640 --> 00:21:39,240 Speaker 3: though they're an industrial this that this was They wanted 330 00:21:39,240 --> 00:21:43,919 Speaker 3: to tell a story that they should be seen as 331 00:21:43,960 --> 00:21:47,360 Speaker 3: and get the multiple of a financial That's what GE 332 00:21:47,440 --> 00:21:52,600 Speaker 3: was all about in those years leading up to the GFC. 333 00:21:53,440 --> 00:21:57,840 Speaker 3: So hey, my my poster child for this is is 334 00:21:57,880 --> 00:22:04,359 Speaker 3: Mark Binnioff Salesforce because he's Prabay. You often see this 335 00:22:04,480 --> 00:22:07,199 Speaker 3: with people who come out of sales like like mar 336 00:22:07,359 --> 00:22:08,080 Speaker 3: I did. 337 00:22:08,440 --> 00:22:10,320 Speaker 2: But it's all about storytelling. 338 00:22:10,720 --> 00:22:14,840 Speaker 3: It's all about storytelling. And what isn't that true? 339 00:22:14,920 --> 00:22:20,480 Speaker 2: For go through the great look Steve Jobs, Reid Hastings, 340 00:22:22,200 --> 00:22:27,280 Speaker 2: Larry Ellison, an oracle to some degree, Steve Ballmer at 341 00:22:27,280 --> 00:22:30,439 Speaker 2: Microsoft who wasn't a great CEO, but he was a 342 00:22:30,440 --> 00:22:33,000 Speaker 2: great cheerleader and a great storyteller. 343 00:22:32,440 --> 00:22:37,080 Speaker 3: Great storyteller. What all of those companies have in common 344 00:22:37,240 --> 00:22:40,760 Speaker 3: is that they're great storytellers, and those are trillion dollar 345 00:22:40,840 --> 00:22:44,440 Speaker 3: companies today. What I would say to you is that 346 00:22:44,680 --> 00:22:48,800 Speaker 3: you don't remember or hear about the companies that did 347 00:22:48,840 --> 00:22:51,320 Speaker 3: not have CEOs who are great storytellers. 348 00:22:51,320 --> 00:22:54,680 Speaker 2: Well, Ken Lay was a great storyteller until you found 349 00:22:54,760 --> 00:22:57,440 Speaker 2: out that it was all nonsense. And you could say 350 00:22:57,480 --> 00:22:59,720 Speaker 2: the same thing about folks like Bernie Madoff. 351 00:23:00,240 --> 00:23:03,000 Speaker 3: There are a lot of stories that get told that 352 00:23:03,160 --> 00:23:08,800 Speaker 3: are not true, right, what And and I think even 353 00:23:08,840 --> 00:23:12,439 Speaker 3: today people think of this word narrative. They have a 354 00:23:12,480 --> 00:23:18,840 Speaker 3: pejorative sense to it. It's like, oh, for for sure, 355 00:23:18,840 --> 00:23:21,679 Speaker 3: that's just your narrative, man, you know, the big Lebowski. 356 00:23:21,720 --> 00:23:25,560 Speaker 3: You know, that's that's the And it's not that a 357 00:23:25,720 --> 00:23:31,080 Speaker 3: story is a lie. It's that the story is constructed 358 00:23:31,240 --> 00:23:36,640 Speaker 3: for effect. It's it's not and it's presented to you 359 00:23:36,760 --> 00:23:40,600 Speaker 3: as if this is my true and inner thoughts. But 360 00:23:42,359 --> 00:23:47,800 Speaker 3: the construction of the intentionality behind these stories phenomenal. Bniel, 361 00:23:47,880 --> 00:23:51,480 Speaker 3: for example, you know, created the metrics by which you 362 00:23:51,680 --> 00:23:56,480 Speaker 3: wanted salesforce to be judged, not metrics of profitability, but 363 00:23:56,680 --> 00:24:00,560 Speaker 3: metrics of what he called pro forma net revenue growth, 364 00:24:01,240 --> 00:24:04,480 Speaker 3: whatever the hell that means, right, Because if you can 365 00:24:04,520 --> 00:24:08,400 Speaker 3: construct the story, you can construct it in a way 366 00:24:08,440 --> 00:24:11,720 Speaker 3: that yes, I can beat and raise pretty much every quarter. 367 00:24:12,359 --> 00:24:17,760 Speaker 3: So there was there were three I think big changes 368 00:24:17,840 --> 00:24:23,640 Speaker 3: that happened to make the role of narrative overwhelming as 369 00:24:23,640 --> 00:24:26,359 Speaker 3: it is today. Whereas before it's always been there, to 370 00:24:26,400 --> 00:24:30,639 Speaker 3: your point, it's always been there. Today it's overwhelming. And 371 00:24:30,920 --> 00:24:34,520 Speaker 3: I think it's it's not just the success that first 372 00:24:34,560 --> 00:24:38,639 Speaker 3: central bankers and then CEOs. I mean, Wall Street's the 373 00:24:38,640 --> 00:24:42,800 Speaker 3: greatest copying machine. Wall Street copies what works. Sure, so 374 00:24:43,000 --> 00:24:46,000 Speaker 3: when you see the something's working, oh, they're getting a 375 00:24:46,080 --> 00:24:49,600 Speaker 3: multiple by telling the story and going on Kramer, you know, 376 00:24:49,760 --> 00:24:50,640 Speaker 3: four times a year. 377 00:24:50,880 --> 00:24:55,040 Speaker 2: It's endless iteration. You're just constantly tweaking it and if 378 00:24:55,080 --> 00:24:57,479 Speaker 2: it works, do more of it, and if it doesn't, 379 00:24:57,560 --> 00:24:58,240 Speaker 2: toiss it out. 380 00:24:58,359 --> 00:25:01,640 Speaker 3: So it was the fact that it works to tell 381 00:25:01,640 --> 00:25:06,680 Speaker 3: a story, and people got good at telling stories. It's 382 00:25:06,720 --> 00:25:09,040 Speaker 3: the growth of twenty four to seven. I'm want to 383 00:25:09,119 --> 00:25:11,120 Speaker 3: use air quotes here. I'm glad we're taking this. 384 00:25:11,160 --> 00:25:14,120 Speaker 2: Well, it's medius, social media news. 385 00:25:13,920 --> 00:25:17,919 Speaker 3: Right now, news like news liked. That wasn't the case. 386 00:25:18,560 --> 00:25:21,879 Speaker 2: So I want to annotate what you said slightly, okay, 387 00:25:22,080 --> 00:25:26,399 Speaker 2: because I think CEOs have always been storytellers. But they 388 00:25:26,440 --> 00:25:32,359 Speaker 2: were storytellers to their boards, to their employees, to their shareholders. 389 00:25:32,440 --> 00:25:36,320 Speaker 2: They're always you're hitting now on the modern world of 390 00:25:36,920 --> 00:25:40,320 Speaker 2: twenty four to seven media. Telling a story in a 391 00:25:40,359 --> 00:25:44,000 Speaker 2: boardroom is very different than sitting in a TV studio 392 00:25:44,040 --> 00:25:48,000 Speaker 2: and talking about, Hey, here's why our new chip is 393 00:25:48,080 --> 00:25:51,160 Speaker 2: going to catch up to video and it's the greatest thing. Ever, Yep, 394 00:25:51,440 --> 00:25:52,600 Speaker 2: that's a different skill set. 395 00:25:52,800 --> 00:25:55,560 Speaker 3: It changes the time horizon. It is a very different 396 00:25:55,600 --> 00:25:58,200 Speaker 3: skill set because you're not telling the story of oh 397 00:25:58,200 --> 00:26:02,400 Speaker 3: I'm getting another you know turn of leverage and our operations, 398 00:26:02,520 --> 00:26:07,240 Speaker 3: or you know, our capacity utilization and this factory went 399 00:26:07,320 --> 00:26:09,600 Speaker 3: up by five percent, which are the kind of stories 400 00:26:09,640 --> 00:26:13,760 Speaker 3: you would tell even on earnings call or certainly to 401 00:26:13,840 --> 00:26:17,000 Speaker 3: a board. Now, this is the story where you've got 402 00:26:17,000 --> 00:26:19,440 Speaker 3: a segment a little bit, Yes, you've got a segment. 403 00:26:19,880 --> 00:26:23,159 Speaker 3: You're going on Kramer, you say bye bye bye. You 404 00:26:23,240 --> 00:26:26,679 Speaker 3: got at most four minutes, right, How are you going 405 00:26:26,720 --> 00:26:31,439 Speaker 3: to tell that story that sings to that audience? Enormous 406 00:26:31,560 --> 00:26:38,760 Speaker 3: change change, structural change in our media, both quote unquote 407 00:26:38,880 --> 00:26:43,000 Speaker 3: news media, but also financial news media. The Wall Street 408 00:26:43,080 --> 00:26:46,480 Speaker 3: Journal today is a twenty four to seven news financial 409 00:26:46,520 --> 00:26:47,480 Speaker 3: news organization. 410 00:26:48,240 --> 00:26:51,879 Speaker 2: What is printing times in back to Washington Post? They 411 00:26:51,920 --> 00:26:54,520 Speaker 2: all have websites that can update it around the clock. 412 00:26:54,640 --> 00:26:58,760 Speaker 3: And here's the thing, there's not enough hard news to 413 00:26:58,880 --> 00:27:02,200 Speaker 3: fill the time or fill the space. So what takes 414 00:27:02,240 --> 00:27:06,679 Speaker 3: the space? Opinion story you channel glory takes the place 415 00:27:06,800 --> 00:27:07,240 Speaker 3: you were. 416 00:27:07,160 --> 00:27:11,119 Speaker 2: Channeling Michael Crichton from twenty five years ago. Most of 417 00:27:11,160 --> 00:27:16,320 Speaker 2: what you see in the media is speculation, opinion, and theory, 418 00:27:16,880 --> 00:27:17,640 Speaker 2: not news. 419 00:27:17,840 --> 00:27:20,760 Speaker 3: I've written so much about Criton in his study that. 420 00:27:20,920 --> 00:27:22,280 Speaker 2: I know that's why I threw that back. 421 00:27:22,640 --> 00:27:25,600 Speaker 3: He says he was he was so far ahead, a. 422 00:27:25,640 --> 00:27:27,920 Speaker 2: Quarter century ahead of what of what took place. 423 00:27:29,800 --> 00:27:31,800 Speaker 3: There's a third piece, though, give us. 424 00:27:31,680 --> 00:27:33,919 Speaker 2: The third piece before we go to our next. 425 00:27:34,040 --> 00:27:37,879 Speaker 3: Third piece has changed everything is our smartphones. 426 00:27:38,000 --> 00:27:41,080 Speaker 2: That you're you're walking around with not only a studio, 427 00:27:41,840 --> 00:27:47,159 Speaker 2: but a dopamine device that you're constantly it's. 428 00:27:47,760 --> 00:27:51,520 Speaker 3: Machine and I we do it to ourselves. It's not 429 00:27:51,560 --> 00:27:55,199 Speaker 3: that someone forces us to hear these stories over and 430 00:27:55,240 --> 00:27:57,960 Speaker 3: over again. We do it to ourselves. I mean, I 431 00:27:58,000 --> 00:28:00,760 Speaker 3: get a little nervous if I you know, pat, where's 432 00:28:00,760 --> 00:28:05,000 Speaker 3: my phone? Where's I get a little nervous. And it's 433 00:28:05,040 --> 00:28:12,400 Speaker 3: a it's It is absolutely a neurotransmitter addiction. I think 434 00:28:12,400 --> 00:28:15,640 Speaker 3: it's so important to keep that from our kids. 435 00:28:16,760 --> 00:28:19,800 Speaker 2: That's a whole another there's a whole depression situation with 436 00:28:19,840 --> 00:28:22,240 Speaker 2: teenagers today, and it all traces back to the phone 437 00:28:22,280 --> 00:28:22,960 Speaker 2: and social media. 438 00:28:23,080 --> 00:28:26,040 Speaker 3: These are three I think real secular changes we've had. 439 00:28:27,800 --> 00:28:35,639 Speaker 3: Markets become this political utility, the success of constructing a 440 00:28:35,720 --> 00:28:41,720 Speaker 3: story structural changes in social media and the devices that 441 00:28:41,760 --> 00:28:46,400 Speaker 3: we insist on caring with ourselves all the time. 442 00:28:46,880 --> 00:28:50,479 Speaker 2: Huh, absolutely fascinating. Coming up, we continue our conversation with 443 00:28:50,600 --> 00:28:55,880 Speaker 2: Ben Hunt, president and co founder of Percion, explaining how 444 00:28:55,920 --> 00:29:00,720 Speaker 2: he's using AI to identify narratives in real time. 445 00:29:01,320 --> 00:29:02,400 Speaker 3: I'm Barry Redhults. 446 00:29:02,480 --> 00:29:18,280 Speaker 2: You're listening to Masters in Business on Bloomberg Radio. I'm 447 00:29:18,320 --> 00:29:21,040 Speaker 2: Burry rid Halts. You're listening to Masters in Business on 448 00:29:21,080 --> 00:29:24,880 Speaker 2: Bloomberg Radio. My special guest this week is Ben Hunt. 449 00:29:25,360 --> 00:29:32,440 Speaker 2: He is a academic, fund manager, risk manager, entrepreneur, tech 450 00:29:32,560 --> 00:29:37,840 Speaker 2: startup person. He is currently co founder and president at percent, 451 00:29:38,000 --> 00:29:43,600 Speaker 2: which applies AI tools to map and measure market narratives 452 00:29:44,000 --> 00:29:47,680 Speaker 2: in real time. It's really more than market narratives. It's politics, 453 00:29:47,680 --> 00:29:50,920 Speaker 2: it's economics, it's markets. You cover up a whole lot 454 00:29:50,920 --> 00:29:51,280 Speaker 2: of stuff. 455 00:29:51,840 --> 00:29:53,720 Speaker 3: So what we're able to do today and this is 456 00:29:53,760 --> 00:29:57,080 Speaker 3: the crazy change in the world. Back from when I 457 00:29:57,200 --> 00:30:06,200 Speaker 3: was doing this on microfiche, back in yeah, exactly, we 458 00:30:06,360 --> 00:30:11,680 Speaker 3: get we have access to everything that's published publicly in 459 00:30:11,760 --> 00:30:15,120 Speaker 3: the world, and there are a couple of big data 460 00:30:15,200 --> 00:30:22,760 Speaker 3: aggregators dal Jones one, Lexis and Nexus another. Everything that 461 00:30:22,880 --> 00:30:26,440 Speaker 3: gets published in the world, all these languages. It's available 462 00:30:26,480 --> 00:30:30,440 Speaker 3: to you, and it's it's not cheap, but it's not 463 00:30:30,600 --> 00:30:33,560 Speaker 3: crazy expensive like it used to be. It's always getting cheaper. 464 00:30:34,160 --> 00:30:36,240 Speaker 3: So we're able to take everything in the world that 465 00:30:36,240 --> 00:30:39,760 Speaker 3: gets published, all the newspapers, all the websites, all the transcripts, 466 00:30:39,800 --> 00:30:45,120 Speaker 3: everything's published publicly. We can pull in and then we 467 00:30:45,240 --> 00:30:51,040 Speaker 3: can process it, process it with Really is the same 468 00:30:51,200 --> 00:30:54,600 Speaker 3: math that I was using thirty years ago. Nobody's invented 469 00:30:54,640 --> 00:30:56,000 Speaker 3: cold fusion here. 470 00:30:55,880 --> 00:30:59,120 Speaker 2: But the software tools are faster, stronger, better. 471 00:30:59,320 --> 00:31:05,280 Speaker 3: And infinite. So the calculations here are not particularly complicated, 472 00:31:05,960 --> 00:31:07,880 Speaker 3: but you have to do them at enormous scale. 473 00:31:08,120 --> 00:31:09,680 Speaker 2: It's a ton of volume. 474 00:31:09,600 --> 00:31:15,440 Speaker 3: So I mean, it's crazy, the just the scale of. 475 00:31:15,360 --> 00:31:17,720 Speaker 2: The abytes, terabytes, just crazy. 476 00:31:17,880 --> 00:31:20,120 Speaker 3: I mean, yeah, in the last couple of months we've 477 00:31:20,160 --> 00:31:26,320 Speaker 3: processed over two hundred billion tokens. Billion to is how 478 00:31:26,400 --> 00:31:28,840 Speaker 3: much a token is, like a word or a folk 479 00:31:29,680 --> 00:31:31,800 Speaker 3: And so that's the kind of the unit that you 480 00:31:31,840 --> 00:31:35,320 Speaker 3: talk about when you're putting through, when you're putting something 481 00:31:35,360 --> 00:31:39,480 Speaker 3: through a linguistic calculator, which is which is what all 482 00:31:39,560 --> 00:31:45,440 Speaker 3: of the all the llms are, they're linguistic calculators and 483 00:31:45,520 --> 00:31:51,320 Speaker 3: so you know, we've processed, you know, several hundred billion tokens. Again, 484 00:31:51,360 --> 00:31:56,520 Speaker 3: it's not complex, but it is at scale. And what 485 00:31:56,520 --> 00:32:00,360 Speaker 3: we're doing with that is we're reading the world's use 486 00:32:01,400 --> 00:32:06,440 Speaker 3: to understand the world's stories and narratives. And you're right, 487 00:32:06,520 --> 00:32:11,760 Speaker 3: it's much bigger than or it's much more focused that. 488 00:32:11,880 --> 00:32:16,280 Speaker 3: We have much higher resolution than just saying, oh, I'm 489 00:32:16,320 --> 00:32:21,000 Speaker 3: bullish on financials. I mean, that's a narrative, sure, but 490 00:32:21,640 --> 00:32:26,000 Speaker 3: there are twenty different variations of that. You're bullish on financials. 491 00:32:26,440 --> 00:32:29,160 Speaker 3: Why so wait, let me let me we attract all 492 00:32:29,160 --> 00:32:31,440 Speaker 3: those We attract all those stories and how they wax 493 00:32:31,480 --> 00:32:32,440 Speaker 3: and wane over time. 494 00:32:32,600 --> 00:32:34,960 Speaker 2: So let me roll back to I want you to 495 00:32:35,080 --> 00:32:39,000 Speaker 2: explain what Persient is. Who are the clients. I don't 496 00:32:39,040 --> 00:32:42,160 Speaker 2: mean names, but what type of clients do you have 497 00:32:42,720 --> 00:32:46,320 Speaker 2: and what do they do with Persian's output? 498 00:32:47,240 --> 00:32:52,240 Speaker 3: So Starry Percient in twenty eighteen, my partner from we 499 00:32:52,240 --> 00:32:55,960 Speaker 3: were at an asset manager, spun out there to take 500 00:32:56,040 --> 00:32:59,400 Speaker 3: the the technology that I've been working on for years 501 00:33:00,720 --> 00:33:04,240 Speaker 3: and really been writing about with epsilon theory. So we 502 00:33:04,280 --> 00:33:08,440 Speaker 3: started that in twenty eighteen to do the basic research 503 00:33:09,240 --> 00:33:15,920 Speaker 3: into processing enormous amounts of financial news data and to 504 00:33:16,080 --> 00:33:21,600 Speaker 3: track the stories and how they rise and fall and 505 00:33:21,720 --> 00:33:24,960 Speaker 3: wax and wane over time. So that was that was 506 00:33:24,960 --> 00:33:25,560 Speaker 3: the story. 507 00:33:25,360 --> 00:33:29,840 Speaker 2: That was I love that description because there are a 508 00:33:29,840 --> 00:33:34,719 Speaker 2: lot of trades going on where the storyline changes on 509 00:33:34,760 --> 00:33:39,920 Speaker 2: a regular basis. Probably the mac daddy of that is crypto. First, 510 00:33:39,960 --> 00:33:43,800 Speaker 2: it's hey, you know there's fiat currency, this is outside 511 00:33:43,840 --> 00:33:47,200 Speaker 2: of the system. It's DeFi. Then it's a hedge for deflation. 512 00:33:47,280 --> 00:33:51,040 Speaker 2: Then it's a hedge for inflation. Now it's scarcity. 513 00:33:51,200 --> 00:33:56,680 Speaker 3: And digital gold dig Gold's there are no fundamentals right 514 00:33:56,720 --> 00:33:59,280 Speaker 3: with with crypto and people will say there are, but 515 00:33:59,280 --> 00:34:03,320 Speaker 3: there aren't. It's it's driven by the waxing and waning 516 00:34:03,360 --> 00:34:08,200 Speaker 3: of stories. Uh huh and do they find purchase? Do they? 517 00:34:08,800 --> 00:34:11,600 Speaker 3: Or do they? Kind of people get tired of them. 518 00:34:11,880 --> 00:34:14,880 Speaker 2: Well, they got tired of the DeFi story and then Warnce, 519 00:34:15,800 --> 00:34:19,200 Speaker 2: JP Morgan and Black Rocks started creating it like the 520 00:34:19,520 --> 00:34:23,320 Speaker 2: by bit is the fastest etf to one hundred billion 521 00:34:23,360 --> 00:34:26,920 Speaker 2: dollars And so the old story of DeFi is gone 522 00:34:26,920 --> 00:34:29,000 Speaker 2: and a new story is, oh no, this is an 523 00:34:29,000 --> 00:34:31,160 Speaker 2: asset class at Wall Street's embracing that's why you have 524 00:34:31,200 --> 00:34:33,600 Speaker 2: to own it that story. 525 00:34:33,680 --> 00:34:36,640 Speaker 3: That story was and very that was a very similar story, 526 00:34:36,680 --> 00:34:39,800 Speaker 3: by the way, or a transition in story from physical 527 00:34:39,840 --> 00:34:43,960 Speaker 3: gold to GLD when that ETF came out, right, which 528 00:34:45,400 --> 00:34:50,120 Speaker 3: was a very similar pattern, because once it became a 529 00:34:50,160 --> 00:34:53,600 Speaker 3: Wall Street a table at the Wall Street casino, right, 530 00:34:54,320 --> 00:35:00,400 Speaker 3: then it takes on a different meaning, specially around go 531 00:35:00,400 --> 00:35:05,000 Speaker 3: gold change from being okay, something that you bury in 532 00:35:05,000 --> 00:35:07,080 Speaker 3: your back, your heart or you're having your vault, you know, 533 00:35:07,160 --> 00:35:11,520 Speaker 3: along with mo and seeds for when the hard times. 534 00:35:11,160 --> 00:35:14,880 Speaker 2: Coddled water right balls ready to eat gold and lead. 535 00:35:14,840 --> 00:35:19,520 Speaker 3: It becomes a security and it's meaning changes from that, 536 00:35:20,040 --> 00:35:24,920 Speaker 3: you know, apocalyptic modeled water. And the meaning of gold 537 00:35:24,920 --> 00:35:29,760 Speaker 3: today is as an insurance policy, a security against central 538 00:35:29,760 --> 00:35:34,040 Speaker 3: bank error or government error. That's the meaning of gold today. 539 00:35:35,080 --> 00:35:38,799 Speaker 2: And is that the dominant narrative that you're identifying as 540 00:35:38,920 --> 00:35:41,280 Speaker 2: gold rallied over through four thousand? 541 00:35:42,320 --> 00:35:45,319 Speaker 3: Absolutely so, And we've really been able to track that 542 00:35:45,360 --> 00:35:45,640 Speaker 3: one in. 543 00:35:46,040 --> 00:35:49,400 Speaker 2: So here's the really big here's the million dollar or 544 00:35:49,480 --> 00:35:55,120 Speaker 2: trillion dollar question. How do you identify a narrative and say, oh, 545 00:35:55,160 --> 00:35:57,120 Speaker 2: gold is going to double from here based on this 546 00:35:57,239 --> 00:35:59,280 Speaker 2: narrative or are we not there yet. 547 00:36:00,440 --> 00:36:03,680 Speaker 3: No, you you identify the narratives because the narrative the 548 00:36:03,719 --> 00:36:09,120 Speaker 3: stories don't ever change. So that the story that leave 549 00:36:09,120 --> 00:36:12,759 Speaker 3: Gold aside for a think about we're talking about your 550 00:36:12,800 --> 00:36:18,600 Speaker 3: bullish on company X y Z, because there are about 551 00:36:18,840 --> 00:36:21,400 Speaker 3: I don't know, depending on how again, how finally you 552 00:36:21,440 --> 00:36:25,800 Speaker 3: want to resolve that they're only about a dozen stories 553 00:36:25,880 --> 00:36:28,840 Speaker 3: for why you're bullish on something, right, They can be 554 00:36:29,320 --> 00:36:36,640 Speaker 3: management change, top line growth, opportunity, consolidation in the industry, catalyst. 555 00:36:36,920 --> 00:36:40,920 Speaker 3: So every catalyst story, new product is going to prove 556 00:36:40,960 --> 00:36:45,360 Speaker 3: the new drug, new molecule. So that story you just 557 00:36:45,480 --> 00:36:49,120 Speaker 3: change the name. That's the same story that's repeated over 558 00:36:49,239 --> 00:36:55,839 Speaker 3: and over again about any pharma or biotech company. The stories, 559 00:36:56,880 --> 00:37:02,960 Speaker 3: we think that they are amorphous and variable. The fact 560 00:37:03,040 --> 00:37:05,080 Speaker 3: is that the core of the story, what we call 561 00:37:05,120 --> 00:37:10,880 Speaker 3: the semantic signature, the meaning of a story, they're amazingly 562 00:37:12,320 --> 00:37:17,000 Speaker 3: constant over time. So what we're looking for is for 563 00:37:18,040 --> 00:37:20,680 Speaker 3: it could be dormant for a long time. But what 564 00:37:20,719 --> 00:37:24,000 Speaker 3: you want to know is when that story starts picking 565 00:37:24,080 --> 00:37:30,920 Speaker 3: up again, when someone starts playing that story, when it 566 00:37:31,080 --> 00:37:34,920 Speaker 3: appears on Kramer and starts happening in the financial press. 567 00:37:35,760 --> 00:37:38,480 Speaker 3: That's the stuff we can pick up with real precision. 568 00:37:38,800 --> 00:37:40,960 Speaker 3: So it's both the stories that are starting to fade, 569 00:37:41,920 --> 00:37:45,520 Speaker 3: but the I think the really interesting stories are the 570 00:37:45,560 --> 00:37:47,520 Speaker 3: stories that have been dormant for a long time and 571 00:37:47,560 --> 00:37:48,640 Speaker 3: they start picking up again. 572 00:37:49,000 --> 00:37:53,239 Speaker 2: You are reminding me of Campbell's hero's journey that there's 573 00:37:53,280 --> 00:37:54,440 Speaker 2: only so many. 574 00:37:54,719 --> 00:37:57,920 Speaker 3: My hero right, there are only so many stories right 575 00:37:58,000 --> 00:38:02,840 Speaker 3: in the world. And that's it now, I like to 576 00:38:02,840 --> 00:38:09,680 Speaker 3: talk about in Hollywood famously, there are only like five scripts, right, uh. 577 00:38:10,400 --> 00:38:13,080 Speaker 3: Tolstoy Is supposedly told Stoy he said, there are only 578 00:38:13,120 --> 00:38:16,200 Speaker 3: two stories that a man goes on a journey or 579 00:38:16,200 --> 00:38:18,839 Speaker 3: a stranger comes to town. Those are the only two 580 00:38:18,840 --> 00:38:19,640 Speaker 3: stories in the world. 581 00:38:20,040 --> 00:38:21,959 Speaker 2: And not quite but he's not quite. 582 00:38:22,000 --> 00:38:25,320 Speaker 3: But you're on the right track, right right. The point 583 00:38:25,360 --> 00:38:29,400 Speaker 3: is there's a finite number of stories you can drill down, 584 00:38:29,520 --> 00:38:31,600 Speaker 3: so you can get a couple of dozen about any 585 00:38:31,800 --> 00:38:34,440 Speaker 3: sector you want to talk about or like. But it's 586 00:38:34,560 --> 00:38:39,120 Speaker 3: it's a finite number. And so what we do, and 587 00:38:39,160 --> 00:38:42,320 Speaker 3: what I think is really interesting, is to track the 588 00:38:42,360 --> 00:38:45,880 Speaker 3: finite number of stories. And you're right, it's not just 589 00:38:45,920 --> 00:38:50,720 Speaker 3: around markets. We track several thousand of these stories today. 590 00:38:51,040 --> 00:38:55,480 Speaker 2: How So, let's let's delve into that. So, so, how 591 00:38:55,480 --> 00:38:58,799 Speaker 2: do you you have this massive database you're sucking in 592 00:38:58,960 --> 00:39:06,200 Speaker 2: every news feed, every magazine, newspaper, everything that you could 593 00:39:06,280 --> 00:39:09,360 Speaker 2: quantify and run into a linguistics model. 594 00:39:10,040 --> 00:39:11,960 Speaker 3: What's the process. 595 00:39:11,320 --> 00:39:15,920 Speaker 2: For analyzing this, how to use artificial intelligence to go 596 00:39:16,000 --> 00:39:18,480 Speaker 2: through this? And how do you make sense out of 597 00:39:18,520 --> 00:39:22,000 Speaker 2: that heap of how do you find signal amidst all 598 00:39:22,040 --> 00:39:22,600 Speaker 2: that noise? 599 00:39:23,480 --> 00:39:26,600 Speaker 3: The crucial thing is you can't just ask AI an 600 00:39:26,600 --> 00:39:31,279 Speaker 3: open ended questions, say what are the narratives in this 601 00:39:31,600 --> 00:39:36,120 Speaker 3: for this company or for this sector. Don't do that right, 602 00:39:36,160 --> 00:39:38,040 Speaker 3: And this is a mistake that people make all the time. 603 00:39:38,080 --> 00:39:42,560 Speaker 3: They ask open ended questions of chat GPT or whoever 604 00:39:43,400 --> 00:39:45,560 Speaker 3: the problem is. Chat gpt will give you an answer, 605 00:39:46,280 --> 00:39:48,560 Speaker 3: just not a good one, not a good one. It'll 606 00:39:48,600 --> 00:39:52,520 Speaker 3: hallucinate a lot. Right, It'll go out, it'll find its 607 00:39:52,560 --> 00:39:58,360 Speaker 3: own data. The secret to using AI successfully is to 608 00:39:58,400 --> 00:40:02,040 Speaker 3: take this magic genie. It's a magic genie, and you 609 00:40:02,120 --> 00:40:05,359 Speaker 3: stuff it into that bottle. You do not let it out. 610 00:40:05,880 --> 00:40:08,880 Speaker 3: You constrain it dramatically. You don't let it go out 611 00:40:08,920 --> 00:40:13,520 Speaker 3: and find data. You give it the data. Crucial thing 612 00:40:14,239 --> 00:40:19,239 Speaker 3: you don't allow it to think you tell it how 613 00:40:19,320 --> 00:40:22,600 Speaker 3: to think. So the most important step that we do 614 00:40:23,080 --> 00:40:27,280 Speaker 3: is we don't ask AI what are the narratives that Okay, 615 00:40:27,760 --> 00:40:30,719 Speaker 3: we tell it. This is human direction. You have to 616 00:40:30,760 --> 00:40:31,920 Speaker 3: have human control. 617 00:40:32,040 --> 00:40:36,560 Speaker 2: I'm hearing data set is controlled by you as well 618 00:40:36,600 --> 00:40:37,800 Speaker 2: as the thinking. 619 00:40:39,440 --> 00:40:42,920 Speaker 3: Yes, and it's more than prompt, right, So the it 620 00:40:43,440 --> 00:40:48,360 Speaker 3: includes prompt. But the phrase that's used in this world 621 00:40:48,440 --> 00:40:54,600 Speaker 3: is called not prompt engineering, but context engineering. So you 622 00:40:54,640 --> 00:40:59,880 Speaker 3: want to you want to control everything around the A 623 00:41:00,920 --> 00:41:04,960 Speaker 3: because you want to limit it to being that linguistic calculator. 624 00:41:05,360 --> 00:41:08,640 Speaker 3: You want it to be your operating system. That's really 625 00:41:08,680 --> 00:41:11,480 Speaker 3: the way the thing. And if you do that, then 626 00:41:12,239 --> 00:41:15,040 Speaker 3: it will give you the same answer twice for the 627 00:41:15,080 --> 00:41:18,279 Speaker 3: same inputs and the same question. That's the crucial thing. 628 00:41:18,360 --> 00:41:20,719 Speaker 3: So for this to be true, for it to be consistent, 629 00:41:20,840 --> 00:41:23,560 Speaker 3: for it to be real signal. So this is a 630 00:41:23,680 --> 00:41:27,360 Speaker 3: human directed process. You can't ask AI an open ended question. 631 00:41:27,840 --> 00:41:30,759 Speaker 3: You have to control all the inputs. You have to 632 00:41:30,800 --> 00:41:33,759 Speaker 3: control the output, meaning you judge it, you run it 633 00:41:33,800 --> 00:41:36,759 Speaker 3: back through a different AI system to say had they 634 00:41:36,800 --> 00:41:39,600 Speaker 3: do did they go off the rails here? But the 635 00:41:39,640 --> 00:41:43,480 Speaker 3: most important thing is you have to give it the scaffolding. 636 00:41:43,560 --> 00:41:47,120 Speaker 3: You have to give it the the skeleton. You have 637 00:41:47,200 --> 00:41:50,160 Speaker 3: to tell it these are the thoughts you are allowed 638 00:41:50,200 --> 00:41:53,839 Speaker 3: to think about, and those are the signatures. 639 00:41:54,040 --> 00:41:59,120 Speaker 2: I've kind of learned. I have to avoid asking questions 640 00:41:59,160 --> 00:42:06,560 Speaker 2: that have a human emotional subtext, like tell me what 641 00:42:06,680 --> 00:42:09,360 Speaker 2: was most surprising about this? It doesn't know what a 642 00:42:09,440 --> 00:42:12,799 Speaker 2: surprise is. Tell me what was most interesting about this? 643 00:42:13,719 --> 00:42:16,440 Speaker 2: It's not able to do that. You really have to 644 00:42:16,600 --> 00:42:18,240 Speaker 2: treat it like it's a dumb machine. 645 00:42:18,520 --> 00:42:20,920 Speaker 3: Well, that's right this way I mean you treat it. 646 00:42:21,200 --> 00:42:25,080 Speaker 3: You need to treat it as an operating system. You 647 00:42:25,160 --> 00:42:28,319 Speaker 3: need to constrain every bit about it, particularly in how 648 00:42:28,400 --> 00:42:30,920 Speaker 3: you allow it to think, because it wants to please 649 00:42:30,960 --> 00:42:33,799 Speaker 3: you so badly it does. So if you ask it 650 00:42:33,840 --> 00:42:36,920 Speaker 3: what's interesting, it will look back at its history of 651 00:42:36,960 --> 00:42:40,800 Speaker 3: communication with you, and I'll think, what will Berry find interesting? 652 00:42:41,520 --> 00:42:43,120 Speaker 3: And it will give that answer to you. And if 653 00:42:43,120 --> 00:42:45,520 Speaker 3: you can't find it easily, it'll make it up. It'll 654 00:42:45,520 --> 00:42:48,000 Speaker 3: make up an answer that you will find interesting. 655 00:42:48,440 --> 00:42:52,920 Speaker 2: I find when I try and prompt, Hey, tell me 656 00:42:52,960 --> 00:42:57,279 Speaker 2: about Ben Hunt's background and give me the timeline of 657 00:42:57,320 --> 00:43:00,879 Speaker 2: his career that it's good at? Hey, what was Ben 658 00:43:00,960 --> 00:43:04,280 Speaker 2: Hunt really good at? It has got no idea. 659 00:43:04,960 --> 00:43:07,920 Speaker 3: So what you're able to do, if you're able to 660 00:43:08,280 --> 00:43:12,880 Speaker 3: again put the genie in the bottle, tell it. How 661 00:43:12,920 --> 00:43:16,120 Speaker 3: to think about a problem is you're able to identify. 662 00:43:17,200 --> 00:43:19,720 Speaker 3: And this does go back to the work from thirty 663 00:43:19,760 --> 00:43:23,840 Speaker 3: five years ago. The type of stories we tell that 664 00:43:23,880 --> 00:43:28,879 Speaker 3: we humans tell about stocks or politics. We tell two 665 00:43:28,920 --> 00:43:35,279 Speaker 3: types of stories. We tell descriptive stories. Oh, the you 666 00:43:35,320 --> 00:43:40,160 Speaker 3: know the Fed cut rates by twenty five basis points today, 667 00:43:40,239 --> 00:43:44,080 Speaker 3: a descriptive story. And we can also tell the description 668 00:43:44,120 --> 00:43:45,160 Speaker 3: of the you know. 669 00:43:45,840 --> 00:43:51,680 Speaker 2: It's the because clause because we're seeing slowing, increasing layoffs 670 00:43:51,719 --> 00:43:52,840 Speaker 2: and slowing consumers. 671 00:43:53,120 --> 00:44:01,040 Speaker 3: And that's high resolution and very descriptive. Right. The Powell 672 00:44:01,120 --> 00:44:07,600 Speaker 3: was surprisingly hawkish today, and he was, that's a description. 673 00:44:08,480 --> 00:44:12,080 Speaker 3: There's another type of story we tell, Barry, and that's prescriptive, 674 00:44:13,360 --> 00:44:19,240 Speaker 3: meaning meaning the FED should be hawkish, the Fed should 675 00:44:19,360 --> 00:44:23,840 Speaker 3: cut by twenty five basis points. Those are the stories 676 00:44:24,400 --> 00:44:29,880 Speaker 3: that are indicative of an effort being made to move 677 00:44:30,040 --> 00:44:35,600 Speaker 3: public opinion in a certain direction. That's like the forward 678 00:44:35,680 --> 00:44:40,480 Speaker 3: guidance that the FED still does right, using their words 679 00:44:40,520 --> 00:44:44,680 Speaker 3: for effect. They're using words to nudge you and how 680 00:44:44,719 --> 00:44:47,000 Speaker 3: you should think about the world. How to think about 681 00:44:47,000 --> 00:44:50,080 Speaker 3: the world. So the crucial thing when we're doing these, 682 00:44:50,640 --> 00:44:54,640 Speaker 3: when we're asking the AI to here's all the text 683 00:44:54,760 --> 00:44:57,879 Speaker 3: in the world. Here are the stories that we want 684 00:44:57,920 --> 00:45:01,160 Speaker 3: you to identify. We can all also boil that down 685 00:45:01,239 --> 00:45:07,920 Speaker 3: into identify the stories that are trying to tell the 686 00:45:07,960 --> 00:45:12,239 Speaker 3: reader how they should think or how policy should go. 687 00:45:12,960 --> 00:45:18,160 Speaker 3: That we find has a lot of predictive capability to it. 688 00:45:18,800 --> 00:45:24,919 Speaker 2: So you're in the business of analyzing the world's narratives 689 00:45:25,000 --> 00:45:25,719 Speaker 2: every day. 690 00:45:26,520 --> 00:45:27,840 Speaker 3: How is that even possible? 691 00:45:27,880 --> 00:45:31,440 Speaker 2: It seems like that is an impressing crazy crazy And. 692 00:45:32,760 --> 00:45:36,719 Speaker 3: It used to be it used to be crazy. I mean, Barry, I, 693 00:45:36,719 --> 00:45:39,760 Speaker 3: I really do remember in the academic days. I would 694 00:45:40,040 --> 00:45:43,279 Speaker 3: literally hire grad students and give them a cup of 695 00:45:43,360 --> 00:45:46,720 Speaker 3: dimes so they go down to the microfiche machine. Remember 696 00:45:46,760 --> 00:45:51,280 Speaker 3: those machines. Remember those machines, Yes, remember those. I would code, 697 00:45:51,400 --> 00:45:56,160 Speaker 3: hand code the or hand record the coded data. I 698 00:45:56,200 --> 00:46:02,440 Speaker 3: would type it into remote access for a digital equipment 699 00:46:02,560 --> 00:46:06,759 Speaker 3: mini frame and the next day, you know, something would 700 00:46:06,840 --> 00:46:11,480 Speaker 3: churn out for me. Today it's they say, we're processing 701 00:46:11,600 --> 00:46:15,440 Speaker 3: hundreds of billions of tokens. We get millions of documents 702 00:46:15,480 --> 00:46:19,680 Speaker 3: over an eight like that. We are infinite computing resources 703 00:46:19,719 --> 00:46:20,440 Speaker 3: available to us. 704 00:46:20,600 --> 00:46:22,880 Speaker 2: Is there anything you can't access that you wish you 705 00:46:22,880 --> 00:46:23,640 Speaker 2: had access to? 706 00:46:26,520 --> 00:46:31,000 Speaker 3: Any data source? So one of the things that's happened 707 00:46:31,040 --> 00:46:34,759 Speaker 3: on Wall Street is that the banks and the cell 708 00:46:34,880 --> 00:46:39,480 Speaker 3: side had become very jealous of their publications because they 709 00:46:39,520 --> 00:46:44,000 Speaker 3: tend to think, and they're wrong, that they're good at it, right, 710 00:46:44,320 --> 00:46:48,719 Speaker 3: that they're good at analysis. I personally don't think they are. 711 00:46:48,920 --> 00:46:51,960 Speaker 2: Well, let's just say some are better than others. 712 00:46:52,000 --> 00:46:55,320 Speaker 3: Some are better than others, but none of them are really. 713 00:46:57,680 --> 00:47:00,799 Speaker 3: If they were, if there was a call it kind 714 00:47:00,800 --> 00:47:05,279 Speaker 3: of significant alpha there, they wouldn't be a bank, that'd 715 00:47:05,280 --> 00:47:06,920 Speaker 3: be a hedge fund. Yeah, they wouldn't be a on 716 00:47:07,400 --> 00:47:11,279 Speaker 3: the cell side now. So I'm interested in reading the 717 00:47:11,320 --> 00:47:13,600 Speaker 3: cell side research not because I think there's some nugget 718 00:47:13,680 --> 00:47:16,560 Speaker 3: of truth in there, but because I want to see 719 00:47:16,600 --> 00:47:17,839 Speaker 3: what they're all talking about. 720 00:47:17,960 --> 00:47:22,759 Speaker 2: Right, it's reflective of, if not a consensus, certainly a 721 00:47:23,000 --> 00:47:25,239 Speaker 2: popular set of ideas. 722 00:47:24,800 --> 00:47:27,279 Speaker 3: And this is a crucial thing to talk about what 723 00:47:28,000 --> 00:47:30,920 Speaker 3: we do. I don't know what the truth is, right, 724 00:47:31,080 --> 00:47:31,800 Speaker 3: I have no idea. 725 00:47:31,840 --> 00:47:32,480 Speaker 2: Does it matter? 726 00:47:32,960 --> 00:47:36,600 Speaker 3: And I don't think it matters. I want to provide 727 00:47:36,680 --> 00:47:39,160 Speaker 3: this information to people who do have a view on 728 00:47:39,200 --> 00:47:42,480 Speaker 3: the truth. Let's say you're a you're a value investor, right, 729 00:47:42,480 --> 00:47:45,080 Speaker 3: you're running a fund. You've got your views, you've done 730 00:47:45,320 --> 00:47:48,000 Speaker 3: your homework, you've done your research. 731 00:47:47,840 --> 00:47:49,000 Speaker 2: You've got a good back test. 732 00:47:49,160 --> 00:47:51,600 Speaker 3: Yeah, yeah, yeah, you've got. So I say, I've got 733 00:47:52,080 --> 00:47:56,080 Speaker 3: Here are the companies where I think I've identified something special, 734 00:47:56,400 --> 00:48:01,520 Speaker 3: something that's valuable, that the market does not recognize, and 735 00:48:01,560 --> 00:48:03,840 Speaker 3: so I want to buy it. And then I'm just 736 00:48:03,880 --> 00:48:09,080 Speaker 3: gonna wait. I gotta wait until one day the market realizes, 737 00:48:09,120 --> 00:48:11,440 Speaker 3: the market comes to their senses and says, oh wow, 738 00:48:12,280 --> 00:48:14,600 Speaker 3: that should trade at a higher multiple or a higher 739 00:48:14,640 --> 00:48:19,200 Speaker 3: price because that special thing that you saw, that source 740 00:48:19,200 --> 00:48:25,319 Speaker 3: of value, the rest of the world comes to see that. Well, 741 00:48:25,520 --> 00:48:28,080 Speaker 3: what I can think I can show you is when 742 00:48:28,120 --> 00:48:29,920 Speaker 3: the rest of the world starts to wake up to 743 00:48:29,960 --> 00:48:31,040 Speaker 3: whatever it is you're looking for. 744 00:48:31,160 --> 00:48:35,200 Speaker 2: So you're catching the early lift off the bottom. I am. 745 00:48:35,280 --> 00:48:40,240 Speaker 3: So when a value investment only works when the market 746 00:48:40,320 --> 00:48:44,440 Speaker 3: recognizes it, and we are tracking when something like that 747 00:48:44,480 --> 00:48:46,000 Speaker 3: gets discovered by the market. 748 00:48:46,880 --> 00:48:50,280 Speaker 2: So before in video is five trillion, when it starts 749 00:48:50,360 --> 00:48:54,520 Speaker 2: ramping up to five hundred billion, Hey, something's going on here. 750 00:48:57,920 --> 00:49:02,239 Speaker 3: What I found in my experience as an investor, is 751 00:49:02,239 --> 00:49:05,200 Speaker 3: that you make the most money on a trade during 752 00:49:05,239 --> 00:49:07,680 Speaker 3: what I call is the discovery phase of a truck, 753 00:49:08,400 --> 00:49:11,800 Speaker 3: when the rest of the world wakes up to something 754 00:49:11,840 --> 00:49:15,960 Speaker 3: that you had noticed and identified before. That's when the 755 00:49:16,040 --> 00:49:17,840 Speaker 3: money is made. Once it gets. 756 00:49:17,640 --> 00:49:23,560 Speaker 2: Out there and it's reflected, then the market is eventually efficient. 757 00:49:23,680 --> 00:49:26,600 Speaker 3: It's a different it's a different risk and reward profile. 758 00:49:26,760 --> 00:49:28,719 Speaker 3: Let me let me put it that way. You you 759 00:49:28,760 --> 00:49:34,280 Speaker 3: get you get a lot more ups and downs post 760 00:49:34,400 --> 00:49:38,200 Speaker 3: that discovery phase than you do when you're enjoying the 761 00:49:38,239 --> 00:49:41,279 Speaker 3: discovery phase. It's a lot harder once you other now 762 00:49:42,080 --> 00:49:45,239 Speaker 3: we miss, oh, the other catalyst, and then say what. 763 00:49:45,400 --> 00:49:48,879 Speaker 2: But that discovery phase is I'm going to quote Doug 764 00:49:49,000 --> 00:49:53,080 Speaker 2: cast that's the there's a phrase he uses. 765 00:49:53,320 --> 00:49:53,880 Speaker 1: Uh. 766 00:49:54,360 --> 00:50:00,279 Speaker 2: It's like a contrariant perspective, a variant perspective perspect if 767 00:50:00,400 --> 00:50:03,759 Speaker 2: you have some insight that is not widely held right 768 00:50:03,960 --> 00:50:09,000 Speaker 2: and the market being mostly kind of sort of eventually efficient. 769 00:50:09,560 --> 00:50:12,760 Speaker 2: If there's a truth or at least a good story 770 00:50:13,160 --> 00:50:17,000 Speaker 2: in your variant perspectives. 771 00:50:15,520 --> 00:50:17,480 Speaker 3: All it takes is a story. 772 00:50:17,480 --> 00:50:18,560 Speaker 2: It's got to be a good story. 773 00:50:18,600 --> 00:50:20,200 Speaker 3: Though, Yeah, it's got to be a well i'll call 774 00:50:20,239 --> 00:50:22,279 Speaker 3: it it's got to be a compelling story, right right, 775 00:50:22,320 --> 00:50:27,399 Speaker 3: And that's that's why you're looking for a CEO who 776 00:50:27,440 --> 00:50:30,920 Speaker 3: can tell that compelling story. You're looking for the ability 777 00:50:30,960 --> 00:50:34,600 Speaker 3: to tell a compelling story, because that is what gives 778 00:50:34,640 --> 00:50:37,080 Speaker 3: you a multiple so is story. 779 00:50:37,280 --> 00:50:41,960 Speaker 2: So let me ask you a few more questions on 780 00:50:42,920 --> 00:50:46,440 Speaker 2: person and before we go into a few other areas. So, first, 781 00:50:47,000 --> 00:50:49,400 Speaker 2: you've been doing this for seven years. What are you 782 00:50:49,440 --> 00:50:54,439 Speaker 2: doing today that was unimaginable four, five, six, seven years ago. 783 00:50:55,000 --> 00:50:58,080 Speaker 3: I tell you something that was unimaginable really two years ago. 784 00:50:58,120 --> 00:51:02,839 Speaker 3: That's amazing, and that is the AI. So what we 785 00:51:02,840 --> 00:51:05,040 Speaker 3: were doing in our early days, we were basically doing 786 00:51:05,120 --> 00:51:08,360 Speaker 3: small language models. We were making the language models essentially 787 00:51:08,360 --> 00:51:12,719 Speaker 3: by hand. And you know, we did our first models 788 00:51:12,760 --> 00:51:14,960 Speaker 3: and our first versions of this, and we licensed it 789 00:51:15,120 --> 00:51:20,440 Speaker 3: some big banks. And here's the problem, Barry. We had 790 00:51:20,480 --> 00:51:24,080 Speaker 3: constructed a net and it was a pretty good net. 791 00:51:24,280 --> 00:51:26,960 Speaker 3: I mean, when we would dip it into a data stream, 792 00:51:27,320 --> 00:51:30,440 Speaker 3: we'd catch a fish, meaning a signal, and the signal 793 00:51:31,080 --> 00:51:34,759 Speaker 3: that signal works, you know, good, good hit on the 794 00:51:35,080 --> 00:51:37,960 Speaker 3: signal that we put up. The problem was we were 795 00:51:37,960 --> 00:51:41,440 Speaker 3: missing too many damn fish. Our net was too small 796 00:51:41,520 --> 00:51:43,759 Speaker 3: and we'd say, oh, we got this one, and then 797 00:51:44,520 --> 00:51:46,960 Speaker 3: we'd look back at whatever it was we were designing 798 00:51:47,000 --> 00:51:49,040 Speaker 3: the models, so, well, how did we miss all these 799 00:51:49,040 --> 00:51:53,320 Speaker 3: other fish? And the answer was the small language model 800 00:51:53,400 --> 00:52:00,279 Speaker 3: we were constructing. It's incredibly complex if you're building these 801 00:52:00,320 --> 00:52:07,839 Speaker 3: models without the probabilistic approach that modern lms allow you 802 00:52:07,920 --> 00:52:10,840 Speaker 3: to take. So this is the whole notion of embedding. 803 00:52:10,960 --> 00:52:14,799 Speaker 3: So that there are a million ways to say I'm 804 00:52:14,840 --> 00:52:18,920 Speaker 3: bullish about the management change at company XYZ. There are 805 00:52:19,040 --> 00:52:21,719 Speaker 3: million ways you can say that. And if you're in 806 00:52:21,760 --> 00:52:25,000 Speaker 3: the business as we were, of kind of handcrafting, let's 807 00:52:25,040 --> 00:52:27,920 Speaker 3: write down all the ways you can say that. Really 808 00:52:28,040 --> 00:52:31,120 Speaker 3: you miss a lot. You've made a small net and 809 00:52:31,880 --> 00:52:35,640 Speaker 3: what you're trying to a small data set, and it's 810 00:52:35,640 --> 00:52:39,759 Speaker 3: a pretty small data set. So we rebuilt all of 811 00:52:39,800 --> 00:52:46,279 Speaker 3: our software and using AI as an operating system, context 812 00:52:46,320 --> 00:52:52,160 Speaker 3: engineering control, how you dole out the text data, how 813 00:52:52,200 --> 00:52:54,360 Speaker 3: you test it, how you allow it to think. And 814 00:52:54,360 --> 00:52:57,120 Speaker 3: we thought, all right, you know what, we're building a 815 00:52:57,120 --> 00:53:01,160 Speaker 3: bigger net. I bet we see five I've maybe even 816 00:53:01,280 --> 00:53:07,440 Speaker 3: six x improvement in our signal to over one hundred x. 817 00:53:07,520 --> 00:53:08,440 Speaker 2: That's unbelievable. 818 00:53:08,480 --> 00:53:10,600 Speaker 3: Over one hundred x it's over and now it's we've 819 00:53:10,600 --> 00:53:14,400 Speaker 3: had a multiple of that again, it's it's It is 820 00:53:14,440 --> 00:53:17,520 Speaker 3: hard to describe in in the in the the expansion 821 00:53:17,560 --> 00:53:21,520 Speaker 3: of the net is in so many different directions. Everything 822 00:53:21,520 --> 00:53:24,600 Speaker 3: we were doing before was just English language, right. 823 00:53:24,760 --> 00:53:27,160 Speaker 2: So now you speak of global and how many languages 824 00:53:27,239 --> 00:53:28,560 Speaker 2: are you pulling into the. 825 00:53:28,719 --> 00:53:33,120 Speaker 3: Anything anything that an AI has been trained on, we 826 00:53:33,239 --> 00:53:35,040 Speaker 3: read it and we can say. 827 00:53:35,080 --> 00:53:37,319 Speaker 2: We can say, got forty languages? Is that like? 828 00:53:37,360 --> 00:53:39,200 Speaker 3: How many? Well there are only you know, there are 829 00:53:39,239 --> 00:53:42,799 Speaker 3: about a dozen languages that are useful in. 830 00:53:43,000 --> 00:53:47,280 Speaker 2: Markets totally Japanese, Chinese, a variety of European languages. 831 00:53:47,320 --> 00:53:50,360 Speaker 3: So we can we can tell Indian, here's the here's 832 00:53:50,400 --> 00:53:55,319 Speaker 3: a story, here's the story about Chinese domestic markets. There's 833 00:53:55,360 --> 00:53:59,160 Speaker 3: a Western story, Western narratives, there's a domestic Chinese story, 834 00:53:59,200 --> 00:54:02,680 Speaker 3: and the domestic Chines story is very different. Oftun it's 835 00:54:02,800 --> 00:54:05,200 Speaker 3: very different. So we were able, for example, picking up 836 00:54:05,480 --> 00:54:09,959 Speaker 3: we were able to pick up way before it got 837 00:54:10,000 --> 00:54:17,200 Speaker 3: picked up in Western press. The demand for luxury goods 838 00:54:17,280 --> 00:54:23,279 Speaker 3: in China went off a clifliff in last November. 839 00:54:23,800 --> 00:54:26,680 Speaker 2: Listen, you can't go from a double digit GDP to 840 00:54:26,960 --> 00:54:29,960 Speaker 2: like load to mid single digits and not have an effect, 841 00:54:30,400 --> 00:54:31,640 Speaker 2: especially in a country like that. 842 00:54:31,680 --> 00:54:35,799 Speaker 3: Well, but it's interesting, right because because they've had you know, 843 00:54:36,000 --> 00:54:38,640 Speaker 3: ups and downs on business cycle before, but this is 844 00:54:38,719 --> 00:54:43,240 Speaker 3: the first time where you saw consumer behavior that really 845 00:54:43,320 --> 00:54:45,040 Speaker 3: was kind of similar to what you might find in 846 00:54:45,080 --> 00:54:48,480 Speaker 3: a Western consumer behavior. Point being, you didn't hear that 847 00:54:48,560 --> 00:54:53,960 Speaker 3: from LVMH or the Macau gaming guys until February, and 848 00:54:54,000 --> 00:54:57,400 Speaker 3: we were picking that up in November from the domestic 849 00:54:57,800 --> 00:54:58,560 Speaker 3: Chinese meatus. 850 00:54:58,760 --> 00:55:02,440 Speaker 2: So that raises a really interesting question. You know, at 851 00:55:02,440 --> 00:55:04,560 Speaker 2: the end of the day, clients want to be able 852 00:55:04,600 --> 00:55:09,200 Speaker 2: to make money on your research. How are people putting 853 00:55:09,239 --> 00:55:13,359 Speaker 2: this to work? How does what you're building help your 854 00:55:13,360 --> 00:55:14,920 Speaker 2: clients generate alpha. 855 00:55:15,600 --> 00:55:21,360 Speaker 3: What we think we've discovered is an entirely new source 856 00:55:21,520 --> 00:55:25,160 Speaker 3: of data, and what we're confident is that we've built 857 00:55:25,239 --> 00:55:28,600 Speaker 3: the systems that are very different from what you see 858 00:55:28,600 --> 00:55:32,200 Speaker 3: with this sort of analytics that are out there from 859 00:55:32,200 --> 00:55:35,239 Speaker 3: anyone else. Right, So this is not sentiment, right, We're 860 00:55:35,239 --> 00:55:37,120 Speaker 3: not tracking or using mean words or nice. 861 00:55:37,040 --> 00:55:38,440 Speaker 2: By the way, that was a big thing I don't 862 00:55:38,440 --> 00:55:42,680 Speaker 2: know is that ten years ago we're scanning Twitter to 863 00:55:42,760 --> 00:55:44,160 Speaker 2: identify investor sentiment. 864 00:55:44,280 --> 00:55:46,200 Speaker 3: Oh my god, I mean, but I was just looking 865 00:55:46,239 --> 00:55:48,880 Speaker 3: today and not to pick on Bloomberg, but you know 866 00:55:50,120 --> 00:55:55,960 Speaker 3: it was their live coverage of the FED statement, right, 867 00:55:56,000 --> 00:56:00,720 Speaker 3: and they were analyzing the sentences and the thing statement 868 00:56:00,840 --> 00:56:02,560 Speaker 3: for hawkish and dubvish sentiment. 869 00:56:02,680 --> 00:56:06,080 Speaker 2: How would change from the last meeting, And so so 870 00:56:06,120 --> 00:56:08,000 Speaker 2: you're saying there's not a whole lot of signal there. 871 00:56:09,239 --> 00:56:10,960 Speaker 3: I want to be careful with what I say, right, 872 00:56:11,080 --> 00:56:15,759 Speaker 3: which is that there are a number of i'll call 873 00:56:15,760 --> 00:56:20,400 Speaker 3: them high frequency stat ARP guys where if it's important 874 00:56:20,400 --> 00:56:25,279 Speaker 3: for you to note the difference in word choice on 875 00:56:25,480 --> 00:56:29,400 Speaker 3: a millisecond level, I think there. I think you can 876 00:56:29,400 --> 00:56:33,200 Speaker 3: get something out of that I do, and so there 877 00:56:33,200 --> 00:56:35,680 Speaker 3: are firms that can do that very well. That's not 878 00:56:35,760 --> 00:56:39,560 Speaker 3: our game, right, This is not the What we're trying 879 00:56:39,600 --> 00:56:42,880 Speaker 3: to identify is not just sentiment, not just word choice. 880 00:56:43,360 --> 00:56:46,160 Speaker 3: This is not Google trends and how many times did 881 00:56:46,960 --> 00:56:51,759 Speaker 3: they mention AI and the earnings report? Right, there's we're 882 00:56:51,800 --> 00:56:54,680 Speaker 3: able to track the actual stories that drive behavior. 883 00:56:55,040 --> 00:56:59,120 Speaker 2: So that that's the next question. And I'm gonna give 884 00:56:59,200 --> 00:57:04,520 Speaker 2: Dave not a credit for asking this. Does every narrative 885 00:57:04,760 --> 00:57:08,080 Speaker 2: turn into a decision? How do you know when something 886 00:57:08,239 --> 00:57:13,840 Speaker 2: is merely noisy versus where there's a significant tradable signal. 887 00:57:14,840 --> 00:57:20,560 Speaker 3: I don't, right, So this is my goal is not to, oh, 888 00:57:21,040 --> 00:57:25,120 Speaker 3: you know, do the trades. My goal is mister hedge 889 00:57:25,120 --> 00:57:31,400 Speaker 3: fun guy, mister asset allocator. You've got to you know China, right, 890 00:57:31,520 --> 00:57:36,600 Speaker 3: you know your companies, you know your commodities. Here is 891 00:57:37,560 --> 00:57:44,400 Speaker 3: data that I think you'll find useful. The efficacy of it, though, 892 00:57:44,480 --> 00:57:45,919 Speaker 3: is up to you. I don't know what you would 893 00:57:46,000 --> 00:57:47,920 Speaker 3: what you want to do with it. I want to 894 00:57:47,960 --> 00:57:52,800 Speaker 3: sell the picks and shovels, honestly for this vast new 895 00:57:52,880 --> 00:57:55,240 Speaker 3: data set that we all know is important, but we 896 00:57:55,360 --> 00:57:58,320 Speaker 3: haven't been able to measure it in a very predictive 897 00:57:58,360 --> 00:57:58,960 Speaker 3: way before. 898 00:57:59,280 --> 00:58:02,800 Speaker 2: So let's talk about a few things related to and 899 00:58:02,800 --> 00:58:03,360 Speaker 2: it's not just. 900 00:58:03,320 --> 00:58:07,240 Speaker 3: About using this for investment. So we have a product 901 00:58:07,280 --> 00:58:13,120 Speaker 3: for financial advisors. Uh huh, right, which is your client's 902 00:58:13,160 --> 00:58:17,720 Speaker 3: coming in? You've got a portfolio. You need to be 903 00:58:17,760 --> 00:58:21,520 Speaker 3: able to say, what what is what is my client? What? 904 00:58:21,520 --> 00:58:23,640 Speaker 3: What are they worried about, what are they nervous about? 905 00:58:23,640 --> 00:58:26,320 Speaker 3: What are they hopeful for? What are the stories they're reading? 906 00:58:26,360 --> 00:58:29,280 Speaker 2: And you're pulling this out of the flow of media, and. 907 00:58:29,240 --> 00:58:32,040 Speaker 3: We can tell you exactly for the portfolio you've got 908 00:58:32,040 --> 00:58:34,480 Speaker 3: for that client. Here's what they're going to be asking about, 909 00:58:35,280 --> 00:58:37,720 Speaker 3: worried about, and here are the answers you can give 910 00:58:37,760 --> 00:58:40,120 Speaker 3: them to show. This is when has happened before, when 911 00:58:40,120 --> 00:58:42,840 Speaker 3: this story has come up, Stay, the course is going 912 00:58:42,880 --> 00:58:45,160 Speaker 3: to be fine. This is the sort of stuff we 913 00:58:45,160 --> 00:58:46,520 Speaker 3: can do for financial advisors. 914 00:58:46,640 --> 00:58:48,880 Speaker 2: Huh, that's really interesting. Would I would love to see 915 00:58:48,880 --> 00:58:49,200 Speaker 2: some of that. 916 00:58:49,400 --> 00:58:52,240 Speaker 3: It's not just in markets, Berry, I got to tell you. 917 00:58:52,160 --> 00:58:56,840 Speaker 2: The policymakers, corporate executivesinkers. 918 00:58:57,040 --> 00:59:01,160 Speaker 3: So we did you know, we did before in the 919 00:59:01,280 --> 00:59:04,520 Speaker 3: run up to the Russian invasion of Ukraine, and we 920 00:59:04,560 --> 00:59:07,240 Speaker 3: published this on epsolon theory and this is my old 921 00:59:07,280 --> 00:59:11,400 Speaker 3: academic work book, getting to War. Before a country starts 922 00:59:11,400 --> 00:59:15,400 Speaker 3: a war, they mobilize public opinion. And so we were 923 00:59:15,400 --> 00:59:20,640 Speaker 3: looking at domestic Russian media looking as a friends, this 924 00:59:20,680 --> 00:59:21,560 Speaker 3: isn't going to be. 925 00:59:21,520 --> 00:59:23,200 Speaker 2: A limited thing. 926 00:59:23,320 --> 00:59:25,800 Speaker 3: This is happening. It's going to be a full scale invasion. 927 00:59:26,200 --> 00:59:29,720 Speaker 3: Because that was the messaging in domestic Russian media to 928 00:59:30,120 --> 00:59:36,439 Speaker 3: the Russian people. Wow, So brands, right, so you are 929 00:59:36,680 --> 00:59:40,000 Speaker 3: you know we're working with some some it sounds great, 930 00:59:40,080 --> 00:59:43,520 Speaker 3: but pro sports teams you want to tell a story 931 00:59:44,120 --> 00:59:49,720 Speaker 3: to build a stadium, some tickets build a stadium. Give 932 00:59:49,720 --> 00:59:52,320 Speaker 3: me some examples of people have told good stories and 933 00:59:52,400 --> 00:59:54,840 Speaker 3: how did that work for them? How can we do 934 00:59:54,880 --> 00:59:58,280 Speaker 3: the same thing? So looking at storytelling. 935 00:59:58,640 --> 01:00:02,000 Speaker 2: If Jeff Bezos were as a subscriber to this back 936 01:00:02,000 --> 01:00:06,440 Speaker 2: when he was trying to build a tax funded HQ 937 01:00:06,960 --> 01:00:10,440 Speaker 2: on the Hudson, had he had your data and you 938 01:00:10,480 --> 01:00:14,120 Speaker 2: were crunching all the New York City news stories about this, 939 01:00:14,400 --> 01:00:17,440 Speaker 2: might you have been able to give him advice that 940 01:00:18,560 --> 01:00:22,400 Speaker 2: he suffered. He suffered such a backlash. 941 01:00:21,760 --> 01:00:25,160 Speaker 3: Because you know we have today, we have polls right because. 942 01:00:24,920 --> 01:00:27,160 Speaker 2: We're terrible, which are terrible, which. 943 01:00:27,000 --> 01:00:30,200 Speaker 3: Are mostly terrible now, you know, and I love the 944 01:00:30,240 --> 01:00:32,240 Speaker 3: polymarket stuff and other things where you try to get 945 01:00:32,240 --> 01:00:34,640 Speaker 3: as many people as possible to put money on something right. 946 01:00:34,720 --> 01:00:37,919 Speaker 2: So you know, when you ask a personal question, you're 947 01:00:37,960 --> 01:00:40,800 Speaker 2: asking them, hey, what do you think you think and 948 01:00:40,840 --> 01:00:42,560 Speaker 2: what do you think you're going to do in the future. 949 01:00:42,880 --> 01:00:45,320 Speaker 2: When we know people are terrible at both of them terrible, 950 01:00:46,040 --> 01:00:47,680 Speaker 2: But if you put a little money on it, all right, 951 01:00:47,760 --> 01:00:50,680 Speaker 2: maybe they might be a little more circumspect. 952 01:00:50,800 --> 01:00:53,280 Speaker 3: That that that that helps the ton So in places 953 01:00:53,320 --> 01:00:55,840 Speaker 3: where you can kin make a bet. I think that 954 01:00:55,960 --> 01:00:59,080 Speaker 3: improves the kind of information you can get. It still 955 01:00:59,160 --> 01:01:02,000 Speaker 3: lends itself to a lot of manipulation and a lot of. 956 01:01:04,160 --> 01:01:08,240 Speaker 2: Poly markets and calshi and all those things. It's not 957 01:01:08,320 --> 01:01:11,640 Speaker 2: the bond market. It's not one hundred and something trillion dollars. 958 01:01:12,080 --> 01:01:13,880 Speaker 2: It's a couple of bucks on each of these, and 959 01:01:13,920 --> 01:01:16,960 Speaker 2: sometimes special million dollars move them. Half a million dollars 960 01:01:17,000 --> 01:01:17,600 Speaker 2: can move the market. 961 01:01:17,680 --> 01:01:19,160 Speaker 3: But for a lot of things. Let's say you're polling 962 01:01:19,200 --> 01:01:22,040 Speaker 3: for a political candidate, right, you can't ask them to 963 01:01:22,040 --> 01:01:25,480 Speaker 3: put money down on something, but you really want to 964 01:01:25,520 --> 01:01:29,960 Speaker 3: know these these policies that my candidate is thinking about 965 01:01:30,200 --> 01:01:33,920 Speaker 3: taking on. Is that popular, does it resonate? Is it 966 01:01:33,920 --> 01:01:37,920 Speaker 3: a compelling story? We can absolutely see if that is 967 01:01:38,000 --> 01:01:42,000 Speaker 3: true by looking at local media, local social media, all 968 01:01:42,040 --> 01:01:45,840 Speaker 3: of that. So it's it's pretty wild bare. I mean, 969 01:01:45,880 --> 01:01:50,720 Speaker 3: once you start looking at how important stories are, and 970 01:01:50,760 --> 01:01:53,000 Speaker 3: once you've got a tool where you can actually measure 971 01:01:53,040 --> 01:01:57,440 Speaker 3: them and visualize them, it's like it's like, I feel 972 01:01:57,440 --> 01:02:00,520 Speaker 3: like it was like when they invented whoever was oven hook, 973 01:02:00,560 --> 01:02:04,040 Speaker 3: whoever vinded the microscope, When you're actually to see something 974 01:02:04,080 --> 01:02:06,400 Speaker 3: that we all know is there, Well, now we do. 975 01:02:06,520 --> 01:02:09,880 Speaker 2: Back then, remember jump theory took what a century to 976 01:02:09,920 --> 01:02:10,280 Speaker 2: catch up? 977 01:02:10,480 --> 01:02:12,560 Speaker 3: It took a century, And it takes it takes seeing 978 01:02:12,600 --> 01:02:15,240 Speaker 3: it right, It takes the instrument to actually measure it 979 01:02:15,280 --> 01:02:19,320 Speaker 3: before you actually believe in it. And so I feel 980 01:02:19,320 --> 01:02:21,080 Speaker 3: like that's kind of where we are right now. It's 981 01:02:21,120 --> 01:02:24,320 Speaker 3: these early days. But to actually see and measure the 982 01:02:24,360 --> 01:02:28,960 Speaker 3: storytelling at this level of resolution, this magnification, it's pretty 983 01:02:28,960 --> 01:02:29,600 Speaker 3: freaking cool. 984 01:02:29,720 --> 01:02:32,400 Speaker 2: So so we're having this conversation with market at all 985 01:02:32,480 --> 01:02:37,240 Speaker 2: time highs and you've written about the ravine. Yes, tell 986 01:02:37,320 --> 01:02:40,040 Speaker 2: us a little bit about what is the ravine? How 987 01:02:40,080 --> 01:02:46,120 Speaker 2: does does your data identify that? Tell us what this means? 988 01:02:46,840 --> 01:02:50,120 Speaker 3: Well, there's there's there's clearly been a change in policy 989 01:02:50,320 --> 01:02:53,680 Speaker 3: regime out of Washington and. 990 01:02:54,120 --> 01:02:56,680 Speaker 2: New administration and are radically different. 991 01:02:56,480 --> 01:02:57,720 Speaker 3: Radically different. 992 01:02:57,640 --> 01:03:00,160 Speaker 2: Even from the first Trump president. 993 01:03:00,160 --> 01:03:04,880 Speaker 3: From the first presidency. And so what we're able to measure, 994 01:03:05,240 --> 01:03:09,840 Speaker 3: really measure is how does that i'll say play, But 995 01:03:09,960 --> 01:03:17,760 Speaker 3: also what comes narratives never happened in a vacuum. There's 996 01:03:17,800 --> 01:03:21,400 Speaker 3: always a counter story. For every bull story, there's a 997 01:03:21,400 --> 01:03:25,600 Speaker 3: bear story, and they they they you can almost kind 998 01:03:25,600 --> 01:03:29,360 Speaker 3: of see the battlefield of ideas, the battlefield of stories 999 01:03:29,440 --> 01:03:35,880 Speaker 3: and how it emerges. So my strong sense, Parry, is 1000 01:03:35,920 --> 01:03:41,680 Speaker 3: that we are going towards politically in this country, towards 1001 01:03:43,080 --> 01:03:48,160 Speaker 3: trench warfare greater and greater. I'll call it narrative violence. 1002 01:03:50,160 --> 01:03:53,600 Speaker 3: And it if you look historically how that plays out 1003 01:03:53,640 --> 01:03:56,800 Speaker 3: in countries, it doesn't It doesn't play out well, the. 1004 01:03:56,880 --> 01:04:00,000 Speaker 2: Civil war, domestic political violence, things like that. 1005 01:04:01,280 --> 01:04:06,520 Speaker 3: Sadly, yes, exactly like that, exactly like that. So there's 1006 01:04:06,680 --> 01:04:09,640 Speaker 3: that element, and so that's a that's a sad one 1007 01:04:10,000 --> 01:04:12,640 Speaker 3: or a very troubling one and trying to well, how 1008 01:04:12,640 --> 01:04:16,800 Speaker 3: do we navigate that? But even when in markets, and 1009 01:04:16,840 --> 01:04:20,320 Speaker 3: I alluded to this earlier, how capital markets have become 1010 01:04:20,360 --> 01:04:27,160 Speaker 3: a political utility, uh, and the role of markets in 1011 01:04:27,200 --> 01:04:30,600 Speaker 3: our society, you can really see how that changes in 1012 01:04:30,680 --> 01:04:34,480 Speaker 3: the stories we tell ourselves about the role of markets, 1013 01:04:34,520 --> 01:04:35,680 Speaker 3: what it's there for. 1014 01:04:37,680 --> 01:04:40,600 Speaker 2: And I end of day options in speculation, what else 1015 01:04:40,680 --> 01:04:41,720 Speaker 2: is it supposed to be there for? 1016 01:04:44,160 --> 01:04:47,240 Speaker 3: But that but that's kind of what I'm getting at, right. 1017 01:04:47,720 --> 01:04:52,520 Speaker 3: You can see an enormous change in the meaning of markets, 1018 01:04:53,240 --> 01:04:57,080 Speaker 3: and it connects with a yes, they'll call it the 1019 01:04:57,120 --> 01:05:02,080 Speaker 3: speculation layer. But it also connects with financial needs. Yolo 1020 01:05:04,240 --> 01:05:12,080 Speaker 3: it it leads to a very I think, less attractive 1021 01:05:12,200 --> 01:05:17,640 Speaker 3: future for how we think about money and the role 1022 01:05:17,720 --> 01:05:20,600 Speaker 3: of markets and the role of capitalism. 1023 01:05:20,640 --> 01:05:25,080 Speaker 2: Coming up, we continue our conversation with Ben Hunt, president 1024 01:05:25,280 --> 01:05:30,240 Speaker 2: and co founder of person and discussing how money managers 1025 01:05:30,720 --> 01:05:35,560 Speaker 2: use their output to generate alpha. I'm Barry Ridults. You're 1026 01:05:35,600 --> 01:05:57,520 Speaker 2: listening to Masters in Business on Bloomberg Radio. I'm Barry Ridults. 1027 01:05:57,560 --> 01:06:00,640 Speaker 2: You're listening to Masters in Business on Bloomberg Radio. My 1028 01:06:00,840 --> 01:06:04,360 Speaker 2: extra special guest today is Ben Hunt. He is president 1029 01:06:04,440 --> 01:06:11,480 Speaker 2: and co founder of Percient, a data analytics, narrative storytelling 1030 01:06:12,160 --> 01:06:17,240 Speaker 2: large language model using artificial intelligence to find some signal 1031 01:06:17,760 --> 01:06:23,160 Speaker 2: amongst the noise. Firm their clients range everything from large 1032 01:06:23,880 --> 01:06:28,720 Speaker 2: money managers to hedge funds, to academics and corporate America. 1033 01:06:29,040 --> 01:06:33,240 Speaker 2: So you write epsilon theory, and some of it is 1034 01:06:33,400 --> 01:06:37,120 Speaker 2: for subscribers, some of it is public. One of the 1035 01:06:37,120 --> 01:06:43,560 Speaker 2: things you wrote is absolutely my favorite item from the 1036 01:06:43,600 --> 01:06:48,200 Speaker 2: first few months of the Trump presidency because I like 1037 01:06:48,240 --> 01:06:50,680 Speaker 2: to talk in probabilities because I don't know what's going 1038 01:06:50,760 --> 01:06:53,920 Speaker 2: to happen. But here's the best case scenario, here's the 1039 01:06:53,960 --> 01:06:57,800 Speaker 2: worst case here's all the middle scenarios. You wrote a 1040 01:06:57,840 --> 01:07:02,000 Speaker 2: piece the end of Paksamariana that I thought was the 1041 01:07:02,000 --> 01:07:08,120 Speaker 2: most cogent, imaginative, well thought out. Hey, here's the worst 1042 01:07:08,120 --> 01:07:15,440 Speaker 2: case scenario, and we are becoming dangerously flirting with this 1043 01:07:15,560 --> 01:07:18,720 Speaker 2: possible outcome, and I use that as all right, So 1044 01:07:18,760 --> 01:07:21,720 Speaker 2: here's what I think is high probability. Here's the best case. 1045 01:07:22,120 --> 01:07:24,480 Speaker 2: But if you really want to think about how this 1046 01:07:24,560 --> 01:07:27,280 Speaker 2: can go off the rails, check out what Ben wrote. 1047 01:07:27,680 --> 01:07:30,800 Speaker 2: And so tell us a little bit about how did 1048 01:07:30,960 --> 01:07:37,400 Speaker 2: Persian inform the end of Pax Americana, Because I get 1049 01:07:37,440 --> 01:07:42,720 Speaker 2: the sense that domestically that wasn't really where a how 1050 01:07:42,720 --> 01:07:44,920 Speaker 2: a lot of people were thinking, but I got the 1051 01:07:44,960 --> 01:07:48,520 Speaker 2: sense from overseas that was a much more common thought. 1052 01:07:48,760 --> 01:07:51,040 Speaker 2: Tell us a little bit about both the peace and 1053 01:07:51,440 --> 01:07:53,680 Speaker 2: how the data informed it. 1054 01:07:55,360 --> 01:07:58,680 Speaker 3: I'll start with I'll call it the international dimension. I 1055 01:07:58,680 --> 01:08:00,880 Speaker 3: want to come back to the domestics, and I think 1056 01:08:00,880 --> 01:08:06,120 Speaker 3: we've got some really interesting data recently to share. It 1057 01:08:06,160 --> 01:08:09,400 Speaker 3: goes back to this again. I cringe whiz I talk about, 1058 01:08:09,440 --> 01:08:12,280 Speaker 3: but but it is. It is game theory, right, and 1059 01:08:12,400 --> 01:08:17,640 Speaker 3: all game theory is is strategic interaction. It's that the 1060 01:08:18,160 --> 01:08:22,200 Speaker 3: United States, Yes, is the most powerful player on the 1061 01:08:22,240 --> 01:08:28,240 Speaker 3: world stage. But every country has some degrees of freedom 1062 01:08:28,280 --> 01:08:31,320 Speaker 3: and some autonomy in there in the policies that they 1063 01:08:31,360 --> 01:08:35,439 Speaker 3: that they implement, there is not a dominant strategy, meaning 1064 01:08:35,920 --> 01:08:39,880 Speaker 3: an outcome, an equilibrium outcome. Again, I hate using these words, 1065 01:08:39,920 --> 01:08:42,880 Speaker 3: but I'll use them anyway. All an equilibrium means is 1066 01:08:43,240 --> 01:08:47,920 Speaker 3: it's a balancing point where both parties, let's call them 1067 01:08:48,360 --> 01:08:52,880 Speaker 3: the United States and a European country, where do they 1068 01:08:53,000 --> 01:08:54,760 Speaker 3: end up where they all say, okay, I can live 1069 01:08:54,800 --> 01:08:57,840 Speaker 3: with this, Where I can live with this, and the 1070 01:08:57,920 --> 01:09:00,720 Speaker 3: America first set of policies them is all these set 1071 01:09:00,760 --> 01:09:07,440 Speaker 3: of policies. What you end up with is less potential, 1072 01:09:07,880 --> 01:09:13,200 Speaker 3: economic growth, trade, all all of these things. 1073 01:09:12,800 --> 01:09:15,160 Speaker 2: All the things that we enjoyed post World War two. 1074 01:09:15,760 --> 01:09:19,559 Speaker 2: That realignment that im due to our benefits so. 1075 01:09:19,520 --> 01:09:25,240 Speaker 3: Great enormously advantage, enormously to our advantage. It goes by 1076 01:09:25,240 --> 01:09:28,879 Speaker 3: the name of soft power. And you know the dollar system, 1077 01:09:29,080 --> 01:09:30,080 Speaker 3: the breton Wood system. 1078 01:09:30,160 --> 01:09:32,760 Speaker 2: All of this serve currency on and on the. 1079 01:09:32,720 --> 01:09:35,680 Speaker 3: Reserve currency to finance our definity. All of this is 1080 01:09:35,880 --> 01:09:39,360 Speaker 3: enormously to our advantage. And yes, there are free riders 1081 01:09:39,400 --> 01:09:42,920 Speaker 3: on that system. Yes there are costs to that, particularly 1082 01:09:42,920 --> 01:09:46,360 Speaker 3: on defense and some other some other areas. Tariffs are 1083 01:09:46,360 --> 01:09:49,439 Speaker 3: another area where there's absolutely free riders on that. So 1084 01:09:49,720 --> 01:09:52,759 Speaker 3: there's improvement that can be made in that system, for sure, 1085 01:09:53,720 --> 01:09:57,479 Speaker 3: But this is not This is a different system. This 1086 01:09:57,600 --> 01:09:59,559 Speaker 3: is a different set of rules of the road, and 1087 01:09:59,600 --> 01:10:03,679 Speaker 3: it leads to a different strategic interaction between between countries. 1088 01:10:04,280 --> 01:10:08,719 Speaker 3: So that's what the note was about, and it's it's 1089 01:10:09,600 --> 01:10:11,960 Speaker 3: I'm not trying to predict. I'm just I'm just trying 1090 01:10:11,960 --> 01:10:16,400 Speaker 3: to observe, which is a great line actually by George Sorows, 1091 01:10:16,400 --> 01:10:19,880 Speaker 3: which is say, I'm not predicting, I'm observing, which I 1092 01:10:19,960 --> 01:10:23,960 Speaker 3: love that as a line. I'm not and our technology 1093 01:10:24,000 --> 01:10:26,559 Speaker 3: is not there to try to predict the future, is 1094 01:10:26,560 --> 01:10:29,439 Speaker 3: trying to tell you what is true in the present today. 1095 01:10:31,040 --> 01:10:33,640 Speaker 2: I'm amazed how many people think they can forecast the 1096 01:10:33,640 --> 01:10:35,839 Speaker 2: future when they have no idea what's happening. 1097 01:10:35,960 --> 01:10:38,080 Speaker 3: I want to I want to now cast is what 1098 01:10:38,120 --> 01:10:40,519 Speaker 3: I want to do, not not forecast. I want to 1099 01:10:40,560 --> 01:10:44,360 Speaker 3: now cast. And I guess what I wanted to talk 1100 01:10:44,400 --> 01:10:51,360 Speaker 3: about domestic. So we started tracking the different narratives around 1101 01:10:51,520 --> 01:10:57,559 Speaker 3: immigration policy early last year, and what you saw in 1102 01:10:57,600 --> 01:10:58,759 Speaker 3: some of the kind of simple. 1103 01:10:59,320 --> 01:11:01,480 Speaker 2: This is during the run up to the election. 1104 01:11:01,360 --> 01:11:04,760 Speaker 3: Yes, exactly, and what you saw in really going back 1105 01:11:04,840 --> 01:11:06,760 Speaker 3: because we can take we take this stuff back for 1106 01:11:06,880 --> 01:11:12,000 Speaker 3: a decade or more. And once you absolutely saw up 1107 01:11:12,040 --> 01:11:15,400 Speaker 3: to last October. There's a there was an event from 1108 01:11:15,520 --> 01:11:18,559 Speaker 3: last October. Not the election, but an event from last October. 1109 01:11:19,120 --> 01:11:24,280 Speaker 3: You see a steady increase in regardless of your political affiliation. 1110 01:11:25,040 --> 01:11:29,640 Speaker 3: You know what immigration isn't working for us, isn't as Americans, 1111 01:11:29,920 --> 01:11:37,679 Speaker 3: So a steady increase starting with the they're eating the cats. 1112 01:11:37,680 --> 01:11:40,400 Speaker 2: They're eating the cats, They're eating their dogs, they're eating. 1113 01:11:40,160 --> 01:11:45,720 Speaker 3: The dogs, the Columbus mode when you that that moment. 1114 01:11:45,680 --> 01:11:49,759 Speaker 2: One of the most that's the real moments in debate history. 1115 01:11:49,840 --> 01:11:55,280 Speaker 3: And we see very clearly in our media data, and 1116 01:11:55,320 --> 01:11:58,360 Speaker 3: this is this is not mainstream media. This is everything 1117 01:11:58,720 --> 01:12:04,880 Speaker 3: that we're pulling up. We've seen a really significant decline 1118 01:12:05,160 --> 01:12:09,600 Speaker 3: in the volume and density of oh my god, immigration 1119 01:12:09,760 --> 01:12:13,160 Speaker 3: is a problem, we need mass deportations. On the contrary, 1120 01:12:13,240 --> 01:12:17,320 Speaker 3: we've seen an enormous increase again regardless of political affiliation, 1121 01:12:17,439 --> 01:12:22,400 Speaker 3: including Republicans. No immigration is a good thing for this country. 1122 01:12:23,560 --> 01:12:27,879 Speaker 3: The stories of America that are pro immigrant in immigration 1123 01:12:28,240 --> 01:12:31,120 Speaker 3: now you would not believe that if you were looking 1124 01:12:31,160 --> 01:12:35,040 Speaker 3: at the policies that the White House has implemented during 1125 01:12:35,080 --> 01:12:38,080 Speaker 3: these first you know, ten months of the administration. 1126 01:12:38,400 --> 01:12:45,280 Speaker 2: So wait, when did you first notice that, regardless of policy, 1127 01:12:45,320 --> 01:12:48,120 Speaker 2: we think immigration is a net benefit to America. 1128 01:12:48,160 --> 01:12:50,560 Speaker 3: When did that first start showing up? It started changing 1129 01:12:51,040 --> 01:12:54,320 Speaker 3: right in October because it was like a breach too far. 1130 01:12:54,439 --> 01:12:57,240 Speaker 3: It was like, this is just stupid. This is stilly 1131 01:12:57,280 --> 01:13:01,000 Speaker 3: and stupid, and it's been a steady reece, you know, 1132 01:13:01,080 --> 01:13:03,280 Speaker 3: and you wouldn't you wouldn't believe that if you were 1133 01:13:03,760 --> 01:13:08,960 Speaker 3: immersed in Twitter. But so this country is actually quite 1134 01:13:10,439 --> 01:13:11,400 Speaker 3: pro immigrant. 1135 01:13:12,600 --> 01:13:15,560 Speaker 2: I know that sounds but we are a nation of immigrants. 1136 01:13:16,000 --> 01:13:19,280 Speaker 2: Here here's another data point that's kind of mind blowing. 1137 01:13:19,320 --> 01:13:23,000 Speaker 2: We were talking about a different data point in the 1138 01:13:23,200 --> 01:13:27,080 Speaker 2: entirety of the US history. Twenty twenty five looks like 1139 01:13:27,200 --> 01:13:31,400 Speaker 2: the first year where the US population will decrease. 1140 01:13:32,040 --> 01:13:33,840 Speaker 3: So it's a decrease. 1141 01:13:33,360 --> 01:13:38,000 Speaker 2: In legal immigrants, not not only illegal immigrants, but legal immigrants. 1142 01:13:38,200 --> 01:13:41,320 Speaker 2: Add that with the deportations and just people staying away. 1143 01:13:41,479 --> 01:13:45,200 Speaker 3: Well, this gets back to the economic picture and from others. 1144 01:13:45,240 --> 01:13:48,600 Speaker 3: So what you've had is a clear and again we 1145 01:13:49,120 --> 01:13:52,479 Speaker 3: see this in our data from other countries, there's a 1146 01:13:52,560 --> 01:13:57,799 Speaker 3: clear effort to repatriate assets and funds away from the US. 1147 01:13:58,280 --> 01:14:01,800 Speaker 3: There's been a clear effort outside the US. You see 1148 01:14:01,920 --> 01:14:04,719 Speaker 3: this was central banks, but also with course hence gold, 1149 01:14:04,840 --> 01:14:10,360 Speaker 3: the gold gold treasuries used to be a safe haven asset. 1150 01:14:11,280 --> 01:14:13,439 Speaker 3: No more. That's that's it's if. 1151 01:14:13,439 --> 01:14:18,120 Speaker 2: You understand how significant that charge is that you're making. 1152 01:14:18,560 --> 01:14:23,560 Speaker 2: You're basically accusing the president well of submarining one of 1153 01:14:23,600 --> 01:14:25,639 Speaker 2: the single greatest assets America. 1154 01:14:25,680 --> 01:14:29,280 Speaker 3: Halts well that it's a it is built up to 1155 01:14:29,320 --> 01:14:31,880 Speaker 3: such an extent, right that the way I think of 1156 01:14:31,960 --> 01:14:35,679 Speaker 3: this is an iceberg that is melting. But it is melting. 1157 01:14:36,200 --> 01:14:40,519 Speaker 3: What we do not see we do not see capital flight. Right. 1158 01:14:40,560 --> 01:14:43,240 Speaker 2: We saw it for like a week in April, and 1159 01:14:43,240 --> 01:14:44,559 Speaker 2: then that we reversed. 1160 01:14:44,560 --> 01:14:46,639 Speaker 3: That was it. So there's there. There is there's no 1161 01:14:46,680 --> 01:14:50,599 Speaker 3: capital flight there, there are no leaving the country. There 1162 01:14:50,600 --> 01:14:52,679 Speaker 3: were fierce and we can track it. If that starts 1163 01:14:52,680 --> 01:14:57,600 Speaker 3: to happen, we'll see it immediately. That's not happening. Repatriation 1164 01:14:57,760 --> 01:15:03,840 Speaker 3: continues to happen mm hmm, slowly measured balanced melting iceberg. Right, 1165 01:15:03,880 --> 01:15:07,960 Speaker 3: Because there are limits to if you're a Bermuda reinsure. 1166 01:15:08,040 --> 01:15:10,920 Speaker 3: I mean you're kind of you you're stuck with treasury, right. 1167 01:15:10,920 --> 01:15:13,840 Speaker 2: I mean that you could buy some gold to offset it, 1168 01:15:13,920 --> 01:15:16,400 Speaker 2: but you can't sell one hundred billion dollars worth of treasuries. 1169 01:15:16,600 --> 01:15:18,920 Speaker 3: You can't. You can't. So it's a melting iceberg. But 1170 01:15:19,120 --> 01:15:22,599 Speaker 3: we can clearly see that the dog is not barking 1171 01:15:22,920 --> 01:15:27,240 Speaker 3: yet and maybe never will. Is capital flight. What we 1172 01:15:27,360 --> 01:15:30,640 Speaker 3: see in politically domestically is that actually, and this was 1173 01:15:30,760 --> 01:15:33,160 Speaker 3: validated by a gallop pool they've been doing twenty years, 1174 01:15:33,840 --> 01:15:37,240 Speaker 3: which also showed what we had saw started seeing a 1175 01:15:37,320 --> 01:15:44,719 Speaker 3: lot earlier, where immigration as a thing is actually pretty 1176 01:15:44,800 --> 01:15:46,759 Speaker 3: darn popular in the United States. 1177 01:15:47,160 --> 01:15:51,240 Speaker 2: Yeah, but so so is restricting assault rifles. And we 1178 01:15:51,280 --> 01:15:54,839 Speaker 2: can't get any change on that. I think sevent. 1179 01:15:56,439 --> 01:15:59,040 Speaker 3: Or the Act. I gotta tell you, I gotta tell 1180 01:15:59,080 --> 01:16:02,240 Speaker 3: you that depends very much and how that question is asked. Well, 1181 01:16:02,280 --> 01:16:04,960 Speaker 3: that's true for all polling. Well, this is the benefit 1182 01:16:05,040 --> 01:16:07,320 Speaker 3: of what I'm tied. So so when we're doing these 1183 01:16:07,400 --> 01:16:10,439 Speaker 3: semantic we call the semantic signatures, it has nothing to 1184 01:16:10,520 --> 01:16:13,679 Speaker 3: do with how you're phrasing the question. We're not doing polling. 1185 01:16:14,120 --> 01:16:18,080 Speaker 3: We're seeing what people are actually the meaning of what 1186 01:16:18,200 --> 01:16:20,000 Speaker 3: they're talking about in media. 1187 01:16:20,240 --> 01:16:25,080 Speaker 2: So how does that play out if people are legitimately 1188 01:16:25,200 --> 01:16:29,000 Speaker 2: saying no, immigration is a good thing for America. Is 1189 01:16:29,080 --> 01:16:32,320 Speaker 2: there an impact on population and the economy. Is there 1190 01:16:32,320 --> 01:16:35,120 Speaker 2: an impact on markets? Is there an impact on policy 1191 01:16:35,160 --> 01:16:35,719 Speaker 2: and politics? 1192 01:16:35,760 --> 01:16:38,520 Speaker 3: I think there's an impact on the election the midterms 1193 01:16:38,600 --> 01:16:39,040 Speaker 3: next year. 1194 01:16:40,439 --> 01:16:45,000 Speaker 2: So we're we're talking literally twelve months from now. 1195 01:16:46,040 --> 01:16:50,759 Speaker 3: Yeah, because that's how this stuff gets on the political front. 1196 01:16:52,000 --> 01:16:56,240 Speaker 3: Narratives and opinions, they get cashed out in elections, right, 1197 01:16:56,439 --> 01:16:58,760 Speaker 3: Mark is a different You cash stuff out every day, 1198 01:16:59,120 --> 01:17:03,080 Speaker 3: right the market. That loop is so rapid with markets exactly. 1199 01:17:04,240 --> 01:17:07,200 Speaker 3: Politics is a different story. So hedy gets cashed out 1200 01:17:07,240 --> 01:17:07,679 Speaker 3: in the election. 1201 01:17:08,080 --> 01:17:11,439 Speaker 2: If you were to ask me before this conversation, what's 1202 01:17:11,479 --> 01:17:15,519 Speaker 2: the most significant impact on the midterm elections? There was 1203 01:17:15,680 --> 01:17:21,439 Speaker 2: just a Gallop poll yesterday, Gop questions on the economy. 1204 01:17:21,520 --> 01:17:25,040 Speaker 2: They were plus fourteen percent two years ago. They're minus 1205 01:17:25,120 --> 01:17:28,320 Speaker 2: four percent this year. I saw that, and that's an 1206 01:17:28,520 --> 01:17:32,960 Speaker 2: amazing swing. And I'm I'm saying to myself, you know, listen, 1207 01:17:33,080 --> 01:17:36,800 Speaker 2: the out of power party usually picks up, you know, 1208 01:17:36,960 --> 01:17:40,679 Speaker 2: ten to fifteen seats. You have the redistricting question, which 1209 01:17:40,760 --> 01:17:44,800 Speaker 2: may blunt that. But if the most important question during 1210 01:17:44,800 --> 01:17:47,719 Speaker 2: the election was on the economy and that's an eighteen 1211 01:17:47,760 --> 01:17:51,559 Speaker 2: percent swing, this is looking like a pretty substantial shift. 1212 01:17:51,680 --> 01:17:54,439 Speaker 2: What I'm hearing from you is, hey, this isn't just 1213 01:17:54,520 --> 01:17:58,280 Speaker 2: about the economy. It's not It's not so it's the economy, 1214 01:17:58,320 --> 01:17:58,959 Speaker 2: it's immigration. 1215 01:17:59,160 --> 01:17:59,559 Speaker 3: What else? 1216 01:18:00,320 --> 01:18:01,599 Speaker 2: What else are you seeing? That's interest? 1217 01:18:01,680 --> 01:18:04,040 Speaker 3: That's that's that's pretty now. There are other aspects though, 1218 01:18:04,240 --> 01:18:11,400 Speaker 3: where the stories it This is, this is people talking 1219 01:18:11,439 --> 01:18:14,080 Speaker 3: about immigration is the thing. Now, Whether now whether that 1220 01:18:14,200 --> 01:18:17,559 Speaker 3: gets translated into a political party position, because I got 1221 01:18:17,640 --> 01:18:22,320 Speaker 3: to tell you, the Democratic Party is enormously there. There 1222 01:18:22,320 --> 01:18:25,120 Speaker 3: are no there are no narratives that are being put 1223 01:18:25,200 --> 01:18:29,840 Speaker 3: forward by the Democrats that are powerful or or or 1224 01:18:30,000 --> 01:18:30,800 Speaker 3: or popular, so. 1225 01:18:30,960 --> 01:18:34,120 Speaker 2: I think, other than the mayoral contest in New York. 1226 01:18:34,439 --> 01:18:37,800 Speaker 3: Yeah, yeah, right, what is one. What I'm saying is 1227 01:18:37,840 --> 01:18:41,760 Speaker 3: that the is that the many of the policies that 1228 01:18:41,840 --> 01:18:47,680 Speaker 3: are being presented by this administration are unpopular, not just 1229 01:18:47,760 --> 01:18:53,800 Speaker 3: with Democrats but with Republicans and increasingly so, immigration being 1230 01:18:53,880 --> 01:18:58,320 Speaker 3: one of them, I think economy being another one. 1231 01:18:58,640 --> 01:19:03,160 Speaker 2: Tariffs aren't really popular amongst the but people perceive that 1232 01:19:03,240 --> 01:19:06,840 Speaker 2: as a tax increase or or a pressure on small business. 1233 01:19:07,400 --> 01:19:10,640 Speaker 3: Government shut down, right, So both calls and also in 1234 01:19:10,680 --> 01:19:13,800 Speaker 3: this it's it's it's that because the question, oh, well, 1235 01:19:13,840 --> 01:19:16,080 Speaker 3: the Democrats will get blamed, Well, that's not really true. 1236 01:19:16,120 --> 01:19:20,759 Speaker 3: That's not what's happening. Now, how that all pays the way. 1237 01:19:22,160 --> 01:19:26,519 Speaker 2: You're not a heart You're not a Democrat. God, like, 1238 01:19:26,960 --> 01:19:30,160 Speaker 2: I know you and your politics and the stuff you're saying. 1239 01:19:30,960 --> 01:19:33,000 Speaker 2: I could hear people shrugging and saying, well that Ben 1240 01:19:33,120 --> 01:19:35,880 Speaker 2: Hunt is just a liberal Democrat. I'm like, no, no, no, 1241 01:19:36,040 --> 01:19:39,840 Speaker 2: that's not who Ben is. No, no, it's not You're 1242 01:19:39,960 --> 01:19:41,960 Speaker 2: just talking about here's what I see in the data. 1243 01:19:42,120 --> 01:19:45,800 Speaker 3: I'm saying. I'm observing. I'm observing, And I think that 1244 01:19:45,880 --> 01:19:48,920 Speaker 3: a lot of times, if we are, like me, very 1245 01:19:49,040 --> 01:19:54,680 Speaker 3: online and on Twitter too much, you don't see the 1246 01:19:54,800 --> 01:20:01,280 Speaker 3: broader picture of what is happening on blog posts and 1247 01:20:01,600 --> 01:20:05,760 Speaker 3: local newspapers and and and everything else. So I think 1248 01:20:06,000 --> 01:20:14,000 Speaker 3: having the ability to read everything and track these stories 1249 01:20:15,240 --> 01:20:19,640 Speaker 3: that you know, they don't they they like say they 1250 01:20:19,720 --> 01:20:22,120 Speaker 3: wax and wane, but they don't. They don't ever go away. 1251 01:20:23,560 --> 01:20:27,240 Speaker 3: It's it's it's something we're very excited about to track 1252 01:20:27,320 --> 01:20:31,400 Speaker 3: the stories of America I'll give you another one, and 1253 01:20:31,439 --> 01:20:35,760 Speaker 3: this one, actually I don't know what to do with this. 1254 01:20:37,160 --> 01:20:40,679 Speaker 3: One of the stories of America is that America has 1255 01:20:43,600 --> 01:20:49,760 Speaker 3: raised the world's wealth, right, that that global standard of living, 1256 01:20:49,800 --> 01:20:52,439 Speaker 3: global center of living, That that America has been this 1257 01:20:52,600 --> 01:20:57,000 Speaker 3: powerful force to and capitalism in America have been this 1258 01:20:57,120 --> 01:21:00,720 Speaker 3: powerful force to raise people out of poverty. That's a 1259 01:21:00,800 --> 01:21:05,720 Speaker 3: story that in other periods of time has been out, 1260 01:21:05,760 --> 01:21:10,120 Speaker 3: that has been prominently talked about, very resonant. That story 1261 01:21:10,240 --> 01:21:11,559 Speaker 3: is non existent today. 1262 01:21:11,800 --> 01:21:15,599 Speaker 2: Well, you cut things like HIV a few million dollars 1263 01:21:15,840 --> 01:21:20,200 Speaker 2: to inoculate all of Africa from HIV, that's going to 1264 01:21:20,280 --> 01:21:21,160 Speaker 2: have repercussions. 1265 01:21:21,280 --> 01:21:26,240 Speaker 3: Well, it's it's not just that that that has the 1266 01:21:26,400 --> 01:21:30,600 Speaker 3: conquer those repercussions that you're describing. What I'm saying is 1267 01:21:30,760 --> 01:21:37,080 Speaker 3: that the story that would support that it's gone. 1268 01:21:37,960 --> 01:21:39,280 Speaker 2: Is it gone forever? No? 1269 01:21:39,880 --> 01:21:42,799 Speaker 3: No, doesn't. These things are never gone. These stories stories 1270 01:21:42,880 --> 01:21:46,840 Speaker 3: have they never die, right, They're they're they're waiting for 1271 01:21:47,880 --> 01:21:50,560 Speaker 3: a new force to give them this version of it. 1272 01:21:50,880 --> 01:21:53,880 Speaker 2: If US is not seen as a force for raising 1273 01:21:53,960 --> 01:21:58,000 Speaker 2: people out of poverty around the world, do people see 1274 01:21:58,120 --> 01:21:59,280 Speaker 2: China as that force? 1275 01:22:00,640 --> 01:22:02,240 Speaker 3: I haven't looked at that we've been looking at the 1276 01:22:02,520 --> 01:22:05,120 Speaker 3: US story. But I but I my my personal seness 1277 01:22:05,400 --> 01:22:13,160 Speaker 3: is that we've essentially seeded the field and a particularly 1278 01:22:13,280 --> 01:22:17,160 Speaker 3: in South Asia and Africa. Uh huh to to to China. 1279 01:22:18,040 --> 01:22:20,640 Speaker 3: I mean that that seems that seems pretty you know, 1280 01:22:20,840 --> 01:22:21,320 Speaker 3: pretty clear. 1281 01:22:21,439 --> 01:22:23,560 Speaker 2: This is I'm glad I asked the question about the 1282 01:22:23,680 --> 01:22:27,599 Speaker 2: end of Past Americana because I've just found that piece 1283 01:22:27,760 --> 01:22:30,840 Speaker 2: so insightful and so useful, and I could keep you 1284 01:22:31,000 --> 01:22:35,040 Speaker 2: here talking about this stuff for hours, but out of 1285 01:22:35,240 --> 01:22:39,439 Speaker 2: deference and respect for our listener's time, I'm going to 1286 01:22:39,560 --> 01:22:42,960 Speaker 2: jump to our favorite questions that I ask, oh all 1287 01:22:43,000 --> 01:22:46,040 Speaker 2: of our all of our guests. But before I do, 1288 01:22:47,000 --> 01:22:50,920 Speaker 2: there's a question that I have to pose to you, 1289 01:22:51,760 --> 01:22:56,479 Speaker 2: which is, so you get to see things before they 1290 01:22:56,560 --> 01:22:59,800 Speaker 2: sort of bubble up into the mainstream, before they be 1291 01:23:00,040 --> 01:23:04,240 Speaker 2: I'm a well understood narrative. What's a narrative that you're 1292 01:23:04,520 --> 01:23:08,679 Speaker 2: just starting to sniff out that most of the world, 1293 01:23:08,840 --> 01:23:11,519 Speaker 2: or most of the country, or most of Wall Street 1294 01:23:11,600 --> 01:23:14,000 Speaker 2: and Finance hasn't seen yet. 1295 01:23:15,920 --> 01:23:21,560 Speaker 3: It's a story that people are talking about since Tricolor 1296 01:23:21,840 --> 01:23:26,639 Speaker 3: and first Brands went out. So it's the story on credit, 1297 01:23:27,520 --> 01:23:28,920 Speaker 3: story of credit, and. 1298 01:23:29,360 --> 01:23:33,919 Speaker 2: I mean first Brands is legitimate. It sounds like felonies 1299 01:23:33,960 --> 01:23:35,839 Speaker 2: were happening there, so we'll have to see. 1300 01:23:35,680 --> 01:23:41,320 Speaker 3: What tricolor as well. But my point is that every 1301 01:23:41,439 --> 01:23:44,880 Speaker 3: day you have a new bank CEO come out and 1302 01:23:44,960 --> 01:23:48,479 Speaker 3: say no, no, everything's fine. So yesterday it was David 1303 01:23:48,560 --> 01:23:52,519 Speaker 3: Solomon at Golden Sacks saying no, no, it's fine. Today 1304 01:23:52,640 --> 01:23:55,920 Speaker 3: I was hearing the Brookfield guys saying no, no, it's fine. 1305 01:23:57,120 --> 01:24:02,320 Speaker 3: For every story, every interview, the guy says it's fine. 1306 01:24:03,040 --> 01:24:06,840 Speaker 3: You're getting two articles in the ft or the journal 1307 01:24:06,920 --> 01:24:08,160 Speaker 3: saying ain't fine. 1308 01:24:09,120 --> 01:24:12,040 Speaker 2: So they think thou dost protest this much. 1309 01:24:12,000 --> 01:24:15,240 Speaker 3: All of that. All I'm saying is I am observing. 1310 01:24:15,840 --> 01:24:19,839 Speaker 3: I'm not predicting. I am observing the level of volume 1311 01:24:20,479 --> 01:24:24,680 Speaker 3: for alternative asset managers, their exposure to private credit. This 1312 01:24:24,800 --> 01:24:28,160 Speaker 3: is problematic. Oh, I'm wondering what's going to happen when 1313 01:24:28,200 --> 01:24:29,040 Speaker 3: the music stops. 1314 01:24:29,520 --> 01:24:29,680 Speaker 1: Oh. 1315 01:24:29,760 --> 01:24:33,120 Speaker 3: I think that these this blows back into the commercial 1316 01:24:33,200 --> 01:24:39,720 Speaker 3: banking system. Those stories, those narratives are higher by an 1317 01:24:39,840 --> 01:24:42,439 Speaker 3: order of magnitude than they've been at any point in 1318 01:24:42,479 --> 01:24:46,439 Speaker 3: the last ten years, at any point including during COVID, 1319 01:24:46,640 --> 01:24:50,000 Speaker 3: during when you had similar concerns over Oh, my god, 1320 01:24:50,040 --> 01:24:51,040 Speaker 3: private credits COVID. 1321 01:24:51,040 --> 01:24:54,040 Speaker 2: It was easy to rationalize everything. Hey listen, we're all frozen. 1322 01:24:54,400 --> 01:24:59,439 Speaker 3: Just but I this is a story that has legs. 1323 01:25:00,280 --> 01:25:03,439 Speaker 3: It is growing in a way that I rarely see. 1324 01:25:04,120 --> 01:25:07,280 Speaker 3: And what's a story like this grows? It doesn't just 1325 01:25:07,400 --> 01:25:11,760 Speaker 3: go away, And it's not fixed by bernanke saying oh, 1326 01:25:11,920 --> 01:25:15,320 Speaker 3: subprime is contained immediately what I thought of as soon 1327 01:25:15,400 --> 01:25:18,760 Speaker 3: as you absolutely it does not get fixed by people 1328 01:25:18,880 --> 01:25:21,080 Speaker 3: saying don't worry, there's no problem. 1329 01:25:21,640 --> 01:25:25,280 Speaker 2: Jim Grant said he was right. It was contained to Earth. 1330 01:25:26,040 --> 01:25:29,200 Speaker 2: The rest of the solar system was fine, which turned 1331 01:25:29,200 --> 01:25:31,640 Speaker 2: out to be a very witty and clever observation. 1332 01:25:31,880 --> 01:25:36,040 Speaker 3: This story, this worry, this concern absolutely has legs. Huh. 1333 01:25:36,320 --> 01:25:41,800 Speaker 3: And we're seeing no signs of it easing off in 1334 01:25:42,160 --> 01:25:43,519 Speaker 3: financial media and press. 1335 01:25:43,680 --> 01:25:47,640 Speaker 2: So the big question is is it systemic or is 1336 01:25:47,720 --> 01:25:48,960 Speaker 2: it the specifics. 1337 01:25:49,320 --> 01:25:52,880 Speaker 3: I don't know the truth. I don't know reality. All 1338 01:25:52,920 --> 01:25:56,679 Speaker 3: I can tell you is we're seeing more of this. Well, 1339 01:25:57,080 --> 01:26:01,200 Speaker 3: we're seeing this is the story really really interesting. 1340 01:26:01,560 --> 01:26:06,120 Speaker 2: Let let's jump to our favorite questions, starting with who 1341 01:26:06,200 --> 01:26:09,280 Speaker 2: are your mentors who helped shape your career? 1342 01:26:10,080 --> 01:26:14,439 Speaker 3: It's good, well, I mentioned that the people both in 1343 01:26:14,560 --> 01:26:17,880 Speaker 3: undergrad and graduate school who turned me on to the 1344 01:26:18,000 --> 01:26:22,599 Speaker 3: science part of political science, and in particular in graduate school, 1345 01:26:22,680 --> 01:26:26,840 Speaker 3: Gary King, who runs the whole social science research center 1346 01:26:26,920 --> 01:26:29,559 Speaker 3: up there at Harvard, and as for a long time now, 1347 01:26:30,040 --> 01:26:33,200 Speaker 3: he wrote the original, really the book on inference. You 1348 01:26:33,280 --> 01:26:34,320 Speaker 3: hear about inference all. 1349 01:26:34,240 --> 01:26:35,960 Speaker 2: The time Aldini, of course. 1350 01:26:36,560 --> 01:26:40,439 Speaker 3: So the notion of inference, taking large data sets and 1351 01:26:40,560 --> 01:26:41,360 Speaker 3: pulling out. 1352 01:26:41,400 --> 01:26:44,120 Speaker 2: Oh, inference. I think you said influence, No, no, no, no, 1353 01:26:44,280 --> 01:26:45,440 Speaker 2: inference inference. 1354 01:26:45,720 --> 01:26:51,360 Speaker 3: Huh So the science of inference. I learned that from Gary, 1355 01:26:51,720 --> 01:26:54,680 Speaker 3: you know, thirty years ago. And it's so interesting to 1356 01:26:54,680 --> 01:26:56,920 Speaker 3: see that come full circle because that's at the core 1357 01:26:57,160 --> 01:27:00,240 Speaker 3: of you know, Jensen Wong is always talking about it 1358 01:27:00,320 --> 01:27:03,439 Speaker 3: and the whole notion of AI to that inference spind 1359 01:27:03,479 --> 01:27:06,000 Speaker 3: do they talk about. I was there at the beginning 1360 01:27:06,880 --> 01:27:10,760 Speaker 3: for how you what inference is and why it is 1361 01:27:10,840 --> 01:27:14,640 Speaker 3: so powerful. That was a huge influence on me. 1362 01:27:15,160 --> 01:27:16,280 Speaker 2: Let's talk about books. 1363 01:27:16,680 --> 01:27:17,760 Speaker 3: What are you reading now? 1364 01:27:17,880 --> 01:27:19,120 Speaker 2: What are some of your favorites. 1365 01:27:19,800 --> 01:27:21,240 Speaker 3: I'm a science fiction guy. 1366 01:27:21,320 --> 01:27:23,960 Speaker 2: Okay, I really talking to another sci fi guy. So 1367 01:27:24,120 --> 01:27:27,200 Speaker 2: let's so I give us something new and something classic. 1368 01:27:27,479 --> 01:27:31,519 Speaker 3: Well, something classic would be Luci Shin and the three 1369 01:27:31,560 --> 01:27:32,320 Speaker 3: Body Problem. 1370 01:27:33,800 --> 01:27:36,680 Speaker 2: The Netflix show on that was surprisingly watchable. 1371 01:27:36,840 --> 01:27:41,000 Speaker 3: I that was excellent before that, And you know, I 1372 01:27:41,080 --> 01:27:44,840 Speaker 3: named a company after him. Was the Foundation Trilogy as 1373 01:27:45,720 --> 01:27:48,320 Speaker 3: those books hold up less well, honestly, And I thought 1374 01:27:48,400 --> 01:27:52,080 Speaker 3: the series was. It had his moments. It was so 1375 01:27:52,200 --> 01:27:52,960 Speaker 3: pretty so so to me. 1376 01:27:53,720 --> 01:27:53,840 Speaker 2: Uh. 1377 01:27:54,680 --> 01:28:00,439 Speaker 3: Current science fiction, Rebecca Goes, she goes by R. F. 1378 01:28:01,120 --> 01:28:06,519 Speaker 3: Quog and she wrote a book called Babbel and she's 1379 01:28:06,520 --> 01:28:09,920 Speaker 3: got a new one out. I like Tower of Babel 1380 01:28:10,320 --> 01:28:12,439 Speaker 3: b A B E L. And she's got a new 1381 01:28:12,479 --> 01:28:15,840 Speaker 3: one out, Catabasis. I think it is. But science fiction 1382 01:28:16,200 --> 01:28:19,240 Speaker 3: dealing with language and linguistics. I love that stuff. 1383 01:28:19,640 --> 01:28:22,439 Speaker 2: That sounds like that's right in your your sweet spot. 1384 01:28:23,120 --> 01:28:26,479 Speaker 2: Let's talk about streaming. What are you watching or listening today? 1385 01:28:26,840 --> 01:28:28,599 Speaker 2: Podcasts or Netflix or whatever. 1386 01:28:28,840 --> 01:28:31,120 Speaker 3: So honestly, I don't listen to any podcasts. Isn't that 1387 01:28:31,240 --> 01:28:31,880 Speaker 3: terrible to admit? 1388 01:28:31,960 --> 01:28:34,080 Speaker 2: Well, if you host, I will say this, if you 1389 01:28:34,200 --> 01:28:38,599 Speaker 2: host a podcast, you're either preparing for a podcast, doing 1390 01:28:38,640 --> 01:28:41,800 Speaker 2: a podcast, or ordering a podcast, right, So it's like, 1391 01:28:42,000 --> 01:28:43,360 Speaker 2: all right, that's three of that. 1392 01:28:43,960 --> 01:28:44,720 Speaker 3: It is right. 1393 01:28:44,840 --> 01:28:47,439 Speaker 2: Every now and then I'll catch something because I want 1394 01:28:47,479 --> 01:28:50,600 Speaker 2: to either listen to this a specific guest like I, 1395 01:28:51,160 --> 01:28:53,800 Speaker 2: I will not listen to any podcasts that you are 1396 01:28:53,880 --> 01:28:56,320 Speaker 2: on before we do our podcast because I don't want 1397 01:28:56,360 --> 01:28:59,960 Speaker 2: to steal anybody else's or narrative or questions you'll line 1398 01:29:00,120 --> 01:29:04,680 Speaker 2: of thinking. So that's purposeful. But every now and then, 1399 01:29:04,840 --> 01:29:08,559 Speaker 2: so I listened to the things I listened to are 1400 01:29:08,720 --> 01:29:14,400 Speaker 2: like John Pizzarelli's Radio Deluxe, which is sort of like 1401 01:29:14,560 --> 01:29:17,560 Speaker 2: a podcast slash music series. 1402 01:29:17,320 --> 01:29:19,680 Speaker 3: Something a little different, exactly right. 1403 01:29:19,920 --> 01:29:21,160 Speaker 2: What about streaming? 1404 01:29:21,200 --> 01:29:23,400 Speaker 3: What do you well? You and I are both big 1405 01:29:23,439 --> 01:29:26,560 Speaker 3: Godfather fans, so we love all the Mobster Have you 1406 01:29:26,640 --> 01:29:27,720 Speaker 3: seen Mobland yet? 1407 01:29:27,880 --> 01:29:28,720 Speaker 2: With no? All? 1408 01:29:28,760 --> 01:29:33,000 Speaker 3: Right? Worth your time? Okay, Pierce Brosnan eats up every scene. 1409 01:29:33,320 --> 01:29:35,560 Speaker 3: Tom Hardy is awfully good. I just thank him in 1410 01:29:35,600 --> 01:29:38,320 Speaker 3: anything if you I don't know if you ever saw 1411 01:29:38,400 --> 01:29:39,920 Speaker 3: Peaky Blinders. 1412 01:29:39,600 --> 01:29:44,960 Speaker 2: Right, I kind of full. So my issue is I 1413 01:29:45,120 --> 01:29:47,480 Speaker 2: have to watch something that my wife tolerates. 1414 01:29:47,600 --> 01:29:50,200 Speaker 3: Like I have friends that's a struggle. 1415 01:29:50,200 --> 01:29:53,040 Speaker 2: Who like he goes into this room, she goes into 1416 01:29:53,080 --> 01:29:56,200 Speaker 2: that room. They don't watch anything together, Like, I don't 1417 01:29:56,240 --> 01:29:59,599 Speaker 2: know how that works. Like she will watch some stuff 1418 01:29:59,640 --> 01:30:03,200 Speaker 2: without me. I will watch some stuff with right, eighty 1419 01:30:03,320 --> 01:30:06,920 Speaker 2: percent of what we watch. So I have fallen down 1420 01:30:07,280 --> 01:30:12,599 Speaker 2: the British Upstairs, downstairs, Gilded Age six, So we watched 1421 01:30:12,640 --> 01:30:15,360 Speaker 2: the Crown, we watched the Gilded Age. We finally let 1422 01:30:15,400 --> 01:30:18,519 Speaker 2: me give you one you want back to Dalton Abbey, 1423 01:30:18,560 --> 01:30:20,400 Speaker 2: which I missed when I first rolled out. 1424 01:30:20,600 --> 01:30:23,360 Speaker 3: Check out, so my guilty pleasure and I did watch 1425 01:30:23,400 --> 01:30:24,920 Speaker 3: this with my wife. Is a Diplomat. 1426 01:30:25,479 --> 01:30:28,599 Speaker 2: The new series, just the new season, just and that's 1427 01:30:28,680 --> 01:30:29,799 Speaker 2: teed up for this weekend. 1428 01:30:29,840 --> 01:30:30,920 Speaker 3: I'm looking forward to that. 1429 01:30:31,280 --> 01:30:34,720 Speaker 2: Excellent that the first was the season two or season three? 1430 01:30:34,800 --> 01:30:35,960 Speaker 3: The first season was great. 1431 01:30:36,640 --> 01:30:39,040 Speaker 2: Yeah, and I'm going to give you if you like that. 1432 01:30:40,920 --> 01:30:43,880 Speaker 2: So my wife finds these really interesting. She got me 1433 01:30:43,960 --> 01:30:48,800 Speaker 2: into Killing Eve, which was a little more spyfair than Diplomacy. 1434 01:30:50,040 --> 01:30:52,560 Speaker 2: And if you get a chance to watch Slow Horses. 1435 01:30:52,400 --> 01:30:53,720 Speaker 3: Oh that's the one I've got to see. 1436 01:30:53,840 --> 01:30:57,600 Speaker 2: Yeah, it's really so the first season is great and 1437 01:30:57,720 --> 01:31:00,840 Speaker 2: people will told me the most recent season it's some 1438 01:31:01,000 --> 01:31:04,800 Speaker 2: people will say I don't like I know people who 1439 01:31:04,920 --> 01:31:08,240 Speaker 2: loved it all the way through, but it's it's really 1440 01:31:09,840 --> 01:31:12,000 Speaker 2: it's really an interesting, well told. 1441 01:31:12,080 --> 01:31:13,800 Speaker 3: Beautiful cast, fantastic. 1442 01:31:13,920 --> 01:31:19,000 Speaker 2: Yeah, you'll you'll love that. Our final two questions, Yeah, 1443 01:31:19,200 --> 01:31:21,880 Speaker 2: what sort of advice would you give a recent college 1444 01:31:21,920 --> 01:31:28,400 Speaker 2: grad interested in either investing or working with large language models, 1445 01:31:29,000 --> 01:31:33,360 Speaker 2: working with narrative analysis and artificial intelligence. 1446 01:31:35,800 --> 01:31:39,880 Speaker 3: Don't I say that tongue in cheek. 1447 01:31:40,000 --> 01:31:42,680 Speaker 2: I you had a lot of fits and starts going 1448 01:31:42,760 --> 01:31:46,240 Speaker 2: back seven eight years for sure. And I. 1449 01:31:50,200 --> 01:31:58,080 Speaker 3: I tell you what, I I think some important you 1450 01:31:58,240 --> 01:32:00,360 Speaker 3: have to would you would? I think think you can 1451 01:32:00,360 --> 01:32:04,080 Speaker 3: accomplish in academia, either grad school or whatever it is, 1452 01:32:04,320 --> 01:32:08,599 Speaker 3: is you build your intellectual capital. And so I think 1453 01:32:08,600 --> 01:32:12,479 Speaker 3: a lot of times when you get out of college 1454 01:32:13,320 --> 01:32:15,439 Speaker 3: you say, okay, I'm just ready to kind of live 1455 01:32:15,520 --> 01:32:18,400 Speaker 3: my life and start something, and you haven't built that 1456 01:32:18,520 --> 01:32:23,880 Speaker 3: intellectual capital yet. The issue, though, is that once you enter, 1457 01:32:24,800 --> 01:32:28,679 Speaker 3: particularly the investment world, where if you're responsible for managing 1458 01:32:28,720 --> 01:32:34,120 Speaker 3: other people's money. Brother, that's it, right, I mean, you're 1459 01:32:34,360 --> 01:32:38,680 Speaker 3: you're there's there's not you. That's got to be your 1460 01:32:39,680 --> 01:32:45,160 Speaker 3: your total focus. You are spending your intellectual capital. You're 1461 01:32:45,240 --> 01:32:49,599 Speaker 3: not gaining intellectual capital once you take on a role 1462 01:32:49,760 --> 01:32:54,320 Speaker 3: like that. So I my first advice is find a 1463 01:32:54,439 --> 01:32:58,599 Speaker 3: path where and that's often in in in academia where 1464 01:32:58,600 --> 01:33:02,000 Speaker 3: you're building intellectual capital, because once you leave that environment, 1465 01:33:02,840 --> 01:33:05,559 Speaker 3: then you're spending it down you're spending it down. Make 1466 01:33:05,640 --> 01:33:06,439 Speaker 3: sen spending it down. 1467 01:33:06,840 --> 01:33:09,439 Speaker 2: Our final question, what do you know about the world 1468 01:33:09,720 --> 01:33:16,080 Speaker 2: of filling the blank? Investing? Data, analytics, narrative storytelling today 1469 01:33:16,520 --> 01:33:19,000 Speaker 2: would have been helpful twenty five to thirty years ago. 1470 01:33:22,200 --> 01:33:31,280 Speaker 3: And investing. I wish I had understood the role of 1471 01:33:34,400 --> 01:33:36,920 Speaker 3: at least step back a second. I think that whether 1472 01:33:36,960 --> 01:33:39,919 Speaker 3: you're talking about data or whether you're talking about investing. 1473 01:33:42,600 --> 01:33:44,080 Speaker 3: And I'll speak for myself, but I think there are 1474 01:33:44,080 --> 01:33:46,240 Speaker 3: a lot of people like me. We think there's an 1475 01:33:46,360 --> 01:33:50,160 Speaker 3: answer with a capital a huh right there in the 1476 01:33:50,640 --> 01:33:53,479 Speaker 3: in the numbers, and if you just look hard enough, 1477 01:33:53,960 --> 01:33:57,920 Speaker 3: and if you work hard enough, you'll find that answer. Right, 1478 01:33:58,000 --> 01:34:00,480 Speaker 3: you'll find the secret the secret formula. 1479 01:34:00,960 --> 01:34:02,679 Speaker 2: And you have learned since then. 1480 01:34:03,120 --> 01:34:07,040 Speaker 3: Ain't no such thing? Right now, There is magic and 1481 01:34:07,120 --> 01:34:10,280 Speaker 3: there are patterns, but it's it's it's not in the 1482 01:34:10,400 --> 01:34:14,880 Speaker 3: structured data. It's not in the numbers. It's actually in 1483 01:34:15,000 --> 01:34:19,320 Speaker 3: the error. It's in the probabilitabilities. It's in we're calling, 1484 01:34:19,600 --> 01:34:24,839 Speaker 3: you know, the stochastic element, right, the role of chance, 1485 01:34:25,080 --> 01:34:29,000 Speaker 3: and understanding that there are patterns and there's real magic, 1486 01:34:29,160 --> 01:34:33,880 Speaker 3: and understanding that I didn't get that when I was 1487 01:34:34,000 --> 01:34:38,519 Speaker 3: either starting with data analysis or with investing. I was 1488 01:34:38,600 --> 01:34:41,759 Speaker 3: looking for the answer with the capital A as opposed 1489 01:34:41,800 --> 01:34:43,360 Speaker 3: to a process with a capital P. 1490 01:34:43,920 --> 01:34:48,840 Speaker 2: I love that answer because so my exercise in confirmation 1491 01:34:49,000 --> 01:34:54,200 Speaker 2: bias is I love the fact that you looking back, 1492 01:34:54,520 --> 01:34:57,680 Speaker 2: we all have the benefit of hindsight. Yeah, and unfortunately 1493 01:34:58,360 --> 01:35:00,760 Speaker 2: that can be a bias to certain p people. But 1494 01:35:00,880 --> 01:35:04,040 Speaker 2: when you're looking back at it, you know how it happens. 1495 01:35:04,800 --> 01:35:07,360 Speaker 2: At the moment when you don't know what the outcome is, 1496 01:35:08,200 --> 01:35:13,599 Speaker 2: thinking about it probabilistically is a much healthier approach than saying, 1497 01:35:14,160 --> 01:35:16,479 Speaker 2: here's a binary up or down, yes or no, and 1498 01:35:16,560 --> 01:35:20,200 Speaker 2: I'm either right or wrong. And I see people struggle 1499 01:35:20,240 --> 01:35:22,000 Speaker 2: with that constantly, constantly. 1500 01:35:22,760 --> 01:35:25,840 Speaker 3: When first started investing, we're again late. I got two 1501 01:35:25,880 --> 01:35:30,760 Speaker 3: pieces of good advice right. One was never go all in, right, 1502 01:35:31,360 --> 01:35:35,240 Speaker 3: which is which is really interesting because it's this, this 1503 01:35:35,479 --> 01:35:39,160 Speaker 3: business of investing. It's a wonderful life. You were solving puzzles, 1504 01:35:39,280 --> 01:35:41,800 Speaker 3: we meet interesting people, we get to have these sort 1505 01:35:41,800 --> 01:35:45,400 Speaker 3: of conversations. It's a it's a career for a lifetime. 1506 01:35:46,160 --> 01:35:51,200 Speaker 3: And the two pieces of advice early it never go 1507 01:35:51,360 --> 01:35:58,439 Speaker 3: all in. And also, reputation is absolutely the most important thing. 1508 01:35:59,240 --> 01:36:02,640 Speaker 3: And again, especially when you're young, you think, oh, well, 1509 01:36:02,720 --> 01:36:05,000 Speaker 3: that's not the most important thing. You know, being right's 1510 01:36:05,000 --> 01:36:09,760 Speaker 3: the most important thing. It ain't. It's playing the long 1511 01:36:09,880 --> 01:36:13,040 Speaker 3: game here would you want to? It's never go all 1512 01:36:13,120 --> 01:36:16,439 Speaker 3: in and the right never. 1513 01:36:17,720 --> 01:36:21,160 Speaker 2: By never go in, you mean never put everything at 1514 01:36:21,320 --> 01:36:23,360 Speaker 2: risks so that if it doesn't work out. 1515 01:36:23,280 --> 01:36:26,160 Speaker 3: You're you're out of the game. You're stay in the game. 1516 01:36:26,360 --> 01:36:29,120 Speaker 2: Gerald Loeb's book The Battle for Investment Survival. 1517 01:36:29,360 --> 01:36:32,800 Speaker 3: Stay in the game, avoid that risk of ruin. You 1518 01:36:33,360 --> 01:36:37,639 Speaker 3: just or don't ever risk ruin. That makes sense because 1519 01:36:37,960 --> 01:36:39,280 Speaker 3: once you don't think in those terms. 1520 01:36:39,360 --> 01:36:40,360 Speaker 2: But that's really it. 1521 01:36:40,520 --> 01:36:43,639 Speaker 3: All in means risk of ruin. It's the risk of ruin. 1522 01:36:46,760 --> 01:36:49,280 Speaker 3: And it also connects with the reputation because once. 1523 01:36:49,120 --> 01:36:52,639 Speaker 2: You once you blow up, it's tough to go back 1524 01:36:53,280 --> 01:36:54,400 Speaker 2: and you start. 1525 01:36:54,280 --> 01:36:57,519 Speaker 3: Saying, oh, I can fix it by taking this shortcut 1526 01:36:57,680 --> 01:37:00,680 Speaker 3: or doing this other thing, and once you do that, 1527 01:37:01,080 --> 01:37:04,840 Speaker 3: it never ends well all right, and you tell yourself, oh, 1528 01:37:05,040 --> 01:37:06,960 Speaker 3: just this one time. It's never this one time. 1529 01:37:07,360 --> 01:37:11,680 Speaker 2: Ben, This has been absolutely delightful. I'm so glad we've 1530 01:37:11,760 --> 01:37:17,439 Speaker 2: finally too gone around to doing this. I'm just entranced 1531 01:37:17,520 --> 01:37:21,720 Speaker 2: by your thought process. Some of the things like I 1532 01:37:21,880 --> 01:37:25,040 Speaker 2: love reading stuff of yours that I totally disagree with, 1533 01:37:25,760 --> 01:37:28,720 Speaker 2: and then because it forces me to say, well, he's 1534 01:37:28,840 --> 01:37:32,280 Speaker 2: not just making this up. I know how your brain works, 1535 01:37:32,720 --> 01:37:34,920 Speaker 2: and it's like, all right, if Ben is saying this, 1536 01:37:35,200 --> 01:37:38,759 Speaker 2: then I'm going to make this my worst case because 1537 01:37:38,800 --> 01:37:42,839 Speaker 2: it's easy to dismiss it. Everybody is out seeking confirming 1538 01:37:42,960 --> 01:37:46,840 Speaker 2: information and rather than being dismissive of it, all right, 1539 01:37:47,040 --> 01:37:50,880 Speaker 2: you claim to think probabilistically, where does this fit into 1540 01:37:50,960 --> 01:37:53,680 Speaker 2: the range of probabilities? And once you start thinking in 1541 01:37:53,720 --> 01:37:56,519 Speaker 2: those terms, it's like, oh, so the worst case scenario 1542 01:37:56,800 --> 01:37:59,880 Speaker 2: is worse than my worst case scenario. I got to 1543 01:38:00,120 --> 01:38:03,559 Speaker 2: move the bottom of my range down further because if 1544 01:38:03,600 --> 01:38:06,400 Speaker 2: this goes off the rails, this is really bad. That's 1545 01:38:06,439 --> 01:38:10,200 Speaker 2: what your past americanopies did with me, and really it 1546 01:38:10,360 --> 01:38:14,719 Speaker 2: really helped me figure out how to think about especially 1547 01:38:14,800 --> 01:38:18,759 Speaker 2: in that week between April twond and ninth where everyone 1548 01:38:18,880 --> 01:38:21,400 Speaker 2: was losing their mind. It's like, oh, no, so they're 1549 01:38:21,439 --> 01:38:24,519 Speaker 2: losing the mind because hey, this could happen, but maybe 1550 01:38:24,960 --> 01:38:26,080 Speaker 2: something good comes out of it. 1551 01:38:26,400 --> 01:38:28,919 Speaker 3: Trying to avoid tunnel vision, I think is so important 1552 01:38:29,000 --> 01:38:31,320 Speaker 3: in our business or any business, but especially. 1553 01:38:31,320 --> 01:38:35,519 Speaker 2: In our business in particular. Well, thank you for being 1554 01:38:35,600 --> 01:38:38,280 Speaker 2: so generous with your time. We have been speaking with 1555 01:38:38,400 --> 01:38:42,840 Speaker 2: Ben Hunts. He is the co founder and president of Prescients, 1556 01:38:42,920 --> 01:38:46,720 Speaker 2: and you can find his writing at Epsilon Theory. If 1557 01:38:46,800 --> 01:38:49,519 Speaker 2: you enjoy this conversation, well check out any of the 1558 01:38:49,640 --> 01:38:52,799 Speaker 2: five hundred and ninety three we've done over the past 1559 01:38:53,320 --> 01:38:58,920 Speaker 2: eleven and a half years. You can find those at Bloomberg, iTunes, Spotify, 1560 01:38:59,560 --> 01:39:03,040 Speaker 2: you two Tube, wherever you find your favorite podcasts. Be 1561 01:39:03,160 --> 01:39:06,120 Speaker 2: sure and check out my new book, How Not to 1562 01:39:06,280 --> 01:39:10,280 Speaker 2: Invest The Ideas, numbers and behaviors that destroy Wealth and 1563 01:39:10,400 --> 01:39:13,639 Speaker 2: How to Avoid Them at your favorite bookseller. I would 1564 01:39:13,680 --> 01:39:16,320 Speaker 2: be remiss if I not thank the crack staff that 1565 01:39:16,520 --> 01:39:21,040 Speaker 2: helps put these conversations together each week. Alexis Noriega is 1566 01:39:21,160 --> 01:39:26,080 Speaker 2: my video producer. Sean Russo is my researcher. Anna Luke 1567 01:39:26,200 --> 01:39:30,439 Speaker 2: is my podcast producer. Sage Bauman is the head of 1568 01:39:30,520 --> 01:39:35,160 Speaker 2: podcasts here at Bloomberg. I'm Barry Retults. You've been listening 1569 01:39:35,240 --> 01:39:38,280 Speaker 2: to Masters in Business on Bloomberg Radio