1 00:00:02,520 --> 00:00:07,040 Speaker 1: Bloomberg Audio Studios, podcasts, radio News. 2 00:00:07,840 --> 00:00:11,400 Speaker 2: Let's go back to Bloomberg Investo conference for a conversation 3 00:00:11,520 --> 00:00:15,200 Speaker 2: here with the CEO of Bridgewater Associates, Near Bardea, speaking 4 00:00:15,280 --> 00:00:16,880 Speaker 2: with Bloomberg's Shanali Bassic. 5 00:00:17,000 --> 00:00:20,040 Speaker 1: I have to embrace that change in the macro paradigm 6 00:00:20,079 --> 00:00:22,280 Speaker 1: and prep yourself for a different reality. I saw other 7 00:00:22,560 --> 00:00:24,159 Speaker 1: two quick things and then kind of touch on how 8 00:00:24,200 --> 00:00:27,160 Speaker 1: Bridgewaters align itself for it. The second thing that has 9 00:00:27,200 --> 00:00:30,480 Speaker 1: happened during that time is that while wealth got distributed, 10 00:00:31,040 --> 00:00:34,920 Speaker 1: I talked about Asia and China, the Middle East, profess 11 00:00:35,040 --> 00:00:36,839 Speaker 1: actually got a lot more concentrated. If you look at 12 00:00:36,880 --> 00:00:39,680 Speaker 1: our institutional investors, they have more than doubles their allocations 13 00:00:39,680 --> 00:00:43,520 Speaker 1: to private assets, even if you look at their liquid 14 00:00:43,560 --> 00:00:45,919 Speaker 1: part of the portfolio, unbelievably concentrated. 15 00:00:45,920 --> 00:00:46,360 Speaker 3: In the US. 16 00:00:46,360 --> 00:00:47,600 Speaker 1: I heard the questions kind of in the end of 17 00:00:47,600 --> 00:00:50,480 Speaker 1: the last interview of is US still a great place 18 00:00:50,479 --> 00:00:52,280 Speaker 1: to invest? Well, a lot of the investments has gone 19 00:00:52,320 --> 00:00:55,400 Speaker 1: to US equities. Seventy cents of every dollar inequities has 20 00:00:55,400 --> 00:00:57,280 Speaker 1: gone to USE equities, and. 21 00:00:57,200 --> 00:00:59,800 Speaker 3: That means that they are incredibly. 22 00:00:59,040 --> 00:01:01,960 Speaker 1: Vulnerable into this new economic shariatim. And the third thing, 23 00:01:02,000 --> 00:01:06,360 Speaker 1: which I want to talk about more a little bit later. 24 00:01:06,440 --> 00:01:09,000 Speaker 1: Is all this is happening when we are facing a 25 00:01:09,040 --> 00:01:12,319 Speaker 1: once in a generation technological disruption. We've been talking about 26 00:01:12,360 --> 00:01:15,840 Speaker 1: AI for a long time, but here the intelligence is 27 00:01:15,959 --> 00:01:20,240 Speaker 1: here and the macroeconomic impacts of that are just trickling in. 28 00:01:20,640 --> 00:01:22,920 Speaker 1: So capicks this year will double last year, and it 29 00:01:22,959 --> 00:01:25,320 Speaker 1: starts to move the needle. But the questions around productivity 30 00:01:25,880 --> 00:01:30,280 Speaker 1: and they're for employment and social implications is still outstanding. 31 00:01:30,959 --> 00:01:33,720 Speaker 1: Bridgewater's playbook has been looking at this for a long 32 00:01:33,760 --> 00:01:36,800 Speaker 1: time and saying, how do we position ourselves across these 33 00:01:36,840 --> 00:01:39,679 Speaker 1: three things in order to be there for our clients 34 00:01:39,680 --> 00:01:40,440 Speaker 1: when they need us most. 35 00:01:40,480 --> 00:01:41,720 Speaker 3: And having talked about that more. 36 00:01:41,680 --> 00:01:44,560 Speaker 2: Yeah, let's start with the last piece of that because 37 00:01:44,959 --> 00:01:47,960 Speaker 2: on a daylight today, maybe everyone's thinking about the short term, 38 00:01:48,000 --> 00:01:50,560 Speaker 2: but the long term trend has been thinking about what 39 00:01:50,640 --> 00:01:51,360 Speaker 2: AI would do. 40 00:01:51,840 --> 00:01:54,080 Speaker 4: What is the good, the bad, and the ugly of 41 00:01:54,160 --> 00:01:55,280 Speaker 4: it all. 42 00:01:55,440 --> 00:01:59,680 Speaker 1: So, man, there's so much words that are being said 43 00:01:59,680 --> 00:02:03,200 Speaker 1: about topic, and I'm going to add some words on 44 00:02:03,200 --> 00:02:04,600 Speaker 1: top of that, but I'll kind of. 45 00:02:04,560 --> 00:02:06,600 Speaker 3: Give you our frame or to think care about it. 46 00:02:06,760 --> 00:02:10,600 Speaker 1: We've been forever thinking about the best combinations of humans 47 00:02:10,600 --> 00:02:12,480 Speaker 1: and technology because if you think about our mission, our 48 00:02:12,520 --> 00:02:14,920 Speaker 1: mission is to understand how the world works through combining 49 00:02:15,000 --> 00:02:18,080 Speaker 1: humans and technology, because no one human mind can wrap 50 00:02:18,120 --> 00:02:22,560 Speaker 1: their head around everything in the world, and that means 51 00:02:22,560 --> 00:02:26,920 Speaker 1: that we've always searched for how technologies can replace what 52 00:02:27,000 --> 00:02:29,040 Speaker 1: humans are doing, so humans can do what technology can't do. 53 00:02:29,800 --> 00:02:31,800 Speaker 1: And what I can tell you is that I think 54 00:02:31,840 --> 00:02:35,360 Speaker 1: it's a fool's errand to try to be precise about 55 00:02:35,400 --> 00:02:36,919 Speaker 1: how this is going to play out. I'll give you 56 00:02:36,960 --> 00:02:40,280 Speaker 1: two examples of that. What I said technology is here. 57 00:02:41,440 --> 00:02:43,519 Speaker 1: You know, one school is saying we are headed to 58 00:02:43,840 --> 00:02:48,360 Speaker 1: agentic full agentic AI, which means we will sit back 59 00:02:48,440 --> 00:02:51,560 Speaker 1: and everything's got to be replaced by AI. That could happen. 60 00:02:52,360 --> 00:02:54,880 Speaker 1: Others are saying this is going to be a general 61 00:02:54,880 --> 00:02:57,800 Speaker 1: purpose technology, which is still very impactful, but it's wildly 62 00:02:57,840 --> 00:03:03,120 Speaker 1: different than fully agentic AI, and people can be very 63 00:03:03,120 --> 00:03:04,720 Speaker 1: specific about where they think that's gonna land. 64 00:03:04,720 --> 00:03:05,800 Speaker 3: I don't think anybody knows. 65 00:03:06,320 --> 00:03:08,000 Speaker 1: And the most important thing to suffroect yourself for a 66 00:03:08,000 --> 00:03:08,799 Speaker 1: wide range of ACOMA. 67 00:03:08,800 --> 00:03:10,160 Speaker 3: Will give you another example, which I think is. 68 00:03:10,160 --> 00:03:12,280 Speaker 1: Really important to kind of process with regards to. 69 00:03:12,240 --> 00:03:13,720 Speaker 3: The macroeconomic paradigms TIF. 70 00:03:13,639 --> 00:03:17,120 Speaker 1: Thing and that is that, you know, the optimists say, yes, 71 00:03:17,200 --> 00:03:21,480 Speaker 1: we are in a modern mercantilist world. If the opposite 72 00:03:21,520 --> 00:03:24,760 Speaker 1: of globalization, a world where we have less efficiencies because 73 00:03:24,800 --> 00:03:27,960 Speaker 1: government are trying to avoid the vulnerability of globalization. 74 00:03:28,400 --> 00:03:31,960 Speaker 3: The optimists say, well, we are headed to a productivity miracle. 75 00:03:32,760 --> 00:03:35,760 Speaker 1: It's going to be great because these technologies are going 76 00:03:35,840 --> 00:03:39,600 Speaker 1: to more than an offset whatever we lose from the 77 00:03:39,880 --> 00:03:40,960 Speaker 1: less collaborative world. 78 00:03:41,240 --> 00:03:43,200 Speaker 3: Okay, that could happen, It could totally happen. 79 00:03:44,000 --> 00:03:46,440 Speaker 1: But the passive is say, hey, the reason we're here 80 00:03:46,440 --> 00:03:48,600 Speaker 1: in the first place is because we're massive gains in 81 00:03:48,640 --> 00:03:52,040 Speaker 1: this macroeconomic paradigm that were distributed very unequally. 82 00:03:52,040 --> 00:03:53,320 Speaker 3: We had big winners and losers. 83 00:03:53,720 --> 00:03:56,440 Speaker 1: Well, the passive is say, AI is going to exacerbate 84 00:03:56,520 --> 00:03:59,520 Speaker 1: that tremendously when we have massive winners and losers. It's 85 00:03:59,520 --> 00:04:01,880 Speaker 1: going to be the show up into society. How those 86 00:04:01,880 --> 00:04:04,920 Speaker 1: things net out has such a wide cone. And I 87 00:04:04,920 --> 00:04:07,080 Speaker 1: think again into fool's errands to kind of try to 88 00:04:07,080 --> 00:04:09,880 Speaker 1: predict exactly what's going to happen. So my advice and 89 00:04:09,920 --> 00:04:13,000 Speaker 1: what we do with Bridgewater, we try to be practitioners. 90 00:04:13,760 --> 00:04:18,240 Speaker 1: So get your hands dirty. Understand that technology deeply and 91 00:04:18,279 --> 00:04:20,680 Speaker 1: then prep yourself for wide range of outcomes of how 92 00:04:20,760 --> 00:04:21,479 Speaker 1: this could play out. 93 00:04:22,400 --> 00:04:26,200 Speaker 2: Speaking of getting your hands dirty, Bridgewater has been using 94 00:04:26,240 --> 00:04:29,520 Speaker 2: AI for its own processes. Late last year, you launched 95 00:04:29,520 --> 00:04:32,279 Speaker 2: a fund over at Bridgewater starting with two billion dollars 96 00:04:32,520 --> 00:04:36,640 Speaker 2: that uses machine learning as the primary basis for decision making, 97 00:04:37,000 --> 00:04:38,800 Speaker 2: is built on proprietary technology. 98 00:04:39,200 --> 00:04:42,240 Speaker 4: What have you learned from this in terms. 99 00:04:41,960 --> 00:04:46,080 Speaker 2: Of how machine learning based investing is different from human 100 00:04:46,200 --> 00:04:47,120 Speaker 2: lad investing. 101 00:04:48,080 --> 00:04:50,720 Speaker 3: So I'd say I feel very lucky. 102 00:04:53,640 --> 00:04:56,000 Speaker 1: Because of Bridge Orks DNA as we're coming to this 103 00:04:56,040 --> 00:04:59,040 Speaker 1: inflection point I said earlier from the eighties, the idea 104 00:04:59,080 --> 00:05:01,600 Speaker 1: that no human brain and can possibly get their head 105 00:05:01,640 --> 00:05:02,520 Speaker 1: around everything. 106 00:05:02,279 --> 00:05:03,000 Speaker 3: In the world. 107 00:05:03,400 --> 00:05:07,240 Speaker 1: Men that we always try to replace humans with machines. 108 00:05:07,800 --> 00:05:11,000 Speaker 1: So we are not moving from a human lead. 109 00:05:10,880 --> 00:05:13,680 Speaker 3: Investment process to a machine let investment process. We are moving. 110 00:05:14,240 --> 00:05:17,320 Speaker 1: We were using scientific calculators, and then we were using 111 00:05:17,279 --> 00:05:19,320 Speaker 1: the seal sheets, and then we were using the most 112 00:05:19,320 --> 00:05:20,520 Speaker 1: complicated expert system. 113 00:05:20,560 --> 00:05:22,040 Speaker 3: This is AI before AI. 114 00:05:23,680 --> 00:05:27,160 Speaker 1: To make investment decisions in a systematic, fundamental way. So 115 00:05:27,240 --> 00:05:30,880 Speaker 1: for us, this is another step along in ARC and 116 00:05:30,920 --> 00:05:32,880 Speaker 1: what we have learned through this process of what we 117 00:05:32,960 --> 00:05:35,480 Speaker 1: call AA this is the artificial. 118 00:05:35,080 --> 00:05:36,840 Speaker 3: Intelligent Investment Associate. 119 00:05:37,160 --> 00:05:38,880 Speaker 1: That's the strategy that we rolled out and begin able 120 00:05:38,880 --> 00:05:40,280 Speaker 1: to last year is that it does the same thing 121 00:05:40,360 --> 00:05:43,440 Speaker 1: our people do by a machine learning first way. We 122 00:05:43,640 --> 00:05:48,200 Speaker 1: learned through scientific breakthroughs within Bridgewater that you can reason 123 00:05:49,040 --> 00:05:52,240 Speaker 1: you can generate alpha in a machine learning first way. 124 00:05:52,560 --> 00:05:55,640 Speaker 1: That strategy compete with our expert system in humans. It 125 00:05:55,800 --> 00:05:59,599 Speaker 1: generates unique alpha that is uncoordinated to what our humans do. 126 00:06:00,200 --> 00:06:01,800 Speaker 1: That's been running out for a year and a half 127 00:06:01,880 --> 00:06:05,840 Speaker 1: and it's comparable. And that's mind blowing in the space 128 00:06:05,880 --> 00:06:08,480 Speaker 1: of possibility, what can happen? And so one last thing 129 00:06:08,920 --> 00:06:11,640 Speaker 1: beyond the numbers that are happening this year or in 130 00:06:11,680 --> 00:06:14,000 Speaker 1: any given short period of times, the most important thing 131 00:06:14,120 --> 00:06:18,080 Speaker 1: is it keeps us up to date on where things 132 00:06:18,120 --> 00:06:21,160 Speaker 1: are going, not only within the asset management industry, but 133 00:06:21,200 --> 00:06:22,960 Speaker 1: also as you think about how this is going to 134 00:06:22,960 --> 00:06:24,000 Speaker 1: impact the macroeconomy. 135 00:06:24,600 --> 00:06:26,920 Speaker 2: So it's interesting. At the time of the launch, your 136 00:06:26,960 --> 00:06:30,040 Speaker 2: cocio at Bridgewater, Bregg Jensen said, the fund had the 137 00:06:30,080 --> 00:06:34,320 Speaker 2: potential to change the hiring and composition of staff. What 138 00:06:34,560 --> 00:06:37,440 Speaker 2: skills do you think the future employees of Bridgewater are 139 00:06:37,440 --> 00:06:39,200 Speaker 2: going to need? And you know, we were kind of 140 00:06:39,279 --> 00:06:40,200 Speaker 2: joking a little earlier. 141 00:06:40,279 --> 00:06:41,320 Speaker 4: Maybe the junior. 142 00:06:41,000 --> 00:06:43,600 Speaker 2: Analysts don't go away, but boy do they have to 143 00:06:43,640 --> 00:06:44,599 Speaker 2: be doing something. 144 00:06:44,279 --> 00:06:48,320 Speaker 1: Different, which has always been the case truly, because when 145 00:06:48,320 --> 00:06:51,359 Speaker 1: you have an expert system that does a lot of 146 00:06:51,360 --> 00:06:54,080 Speaker 1: what humans do in other funds, as an example, you 147 00:06:54,160 --> 00:06:57,159 Speaker 1: really start to move people to do what machines couldn't do. 148 00:06:57,240 --> 00:07:00,479 Speaker 1: So we have been shifting the type of humans we 149 00:07:00,520 --> 00:07:03,520 Speaker 1: bring into Bridgewater for fifty years, and that means from 150 00:07:03,560 --> 00:07:07,159 Speaker 1: really looking for analytical skills and financial backgrounds to people 151 00:07:07,279 --> 00:07:10,239 Speaker 1: that are conceptual and can ask philosophical question and query 152 00:07:10,280 --> 00:07:12,720 Speaker 1: as an example, because they're going to be very levered 153 00:07:12,800 --> 00:07:16,120 Speaker 1: with technology and how they explore their questions. That's been 154 00:07:16,680 --> 00:07:18,559 Speaker 1: constantly moving and I think this is a big shift, 155 00:07:18,600 --> 00:07:20,840 Speaker 1: but it's a big shift in the same direction of 156 00:07:20,880 --> 00:07:24,240 Speaker 1: what can humans do the future analysts, what can they 157 00:07:24,280 --> 00:07:25,920 Speaker 1: do that machines can do. I'll say the one thing 158 00:07:25,960 --> 00:07:28,120 Speaker 1: that is on my mind that I'm worried about. When 159 00:07:28,200 --> 00:07:31,640 Speaker 1: you think about that arc, you have to think about 160 00:07:31,640 --> 00:07:33,160 Speaker 1: what trains people over time. 161 00:07:34,360 --> 00:07:36,360 Speaker 3: So there are basic. 162 00:07:36,120 --> 00:07:39,480 Speaker 1: Jobs that people have in many places where you sleep 163 00:07:39,520 --> 00:07:42,720 Speaker 1: under your desk and you do the tough work that 164 00:07:42,800 --> 00:07:46,320 Speaker 1: then leads you to be able to do the higher level, 165 00:07:46,600 --> 00:07:50,800 Speaker 1: more conceptual things. One of the things we're thinking about is, hey, 166 00:07:51,400 --> 00:07:53,960 Speaker 1: we are well positioned already to replace a lot of 167 00:07:53,960 --> 00:07:56,680 Speaker 1: the basic tasks with machines, But what does that mean 168 00:07:56,760 --> 00:07:59,280 Speaker 1: to talent development over a long period of time. You 169 00:07:59,280 --> 00:08:01,200 Speaker 1: have to be conscious not only of today, but what 170 00:08:01,200 --> 00:08:04,120 Speaker 1: does that mean for your talent strategy over long term? 171 00:08:04,760 --> 00:08:07,960 Speaker 2: You know, how does this fit into the broader trajectory 172 00:08:07,960 --> 00:08:10,400 Speaker 2: of Bridgewater. When I think about where you stand today, 173 00:08:10,480 --> 00:08:13,880 Speaker 2: You've inherited this firm that's now roughly a half century old. 174 00:08:14,400 --> 00:08:17,560 Speaker 4: How do you position the firm for the future. 175 00:08:17,640 --> 00:08:19,800 Speaker 2: This is all in the scope of positioning for the 176 00:08:19,840 --> 00:08:21,040 Speaker 2: next fifty years. 177 00:08:21,240 --> 00:08:23,240 Speaker 4: What is Bridgewater two point zero? 178 00:08:24,280 --> 00:08:26,880 Speaker 1: So I think the best way to hit this is 179 00:08:26,920 --> 00:08:28,440 Speaker 1: to go back to the three things that I've said, 180 00:08:29,360 --> 00:08:31,440 Speaker 1: because it kind of it's where our heads are und 181 00:08:31,880 --> 00:08:32,320 Speaker 1: There are. 182 00:08:32,160 --> 00:08:34,040 Speaker 3: Things that have been that is a bad rocker Bridgewater. 183 00:08:34,120 --> 00:08:36,719 Speaker 1: Bridgewater truly, and I think I said this on the 184 00:08:36,800 --> 00:08:38,479 Speaker 1: stage years ago. 185 00:08:39,600 --> 00:08:43,120 Speaker 3: Just after you were yeah yeah, with Eric Shatska. 186 00:08:43,559 --> 00:08:45,800 Speaker 1: Bridge was about getting an incredible group of people together 187 00:08:46,679 --> 00:08:49,360 Speaker 1: and have this very very unique culture that puts in 188 00:08:49,400 --> 00:08:50,240 Speaker 1: the center of everything. 189 00:08:51,360 --> 00:08:52,920 Speaker 3: The desire to be excellent. 190 00:08:52,559 --> 00:08:55,320 Speaker 1: And constantly learn, because that's the only way to accomplish 191 00:08:55,400 --> 00:08:58,160 Speaker 1: this incredible mission of mapping the deepest precenting of how 192 00:08:58,160 --> 00:09:00,679 Speaker 1: the ward works in a systematic, fundamental way through combination 193 00:09:00,679 --> 00:09:01,480 Speaker 1: of fewas of technlogy. 194 00:09:01,559 --> 00:09:02,320 Speaker 3: That's stay the same. 195 00:09:02,679 --> 00:09:04,439 Speaker 1: If you look at the three things that I've said, 196 00:09:04,720 --> 00:09:08,160 Speaker 1: I said, the microeconomic paradigm is shifting, and because of that, 197 00:09:08,160 --> 00:09:12,080 Speaker 1: that means that beta, the tailwind is done. Alpha generation, 198 00:09:12,320 --> 00:09:17,480 Speaker 1: the positive uncredited alpha that pure off as an example, 199 00:09:17,480 --> 00:09:20,079 Speaker 1: a return for thirty three years is more important than ever. 200 00:09:20,120 --> 00:09:22,280 Speaker 1: A deep understanding of cause effect languages of how the 201 00:09:22,280 --> 00:09:25,160 Speaker 1: world works more important than ever. We have to hold 202 00:09:25,160 --> 00:09:27,080 Speaker 1: our hand the hands of our clients in days like 203 00:09:27,120 --> 00:09:29,679 Speaker 1: today and months like the last month, where they're asking, 204 00:09:29,760 --> 00:09:33,000 Speaker 1: how are these new policies going to flow through the economy? 205 00:09:33,040 --> 00:09:33,679 Speaker 3: What's going on? 206 00:09:34,240 --> 00:09:38,240 Speaker 1: So tremendous focus on improving the quality of understanding on 207 00:09:38,240 --> 00:09:39,040 Speaker 1: our in our alpha. 208 00:09:39,360 --> 00:09:39,680 Speaker 3: I said. 209 00:09:39,679 --> 00:09:43,360 Speaker 1: The second thing is contort portfolios. For fifty years, you've 210 00:09:43,360 --> 00:09:46,160 Speaker 1: said this. We're selling our fiftieth year this year. It's 211 00:09:46,200 --> 00:09:49,439 Speaker 1: always been about partnering with CIOs and CEOs of the 212 00:09:49,520 --> 00:09:50,959 Speaker 1: largest pools of capital in the world. 213 00:09:51,320 --> 00:09:52,720 Speaker 3: And helping them achieve their goals. 214 00:09:52,720 --> 00:09:57,079 Speaker 1: These are investment goals, resiliency, diversification. We've been very, very 215 00:09:57,080 --> 00:09:58,719 Speaker 1: focused over the last couple of years to make sure 216 00:09:58,720 --> 00:10:01,760 Speaker 1: that we can increase the impact act across their profilo 217 00:10:01,760 --> 00:10:04,160 Speaker 1: because there are allocations of pure office is going to be small. 218 00:10:04,360 --> 00:10:06,720 Speaker 3: But in a trillion dollar portfolio. 219 00:10:06,480 --> 00:10:07,920 Speaker 1: They're going to want to know that you have the 220 00:10:07,960 --> 00:10:10,080 Speaker 1: capability to invest, not in the US as an example, 221 00:10:10,120 --> 00:10:11,400 Speaker 1: if everyone's investing in the US. 222 00:10:11,679 --> 00:10:13,560 Speaker 3: We've been really focused on building and capability to invest 223 00:10:13,600 --> 00:10:14,760 Speaker 3: in China as an example. 224 00:10:14,800 --> 00:10:18,080 Speaker 1: Are our Onshore strategy lasts and return thirty percent? And 225 00:10:18,120 --> 00:10:20,120 Speaker 1: I say that not because of the number. I say 226 00:10:20,160 --> 00:10:21,920 Speaker 1: it because it shows where our focus has been. And 227 00:10:21,920 --> 00:10:24,800 Speaker 1: the last thing is we've talked about, which is in 228 00:10:24,880 --> 00:10:28,720 Speaker 1: a world that technology technology is going to disrupt us tremendously, 229 00:10:29,280 --> 00:10:32,959 Speaker 1: making sure that you have a partner in Bridgewater. 230 00:10:33,559 --> 00:10:36,280 Speaker 3: That is the curve that is where. 231 00:10:36,040 --> 00:10:38,240 Speaker 1: The scientific breakthroughs are happening that you can lean on 232 00:10:38,320 --> 00:10:40,360 Speaker 1: to show you where things are headed. 233 00:10:41,760 --> 00:10:43,880 Speaker 3: Those are the big three areas that we've been focused on. 234 00:10:43,880 --> 00:10:44,280 Speaker 3: For now. 235 00:10:44,280 --> 00:10:45,840 Speaker 2: We'll get back to where things are headed, but before 236 00:10:45,880 --> 00:10:48,280 Speaker 2: we get there, I just want to give the audience 237 00:10:48,320 --> 00:10:49,840 Speaker 2: a chance to get to know you a little bit, 238 00:10:49,880 --> 00:10:52,199 Speaker 2: because the more I've learned about you, the more your 239 00:10:52,280 --> 00:10:55,720 Speaker 2: story is kind of like nothing I've ever It is 240 00:10:55,880 --> 00:11:00,240 Speaker 2: like nothing I've ever seen in finance. So you didn't 241 00:11:00,280 --> 00:11:04,480 Speaker 2: graduate from high school initially, you eventually went to Warton 242 00:11:04,679 --> 00:11:07,400 Speaker 2: and then you started at Bridgewater as a thirty. 243 00:11:07,120 --> 00:11:08,360 Speaker 4: Three year old intern. 244 00:11:09,440 --> 00:11:13,240 Speaker 1: How I mean, sir, my Jewish mother, for you saying 245 00:11:13,640 --> 00:11:16,000 Speaker 1: that I didn't graduate high school on this stage is 246 00:11:16,040 --> 00:11:20,760 Speaker 1: probably like her heart is broken right now, But it's true. 247 00:11:22,880 --> 00:11:25,680 Speaker 1: And I've never shared with this. Actually, So I've said 248 00:11:25,720 --> 00:11:27,600 Speaker 1: many times that I look different. I know I'm different. 249 00:11:27,960 --> 00:11:29,480 Speaker 1: The amount of people that commented on the fact that 250 00:11:29,480 --> 00:11:32,320 Speaker 1: I'm not wearing a tie today, I am different. I 251 00:11:32,320 --> 00:11:35,000 Speaker 1: come from the outside of the world. I come from Israel. 252 00:11:35,160 --> 00:11:38,400 Speaker 1: I've talked historically about being a third generation to grandparents 253 00:11:38,400 --> 00:11:41,880 Speaker 1: that were Jewish refugees, family that was murdered in. 254 00:11:41,880 --> 00:11:43,559 Speaker 3: The Holocaust, and I talked about how the. 255 00:11:43,520 --> 00:11:48,840 Speaker 1: Story of my life is connected to seeing them establish 256 00:11:48,840 --> 00:11:50,640 Speaker 1: a country. And I talked about how an incredible group 257 00:11:50,640 --> 00:11:52,880 Speaker 1: of people with values can do incredible things. And I 258 00:11:52,880 --> 00:11:54,640 Speaker 1: always jump the story and say and then I ended 259 00:11:54,679 --> 00:11:57,599 Speaker 1: up a Bridgewater. To your point, if you doubleclated on 260 00:11:57,679 --> 00:12:00,839 Speaker 1: that story, and I've been reflecting this a lot lately. 261 00:12:01,160 --> 00:12:02,680 Speaker 3: I was a bad student in high school. 262 00:12:03,320 --> 00:12:06,120 Speaker 1: I didn't get a diploma on the other of twelve 263 00:12:06,200 --> 00:12:08,720 Speaker 1: years and I was very, very lucky that in Israel 264 00:12:09,320 --> 00:12:12,360 Speaker 1: there is a mandatory service that everybody. 265 00:12:12,040 --> 00:12:13,040 Speaker 3: Has to do with the age of eighteen. 266 00:12:13,720 --> 00:12:16,080 Speaker 1: And because people like my grandparents that come from Libya 267 00:12:16,160 --> 00:12:19,960 Speaker 1: and Hungary and Poland, looking different, different backgrounds, different languages, 268 00:12:20,720 --> 00:12:24,319 Speaker 1: needed to come together to accomplish this really incredible task 269 00:12:24,360 --> 00:12:27,000 Speaker 1: of putting a country together. They knew they had to 270 00:12:27,000 --> 00:12:30,320 Speaker 1: make the most out of the told human capability and 271 00:12:30,440 --> 00:12:31,000 Speaker 1: do that. 272 00:12:30,960 --> 00:12:31,760 Speaker 3: Through the service. 273 00:12:31,960 --> 00:12:35,319 Speaker 1: So the service ignores everything that happens up until the 274 00:12:35,320 --> 00:12:38,800 Speaker 1: age of eighteen, and based on your merit, reassesses you. 275 00:12:39,000 --> 00:12:39,679 Speaker 3: I tested at. 276 00:12:39,640 --> 00:12:43,680 Speaker 1: Eighteen thinking that I was incompetent and tested extremely highly, 277 00:12:44,720 --> 00:12:45,720 Speaker 1: and that changed my life. 278 00:12:45,760 --> 00:12:47,920 Speaker 3: It led to a very successful service. 279 00:12:48,120 --> 00:12:50,439 Speaker 1: And the reason that's important is because fifteen years later, 280 00:12:50,880 --> 00:12:53,760 Speaker 1: you're right, I am a thirty three year old already 281 00:12:53,760 --> 00:12:54,400 Speaker 1: with gray hair. 282 00:12:55,720 --> 00:12:57,160 Speaker 3: Winter intern in Bridgewater. 283 00:12:57,160 --> 00:13:03,120 Speaker 1: It's me and six other hung over Dartmouth twenty year olds, 284 00:13:03,800 --> 00:13:07,000 Speaker 1: and my English is not great and I'm out of 285 00:13:07,040 --> 00:13:09,360 Speaker 1: context and it's pretty confusing, and to say that I 286 00:13:09,360 --> 00:13:12,120 Speaker 1: feel lost is not even the beginning of it. But again, 287 00:13:12,160 --> 00:13:13,960 Speaker 1: I'm at an organization that is trying to make the 288 00:13:14,000 --> 00:13:17,800 Speaker 1: most in an amertocratic way out of these human capital 289 00:13:17,840 --> 00:13:20,720 Speaker 1: that it has because we're trying to accomplish this incredible goal. 290 00:13:21,200 --> 00:13:23,840 Speaker 3: And it leads me to kind of three big lessons. One, 291 00:13:24,559 --> 00:13:26,880 Speaker 3: luck matters. Luck matters. 292 00:13:27,200 --> 00:13:28,880 Speaker 1: I was lucky to be in a society where I 293 00:13:28,920 --> 00:13:30,400 Speaker 1: got a second shot in the service, and I was 294 00:13:30,480 --> 00:13:31,720 Speaker 1: lucky to come across a Bridgewater. 295 00:13:31,720 --> 00:13:33,240 Speaker 3: I could have come across a different organization. 296 00:13:33,760 --> 00:13:35,240 Speaker 1: The second thing, and I connected it back to the 297 00:13:35,240 --> 00:13:40,040 Speaker 1: AI point that you talked about, differences in thinking and 298 00:13:40,320 --> 00:13:44,400 Speaker 1: unique contributions are very valuable, and I think that's especially 299 00:13:44,440 --> 00:13:47,240 Speaker 1: true in a world where a lot of the generic 300 00:13:47,320 --> 00:13:48,959 Speaker 1: intelligence is going to be commoditized. 301 00:13:50,160 --> 00:13:51,560 Speaker 3: And the third thing is the way to. 302 00:13:51,480 --> 00:13:54,160 Speaker 1: Capture unique ways of thinking is to aspire these to 303 00:13:54,200 --> 00:13:55,280 Speaker 1: have a real meritocracy. 304 00:13:55,360 --> 00:13:57,880 Speaker 3: That's true in Israel, it's true at Bridgewater. 305 00:13:57,880 --> 00:14:00,640 Speaker 1: I think that is the key in US unlocking and 306 00:14:00,679 --> 00:14:03,120 Speaker 1: facing some of the biggest challenges we have as corporations 307 00:14:03,200 --> 00:14:03,960 Speaker 1: and as a society. 308 00:14:05,600 --> 00:14:07,960 Speaker 2: So another major part of your background, as you've been 309 00:14:07,960 --> 00:14:11,559 Speaker 2: talking about, is your Israeli heritage, and you were talking 310 00:14:11,559 --> 00:14:15,679 Speaker 2: about complicated geopolitics. In the past year alone, you've been 311 00:14:15,760 --> 00:14:18,160 Speaker 2: back to Israel, You've spent a lot of time in 312 00:14:18,200 --> 00:14:22,720 Speaker 2: the Middle East, including Saudi Arabia, Qatar, Bahrain. All in 313 00:14:22,760 --> 00:14:26,840 Speaker 2: your capacity as the CEO of Bridgewater, what can you 314 00:14:26,880 --> 00:14:29,720 Speaker 2: share about the trajectory of the Middle East and do 315 00:14:29,760 --> 00:14:32,520 Speaker 2: you see a path to peace when it comes to 316 00:14:32,560 --> 00:14:33,600 Speaker 2: the conflict in Israel. 317 00:14:34,400 --> 00:14:39,800 Speaker 1: Wow, let me start where I feel like I have to. 318 00:14:40,000 --> 00:14:41,359 Speaker 3: October seventh. 319 00:14:43,320 --> 00:14:47,000 Speaker 1: Is incredibly personal for me, and not just Ocstober sevens, 320 00:14:47,040 --> 00:14:49,800 Speaker 1: but also what happens subsequently. There's still fifty nine hostages 321 00:14:49,800 --> 00:14:53,600 Speaker 1: in Gaza. There's suffering in Gaza, They're suffering in Israel, 322 00:14:54,000 --> 00:14:58,160 Speaker 1: They're suffering in the entire region. These are family members, 323 00:14:58,320 --> 00:15:01,360 Speaker 1: my community. These my community at Bridgewater. Some of them 324 00:15:01,400 --> 00:15:04,760 Speaker 1: are neighboring countries. These are partners of Bridgewater and neighboring countries. 325 00:15:05,200 --> 00:15:08,200 Speaker 1: It's been a year and a half of trauma, truly trauma, 326 00:15:08,760 --> 00:15:11,760 Speaker 1: and my heart goes to everyone on this. 327 00:15:11,960 --> 00:15:12,800 Speaker 3: It is important for. 328 00:15:12,720 --> 00:15:16,360 Speaker 1: Me to say that I'm proud to be Jewish. I'm 329 00:15:16,400 --> 00:15:19,560 Speaker 1: proud to be Israeli. I'm proud of my country men 330 00:15:19,560 --> 00:15:22,280 Speaker 1: and women, and I feel very lucky that I've had 331 00:15:22,320 --> 00:15:25,720 Speaker 1: the chance in this job, to your point, to go 332 00:15:25,840 --> 00:15:28,200 Speaker 1: around the world and including in the Middle East and 333 00:15:28,240 --> 00:15:32,280 Speaker 1: places that matter and represent hopefully the best of Israel, 334 00:15:32,480 --> 00:15:35,160 Speaker 1: and learn about the best in many of the countries 335 00:15:35,240 --> 00:15:39,800 Speaker 1: that you've mentioned, forge real friendships with people and incredible, 336 00:15:39,840 --> 00:15:43,440 Speaker 1: incredible leaders. And your question about where things are had 337 00:15:44,040 --> 00:15:47,960 Speaker 1: I've said this before. I have a deep belief that 338 00:15:48,520 --> 00:15:54,920 Speaker 1: look tragedy and trauma and suffering leads to resiliency, and 339 00:15:54,960 --> 00:15:58,440 Speaker 1: resiliency leads to evolution. If you look at the Second 340 00:15:58,440 --> 00:16:00,720 Speaker 1: World War and the new World order they came out of. 341 00:16:00,720 --> 00:16:02,480 Speaker 1: If you look at a great depression and the new 342 00:16:02,800 --> 00:16:04,520 Speaker 1: economic paradigm that came out of, if you look at 343 00:16:04,520 --> 00:16:08,720 Speaker 1: the Holocaust and my grandparents rising from the ashes and 344 00:16:08,760 --> 00:16:12,000 Speaker 1: building a country. If I said that a great group 345 00:16:12,040 --> 00:16:14,880 Speaker 1: of people can come together and share values and force 346 00:16:14,920 --> 00:16:16,960 Speaker 1: a different future, I go back to all the people 347 00:16:17,000 --> 00:16:18,840 Speaker 1: that are in the Middle East, not just in Israel, 348 00:16:18,880 --> 00:16:22,360 Speaker 1: but in Saudi Arabia and Bahrain, in the UA. There's 349 00:16:22,400 --> 00:16:26,520 Speaker 1: an incredible resilient generation that has suffered a lot, and 350 00:16:26,600 --> 00:16:30,120 Speaker 1: I am actually more hopeful than ever that that generation 351 00:16:30,680 --> 00:16:34,080 Speaker 1: can create a completely different future going forward to the 352 00:16:34,120 --> 00:16:36,200 Speaker 1: Middle East and one hundred year piece. 353 00:16:36,560 --> 00:16:39,280 Speaker 4: What role does Saudi Arabia, does Katar? 354 00:16:39,400 --> 00:16:43,960 Speaker 2: Does the US play in negotiating peace here? 355 00:16:44,160 --> 00:16:47,000 Speaker 4: And is there a two point zero version of the 356 00:16:47,000 --> 00:16:48,000 Speaker 4: Abraham Accords? 357 00:16:48,600 --> 00:16:51,600 Speaker 3: I mean, that's really not for me to answer. 358 00:16:51,720 --> 00:16:53,680 Speaker 1: The thing I would say is yes, I think that 359 00:16:54,440 --> 00:16:58,680 Speaker 1: Saudi Arabia, the US, the Ua baharein. When I say 360 00:16:58,720 --> 00:17:01,880 Speaker 1: there's a generation, there's a generation of leaders. Then when 361 00:17:01,920 --> 00:17:05,800 Speaker 1: I talk to these people, they're. 362 00:17:06,119 --> 00:17:09,080 Speaker 3: Just like me. They have the same exact aspirations as 363 00:17:09,080 --> 00:17:09,440 Speaker 3: I am. 364 00:17:10,080 --> 00:17:12,399 Speaker 1: And I think a coalition of that group of people 365 00:17:12,640 --> 00:17:15,960 Speaker 1: in Israel and in the region is the way to 366 00:17:16,000 --> 00:17:19,720 Speaker 1: create what you call the second Abram Accord, and it's coming. 367 00:17:19,880 --> 00:17:23,880 Speaker 1: I'm more hopeful today, Chanelli, honesty that have been five 368 00:17:23,960 --> 00:17:27,719 Speaker 1: years ago, because out of this suffering there's a crystallization 369 00:17:27,760 --> 00:17:31,439 Speaker 1: of the realization that creates the necessity to evolve and 370 00:17:31,480 --> 00:17:32,680 Speaker 1: take the region in a different direction. 371 00:17:33,000 --> 00:17:35,920 Speaker 2: And I want to get your thoughts clearly on this, 372 00:17:36,040 --> 00:17:39,840 Speaker 2: because the human toll has been devastating, both in Israel 373 00:17:39,920 --> 00:17:45,440 Speaker 2: and in Gaza and beyond. Really at this point how 374 00:17:45,480 --> 00:17:47,359 Speaker 2: do you see the conflict playing out, given that the 375 00:17:47,400 --> 00:17:48,760 Speaker 2: ceasefire talks have stalled. 376 00:17:49,600 --> 00:17:50,160 Speaker 3: Well, if I. 377 00:17:50,080 --> 00:17:51,720 Speaker 1: Said that, it's a fool's err and to guess where 378 00:17:51,720 --> 00:17:57,359 Speaker 1: AI is going, this is truly a fool's erin to guess. 379 00:17:57,440 --> 00:18:03,400 Speaker 1: But I would say this, until Bill the hostages are released, 380 00:18:04,720 --> 00:18:08,760 Speaker 1: there is no solution. Also, until there is a future 381 00:18:08,840 --> 00:18:10,359 Speaker 1: for the Palestinian. 382 00:18:09,760 --> 00:18:15,960 Speaker 3: People, there is no solution. And there are solutions that. 383 00:18:16,040 --> 00:18:18,000 Speaker 1: Can be put together, but they will be put together 384 00:18:18,160 --> 00:18:23,000 Speaker 1: by a new generation taking leadership and coming together and 385 00:18:23,080 --> 00:18:25,080 Speaker 1: making sacrifices the previous generations were. 386 00:18:24,960 --> 00:18:25,520 Speaker 3: I'm willing to do. 387 00:18:25,640 --> 00:18:27,800 Speaker 2: You know, a lot of people look around and they say, 388 00:18:28,320 --> 00:18:31,160 Speaker 2: in the wake of World War Two, when you look now, 389 00:18:31,680 --> 00:18:33,680 Speaker 2: there are a lot of the protections that were put 390 00:18:33,680 --> 00:18:36,440 Speaker 2: in place to make sure that the world didn't fracture again, 391 00:18:37,160 --> 00:18:41,240 Speaker 2: some of those protections are now under threat, particularly with 392 00:18:41,320 --> 00:18:43,560 Speaker 2: the changing guards across countries. 393 00:18:44,040 --> 00:18:44,919 Speaker 4: Do you worry about that? 394 00:18:47,240 --> 00:18:49,920 Speaker 1: I mean, personally yes, I think that's a part of 395 00:18:49,920 --> 00:18:53,960 Speaker 1: the macroeconomic paradigm that is also a geopolitical paradigm that 396 00:18:54,080 --> 00:18:57,040 Speaker 1: has shifted the world post World War Two. 397 00:18:57,080 --> 00:19:01,240 Speaker 3: Out of the trauma of World War Two started a. 398 00:19:01,400 --> 00:19:04,639 Speaker 1: Series of events that led us to not only for 399 00:19:04,800 --> 00:19:09,919 Speaker 1: protections and institutions, but increased collaboration decade after decade after decade, 400 00:19:10,000 --> 00:19:12,439 Speaker 1: leading us to a long period of extreme collaboration over 401 00:19:12,480 --> 00:19:16,400 Speaker 1: the last thirty years. There is no question that that paradigm, 402 00:19:16,440 --> 00:19:21,040 Speaker 1: geopolitical and macroeconomic paradigm is being undone. Now there are 403 00:19:21,080 --> 00:19:24,160 Speaker 1: reasons why that's getting undone. There are problems that were 404 00:19:24,160 --> 00:19:26,640 Speaker 1: created that I mentioned earlier in this conversation that need 405 00:19:26,680 --> 00:19:30,439 Speaker 1: to get fixed. But if that's not done well, you 406 00:19:30,480 --> 00:19:33,440 Speaker 1: can instead of correcting the problems that the system. 407 00:19:33,119 --> 00:19:36,120 Speaker 3: Had, you can create a disaster. 408 00:19:36,520 --> 00:19:40,480 Speaker 1: And I worry that between correcting the problems and causing 409 00:19:40,760 --> 00:19:45,200 Speaker 1: complete chaos, we can easily end up being in chaos. 410 00:19:46,560 --> 00:19:48,639 Speaker 2: Before I let you go, this is kind of another 411 00:19:48,640 --> 00:19:50,679 Speaker 2: bumber of a question, so I apologize, But do you 412 00:19:50,680 --> 00:19:53,240 Speaker 2: think the world is more dangerous today than it's been 413 00:19:53,240 --> 00:19:54,000 Speaker 2: in a long time? 414 00:19:56,960 --> 00:20:01,000 Speaker 1: The world is more dangerous today than it's been in 415 00:20:01,080 --> 00:20:04,200 Speaker 1: r in recent history. But I think this is one 416 00:20:04,200 --> 00:20:05,879 Speaker 1: of the problems that we have as humans. If you 417 00:20:05,920 --> 00:20:08,720 Speaker 1: go back not that long, go back one hundred and 418 00:20:08,720 --> 00:20:12,240 Speaker 1: fifty years, if you're in the US and eighteen hundreds, 419 00:20:12,760 --> 00:20:15,359 Speaker 1: it is tough you have militias. 420 00:20:15,200 --> 00:20:17,440 Speaker 3: Women have no rights, black people have no rights. 421 00:20:17,480 --> 00:20:20,200 Speaker 1: Then you have the Spanish Flu, then you have the 422 00:20:20,240 --> 00:20:23,359 Speaker 1: Great Depression, then you have World War one, World War two, 423 00:20:24,000 --> 00:20:26,480 Speaker 1: you have the assassination of Kennedy and Martin Luther King. 424 00:20:28,119 --> 00:20:30,080 Speaker 1: We have a bias to remember the last thirty years, 425 00:20:30,080 --> 00:20:34,639 Speaker 1: which will particularly come. So the world is more dangerous today, 426 00:20:34,760 --> 00:20:38,240 Speaker 1: but this is not a bigger problem than we have 427 00:20:38,320 --> 00:20:39,399 Speaker 1: seen in the history. 428 00:20:39,160 --> 00:20:40,600 Speaker 3: Of our parents and grandparents. 429 00:20:40,640 --> 00:20:43,320 Speaker 1: We can deal with this as long as we embrace reality, 430 00:20:43,440 --> 00:20:46,000 Speaker 1: understand it well, and come together. 431 00:20:46,720 --> 00:20:47,560 Speaker 3: To find solutions. 432 00:20:47,880 --> 00:20:50,520 Speaker 2: Thank you for such a clear conversation that is near 433 00:20:50,600 --> 00:20:53,159 Speaker 2: Bardaya of course, the CEO of for daughter, thank you. 434 00:20:53,400 --> 00:20:54,560 Speaker 3: Thank you, our allies feat to be