1 00:00:02,759 --> 00:00:07,240 Speaker 1: Bloomberg Audio Studios, Podcasts, radio news. 2 00:00:08,080 --> 00:00:10,600 Speaker 2: Steve Patiuca sat down with US and Lisa Lane's in 3 00:00:10,640 --> 00:00:11,600 Speaker 2: and guys apart from sports. 4 00:00:11,600 --> 00:00:14,280 Speaker 3: Okay, hold on a second, but you've been talking about 5 00:00:14,320 --> 00:00:18,680 Speaker 3: Atalanta for like about three years now, and you're so excited, 6 00:00:18,800 --> 00:00:20,119 Speaker 3: and I know that the whole thing is going to 7 00:00:20,160 --> 00:00:21,040 Speaker 3: be about Atlanta, and. 8 00:00:22,520 --> 00:00:23,600 Speaker 1: I think that that's wonderful. 9 00:00:23,680 --> 00:00:26,520 Speaker 3: But a lot of tules in a lot of places. 10 00:00:26,600 --> 00:00:27,400 Speaker 1: Really, you're going to do. 11 00:00:27,320 --> 00:00:29,480 Speaker 2: This in the capital private equity senior Advice that joins 12 00:00:29,520 --> 00:00:31,400 Speaker 2: US now three wind Steve, Hello, good to see you. 13 00:00:31,640 --> 00:00:32,239 Speaker 1: Great to be here. 14 00:00:32,280 --> 00:00:33,360 Speaker 2: Let's start with sports. 15 00:00:34,720 --> 00:00:35,680 Speaker 1: I thought it was in ESPN. 16 00:00:36,600 --> 00:00:38,519 Speaker 2: We had to get on to night because that's Atlanta 17 00:00:38,560 --> 00:00:41,080 Speaker 2: five NOL. What a story for that team this year. 18 00:00:41,760 --> 00:00:43,760 Speaker 1: You know, I'm still pinching myself. It's just incredible and 19 00:00:43,800 --> 00:00:45,320 Speaker 1: very happy for the team and the coach and the 20 00:00:45,320 --> 00:00:48,160 Speaker 1: Percossi family. It's it's been great for Bergamo and Milan 21 00:00:48,200 --> 00:00:49,800 Speaker 1: and it's just really exciting. 22 00:00:49,960 --> 00:00:51,960 Speaker 2: I don't need to get too excited, but can you 23 00:00:51,960 --> 00:00:52,800 Speaker 2: win the league this year? 24 00:00:54,080 --> 00:00:56,200 Speaker 1: You know the old adage take it one game at 25 00:00:56,240 --> 00:00:57,760 Speaker 1: a time, so we just we've got to do We've 26 00:00:57,760 --> 00:00:59,120 Speaker 1: got to get healthy. We got a couple of guys 27 00:00:59,120 --> 00:01:01,200 Speaker 1: that are hurt, but they're playing fantastic. 28 00:01:01,360 --> 00:01:05,440 Speaker 2: It's amazing to see Barcelona up next and maybe straight qualification. 29 00:01:05,480 --> 00:01:07,479 Speaker 2: So the not the next round, but the one after that, right. 30 00:01:07,959 --> 00:01:09,959 Speaker 1: Big that would be a big big for European would 31 00:01:10,040 --> 00:01:11,440 Speaker 1: go to Barcelona. That's the football. 32 00:01:11,480 --> 00:01:13,280 Speaker 2: Okay, I did the basketball. You should do best. 33 00:01:13,400 --> 00:01:14,200 Speaker 1: Actually, there's a lot of. 34 00:01:15,520 --> 00:01:18,600 Speaker 2: So reports that you're after majority ownership of the Boston Celtics. 35 00:01:18,720 --> 00:01:20,320 Speaker 2: How much can you tell us this morning? 36 00:01:20,520 --> 00:01:23,120 Speaker 1: I can't comment, as you know, on any deals, including. 37 00:01:22,800 --> 00:01:25,200 Speaker 2: That deal, okay that one specifically? Can we talk more 38 00:01:25,240 --> 00:01:28,200 Speaker 2: broadly about the NBA franchises and what's happening there, because 39 00:01:28,280 --> 00:01:31,039 Speaker 2: I have noticed that audience levels TV audience levels have 40 00:01:31,160 --> 00:01:32,760 Speaker 2: fallen off in some parts. What do you think is 41 00:01:32,800 --> 00:01:33,240 Speaker 2: behind that? 42 00:01:33,840 --> 00:01:35,960 Speaker 1: You know, I think I think it's fragmentation of viewing, 43 00:01:36,000 --> 00:01:38,040 Speaker 1: and I think the viewing leaderships are really good when 44 00:01:38,080 --> 00:01:40,759 Speaker 1: they can count it correctly and really find out who's 45 00:01:40,800 --> 00:01:42,880 Speaker 1: watching where. So I don't think there's a concern there 46 00:01:42,920 --> 00:01:45,080 Speaker 1: in the league is really strong and has more stars 47 00:01:45,120 --> 00:01:47,880 Speaker 1: than ever, and you know, it's been fantastic, so I 48 00:01:47,920 --> 00:01:49,760 Speaker 1: think that's a that's kind of a bump in the room. 49 00:01:49,920 --> 00:01:52,600 Speaker 3: So we've covered football, we've covered basketball, any other sports 50 00:01:52,600 --> 00:01:53,720 Speaker 3: that you're going to buy a team in. 51 00:01:54,080 --> 00:01:58,520 Speaker 1: No, I'm not in a cricket so I'm good. 52 00:02:00,080 --> 00:02:01,440 Speaker 3: But there's actually a lot to talk about it and 53 00:02:01,440 --> 00:02:02,840 Speaker 3: whether you're going to buy another team? Are you going 54 00:02:02,840 --> 00:02:03,680 Speaker 3: to buy another team? 55 00:02:04,360 --> 00:02:04,800 Speaker 1: Who knows? 56 00:02:05,040 --> 00:02:07,600 Speaker 3: Okay, all right, well let's move on because right now 57 00:02:07,600 --> 00:02:10,320 Speaker 3: we talk about Davos and Hall has been dominated by 58 00:02:10,440 --> 00:02:14,280 Speaker 3: incredible optimism. America is going to boom and Donald Trump 59 00:02:14,320 --> 00:02:17,160 Speaker 3: what's he going to do? Of course there's also overlay 60 00:02:17,200 --> 00:02:21,360 Speaker 3: of artificial intelligence. What's been your biggest takeaway about what 61 00:02:21,440 --> 00:02:23,519 Speaker 3: people are concerned about with the Trump administration? 62 00:02:24,120 --> 00:02:27,000 Speaker 1: Well, I think first of all, this this Davos is 63 00:02:27,040 --> 00:02:30,600 Speaker 1: all Trump, all AI. That's all people are talking about. 64 00:02:31,160 --> 00:02:34,679 Speaker 1: And the Trump issue is, you know, what's going to 65 00:02:34,720 --> 00:02:37,560 Speaker 1: happen with tariffs? You know, you're worried about tariffs? Is 66 00:02:37,560 --> 00:02:40,480 Speaker 1: worried about tariffs, What's going to happen with the Ukraine, 67 00:02:41,400 --> 00:02:44,480 Speaker 1: What's going to happen with the American economy and protectionism? 68 00:02:44,680 --> 00:02:50,040 Speaker 1: That's that's their worry. And on AI, you know, AI 69 00:02:50,360 --> 00:02:52,799 Speaker 1: is for real. It feels a little bit like you're 70 00:02:52,840 --> 00:02:56,640 Speaker 1: in nineteen ninety nine Internet boom. So there's a lot 71 00:02:56,639 --> 00:02:58,120 Speaker 1: of hype, but there's a lot of reality in AI, 72 00:02:58,160 --> 00:03:00,720 Speaker 1: and AI is going to change everything. I view AI 73 00:03:01,280 --> 00:03:06,760 Speaker 1: as almost building the railroads and the interstate highway system 74 00:03:06,840 --> 00:03:08,480 Speaker 1: in the US. We're going to need to have that 75 00:03:08,560 --> 00:03:10,600 Speaker 1: over the next twenty years or we're going to fall behind. 76 00:03:10,919 --> 00:03:13,160 Speaker 1: So I think it's a legitimate discussion on AI. And 77 00:03:13,200 --> 00:03:15,160 Speaker 1: they came out with the Stargate program, which I think 78 00:03:15,240 --> 00:03:19,200 Speaker 1: is fantastic because other countries are getting ahead. China's investing billions, 79 00:03:19,840 --> 00:03:22,960 Speaker 1: Ue is investing billions, Saudia's investing billions. I think the 80 00:03:23,080 --> 00:03:24,519 Speaker 1: US has to get out ahead of that trend and 81 00:03:24,520 --> 00:03:26,200 Speaker 1: it's going to affect everything over the next twenties. I'm 82 00:03:26,200 --> 00:03:27,080 Speaker 1: really excited about it. 83 00:03:27,040 --> 00:03:28,800 Speaker 2: As an investor. What's your approach to one of this? 84 00:03:28,880 --> 00:03:30,720 Speaker 2: How do you think about the opportunities you mentioned the 85 00:03:30,800 --> 00:03:32,760 Speaker 2: late nineties, it took a while to find out who 86 00:03:32,760 --> 00:03:36,040 Speaker 2: the ultimate winners would be. Are you expecting something similar 87 00:03:36,080 --> 00:03:36,680 Speaker 2: this time around? 88 00:03:36,800 --> 00:03:38,800 Speaker 1: It's going to be a little different because I look 89 00:03:38,840 --> 00:03:42,280 Speaker 1: at AID, you had winners like Google that were kind 90 00:03:42,280 --> 00:03:45,640 Speaker 1: of natural monopolies. There can be many variant of large 91 00:03:45,680 --> 00:03:49,120 Speaker 1: language models, so there can be a global large language model. 92 00:03:49,640 --> 00:03:52,280 Speaker 1: There can be probably four or five of those. But 93 00:03:52,400 --> 00:03:54,560 Speaker 1: now the real use is going to be vertical ones, 94 00:03:54,680 --> 00:03:58,760 Speaker 1: so tailoring models with the data specific data for say 95 00:03:58,760 --> 00:04:04,040 Speaker 1: healthcare specific models for transportation with transportation data in it. 96 00:04:04,240 --> 00:04:06,520 Speaker 1: So I think vertical AI models will be the ones 97 00:04:06,520 --> 00:04:08,720 Speaker 1: that start to impact the economy, and then the other 98 00:04:08,720 --> 00:04:10,920 Speaker 1: ones are going to grow and we need that infrastructure, 99 00:04:11,040 --> 00:04:12,920 Speaker 1: and then what's going to happen is from that you 100 00:04:12,960 --> 00:04:17,000 Speaker 1: need power. Fusion is being talked about and is around 101 00:04:17,000 --> 00:04:18,680 Speaker 1: the corner. All the fusion companies I look at are 102 00:04:18,680 --> 00:04:21,600 Speaker 1: getting close to actually generating energy. The models are showing 103 00:04:21,640 --> 00:04:23,600 Speaker 1: soon they're going to generate energy and build these large 104 00:04:23,600 --> 00:04:26,599 Speaker 1: tokmacs that's going to have to power all the centers 105 00:04:26,600 --> 00:04:28,760 Speaker 1: with all to get as the GPUs, because if you 106 00:04:28,760 --> 00:04:30,680 Speaker 1: look at the power users of AI, it could be 107 00:04:30,720 --> 00:04:33,120 Speaker 1: the entire power right now in the United States. So 108 00:04:33,160 --> 00:04:35,560 Speaker 1: I think this stargate program or program like it is 109 00:04:35,600 --> 00:04:38,000 Speaker 1: needed and we've got to match those other countries, and 110 00:04:38,040 --> 00:04:41,479 Speaker 1: that is going to create a virtual sloop of spending 111 00:04:41,480 --> 00:04:43,720 Speaker 1: building the infrastructure and be great for America and a 112 00:04:43,800 --> 00:04:44,719 Speaker 1: revolutionized healthcare. 113 00:04:44,800 --> 00:04:47,359 Speaker 3: Healthcare as well, I know that you've been very active 114 00:04:47,480 --> 00:04:49,960 Speaker 3: in that space, in particular, particularly in the speed that 115 00:04:50,040 --> 00:04:52,360 Speaker 3: it will take to put new prescription drugs on the 116 00:04:52,400 --> 00:04:56,600 Speaker 3: market using different machine learning techniques. I'm just wondering, is 117 00:04:56,640 --> 00:04:59,159 Speaker 3: that your main focus right now of investment when it 118 00:04:59,200 --> 00:05:01,600 Speaker 3: comes to AI the other regions that you think are 119 00:05:01,640 --> 00:05:04,159 Speaker 3: promising in the application in the near term, not just 120 00:05:04,160 --> 00:05:04,919 Speaker 3: so there's these. 121 00:05:04,760 --> 00:05:07,520 Speaker 1: Long term goals. Well, I'm focused on the health directs. 122 00:05:07,560 --> 00:05:09,880 Speaker 1: I think that's the number one. But it'll be helpful 123 00:05:09,920 --> 00:05:13,200 Speaker 1: in energy and finding energy and processing energy. It'll be 124 00:05:13,200 --> 00:05:15,440 Speaker 1: helpful in transportation, It'll be helpful. It's going to be 125 00:05:15,480 --> 00:05:17,680 Speaker 1: embedded in every business. And I think like liquid I, 126 00:05:17,760 --> 00:05:21,680 Speaker 1: AI were for invested in that doesn't require it's built. 127 00:05:21,720 --> 00:05:24,039 Speaker 1: As we talked about last year, they studied a worm's 128 00:05:24,040 --> 00:05:26,080 Speaker 1: brain in mid like, who would do that? There's three 129 00:05:26,200 --> 00:05:29,400 Speaker 1: hundred neurons in a worm's brain. I think that's fine. 130 00:05:29,480 --> 00:05:31,080 Speaker 1: I mean neurons do you think is in your brain? 131 00:05:31,920 --> 00:05:35,039 Speaker 1: Three hundred and one? I'm just kidding. The average person 132 00:05:35,080 --> 00:05:37,560 Speaker 1: has eighty six billion. You guys probably have eighty seven each. 133 00:05:37,600 --> 00:05:41,280 Speaker 1: But you can come on again. If they come out 134 00:05:41,320 --> 00:05:45,440 Speaker 1: with an AI system that requires eighty percent less power 135 00:05:45,440 --> 00:05:47,440 Speaker 1: to load it, and that means you can put it 136 00:05:47,520 --> 00:05:49,920 Speaker 1: on edge device, so to be on your phone. So 137 00:05:49,960 --> 00:05:51,800 Speaker 1: you're going to have, you know, your assistant on your phone. 138 00:05:51,920 --> 00:05:53,360 Speaker 1: This is going to start to be in all the phones. 139 00:05:53,520 --> 00:05:55,920 Speaker 1: It's going to kind of revolutionize the way our lives are. 140 00:05:56,160 --> 00:05:58,839 Speaker 3: One aspect of this Davos is so interesting is that 141 00:05:58,839 --> 00:06:01,520 Speaker 3: there's some real tensions to be resolved, especially as everyone's 142 00:06:01,520 --> 00:06:04,400 Speaker 3: talking about artificial intelligence. On the flip side, Donald Trump 143 00:06:04,440 --> 00:06:08,200 Speaker 3: talking about national security. He's talking about the chips in 144 00:06:08,240 --> 00:06:11,200 Speaker 3: the US that could potentially transmit data back to China 145 00:06:11,320 --> 00:06:13,839 Speaker 3: or one of our other adversaries or competitors. And I'm 146 00:06:13,839 --> 00:06:17,200 Speaker 3: just wondering as an investor, how you understand what's national 147 00:06:17,279 --> 00:06:18,359 Speaker 3: security and what isn't. 148 00:06:18,800 --> 00:06:20,840 Speaker 1: Well, it's difficult. You got to follow the government and 149 00:06:20,920 --> 00:06:22,680 Speaker 1: it changes all the time. I think in general, we 150 00:06:22,680 --> 00:06:25,600 Speaker 1: do need national security, but it has to be focused, 151 00:06:25,600 --> 00:06:28,400 Speaker 1: it has to have a purpose, and hopefully they'll have 152 00:06:28,520 --> 00:06:30,520 Speaker 1: rational people that decide, you know what that is. 153 00:06:31,240 --> 00:06:33,400 Speaker 2: Do you worry that some voices we'll get a biggest 154 00:06:33,400 --> 00:06:35,960 Speaker 2: site in the future in the private sector. 155 00:06:37,640 --> 00:06:40,520 Speaker 1: No, I don't worry because we have a system that 156 00:06:40,600 --> 00:06:42,200 Speaker 1: I think will work in the long term, and there's 157 00:06:42,200 --> 00:06:43,800 Speaker 1: going to be a lot off. There always has been 158 00:06:44,040 --> 00:06:46,880 Speaker 1: fighting of different companies for different things, and so we'll 159 00:06:46,880 --> 00:06:48,800 Speaker 1: get through it and hopefully we'll do the right thing. 160 00:06:48,839 --> 00:06:50,960 Speaker 1: We've got to Congress, We've we've got the court system 161 00:06:51,080 --> 00:06:53,279 Speaker 1: show in the United States is very stable. So I 162 00:06:53,279 --> 00:06:56,400 Speaker 1: do think we need cybersecurity and we need security in general. 163 00:06:56,800 --> 00:07:00,599 Speaker 1: Hopefully we can have a datante with China so we 164 00:07:00,920 --> 00:07:03,839 Speaker 1: don't have that going on. And I think Trump, you know, 165 00:07:03,920 --> 00:07:07,560 Speaker 1: wants to cut deals, you know, to move the economy forward, 166 00:07:07,560 --> 00:07:08,560 Speaker 1: and I think that'll be a good thing. 167 00:07:08,720 --> 00:07:10,320 Speaker 2: I'm showing a lot of Europeans have asked you on 168 00:07:10,400 --> 00:07:12,480 Speaker 2: your opinion on what the next four years it's going 169 00:07:12,520 --> 00:07:14,600 Speaker 2: to look like. What have you tell the Europeans when 170 00:07:14,600 --> 00:07:15,400 Speaker 2: you ask that question? 171 00:07:16,200 --> 00:07:18,520 Speaker 1: You know, I think the next four years the Europeans, 172 00:07:19,480 --> 00:07:22,200 Speaker 1: I've been in many meetings here with health ministers and 173 00:07:22,240 --> 00:07:26,200 Speaker 1: all sorts of ministers, and what they say is the 174 00:07:26,640 --> 00:07:32,000 Speaker 1: EU really isn't a full union. They use the same money, 175 00:07:32,000 --> 00:07:35,280 Speaker 1: but there's not an integrated capital market. There's not integrated 176 00:07:35,280 --> 00:07:37,120 Speaker 1: in anything. In terms of starting a business, you need 177 00:07:37,160 --> 00:07:38,880 Speaker 1: to fill out, you know, one hundred pages of forums 178 00:07:38,920 --> 00:07:42,080 Speaker 1: in Germany and different ones in Portugal. So many of 179 00:07:42,120 --> 00:07:43,600 Speaker 1: them talk to me and say they need to create 180 00:07:43,840 --> 00:07:46,720 Speaker 1: a twenty eighth state with a set of rules, a 181 00:07:46,800 --> 00:07:49,480 Speaker 1: virtual state with one set of rules to start a 182 00:07:49,520 --> 00:07:53,000 Speaker 1: business and one set of of a bankruptcy. Bankruptcy is 183 00:07:53,000 --> 00:07:55,600 Speaker 1: different in every country. You can't have a great capital 184 00:07:55,600 --> 00:07:58,160 Speaker 1: market if you're dealing with twenty seven different bankruptcy kills 185 00:07:58,240 --> 00:08:01,120 Speaker 1: or twenty different ways of doing things. So I think 186 00:08:01,160 --> 00:08:04,440 Speaker 1: it's a great idea to start a twenty eighth virtual 187 00:08:04,680 --> 00:08:07,600 Speaker 1: European state, and businesses can opt to be in that 188 00:08:07,720 --> 00:08:09,720 Speaker 1: or being one of the twenty seven. But it's hard 189 00:08:09,800 --> 00:08:11,560 Speaker 1: to be in both of those at the same time. 190 00:08:11,800 --> 00:08:13,720 Speaker 1: So the big advantage the US as is it is 191 00:08:13,760 --> 00:08:17,160 Speaker 1: one market. It is as one set of regulations, so 192 00:08:17,200 --> 00:08:19,000 Speaker 1: we have a leg up and hopefully we will make 193 00:08:19,000 --> 00:08:20,880 Speaker 1: that even better in the next four years. 194 00:08:20,960 --> 00:08:23,240 Speaker 3: I want a return to where we began, which is sports, 195 00:08:23,560 --> 00:08:24,680 Speaker 3: and my sport. 196 00:08:24,400 --> 00:08:25,400 Speaker 1: Which is auctions. 197 00:08:25,720 --> 00:08:28,200 Speaker 3: I'm looking at bond auctions, which I think is like 198 00:08:28,240 --> 00:08:30,400 Speaker 3: a R caught it time. Well, I know, hold on 199 00:08:30,440 --> 00:08:32,680 Speaker 3: a second, No, I'm serious, because I wonder how much 200 00:08:32,880 --> 00:08:34,959 Speaker 3: you're watching what so many people here are worried about, 201 00:08:35,000 --> 00:08:37,880 Speaker 3: which is that all of this optimism, all of this 202 00:08:38,400 --> 00:08:41,400 Speaker 3: you know, happy talk about getting together and really focusing 203 00:08:41,400 --> 00:08:43,720 Speaker 3: on growth could get really stein made if you start 204 00:08:43,720 --> 00:08:45,920 Speaker 3: to see bond yields climb and you start to see 205 00:08:46,040 --> 00:08:49,600 Speaker 3: the fiscal pressures really come into play. How concerned are 206 00:08:49,640 --> 00:08:50,120 Speaker 3: you about that. 207 00:08:50,160 --> 00:08:52,640 Speaker 1: I'm very concerned. You know, all the data would show 208 00:08:53,160 --> 00:08:55,480 Speaker 1: when you have an increase in the money supply as 209 00:08:55,480 --> 00:08:58,040 Speaker 1: big as it's been in the last five years and 210 00:08:58,080 --> 00:09:01,240 Speaker 1: a large deficit, that's going to cause higher interest rates 211 00:09:01,280 --> 00:09:03,960 Speaker 1: and inflation. Those kind of go hand in hand. So 212 00:09:04,040 --> 00:09:07,120 Speaker 1: my biggest worry is inflation going to get out of 213 00:09:07,160 --> 00:09:09,160 Speaker 1: control and keep those interests up there. I think house 214 00:09:09,160 --> 00:09:12,199 Speaker 1: mortgages are costing seven percent. We definitely have to increase 215 00:09:12,280 --> 00:09:14,320 Speaker 1: the supply and they're looking at programs to do that. 216 00:09:14,640 --> 00:09:17,040 Speaker 1: But with seven percent mortgages, that's not great for growth. 217 00:09:17,120 --> 00:09:19,079 Speaker 3: You mentioned nineteen ninety nine, and we all know what 218 00:09:19,160 --> 00:09:21,280 Speaker 3: happened after nineteen ninety nine, and there is this feeling 219 00:09:21,280 --> 00:09:22,719 Speaker 3: that there needs to be a washout and some of 220 00:09:22,760 --> 00:09:26,960 Speaker 3: the startup companies to say deal with technology that continues 221 00:09:27,000 --> 00:09:30,600 Speaker 3: to evolve. Is that the big concern that this kind 222 00:09:30,679 --> 00:09:34,080 Speaker 3: of increase in rates could really catalyze that kind of washout. 223 00:09:34,120 --> 00:09:35,960 Speaker 3: Is that what you're keeping your eye on. 224 00:09:36,440 --> 00:09:39,800 Speaker 1: Increase in rates definitely catalyzes washouts because when money is free, 225 00:09:39,840 --> 00:09:41,800 Speaker 1: there are no washouts. Right, we had free money for 226 00:09:41,800 --> 00:09:43,920 Speaker 1: a long period time. Now. The good news is for 227 00:09:44,000 --> 00:09:47,280 Speaker 1: most of my life, too long life, most of my life, 228 00:09:48,400 --> 00:09:50,720 Speaker 1: regist rates have been t bills have been five percent, 229 00:09:50,800 --> 00:09:55,200 Speaker 1: five and a half percent, So it's not anomalist. But yeah, 230 00:09:55,640 --> 00:09:58,280 Speaker 1: that is the fear. We've already seen a lot of 231 00:09:58,280 --> 00:10:00,679 Speaker 1: the company's valuations go down a good white boats, So 232 00:10:00,679 --> 00:10:03,200 Speaker 1: I think we're kind of through that. I think there 233 00:10:03,520 --> 00:10:06,600 Speaker 1: is a lot of hype on AI and probably some 234 00:10:06,600 --> 00:10:08,600 Speaker 1: some fundings at very high levels. But I don't I don't. 235 00:10:08,600 --> 00:10:10,640 Speaker 1: I don't see it like it was in ninety nine, 236 00:10:10,840 --> 00:10:13,360 Speaker 1: where everything across the board was crazy. I used to 237 00:10:13,360 --> 00:10:15,360 Speaker 1: walk into a room in San Francisco. They give me 238 00:10:15,400 --> 00:10:17,480 Speaker 1: a term sheet and they say, we have a company 239 00:10:17,679 --> 00:10:19,839 Speaker 1: that's going to internet. Company is going to reduce the 240 00:10:19,880 --> 00:10:23,240 Speaker 1: supply chain cost of medical equipment. That we're valuing the 241 00:10:23,240 --> 00:10:25,360 Speaker 1: company at one hundred million and has no revenues. This 242 00:10:25,400 --> 00:10:27,120 Speaker 1: is the idea, and you have two hours to decide 243 00:10:27,120 --> 00:10:28,960 Speaker 1: do you want to invest ten million, one hundred million valuation. 244 00:10:29,679 --> 00:10:31,720 Speaker 1: I'll pass. I was, I was fall out of my chair. 245 00:10:32,000 --> 00:10:34,720 Speaker 1: This is a real story. I said, thank you very much. 246 00:10:34,840 --> 00:10:36,800 Speaker 1: And nine and ninety nine of those went bankrupt, and 247 00:10:37,080 --> 00:10:38,959 Speaker 1: then we had Google and a few other ones that 248 00:10:39,040 --> 00:10:41,559 Speaker 1: did well. That's not that's not the situation we're in 249 00:10:41,640 --> 00:10:42,000 Speaker 1: right now. 250 00:10:42,120 --> 00:10:43,839 Speaker 2: Stay you know, John, It's going to say great to 251 00:10:43,840 --> 00:10:46,439 Speaker 2: see you, good luck to all your sports teams. As 252 00:10:46,480 --> 00:10:49,559 Speaker 2: we say for without a doubt, Steve Palu can Bank 253 00:10:49,600 --> 00:10:50,800 Speaker 2: Capital and Force of Milian