1 00:00:00,160 --> 00:00:02,880 Speaker 1: Moving the GDP of Italy and maybe the Eurozone. Is 2 00:00:02,880 --> 00:00:05,760 Speaker 1: Steve Paliuca, the founder and CEO of Pax Greve senior 3 00:00:05,800 --> 00:00:09,040 Speaker 1: advisor at Bank Capital, I owner the Boston Celtics and 4 00:00:09,080 --> 00:00:11,320 Speaker 1: an Italian club called at Atalanta. Thank you for the 5 00:00:11,360 --> 00:00:13,920 Speaker 1: text just last week. Thank you very much defeating my 6 00:00:13,960 --> 00:00:16,639 Speaker 1: beloved AC Milan. How much fun are you have in it? 7 00:00:16,720 --> 00:00:18,200 Speaker 1: Just seems like you're having a blast right now. 8 00:00:18,239 --> 00:00:20,560 Speaker 2: It's been a good year. I I don't want to 9 00:00:20,640 --> 00:00:23,160 Speaker 2: end first open up in a good way. 10 00:00:23,200 --> 00:00:26,120 Speaker 3: Celtics are winning, and Atalanta's winning, and the markets are 11 00:00:26,160 --> 00:00:26,680 Speaker 3: doing well. 12 00:00:26,520 --> 00:00:29,440 Speaker 1: So let's get to the NBA first, so Cuban and 13 00:00:29,480 --> 00:00:32,600 Speaker 1: getting out in the Mavericks. Some people talking about peak 14 00:00:32,680 --> 00:00:34,640 Speaker 1: valuations of NBA franchises. 15 00:00:35,080 --> 00:00:37,160 Speaker 4: Can you give us your view on that development. 16 00:00:37,479 --> 00:00:39,919 Speaker 3: You know, I don't think they've peaked at all yet. 17 00:00:39,920 --> 00:00:42,280 Speaker 3: Basketball is a global school sport. I vested in soccer. 18 00:00:42,320 --> 00:00:45,919 Speaker 3: It's a global sport with streaming, you know you can. 19 00:00:46,440 --> 00:00:48,280 Speaker 3: I watched the game actually last night. I watched soy 20 00:00:48,280 --> 00:00:50,800 Speaker 3: s game on my phone last night. So we haven't 21 00:00:50,840 --> 00:00:53,560 Speaker 3: really penetrated all the markets. Media rights are still going 22 00:00:53,640 --> 00:00:58,120 Speaker 3: up and so the programming, it's the only programmer I 23 00:00:58,160 --> 00:01:01,240 Speaker 3: think the top ten of the ten rated shows in 24 00:01:01,280 --> 00:01:03,640 Speaker 3: the sports program. 25 00:01:03,080 --> 00:01:05,640 Speaker 1: Let's talk about that. And if that is absolutely crushing it, 26 00:01:06,040 --> 00:01:08,520 Speaker 1: let's build on the NBA. It feels like in a 27 00:01:08,560 --> 00:01:10,240 Speaker 1: mind of sum So let me give you that perspective 28 00:01:10,400 --> 00:01:13,400 Speaker 1: that we're hitting this ceiling that the broadcasters never really 29 00:01:13,400 --> 00:01:15,959 Speaker 1: truly on the asset they rent it. They're finding that 30 00:01:16,000 --> 00:01:18,680 Speaker 1: it's getting really expensive and they're trying to work out 31 00:01:18,680 --> 00:01:20,560 Speaker 1: where on earth to puts it. Do you land it 32 00:01:20,600 --> 00:01:22,600 Speaker 1: on peacock on a streaming device, do you get it 33 00:01:22,640 --> 00:01:24,760 Speaker 1: to as many homes as you possibly can through cable? 34 00:01:24,800 --> 00:01:26,080 Speaker 4: Cable's dying? What do we do? 35 00:01:26,840 --> 00:01:29,360 Speaker 1: What do you see further down the road that suggests 36 00:01:29,360 --> 00:01:32,280 Speaker 1: to you that media rights TV rights for sports like 37 00:01:32,319 --> 00:01:33,840 Speaker 1: basketball can keep climbing. 38 00:01:34,240 --> 00:01:36,080 Speaker 3: Well, we're in a period now, as you guys have 39 00:01:36,160 --> 00:01:38,839 Speaker 3: following media and media investing in the last thirty years, 40 00:01:39,160 --> 00:01:39,600 Speaker 3: you've had. 41 00:01:39,520 --> 00:01:41,840 Speaker 2: Bundling, unbundling, rebundling, unbundling. 42 00:01:42,400 --> 00:01:45,360 Speaker 3: We were in a big unbundling phase, you know, streaming 43 00:01:45,720 --> 00:01:49,840 Speaker 3: coming in with cable, with linear TV, and. 44 00:01:49,920 --> 00:01:51,800 Speaker 2: What that means is the new Internet players, you. 45 00:01:51,760 --> 00:01:55,360 Speaker 3: Know, Google, Facebook, They all want programming for people to 46 00:01:55,400 --> 00:01:57,960 Speaker 3: watch content, So content is more and more valuable. NBA 47 00:01:58,000 --> 00:02:00,840 Speaker 3: content is hugely valuable, and so they're fighting over that. 48 00:02:01,080 --> 00:02:04,080 Speaker 3: I think you'll see some consolidation. You'll see these streamers consolidating, 49 00:02:04,360 --> 00:02:06,680 Speaker 3: you know, the back to the bundling, because it's inefficient 50 00:02:06,720 --> 00:02:10,120 Speaker 3: to have multiple software systems, you know, multiple marketing, So 51 00:02:10,160 --> 00:02:11,160 Speaker 3: you'll see a rebundling. 52 00:02:11,440 --> 00:02:13,000 Speaker 2: But the programming will always. 53 00:02:12,760 --> 00:02:15,600 Speaker 3: Be valuable because it's the only thing that really aggregates eyeballs. 54 00:02:15,880 --> 00:02:17,440 Speaker 3: And so I don't think we're through with those rates 55 00:02:17,520 --> 00:02:20,200 Speaker 3: going up for premier sports like NBA, AFL. 56 00:02:20,240 --> 00:02:23,160 Speaker 5: Are buying sports teams better than buying regular companies. 57 00:02:24,840 --> 00:02:29,920 Speaker 3: If you win there, it's better investment lose. 58 00:02:30,160 --> 00:02:32,800 Speaker 2: Okay, investment, fire the owner. 59 00:02:33,760 --> 00:02:35,440 Speaker 4: I'm going to ask this because. 60 00:02:35,120 --> 00:02:37,920 Speaker 5: You know, the entire Middle East delegation here is trying 61 00:02:37,960 --> 00:02:41,240 Speaker 5: to buy the whole host of different football clubs in Europe. 62 00:02:41,440 --> 00:02:43,880 Speaker 5: You see this as sort of the hot investment. At 63 00:02:43,919 --> 00:02:46,400 Speaker 5: what point is it too hot to handle? And this 64 00:02:46,480 --> 00:02:48,560 Speaker 5: is sort of what happened in certain spheres of private 65 00:02:48,560 --> 00:02:50,679 Speaker 5: equity during some of the boom times that turned into 66 00:02:50,680 --> 00:02:51,280 Speaker 5: bus times. 67 00:02:51,360 --> 00:02:53,679 Speaker 3: Well, it's turned out that you know, we didn't buy 68 00:02:53,680 --> 00:02:57,000 Speaker 3: the Celtics, we didn't have a presentation of a seventy 69 00:02:57,040 --> 00:02:59,200 Speaker 3: percent IRR. It was like banner seventeen. We did it 70 00:02:59,240 --> 00:03:01,519 Speaker 3: as a labor love. So I have to say I 71 00:03:02,200 --> 00:03:05,520 Speaker 3: certainly didn't predict the values. What happened has happened in sports, 72 00:03:06,040 --> 00:03:07,760 Speaker 3: but it turns out that sports has been a very 73 00:03:07,760 --> 00:03:10,840 Speaker 3: good business because of this dynamic of people battling for eyeballs. 74 00:03:11,320 --> 00:03:14,240 Speaker 3: And then secondly, it is a fun investment. You know, 75 00:03:14,360 --> 00:03:18,519 Speaker 3: people are like to be part of tribes and sports 76 00:03:18,560 --> 00:03:20,480 Speaker 3: are Sports is a great feeling. 77 00:03:20,800 --> 00:03:25,000 Speaker 2: You know, even some people like Ac Milan, you know, it's. 78 00:03:24,919 --> 00:03:27,440 Speaker 5: Also a great feelings get Hammers. 79 00:03:27,440 --> 00:03:28,120 Speaker 4: It's a great. 80 00:03:27,960 --> 00:03:28,959 Speaker 2: Feeling to be part of this club. 81 00:03:29,000 --> 00:03:31,680 Speaker 3: And when we when we bought the team in two 82 00:03:31,720 --> 00:03:34,440 Speaker 3: thousand and three, they barely had contact with any of 83 00:03:34,480 --> 00:03:35,120 Speaker 3: their customers. 84 00:03:35,240 --> 00:03:36,560 Speaker 2: I don't think they even had the emails. 85 00:03:36,840 --> 00:03:40,320 Speaker 3: And there was no there was no Facebook, there, there 86 00:03:40,440 --> 00:03:44,000 Speaker 3: was no no Twitter now x So what's happened is 87 00:03:44,200 --> 00:03:46,920 Speaker 3: technology has allowed the fans to get right up close 88 00:03:47,200 --> 00:03:50,560 Speaker 3: to the players, to really really be part of the club, 89 00:03:50,680 --> 00:03:53,120 Speaker 3: be more engaged. And then now you have gambling coming in, 90 00:03:53,160 --> 00:03:54,640 Speaker 3: betting coming in, which makes. 91 00:03:54,440 --> 00:03:55,240 Speaker 2: Them further engaged. 92 00:03:55,640 --> 00:03:59,120 Speaker 3: So it's almost it's out of body experience when you 93 00:03:59,120 --> 00:04:01,680 Speaker 3: walk into the Boston gard and you feel that vibe, 94 00:04:01,720 --> 00:04:03,960 Speaker 3: and you feel it on television, and technologies make it easier. 95 00:04:04,080 --> 00:04:05,960 Speaker 5: I love how John open it up. It seems like 96 00:04:06,040 --> 00:04:08,080 Speaker 5: you're having so much fun, and you're having fun with 97 00:04:08,120 --> 00:04:10,280 Speaker 5: the tribe of sports, but you're also having fun with 98 00:04:10,320 --> 00:04:14,080 Speaker 5: the tribe of artificial intelligence, which is basically permeating every 99 00:04:14,160 --> 00:04:18,560 Speaker 5: corner of this get together. And I'm wondering, you've invested 100 00:04:18,560 --> 00:04:21,800 Speaker 5: in companies for many, many years. What proportion of some 101 00:04:21,839 --> 00:04:24,480 Speaker 5: of these AI companies and initiatives do you think are 102 00:04:24,520 --> 00:04:27,480 Speaker 5: actually going to come to fruition and be valuable. 103 00:04:27,600 --> 00:04:30,600 Speaker 3: That's a great question, Lisa. It reminds me of nineteen 104 00:04:30,680 --> 00:04:34,360 Speaker 3: ninety nine. I was going out to California and people 105 00:04:34,360 --> 00:04:36,440 Speaker 3: would come in with a term sheet and said, Okay, 106 00:04:36,480 --> 00:04:38,560 Speaker 3: you have two hours to invest in this company that's 107 00:04:38,600 --> 00:04:40,760 Speaker 3: going to sell medical products on the internet. 108 00:04:41,680 --> 00:04:42,200 Speaker 2: Is there a plan? 109 00:04:42,520 --> 00:04:46,240 Speaker 3: No, it's the internet, and valuation is one hundred million dollars. 110 00:04:46,279 --> 00:04:46,920 Speaker 3: We have no sales. 111 00:04:47,120 --> 00:04:49,120 Speaker 2: So I turned many of those down. 112 00:04:50,120 --> 00:04:52,760 Speaker 3: There are now a thousand artificial intelligence companies if you 113 00:04:52,839 --> 00:04:55,760 Speaker 3: walk around Davos here. I think they should change into 114 00:04:55,800 --> 00:04:59,240 Speaker 3: the artificial intelligence Forum this year. Everything seems to be 115 00:05:00,600 --> 00:05:04,200 Speaker 3: ubiquitous artificial intelligence. That being said, I think we are 116 00:05:04,240 --> 00:05:05,839 Speaker 3: in the experimentation phase. 117 00:05:06,000 --> 00:05:08,359 Speaker 2: The CEOs have not implemented this in a big way yet. 118 00:05:09,200 --> 00:05:12,200 Speaker 3: The technology sells ways to come, but I think it'll 119 00:05:12,240 --> 00:05:14,279 Speaker 3: be just as revolutionary as the Internet was. 120 00:05:15,200 --> 00:05:15,800 Speaker 2: Right now. 121 00:05:16,040 --> 00:05:18,359 Speaker 3: I don't think people realize a model like chat GPT 122 00:05:19,240 --> 00:05:22,719 Speaker 3: because it basically is a brute force transformer model. It 123 00:05:22,839 --> 00:05:26,920 Speaker 3: costs four hundred million dollars just in GPU to program 124 00:05:26,960 --> 00:05:29,800 Speaker 3: it to make the large language model. So liquid AI, 125 00:05:29,800 --> 00:05:32,440 Speaker 3: a company we've invested into the thesis for re mit 126 00:05:32,640 --> 00:05:35,599 Speaker 3: is can they can configure the software more like the brain. 127 00:05:36,200 --> 00:05:40,880 Speaker 3: So they have nine hundred let's say, nodes versus one 128 00:05:40,960 --> 00:05:43,680 Speaker 3: hundred thousand nodes in chat GPT. So you can program 129 00:05:43,720 --> 00:05:47,760 Speaker 3: a large language model for probably a twentieth, fifteenth or twentieth. 130 00:05:47,480 --> 00:05:49,640 Speaker 2: Of the cost. So you're going to need that cost to. 131 00:05:49,640 --> 00:05:51,560 Speaker 3: Come down because if you think about all these models, 132 00:05:51,680 --> 00:05:54,839 Speaker 3: if everybody built one of those models, you take up 133 00:05:54,839 --> 00:05:56,359 Speaker 3: half the power in the United States. 134 00:05:56,600 --> 00:05:57,360 Speaker 2: So I think we're. 135 00:05:57,200 --> 00:06:02,080 Speaker 3: Still in the experimentation stage and CEOs are thoughtful. CEOs 136 00:06:02,080 --> 00:06:03,560 Speaker 3: are looking at this and thinking where can I put it? 137 00:06:03,600 --> 00:06:05,360 Speaker 2: In my business? How can I make it secure? 138 00:06:06,200 --> 00:06:08,920 Speaker 3: If you build your own langulaarge language model with today's technology, 139 00:06:09,000 --> 00:06:10,839 Speaker 3: it's three four million dollars, So that's not going to happen. 140 00:06:11,080 --> 00:06:12,520 Speaker 3: So new technology are going to have to come out 141 00:06:12,520 --> 00:06:14,919 Speaker 3: to make it cheaper, more efficient and more secure. 142 00:06:15,040 --> 00:06:17,400 Speaker 4: When did you begin to become interested in all of 143 00:06:17,480 --> 00:06:18,240 Speaker 4: this state? 144 00:06:19,000 --> 00:06:21,799 Speaker 3: Thirty years ago I started a technology funded ban capital 145 00:06:21,800 --> 00:06:25,320 Speaker 3: called Information Partners, And in fact I wrote a presentation 146 00:06:26,000 --> 00:06:29,599 Speaker 3: for a conference in Aspen. It's very it looks like 147 00:06:29,640 --> 00:06:32,479 Speaker 3: Caveman now, but it was called convergence, and I was 148 00:06:32,480 --> 00:06:34,880 Speaker 3: saying that what's going to happen is the telephone and 149 00:06:34,920 --> 00:06:37,560 Speaker 3: the television and computers were going to converge, so we'd 150 00:06:37,600 --> 00:06:38,360 Speaker 3: have a different experience. 151 00:06:38,400 --> 00:06:39,240 Speaker 2: Now the little did I. 152 00:06:39,200 --> 00:06:41,000 Speaker 3: Know that Apple will come out with the iPhone all 153 00:06:41,000 --> 00:06:44,159 Speaker 3: the rest of it. But I've been interested in tech 154 00:06:44,480 --> 00:06:47,080 Speaker 3: and been a tech investor and consultant for many many years, 155 00:06:47,800 --> 00:06:50,520 Speaker 3: and been through the Internet boom, the craziness of you know, 156 00:06:50,880 --> 00:06:52,960 Speaker 3: literally I'm standing in California ninety ninety is these term 157 00:06:52,960 --> 00:06:55,760 Speaker 3: shoots you're coming in. This is crazy. And now that's 158 00:06:55,760 --> 00:06:58,160 Speaker 3: happening again on Ai. But AI is going to have 159 00:06:58,200 --> 00:07:00,760 Speaker 3: a huge impact productivity. It's going to be great for 160 00:07:00,800 --> 00:07:02,080 Speaker 3: the United States because I think it's going to be 161 00:07:02,120 --> 00:07:05,160 Speaker 3: the next productivity wave. And I'm not one of those 162 00:07:05,440 --> 00:07:08,359 Speaker 3: is scared of it. To me, it is a super 163 00:07:08,400 --> 00:07:12,840 Speaker 3: sized slide rule. You know, when when I was withthus 164 00:07:12,960 --> 00:07:14,560 Speaker 3: the County Study of Accounting, I used to use a 165 00:07:14,560 --> 00:07:17,120 Speaker 3: slide rule and you look at the table and they said, Oh, 166 00:07:17,360 --> 00:07:19,200 Speaker 3: the computer's coming in twelve HP, twelve C. 167 00:07:19,320 --> 00:07:21,200 Speaker 2: It's going to ruin our minds. It doesn't. 168 00:07:21,240 --> 00:07:23,840 Speaker 3: It's a tool that's going to expand our capabilities. And 169 00:07:24,320 --> 00:07:25,280 Speaker 3: AI is incredible. 170 00:07:25,520 --> 00:07:27,000 Speaker 4: My Spence said the same thing. 171 00:07:27,040 --> 00:07:29,680 Speaker 1: No about Laureate General atlant State working over there with 172 00:07:29,800 --> 00:07:32,120 Speaker 1: Bill Ford. We talked to him about forty minutes ago. 173 00:07:32,480 --> 00:07:35,720 Speaker 1: Is there anything about it that concerns you? And I think, whatsoever? 174 00:07:36,080 --> 00:07:38,040 Speaker 1: Is there something? Let's say you get some term sheets. 175 00:07:38,320 --> 00:07:40,120 Speaker 1: I want you to do an station between the companies 176 00:07:40,120 --> 00:07:42,040 Speaker 1: you would work with and the companies you worried about 177 00:07:42,040 --> 00:07:43,480 Speaker 1: because you think that'd be more harm than good. 178 00:07:43,640 --> 00:07:44,800 Speaker 4: How do you draw that distinction. 179 00:07:45,520 --> 00:07:48,000 Speaker 2: Well, that's a great question. It's a tough one, you know. 180 00:07:48,240 --> 00:07:50,560 Speaker 3: I would I would compare it to anything else like 181 00:07:50,720 --> 00:07:53,960 Speaker 3: nuclear power was great. Nuclear bombs can destroy the world, 182 00:07:54,040 --> 00:07:57,800 Speaker 3: right if you have bad actors, So so artificial intelligence, 183 00:07:57,960 --> 00:08:00,360 Speaker 3: if you have a bad actor behind it, it's going 184 00:08:00,400 --> 00:08:01,880 Speaker 3: to be a problem. So what's going to have to 185 00:08:01,920 --> 00:08:03,640 Speaker 3: happen is you're gonna have a regulation. You're gonna have 186 00:08:03,640 --> 00:08:05,160 Speaker 3: to have people watching the Internet. You're gonna have to 187 00:08:05,200 --> 00:08:08,920 Speaker 3: people punishing and and and and thinking about security. So 188 00:08:08,920 --> 00:08:12,000 Speaker 3: so again I view it as a tool, but it's 189 00:08:12,000 --> 00:08:14,040 Speaker 3: a tool that can be misused. So there's gonna have 190 00:08:14,040 --> 00:08:16,360 Speaker 3: to be a certain amount of regulation and a certain 191 00:08:16,360 --> 00:08:20,560 Speaker 3: amount of of of of more security and more watching 192 00:08:20,560 --> 00:08:22,000 Speaker 3: what is happening by the world. 193 00:08:22,120 --> 00:08:25,040 Speaker 5: On the flip side, are there any companies that you're seeing? 194 00:08:25,560 --> 00:08:28,520 Speaker 5: Are there areas of artificial intelligence that you're seeing where 195 00:08:28,520 --> 00:08:30,760 Speaker 5: you think companies are going to become the next Facebook 196 00:08:30,880 --> 00:08:31,360 Speaker 5: or Apple? 197 00:08:32,559 --> 00:08:34,680 Speaker 2: That's you know, there's there's two schools of thought. 198 00:08:35,080 --> 00:08:37,960 Speaker 3: One school of thought would be, uh, one company, maybe 199 00:08:38,040 --> 00:08:40,240 Speaker 3: chat Gypt is going to emerge and become the Google 200 00:08:40,440 --> 00:08:44,959 Speaker 3: you know of of of artificial intelligence. Another school thought is, uh, 201 00:08:45,080 --> 00:08:47,640 Speaker 3: these models will come down in price, which I say 202 00:08:47,679 --> 00:08:50,000 Speaker 3: liquid AI is one that can drive that price down 203 00:08:50,600 --> 00:08:53,520 Speaker 3: and you will get many large language models that are 204 00:08:53,559 --> 00:08:56,680 Speaker 3: that are usable and Secondly, you'll focus on verticals, so 205 00:08:56,679 --> 00:08:59,520 Speaker 3: they'll be artificial intelligence for medicals. So it doesn't make 206 00:08:59,559 --> 00:09:03,040 Speaker 3: sense for a medical artificial intelligence system to go out 207 00:09:03,080 --> 00:09:09,960 Speaker 3: and read the Mozart symphony. It has no relevance to medicine. 208 00:09:10,120 --> 00:09:11,839 Speaker 3: So I'm in the camp that believes there'll be a 209 00:09:11,880 --> 00:09:15,360 Speaker 3: hybrid system. There'll be a few foundational models people will use, 210 00:09:16,280 --> 00:09:18,200 Speaker 3: they'll be large with they foundational and then there'll be 211 00:09:18,640 --> 00:09:21,240 Speaker 3: probably companies that can capture vertical markets and so you'll 212 00:09:21,240 --> 00:09:21,760 Speaker 3: have a number. 213 00:09:22,040 --> 00:09:23,040 Speaker 2: It'll be a network effect. 214 00:09:23,080 --> 00:09:26,360 Speaker 3: There'll be a company captures medical, a company that captures industrial, 215 00:09:26,360 --> 00:09:26,760 Speaker 3: et cetera. 216 00:09:26,920 --> 00:09:28,920 Speaker 5: Before we let you go, you mentioned nineteen ninety nine, 217 00:09:29,040 --> 00:09:30,000 Speaker 5: and we can't. 218 00:09:29,800 --> 00:09:30,640 Speaker 4: Let that slide. 219 00:09:30,679 --> 00:09:33,120 Speaker 5: If we're looking at another ninety nine Internet bubble like 220 00:09:33,160 --> 00:09:36,520 Speaker 5: boom and artificial intelligence, is there going to be some 221 00:09:36,640 --> 00:09:38,400 Speaker 5: kind of bust and how painful will it be? 222 00:09:38,920 --> 00:09:41,640 Speaker 3: Yeah, there's always a bust there for the boom, for sure. 223 00:09:43,440 --> 00:09:46,680 Speaker 3: The good news is these companies are requiring equity, so 224 00:09:46,679 --> 00:09:47,839 Speaker 3: it's not debt, so it's not going to have a 225 00:09:47,880 --> 00:09:48,760 Speaker 3: systemic effect. 226 00:09:49,040 --> 00:09:52,360 Speaker 2: But there will be a bust of people who invested. 227 00:09:51,920 --> 00:09:54,440 Speaker 3: In every single thing that had the word AI in 228 00:09:54,440 --> 00:09:58,120 Speaker 3: front of it, and there'll be a carnage and then 229 00:09:58,160 --> 00:10:01,000 Speaker 3: they'll be the companies will emerge. The best companies will emerge, 230 00:10:01,040 --> 00:10:03,199 Speaker 3: and there'll be several of them. And I think, again, 231 00:10:03,240 --> 00:10:07,360 Speaker 3: different than Google dominating search, there'll be companies and companies 232 00:10:07,400 --> 00:10:11,240 Speaker 3: that really dominate vertical markets. And by the way, if 233 00:10:11,280 --> 00:10:14,240 Speaker 3: there is a disruptor, if you think about AT and 234 00:10:14,240 --> 00:10:16,880 Speaker 3: T used to dominate the phone business and then everything 235 00:10:16,920 --> 00:10:21,600 Speaker 3: became fragmented, Google looks unassailable. But if there is a disruptor, 236 00:10:21,640 --> 00:10:24,680 Speaker 3: it'll be an AI company that gets you a better search. 237 00:10:24,800 --> 00:10:26,480 Speaker 2: So when you say, you know, I want to. 238 00:10:26,440 --> 00:10:29,319 Speaker 3: Know what the best pasta is, it'll it won't give 239 00:10:29,320 --> 00:10:32,000 Speaker 3: you five articles, It'll give you a recipe for the pasta, 240 00:10:32,160 --> 00:10:34,360 Speaker 3: why it's the right pasa and who's the person that 241 00:10:34,360 --> 00:10:37,160 Speaker 3: invented the recipe that will That would be a big 242 00:10:37,200 --> 00:10:39,400 Speaker 3: disruptor for Google. That's why Google is spending so much 243 00:10:39,400 --> 00:10:42,840 Speaker 3: money into AI. Microsoft as well. They want to protect 244 00:10:42,840 --> 00:10:46,280 Speaker 3: their positions. So I think we're in a great error 245 00:10:46,280 --> 00:10:48,880 Speaker 3: where there will be some disruptors coming in and life 246 00:10:48,960 --> 00:10:49,400 Speaker 3: may change. 247 00:10:49,400 --> 00:10:51,080 Speaker 2: It won't be all Google. It will be an ourtificial 248 00:10:51,120 --> 00:10:53,000 Speaker 2: intelligence company that gets us better data. 249 00:10:53,040 --> 00:10:54,800 Speaker 1: You don't need to spend full into midiontol. It's find 250 00:10:54,800 --> 00:10:56,360 Speaker 1: out the ans to that. There is only one answer. 251 00:10:56,400 --> 00:10:59,600 Speaker 4: It's not us. It's always not us, right always that is? 252 00:10:59,760 --> 00:11:02,160 Speaker 4: That is the correct answer? Did she g bt should 253 00:11:02,240 --> 00:11:02,880 Speaker 4: know the answer? 254 00:11:03,480 --> 00:11:05,640 Speaker 5: Your grandmother may be perfect pasta. 255 00:11:05,760 --> 00:11:08,040 Speaker 4: Don't ever think cool. There is no challenge to that. 256 00:11:08,480 --> 00:11:10,160 Speaker 4: Was it really good? Steve will say the same thing 257 00:11:10,200 --> 00:11:12,480 Speaker 4: about his grandmother too, and then we'll fight over it, 258 00:11:12,520 --> 00:11:13,480 Speaker 4: Steve panuk pack. 259 00:11:13,600 --> 00:11:15,720 Speaker 3: But my grandmother used to roll it out on the 260 00:11:15,720 --> 00:11:19,040 Speaker 3: table and yeah, the individual pasta every every Sunday. 261 00:11:19,160 --> 00:11:21,360 Speaker 2: Oh my gosh, I didn't know. I grew up at 262 00:11:21,400 --> 00:11:24,680 Speaker 2: a gourmet in a restaurant. Can I come over that used. 263 00:11:24,600 --> 00:11:27,160 Speaker 1: To make the sauces at the back of the apartment 264 00:11:27,160 --> 00:11:29,559 Speaker 1: building that have these big sort of oil canisters and 265 00:11:29,679 --> 00:11:33,400 Speaker 1: make the tomato sauce just amazing. 266 00:11:33,600 --> 00:11:35,599 Speaker 3: She used to wake up a six and worrying and 267 00:11:35,640 --> 00:11:36,959 Speaker 3: make fried pizza dough for me when I was a 268 00:11:37,040 --> 00:11:39,120 Speaker 3: kid in every breakfast. 269 00:11:38,640 --> 00:11:40,200 Speaker 2: Every every every Saturday Sunday. 270 00:11:40,240 --> 00:11:40,920 Speaker 4: Then you taking me back. 271 00:11:41,040 --> 00:11:41,840 Speaker 2: It was a good life. 272 00:11:42,000 --> 00:11:44,760 Speaker 5: Yeah, I didn't have that. 273 00:11:44,760 --> 00:11:45,320 Speaker 3: That was good. 274 00:11:45,440 --> 00:11:45,840 Speaker 2: Honestly. 275 00:11:45,920 --> 00:11:50,000 Speaker 1: The food there that was like a three micheline style restaurant. 276 00:11:50,000 --> 00:11:51,880 Speaker 1: And like you say, Steve had no idea it was 277 00:11:51,920 --> 00:11:53,160 Speaker 1: that good at the time. 278 00:11:53,320 --> 00:11:55,360 Speaker 4: You know, you try and find that elsewhere, you cannot 279 00:11:55,400 --> 00:11:55,720 Speaker 4: find it. 280 00:11:55,760 --> 00:11:56,960 Speaker 2: We have to get you a kid at a goose 281 00:11:57,000 --> 00:12:00,600 Speaker 2: coat on the company bank apples. Yes, it's an orbis 282 00:12:00,679 --> 00:12:01,600 Speaker 2: orbis code of Davos. 283 00:12:01,720 --> 00:12:02,839 Speaker 4: Okay, this is sounds