1 00:00:02,520 --> 00:00:08,520 Speaker 1: Bloomberg Audio Studios, podcasts, radio news bringing out to that 2 00:00:08,600 --> 00:00:11,920 Speaker 1: special conversation that we mentioned at Bloomberg in Vest, Gary Coheneweischer, 3 00:00:11,960 --> 00:00:13,840 Speaker 1: IBM and Bloomberg's Matt Miller. 4 00:00:13,920 --> 00:00:16,280 Speaker 2: Let's watch and listen live. Yeah, we have right now. 5 00:00:17,000 --> 00:00:20,120 Speaker 1: During that period of time, you had to pre program 6 00:00:20,160 --> 00:00:23,840 Speaker 1: a huge database with all of the answers because we 7 00:00:23,960 --> 00:00:28,640 Speaker 1: didn't have the real time connectivity to the data around 8 00:00:28,640 --> 00:00:30,760 Speaker 1: the world. Today, if you were going to play Jeopardy, 9 00:00:30,760 --> 00:00:33,120 Speaker 1: you would completely do it differently. You would just allow 10 00:00:33,640 --> 00:00:37,680 Speaker 1: the AI machine to go get the answer instantaneously. Forty 11 00:00:37,760 --> 00:00:39,479 Speaker 1: years ago, you had to sort of put the answers 12 00:00:39,479 --> 00:00:42,120 Speaker 1: in the room and hope that Jeopardy asked the questions 13 00:00:42,159 --> 00:00:45,560 Speaker 1: in the categories you wanted. It was easier to program 14 00:00:45,560 --> 00:00:48,720 Speaker 1: all the chess moves. Today, you know, AI has evolved 15 00:00:48,760 --> 00:00:51,880 Speaker 1: dramatically and I think we're all living through this. I 16 00:00:51,920 --> 00:00:54,840 Speaker 1: would say evolution. I'm not sure it's a revolution evolution 17 00:00:54,920 --> 00:00:58,120 Speaker 1: of AI. You know, we've gone from the concept and 18 00:00:58,160 --> 00:01:00,760 Speaker 1: if you look at sort of CEO surveys or corporate surveys, 19 00:01:01,040 --> 00:01:02,800 Speaker 1: you know, a year or two ago, I think it 20 00:01:02,880 --> 00:01:06,920 Speaker 1: was ninety about ninety percent of CEOs said, they, you know, 21 00:01:07,160 --> 00:01:10,000 Speaker 1: clearly we're going to be involved in use AIAI. And 22 00:01:10,080 --> 00:01:12,640 Speaker 1: of those ninety percent, less than ten percent could actually 23 00:01:12,680 --> 00:01:13,479 Speaker 1: tell you what they were going. 24 00:01:13,480 --> 00:01:14,080 Speaker 2: To use it for. 25 00:01:14,440 --> 00:01:17,360 Speaker 1: They just felt like they needed to talk about using it. Today, 26 00:01:17,640 --> 00:01:21,039 Speaker 1: I think the amount of CEOs they are using AI 27 00:01:21,520 --> 00:01:24,520 Speaker 1: continues to grow, but more importantly, they actually know what 28 00:01:24,560 --> 00:01:30,199 Speaker 1: they're going to use it for, and the definitional outcomes 29 00:01:30,280 --> 00:01:34,479 Speaker 1: of where AI can enhance your business is becoming clear. 30 00:01:36,040 --> 00:01:37,840 Speaker 1: That's where they are today. I think it's going to 31 00:01:37,880 --> 00:01:41,560 Speaker 1: continue to evolve to where you can use AI in 32 00:01:41,600 --> 00:01:45,520 Speaker 1: your business and where the paybacks are and how you create. 33 00:01:45,280 --> 00:01:47,280 Speaker 2: Real productivity growth. 34 00:01:48,200 --> 00:01:50,960 Speaker 1: I won't geek out on productivity growth, but at the 35 00:01:51,040 --> 00:01:53,080 Speaker 1: end of the day, you know, if you want to 36 00:01:53,080 --> 00:01:55,640 Speaker 1: grow GDP and you want to grow your economy, it 37 00:01:55,680 --> 00:01:59,600 Speaker 1: is all about getting more productivity, getting more units of 38 00:01:59,640 --> 00:02:00,920 Speaker 1: work your labor force. 39 00:02:01,080 --> 00:02:05,320 Speaker 3: So I think about this a lot in a competitive sense. 40 00:02:05,440 --> 00:02:07,720 Speaker 3: If you're a big business and you haven't figured out 41 00:02:07,760 --> 00:02:08,280 Speaker 3: how you're going to. 42 00:02:08,360 --> 00:02:12,160 Speaker 2: Use AI yet, are you too late? Have you lost out? 43 00:02:12,840 --> 00:02:15,640 Speaker 1: You're never too late. You're never too late. You know 44 00:02:15,919 --> 00:02:18,680 Speaker 1: what you may be late is you may be at 45 00:02:18,720 --> 00:02:22,880 Speaker 1: a competitive disadvantage to your nearest competitor because they may 46 00:02:22,919 --> 00:02:25,399 Speaker 1: be able to do certain things more efficiently than you, 47 00:02:25,919 --> 00:02:28,960 Speaker 1: But you can catch up quickly, so you're not that 48 00:02:29,120 --> 00:02:31,760 Speaker 1: far behind. So you have to sort of figure out 49 00:02:31,800 --> 00:02:34,120 Speaker 1: where the opportunities are in your business. And it depends 50 00:02:34,160 --> 00:02:37,160 Speaker 1: on size, scope scale of your business, so like a 51 00:02:38,800 --> 00:02:41,400 Speaker 1: very obvious place where in a bigger company. And this 52 00:02:41,440 --> 00:02:43,720 Speaker 1: is something we did at IBM, and we've talked about it. 53 00:02:43,760 --> 00:02:46,720 Speaker 1: We like being client zero for all of our products. 54 00:02:47,240 --> 00:02:50,120 Speaker 1: You know, we did something relatively simple. It sounds simple, 55 00:02:50,160 --> 00:02:52,919 Speaker 1: but it was not that simple, an HR chat pot. 56 00:02:53,440 --> 00:02:56,000 Speaker 2: You think of what most people use. 57 00:02:56,080 --> 00:03:00,240 Speaker 1: HR for after the day one hiring experience of what 58 00:03:00,280 --> 00:03:02,480 Speaker 1: they want is, you know, they go buy an apartment, 59 00:03:02,520 --> 00:03:04,600 Speaker 1: they go rent an apartment, they get some they need 60 00:03:04,639 --> 00:03:06,520 Speaker 1: a reference letter, they need a reference leative. What day 61 00:03:06,520 --> 00:03:08,280 Speaker 1: did I start working? How long have I worked here? 62 00:03:08,440 --> 00:03:10,680 Speaker 1: What was my last year's conversation? What was my bonus? 63 00:03:11,000 --> 00:03:14,240 Speaker 1: And you want that scent like that minute. And you 64 00:03:14,280 --> 00:03:16,919 Speaker 1: know in a year, a year or two ago, used 65 00:03:16,919 --> 00:03:19,359 Speaker 1: to call someone in HR, used to give them the information. 66 00:03:19,440 --> 00:03:20,960 Speaker 1: You said, tell them where you want it sent, and 67 00:03:21,000 --> 00:03:23,200 Speaker 1: you hope and pray they send it out quickly. Today 68 00:03:23,480 --> 00:03:26,560 Speaker 1: you're at IBM, you will type it into a HR 69 00:03:26,639 --> 00:03:29,600 Speaker 1: chatbot and in a minute you said hit send, the 70 00:03:29,720 --> 00:03:32,600 Speaker 1: letter will go out. So people are actually very happy. 71 00:03:32,600 --> 00:03:34,040 Speaker 1: Oh my god, I get what I want. I get 72 00:03:34,080 --> 00:03:36,160 Speaker 1: all this information. You have information about your benefit and 73 00:03:36,720 --> 00:03:39,720 Speaker 1: information about your healthcare. We do it all real time. 74 00:03:39,800 --> 00:03:43,160 Speaker 1: We cut six hundred people out of our HR department. 75 00:03:43,160 --> 00:03:45,960 Speaker 1: Now we didn't fire them, we redeployed them to places 76 00:03:46,000 --> 00:03:47,280 Speaker 1: where they can be more useful. 77 00:03:47,680 --> 00:03:49,120 Speaker 2: And our satisfaction scores for. 78 00:03:49,640 --> 00:03:53,200 Speaker 1: Our employees using HR has gone up dramatically because they 79 00:03:53,200 --> 00:03:56,560 Speaker 1: get real time answers. So it depends what you're trying 80 00:03:56,600 --> 00:03:58,840 Speaker 1: to achieve in your company. You know, there's a lot 81 00:03:58,880 --> 00:04:02,160 Speaker 1: of people using AI today and. 82 00:04:02,160 --> 00:04:04,320 Speaker 2: Code assist in writing new code. 83 00:04:04,440 --> 00:04:06,240 Speaker 1: So if you're in a big code writing shop or 84 00:04:06,280 --> 00:04:09,240 Speaker 1: writing new code is important to you, and you're at 85 00:04:09,240 --> 00:04:10,920 Speaker 1: a disadvantage. 86 00:04:10,160 --> 00:04:12,000 Speaker 2: Because your competitors are using a. 87 00:04:13,640 --> 00:04:17,120 Speaker 1: Code assist program and you're you're at a competitive disadvantage. 88 00:04:17,240 --> 00:04:19,800 Speaker 1: You take a lot of people here on financial services industry, 89 00:04:19,880 --> 00:04:22,919 Speaker 1: most banks happen to know what banks do. Most banks 90 00:04:23,040 --> 00:04:27,800 Speaker 1: historically build a technology infrastructure on a sort of as 91 00:04:27,880 --> 00:04:30,680 Speaker 1: needed basis, and if you go pull back the covers. 92 00:04:30,680 --> 00:04:33,800 Speaker 1: They've got lots of different systems in a variety of 93 00:04:33,839 --> 00:04:36,680 Speaker 1: different languages, and when you try and integrate them and 94 00:04:36,680 --> 00:04:38,960 Speaker 1: have them talk to each other, it's very difficult. So 95 00:04:39,800 --> 00:04:41,960 Speaker 1: every bank during the course of the last decade or 96 00:04:42,000 --> 00:04:44,520 Speaker 1: twenty years tried to think, how do I get all 97 00:04:44,720 --> 00:04:46,800 Speaker 1: of my software? How do I get all my systems 98 00:04:46,839 --> 00:04:50,080 Speaker 1: on a common language. It was an arduous task that 99 00:04:50,240 --> 00:04:53,920 Speaker 1: was cost prohibitive. Today it's not cost prohibitive. Today you 100 00:04:54,000 --> 00:04:58,360 Speaker 1: can get a code assist bot to basically port one 101 00:04:58,520 --> 00:05:01,760 Speaker 1: language over to another pro another language, and it's pretty 102 00:05:01,760 --> 00:05:03,919 Speaker 1: flawless and how it happens. So there's a lot of 103 00:05:03,960 --> 00:05:08,320 Speaker 1: places evolving today where AI is very useful, and it. 104 00:05:08,279 --> 00:05:10,720 Speaker 2: Depends on your I think your circumstances. 105 00:05:10,720 --> 00:05:12,680 Speaker 3: I want to ask you about the cost because I 106 00:05:12,680 --> 00:05:15,080 Speaker 3: think the reason it was so daunting in my mind, 107 00:05:15,160 --> 00:05:19,479 Speaker 3: especially for a information technology company, is I was looking 108 00:05:19,520 --> 00:05:22,280 Speaker 3: at open AI and thinking, you've got to invest one 109 00:05:22,320 --> 00:05:26,159 Speaker 3: hundred billion dollars to create a large language model, and 110 00:05:26,200 --> 00:05:29,359 Speaker 3: only a few of these giant hyperscalers in Silicon Valley 111 00:05:29,360 --> 00:05:29,840 Speaker 3: can do it. 112 00:05:30,000 --> 00:05:30,200 Speaker 2: Now. 113 00:05:30,240 --> 00:05:33,600 Speaker 3: Obviously, Deep Seek, although there are questions about how exactly 114 00:05:33,640 --> 00:05:35,440 Speaker 3: they did it how much it costs, shows that you 115 00:05:35,440 --> 00:05:36,520 Speaker 3: can do it for far less. 116 00:05:37,279 --> 00:05:39,839 Speaker 1: Well, thank you the Chinese Deep Seek for proving out 117 00:05:39,839 --> 00:05:41,320 Speaker 1: what a lot of us knew in this country but 118 00:05:41,360 --> 00:05:43,720 Speaker 1: no one wanted to listen to. So if you look 119 00:05:43,720 --> 00:05:45,240 Speaker 1: at what we've been doing at IBM, you look at 120 00:05:45,279 --> 00:05:46,880 Speaker 1: what met has been doing, you look at a lot 121 00:05:46,920 --> 00:05:49,719 Speaker 1: of companies here in the United States been doing. We've 122 00:05:48,960 --> 00:05:51,120 Speaker 1: been building small models. 123 00:05:51,880 --> 00:05:54,480 Speaker 2: What we have found and Deep Seek sort of proved 124 00:05:54,520 --> 00:05:56,640 Speaker 2: this out. The small models are much. 125 00:05:56,480 --> 00:06:00,680 Speaker 1: More efficient, they compute faster, they take less power, but 126 00:06:00,760 --> 00:06:03,839 Speaker 1: they're not a model that can do everything. They're sort 127 00:06:03,880 --> 00:06:07,200 Speaker 1: of targeted models. So the things that I was talking 128 00:06:07,240 --> 00:06:09,799 Speaker 1: to you about, we have small models that are used 129 00:06:09,800 --> 00:06:14,679 Speaker 1: for specific targeted goals. I think that is the future. 130 00:06:14,720 --> 00:06:17,120 Speaker 1: We think that is the future of AI. We think 131 00:06:17,400 --> 00:06:21,040 Speaker 1: most companies will engage with as many small models as 132 00:06:21,040 --> 00:06:23,640 Speaker 1: they need to get done what they need to get 133 00:06:23,640 --> 00:06:26,440 Speaker 1: done where they're using AI, and it will be vastly 134 00:06:26,480 --> 00:06:29,520 Speaker 1: more efficient, vastly more cost effective for them. And I 135 00:06:29,520 --> 00:06:31,719 Speaker 1: think this is the future where we're going. Look, will 136 00:06:31,720 --> 00:06:35,440 Speaker 1: there be some large language models out there that are effective, Yes, 137 00:06:35,600 --> 00:06:38,120 Speaker 1: but they will sort of be the catch all models 138 00:06:38,160 --> 00:06:39,360 Speaker 1: for everything, so. 139 00:06:40,920 --> 00:06:43,080 Speaker 3: It still costs a lot of money to invest. I 140 00:06:43,080 --> 00:06:47,480 Speaker 3: think Moody's estimated it's going to be two billion dollars 141 00:06:47,520 --> 00:06:49,559 Speaker 3: in investment over the next five years, which doesn't seem 142 00:06:49,560 --> 00:06:51,520 Speaker 3: that far off when I look at the kapex plans 143 00:06:51,920 --> 00:06:56,039 Speaker 3: of these mag seven companies. You and I both lived 144 00:06:56,080 --> 00:06:59,320 Speaker 3: through the Internet bubble and watched so many companies put 145 00:06:59,360 --> 00:07:03,640 Speaker 3: money into paying down fiber cable and build the infrastructure 146 00:07:03,640 --> 00:07:07,000 Speaker 3: the data centers of Web one point zero for a 147 00:07:07,040 --> 00:07:09,680 Speaker 3: lot of CAPEX when they weren't getting a lot of 148 00:07:09,720 --> 00:07:11,960 Speaker 3: return on investment, at least not for years if they 149 00:07:12,000 --> 00:07:12,559 Speaker 3: ever did. 150 00:07:13,120 --> 00:07:15,840 Speaker 2: Do you see real parallels here? Yeah? Yes. 151 00:07:16,400 --> 00:07:23,120 Speaker 1: Look, as you're building technological infrastructure and backbone, you're building 152 00:07:23,160 --> 00:07:26,960 Speaker 1: it based on the available what's available today. 153 00:07:27,000 --> 00:07:29,680 Speaker 2: In what you know today, the whole. 154 00:07:31,000 --> 00:07:38,280 Speaker 1: AI backbone infrastructure, compute technology, it's moving very fast. But 155 00:07:38,400 --> 00:07:41,360 Speaker 1: you can't wait for version two point zero or version 156 00:07:41,360 --> 00:07:43,120 Speaker 1: three point zero, version four point oh, you have to 157 00:07:43,120 --> 00:07:46,360 Speaker 1: build based on what's build what's available today. What's available 158 00:07:46,360 --> 00:07:49,560 Speaker 1: today will be obsolete a lot of it before it 159 00:07:49,560 --> 00:07:52,240 Speaker 1: gets built, and it will get replaced. 160 00:07:51,760 --> 00:07:53,679 Speaker 2: By newer and newer versions. 161 00:07:53,680 --> 00:07:56,320 Speaker 1: So there is a bunch of money being spent today, 162 00:07:56,440 --> 00:07:59,880 Speaker 1: they'll have very low returns to no returns. But if 163 00:07:59,920 --> 00:08:02,960 Speaker 1: you don't start investing, you'll never start. 164 00:08:03,040 --> 00:08:04,360 Speaker 2: So you sort of have this. 165 00:08:04,320 --> 00:08:07,280 Speaker 1: Prisoner's dilemma that I have to start building this data 166 00:08:07,320 --> 00:08:09,600 Speaker 1: center and I think I'm going to be putting x 167 00:08:09,760 --> 00:08:11,880 Speaker 1: y z chip in, but by the way it gets 168 00:08:11,920 --> 00:08:14,600 Speaker 1: built in two years, I may not be putting most 169 00:08:14,720 --> 00:08:17,080 Speaker 1: likely will not be putting that chip in, And the 170 00:08:17,120 --> 00:08:19,280 Speaker 1: amount of power I think I need to run that 171 00:08:19,360 --> 00:08:22,840 Speaker 1: data center may be dramatically less because the next chip 172 00:08:23,360 --> 00:08:25,760 Speaker 1: I bet is going to be a lot more efficient 173 00:08:25,960 --> 00:08:28,240 Speaker 1: than the chip we have today. So a lot of 174 00:08:28,280 --> 00:08:32,160 Speaker 1: this build you can't wait for the efficient frontier because someone's. 175 00:08:31,840 --> 00:08:32,959 Speaker 2: Always going to be out there. 176 00:08:33,160 --> 00:08:35,120 Speaker 1: You have to keep building, and there will be a 177 00:08:35,120 --> 00:08:37,920 Speaker 1: lot of waste to get there. I just think it's inevitable. 178 00:08:38,040 --> 00:08:40,360 Speaker 1: And then you know, these AI models themselves are going 179 00:08:40,440 --> 00:08:42,800 Speaker 1: to build upon themselves, and they're gonna get smarter and smarter, 180 00:08:43,040 --> 00:08:46,079 Speaker 1: and they're going to optimize and re optimize, and everything's 181 00:08:46,120 --> 00:08:49,240 Speaker 1: going to get better and faster and quicker, and ultimately 182 00:08:49,280 --> 00:08:52,120 Speaker 1: the AI of this of the data center is going 183 00:08:52,160 --> 00:08:53,679 Speaker 1: to tell you how to build better data center, how 184 00:08:53,679 --> 00:08:57,079 Speaker 1: to build better connectivity, how to optimize the optimization. 185 00:08:57,480 --> 00:08:59,120 Speaker 3: What do you do though as an investor, I mean, 186 00:08:59,800 --> 00:09:02,319 Speaker 3: you know, as an IBM or as a Microsoft or 187 00:09:02,360 --> 00:09:04,720 Speaker 3: as you point out, they don't have any choice, They've 188 00:09:04,720 --> 00:09:07,719 Speaker 3: got to build now. But what do you do as 189 00:09:07,760 --> 00:09:09,280 Speaker 3: a as a private investor or what do you do 190 00:09:09,280 --> 00:09:10,520 Speaker 3: as someone who's looking at markets? 191 00:09:10,520 --> 00:09:13,280 Speaker 2: How do you analyze that? Well, it's interesting if you 192 00:09:13,320 --> 00:09:13,760 Speaker 2: look at the. 193 00:09:15,640 --> 00:09:18,880 Speaker 1: History and the history of technology is not that old, 194 00:09:19,200 --> 00:09:21,640 Speaker 1: and if you look at you know, sort of what's 195 00:09:21,679 --> 00:09:27,560 Speaker 1: going on here. Ultimately, the bigger companies are going to have. 196 00:09:29,040 --> 00:09:31,480 Speaker 2: A real big impact in this now. 197 00:09:31,800 --> 00:09:33,880 Speaker 1: It's really and then I'll point out that all the 198 00:09:33,920 --> 00:09:36,680 Speaker 1: bigger companies that we talk about today, whether it be 199 00:09:36,760 --> 00:09:40,360 Speaker 1: the Metas or the Googles. You know, I remember when 200 00:09:40,360 --> 00:09:42,120 Speaker 1: Facebook started, it was a really small company. 201 00:09:42,320 --> 00:09:43,880 Speaker 2: And I'm not that old. I'm old enough, but I'm 202 00:09:43,920 --> 00:09:44,400 Speaker 2: not that old. 203 00:09:44,400 --> 00:09:46,880 Speaker 1: So all these big companies, I remind people, start out 204 00:09:46,880 --> 00:09:49,560 Speaker 1: as really small companies. So there are a bunch of 205 00:09:49,600 --> 00:09:52,440 Speaker 1: small companies so that are going to be really big 206 00:09:52,440 --> 00:09:56,559 Speaker 1: companies in less in that path, one of these bigger 207 00:09:56,559 --> 00:09:58,679 Speaker 1: companies can buy them out. You know, We've now got 208 00:09:58,679 --> 00:10:00,600 Speaker 1: companies that can go spend on hundred million dollars in 209 00:10:00,679 --> 00:10:04,240 Speaker 1: buy out a company you know, relatively easily today. So 210 00:10:05,760 --> 00:10:08,880 Speaker 1: if you were looking at what you think is the 211 00:10:08,920 --> 00:10:12,440 Speaker 1: most likely path, I think the most likely path is 212 00:10:12,480 --> 00:10:14,680 Speaker 1: the big I don't know if it's the mag seven, 213 00:10:14,720 --> 00:10:18,800 Speaker 1: but the bigger companies who are spending the most in 214 00:10:18,920 --> 00:10:24,320 Speaker 1: CAPEX are the most likely winners, and they will you know, 215 00:10:24,360 --> 00:10:25,559 Speaker 1: they many will. 216 00:10:25,400 --> 00:10:26,680 Speaker 2: Fund some of these startups. 217 00:10:27,600 --> 00:10:30,000 Speaker 1: And almost all of the big companies have huge venture 218 00:10:30,080 --> 00:10:32,640 Speaker 1: arms because they're trying to spend money in these startups. 219 00:10:32,679 --> 00:10:33,760 Speaker 2: They're trying to get. 220 00:10:33,679 --> 00:10:37,360 Speaker 1: Opportunities to let creative people create, and then as those 221 00:10:37,400 --> 00:10:39,800 Speaker 1: products become really valuable, they're just going to buy them 222 00:10:39,840 --> 00:10:41,960 Speaker 1: and merge them into the bigger companies. So, you know, 223 00:10:42,280 --> 00:10:44,880 Speaker 1: I guess my investment advice is, like, you can own 224 00:10:44,920 --> 00:10:46,959 Speaker 1: some of these small companies and most are private, and 225 00:10:46,960 --> 00:10:50,200 Speaker 1: they're probably interesting to own a portfolio of them. But 226 00:10:50,280 --> 00:10:52,200 Speaker 1: if you ask me where I think this is going 227 00:10:52,240 --> 00:10:54,000 Speaker 1: to be a decade from now, I think the bigger 228 00:10:54,040 --> 00:10:55,520 Speaker 1: companies are going to dominate the space. 229 00:10:55,640 --> 00:10:58,120 Speaker 3: Is that part of what you're doing at IBM? I mean, 230 00:10:58,200 --> 00:11:02,240 Speaker 3: do you help Arvin Krishna maybe zone in on some targets. 231 00:11:02,559 --> 00:11:05,040 Speaker 2: We definitely have a ventures portfolio. 232 00:11:05,360 --> 00:11:09,040 Speaker 1: We're always looking for something that's on the cutting edge 233 00:11:09,160 --> 00:11:11,160 Speaker 1: that fits within our parameters. 234 00:11:11,440 --> 00:11:12,080 Speaker 2: You know, where there's. 235 00:11:12,000 --> 00:11:15,880 Speaker 1: Hybrid cloud, open AI, quantum, you know. 236 00:11:15,960 --> 00:11:17,960 Speaker 2: But we're very much in the open source. 237 00:11:18,000 --> 00:11:20,160 Speaker 1: So we're gonna buy open source company if it fits 238 00:11:20,200 --> 00:11:23,880 Speaker 1: within our parameters and we think it's got an interesting opportunity. 239 00:11:24,320 --> 00:11:27,160 Speaker 1: It's in our shareholder's best interest for us to get 240 00:11:27,200 --> 00:11:29,000 Speaker 1: involved in the company early, not late. 241 00:11:29,400 --> 00:11:33,040 Speaker 3: By the way, a little bit detached from the specific 242 00:11:33,120 --> 00:11:35,560 Speaker 3: AI conversation, what do you think about the M and 243 00:11:35,600 --> 00:11:37,640 Speaker 3: I M and A landscape right now? 244 00:11:37,679 --> 00:11:38,760 Speaker 2: When we came into. 245 00:11:38,480 --> 00:11:42,360 Speaker 3: This year, me and Shanali and Katie, we're joking about 246 00:11:42,400 --> 00:11:44,480 Speaker 3: an opinion writer who said, maybe there won't be that 247 00:11:44,559 --> 00:11:47,360 Speaker 3: many deals because everybody said this is going to be 248 00:11:47,400 --> 00:11:50,080 Speaker 3: the year you know that the regulators drop their guard 249 00:11:50,120 --> 00:11:53,680 Speaker 3: and everybody can get together. Now it's starting to look 250 00:11:53,720 --> 00:11:54,880 Speaker 3: like that may not be the case. 251 00:11:55,160 --> 00:11:59,800 Speaker 1: Well, that was the total market euphoria from November election 252 00:12:00,200 --> 00:12:06,720 Speaker 1: January twentieth. Was deregulation, ease of doing business, the FDC, 253 00:12:06,880 --> 00:12:10,800 Speaker 1: the CFDC, the SEC, everything C is going to get 254 00:12:10,800 --> 00:12:14,160 Speaker 1: easier and be much more pro business and we're going 255 00:12:14,200 --> 00:12:18,480 Speaker 1: to have this huge influx, this huge, massive explosion of 256 00:12:18,559 --> 00:12:21,000 Speaker 1: deals of M and A, deals of IPO deals. 257 00:12:21,920 --> 00:12:24,880 Speaker 2: Look, we're forty five or forty six days into this. 258 00:12:25,400 --> 00:12:27,360 Speaker 1: I don't want to judge this book by its cover, 259 00:12:27,880 --> 00:12:30,600 Speaker 1: but the first forty five days we've not had an 260 00:12:30,640 --> 00:12:34,000 Speaker 1: explosion of deals. In fact, probably based on what's going 261 00:12:34,040 --> 00:12:35,400 Speaker 1: on in markets right now, in market. 262 00:12:35,200 --> 00:12:37,280 Speaker 2: Multiples, we've probably lost deals. 263 00:12:37,440 --> 00:12:40,880 Speaker 1: So I would say the calendar from January twenty today 264 00:12:41,040 --> 00:12:44,200 Speaker 1: is smaller than it was. That's not a regulatory statement, 265 00:12:44,200 --> 00:12:48,400 Speaker 1: that's a valuation statement. And you also have to be 266 00:12:48,440 --> 00:12:51,960 Speaker 1: reminded that some of these commissioners are not in place 267 00:12:52,440 --> 00:12:55,920 Speaker 1: heads of SEC. Even the FDC doesn't have a full 268 00:12:56,000 --> 00:12:58,679 Speaker 1: Republican slate, it's got two Republicans too Democrats, so it's 269 00:12:58,720 --> 00:13:02,000 Speaker 1: hard to get things through on to too. These things 270 00:13:02,080 --> 00:13:05,640 Speaker 1: take more time, So I think some of that euphoria 271 00:13:05,720 --> 00:13:09,120 Speaker 1: is still there, but I think more people are taking 272 00:13:09,160 --> 00:13:12,000 Speaker 1: the approach of Okay, you're going to have to show me. 273 00:13:12,360 --> 00:13:15,199 Speaker 1: It's a different environment than just trust me. I think 274 00:13:15,200 --> 00:13:16,559 Speaker 1: we've gone from trust me to show me. 275 00:13:17,280 --> 00:13:19,080 Speaker 3: I want to get back to regulation in a minute, 276 00:13:19,360 --> 00:13:22,640 Speaker 3: as it pretends to AI, But you mentioned productivity. You 277 00:13:22,640 --> 00:13:25,640 Speaker 3: don't want to geek out on it. We haven't seen 278 00:13:25,800 --> 00:13:29,199 Speaker 3: much of an impact yet from. 279 00:13:29,000 --> 00:13:30,640 Speaker 2: A on productivity. 280 00:13:30,679 --> 00:13:33,520 Speaker 3: I actually have a chatbot now open next to my 281 00:13:33,520 --> 00:13:36,720 Speaker 3: Bloomberg screens every morning, so make you more productive. It's 282 00:13:36,840 --> 00:13:40,480 Speaker 3: I'm actually getting a lot more detail into my newscast 283 00:13:40,559 --> 00:13:41,520 Speaker 3: than I would otherwise. 284 00:13:43,080 --> 00:13:44,800 Speaker 2: I use it as more of a glorified search. 285 00:13:44,880 --> 00:13:47,720 Speaker 3: But we haven't really seen it boost the numbers economically, 286 00:13:48,559 --> 00:13:51,160 Speaker 3: And I wonder when you think we're going to see 287 00:13:51,160 --> 00:13:53,120 Speaker 3: that and if it's going to actually have an effect 288 00:13:53,200 --> 00:13:55,839 Speaker 3: on jobs. You mentioned that you've moved HR people out 289 00:13:55,840 --> 00:13:59,520 Speaker 3: because you can replace them with in that role with AI. 290 00:13:59,840 --> 00:14:02,480 Speaker 3: Are we going to see you know, an uptick in 291 00:14:03,360 --> 00:14:06,400 Speaker 3: or a loss in non farm payrolls because of it? 292 00:14:06,640 --> 00:14:10,520 Speaker 1: So we've also created you know, artificial workers, you know, 293 00:14:10,559 --> 00:14:13,000 Speaker 1: So we now can create artificial workers in certain in 294 00:14:13,000 --> 00:14:16,560 Speaker 1: our consulting business. We now have artificial workers who you know, 295 00:14:16,679 --> 00:14:19,680 Speaker 1: will work twenty four hour days, which is. 296 00:14:19,680 --> 00:14:22,560 Speaker 2: Kind of nice, just like Doge. Yeah, so. 297 00:14:24,040 --> 00:14:25,720 Speaker 1: I don't know if I'm going to make that you 298 00:14:25,800 --> 00:14:29,560 Speaker 1: made that comparison, I will not own that comparison. So 299 00:14:29,680 --> 00:14:33,160 Speaker 1: let let's talk about your question. I think your question is, 300 00:14:33,920 --> 00:14:36,840 Speaker 1: you know, if AI is going to be successful, are 301 00:14:36,840 --> 00:14:39,720 Speaker 1: we going to see a real impact to labor and 302 00:14:39,800 --> 00:14:41,760 Speaker 1: employment numbers around the world? 303 00:14:41,880 --> 00:14:43,080 Speaker 2: Let leonne in the United States. 304 00:14:43,680 --> 00:14:48,560 Speaker 1: So my pretty bold answer is, and I always say 305 00:14:48,600 --> 00:14:50,880 Speaker 1: this to rooms, to people like this who usually first 306 00:14:50,960 --> 00:14:54,000 Speaker 1: question I get is isn't this devastating the economy? I say, look, 307 00:14:54,560 --> 00:14:56,240 Speaker 1: I'll give you the rest of the time I'm up here, 308 00:14:56,240 --> 00:14:57,840 Speaker 1: which is very short at this point, so it's not 309 00:14:57,880 --> 00:14:59,640 Speaker 1: a lot of time. I'll give you the rest time 310 00:14:59,640 --> 00:15:03,200 Speaker 1: I'm up to come up with a major technological advancement 311 00:15:03,640 --> 00:15:07,080 Speaker 1: that destroyed jobs, and you can go. 312 00:15:07,040 --> 00:15:08,480 Speaker 2: Back and you can take anyone you want. 313 00:15:08,640 --> 00:15:11,440 Speaker 1: And so like, let's just take the internal combustion engine 314 00:15:11,880 --> 00:15:14,920 Speaker 1: that was supposed to destroy jobs because all those people 315 00:15:14,920 --> 00:15:17,960 Speaker 1: they cleaned stalls and put. 316 00:15:17,800 --> 00:15:20,400 Speaker 2: Shoes on horses, they were going to lose their jobs. 317 00:15:20,800 --> 00:15:22,280 Speaker 1: Well, but all of a sudden, the next day when 318 00:15:22,320 --> 00:15:26,040 Speaker 1: we started putting internal combustion engines on the streets. We 319 00:15:26,080 --> 00:15:29,600 Speaker 1: needed mechanics, we needed gas stations, we needed we needed 320 00:15:29,640 --> 00:15:33,640 Speaker 1: different skill sets, different people. But all of a sudden, 321 00:15:33,640 --> 00:15:36,880 Speaker 1: there were more jobs created because people could go you know, 322 00:15:36,960 --> 00:15:38,680 Speaker 1: fifty miles a day or a hundred miles a day 323 00:15:38,720 --> 00:15:42,400 Speaker 1: versus two miles a day, so their landscape, their business 324 00:15:42,440 --> 00:15:46,280 Speaker 1: model expanded. It's no different than when the internet. Like 325 00:15:46,880 --> 00:15:49,560 Speaker 1: I worked at a bank before we had email. We 326 00:15:49,640 --> 00:15:52,560 Speaker 1: used to have every floor head had a person that 327 00:15:52,720 --> 00:15:55,320 Speaker 1: printed the memos and put it in your mailbox and 328 00:15:55,360 --> 00:15:57,080 Speaker 1: told you what time the meeting was or told you 329 00:15:57,080 --> 00:15:59,360 Speaker 1: what time to go. What happened to that person who 330 00:15:59,400 --> 00:16:01,480 Speaker 1: delivered the mail on the floor and printed the memos. 331 00:16:01,880 --> 00:16:04,520 Speaker 1: You know, they sort of disappeared overnight. 332 00:16:05,280 --> 00:16:07,840 Speaker 2: Now we didn't have less people. 333 00:16:07,840 --> 00:16:10,640 Speaker 1: In fact, every bank, I guarantee you since the invention 334 00:16:10,680 --> 00:16:11,440 Speaker 1: of email. 335 00:16:11,400 --> 00:16:12,720 Speaker 2: Is double its size. 336 00:16:13,200 --> 00:16:16,480 Speaker 1: Why because we got more productive. We got email coverage, 337 00:16:16,520 --> 00:16:18,400 Speaker 1: so we could cover more clients. We could cover more 338 00:16:18,440 --> 00:16:20,880 Speaker 1: people around the world. We could communicate quicker, we could 339 00:16:20,920 --> 00:16:24,720 Speaker 1: communicate faster. I think AI is the next leg in 340 00:16:24,760 --> 00:16:29,160 Speaker 1: this evolution of making us more productive. So look, will 341 00:16:29,200 --> 00:16:31,080 Speaker 1: there be a change in skill sets? 342 00:16:31,200 --> 00:16:34,280 Speaker 2: Yes? Will people do jobs that they hate? 343 00:16:34,720 --> 00:16:34,760 Speaker 3: No. 344 00:16:35,440 --> 00:16:39,440 Speaker 1: The machine can input data. The machine can take codes 345 00:16:39,480 --> 00:16:42,240 Speaker 1: out of an operating room in a hospital and input 346 00:16:42,280 --> 00:16:45,400 Speaker 1: them into the insurance reimbursement system. By the way, that 347 00:16:45,560 --> 00:16:47,560 Speaker 1: job in a couple of hospitals has like forty fifty 348 00:16:47,560 --> 00:16:50,440 Speaker 1: percent turnover rates because people hate doing that, they shouldn't 349 00:16:50,480 --> 00:16:52,640 Speaker 1: do it. Those people in that hospital can now do 350 00:16:52,720 --> 00:16:54,840 Speaker 1: a job where they try and work on recoveries and 351 00:16:54,880 --> 00:16:57,480 Speaker 1: the hospital gets paid back. So I think you will 352 00:16:57,480 --> 00:17:01,200 Speaker 1: see like this natural evolution of people out of jobs 353 00:17:01,240 --> 00:17:05,679 Speaker 1: that they don't like, jobs with load dissatisfaction ratings, jobs 354 00:17:05,720 --> 00:17:08,399 Speaker 1: that have higher satisfaction rates, that are more productive and 355 00:17:08,480 --> 00:17:09,400 Speaker 1: grows the economy. 356 00:17:09,520 --> 00:17:11,160 Speaker 2: That's the history of every. 357 00:17:11,040 --> 00:17:14,560 Speaker 1: Other technological advancement we see. It's not overnight. That's why 358 00:17:14,600 --> 00:17:16,760 Speaker 1: I call this an evolution, not a revolution. All right, 359 00:17:16,800 --> 00:17:19,280 Speaker 1: and all this, by the way, we probably won't have time, 360 00:17:19,640 --> 00:17:22,840 Speaker 1: but AI I believe is just a stepping stone to 361 00:17:22,960 --> 00:17:24,560 Speaker 1: the real outcome, which is. 362 00:17:24,600 --> 00:17:28,800 Speaker 3: Quantum, which is I'm guessing you also think ten to 363 00:17:28,840 --> 00:17:33,320 Speaker 3: fifteen years off, No sooner, sooner, and how soon. 364 00:17:35,640 --> 00:17:37,960 Speaker 2: Five years. 365 00:17:38,640 --> 00:17:42,119 Speaker 1: The strides that is being made in quantum right now 366 00:17:43,040 --> 00:17:44,520 Speaker 1: are large. 367 00:17:44,640 --> 00:17:45,600 Speaker 2: I mean there's a lot more. 368 00:17:45,640 --> 00:17:47,480 Speaker 1: Look again, there's a lot of private capital going in 369 00:17:47,480 --> 00:17:48,120 Speaker 1: to quantum. 370 00:17:48,200 --> 00:17:51,760 Speaker 2: It's big companies and small companies. It's all about error correction. 371 00:17:52,080 --> 00:17:55,000 Speaker 1: How quickly you can correct their how quickly you can 372 00:17:55,040 --> 00:17:59,280 Speaker 1: compute the data is pretty interesting right now. So you know, 373 00:17:59,520 --> 00:18:03,800 Speaker 1: as we have evolve, you know, everything in this technology 374 00:18:04,119 --> 00:18:05,120 Speaker 1: space is speeding up. 375 00:18:05,160 --> 00:18:07,520 Speaker 2: So I'll you know, five years all right, fascinating. 376 00:18:07,720 --> 00:18:11,879 Speaker 3: I wanted to ask about regulation because you mentioned AI 377 00:18:12,000 --> 00:18:14,040 Speaker 3: is going to help us create better AI, and I 378 00:18:14,080 --> 00:18:15,320 Speaker 3: wonder what else AI is. 379 00:18:15,280 --> 00:18:17,400 Speaker 2: Going to decide to do for us. Do we need 380 00:18:17,440 --> 00:18:18,159 Speaker 2: to regulate it? 381 00:18:18,240 --> 00:18:20,160 Speaker 3: Do we need to have a heavy hand here, or 382 00:18:20,200 --> 00:18:22,840 Speaker 3: is it okay to have a few South African billionaires 383 00:18:22,920 --> 00:18:26,320 Speaker 3: just say like, go, I don't. 384 00:18:26,200 --> 00:18:27,320 Speaker 2: Know, I don't know what you're talking about. 385 00:18:30,119 --> 00:18:32,239 Speaker 1: I'll go to chat EPT and ask that what that is. 386 00:18:33,400 --> 00:18:35,840 Speaker 2: So look, the answer is yes. 387 00:18:36,920 --> 00:18:39,080 Speaker 1: And I sit on a couple of these big regulatory 388 00:18:39,080 --> 00:18:42,080 Speaker 1: boards around the world, so I'll do this relatively quickly. 389 00:18:42,320 --> 00:18:46,560 Speaker 1: If it's a regulated activity today, if you use a 390 00:18:47,200 --> 00:18:50,919 Speaker 1: it should be a regulated activity using AI. So an 391 00:18:50,960 --> 00:18:53,320 Speaker 1: example of that is if I'm going to call my 392 00:18:53,440 --> 00:18:56,440 Speaker 1: doctor and tell my doctor that I got a scratchy throat, 393 00:18:56,480 --> 00:18:59,159 Speaker 1: a headache, a swelling of this d D and he 394 00:18:59,320 --> 00:19:01,439 Speaker 1: or she puts it in a machine and they're going 395 00:19:01,520 --> 00:19:05,080 Speaker 1: to diagnose my disease and prescribe me medicine. That's a 396 00:19:05,200 --> 00:19:09,119 Speaker 1: highly regulated business today. The AI machine that does that 397 00:19:09,520 --> 00:19:12,800 Speaker 1: and the program that does should be highly regulated. On 398 00:19:12,880 --> 00:19:16,199 Speaker 1: the flip side, if when I'm working out tomorrow morning, 399 00:19:16,520 --> 00:19:19,159 Speaker 1: Spotify chooses a bad song for me and I have 400 00:19:19,200 --> 00:19:21,600 Speaker 1: to hit skip, which they do a lot, you. 401 00:19:21,520 --> 00:19:23,800 Speaker 2: Know, I'm not sure that needs to be a regulated activity. 402 00:19:23,840 --> 00:19:26,160 Speaker 1: I can get through my workout in the morning listening 403 00:19:26,200 --> 00:19:28,800 Speaker 1: to one bad song and skipping a bad song. So 404 00:19:29,280 --> 00:19:32,200 Speaker 1: my view is, Look, we've decided what businesses are regulated. 405 00:19:32,440 --> 00:19:34,679 Speaker 1: We know what business should not be regulated. We should 406 00:19:34,720 --> 00:19:38,359 Speaker 1: not use AI as a wedge to regulate more businesses. 407 00:19:38,720 --> 00:19:41,200 Speaker 1: On the flip side, we shouldn't use AI to take 408 00:19:41,240 --> 00:19:44,760 Speaker 1: businesses today that deserve regulation and take them out of regulation. 409 00:19:45,720 --> 00:19:51,240 Speaker 3: All right, Well, we have one audience question here on tariffs. 410 00:19:51,359 --> 00:19:53,720 Speaker 2: It's amazing how you got an other question on your card. 411 00:19:54,840 --> 00:19:56,600 Speaker 3: Does this strategy make sense to you as a way 412 00:19:56,640 --> 00:20:04,240 Speaker 3: to deter illegal immigration and fent and all smuggling from Canada? No, Seriously, 413 00:20:04,400 --> 00:20:06,800 Speaker 3: we've seen for a couple of days in a row, 414 00:20:07,320 --> 00:20:10,440 Speaker 3: markets really come down. We've seen rates come down from 415 00:20:10,480 --> 00:20:13,840 Speaker 3: four eighty to four twenty. President Trump has put twenty 416 00:20:13,840 --> 00:20:17,440 Speaker 3: five percent tariffs on our two closest trading partners after 417 00:20:17,520 --> 00:20:22,600 Speaker 3: he negotiated possibly the greatest trade deal in history with them, right, yeah, yes, 418 00:20:23,200 --> 00:20:25,359 Speaker 3: does it make sense to you? 419 00:20:25,680 --> 00:20:26,440 Speaker 2: I mean you were there. 420 00:20:27,640 --> 00:20:31,920 Speaker 1: So look, I think what we have to understand and 421 00:20:31,520 --> 00:20:34,639 Speaker 1: I'm not saying I understand this, but I think what 422 00:20:34,680 --> 00:20:36,480 Speaker 1: we have to try and figure out is what. 423 00:20:36,520 --> 00:20:39,400 Speaker 2: Is the objective of the tariffs? 424 00:20:40,280 --> 00:20:43,720 Speaker 1: And that, to me, I think that's where we're a 425 00:20:43,720 --> 00:20:48,600 Speaker 1: lot of us are struggling. Is the tariffs meant to 426 00:20:49,640 --> 00:20:54,600 Speaker 1: make foreign products more expensive so the domestic product is cheaper. 427 00:20:54,840 --> 00:20:56,959 Speaker 2: Okay, that's a good use of a tariff. 428 00:20:58,080 --> 00:21:00,000 Speaker 1: The problem is a lot of the things that we 429 00:21:00,080 --> 00:21:02,000 Speaker 1: import we actually don't make here in the United States. 430 00:21:02,080 --> 00:21:04,560 Speaker 1: So you can make the foreign product more expensive, it 431 00:21:04,600 --> 00:21:07,360 Speaker 1: doesn't make the domestic product cheaper because we don't have it. 432 00:21:07,520 --> 00:21:13,360 Speaker 1: So that's a question I have is the But if 433 00:21:13,359 --> 00:21:17,919 Speaker 1: we put that tariff on, does that incentivize CAPEX to 434 00:21:18,080 --> 00:21:22,040 Speaker 1: build the capability to make that product? You could argue, Okay, 435 00:21:22,119 --> 00:21:25,320 Speaker 1: that's not a bad long term outcome, because look, if 436 00:21:25,359 --> 00:21:28,119 Speaker 1: we took nothing else away from COVID, we should have 437 00:21:28,160 --> 00:21:31,679 Speaker 1: taken away from COVID. There are some strategic needs of 438 00:21:31,720 --> 00:21:34,879 Speaker 1: this country that we are just too dependent on the 439 00:21:34,880 --> 00:21:38,040 Speaker 1: rest of the world for and we better figure out 440 00:21:38,080 --> 00:21:43,919 Speaker 1: how to be more self sufficient on necessity goods and items. 441 00:21:44,160 --> 00:21:45,720 Speaker 2: So you know, when. 442 00:21:45,520 --> 00:21:48,399 Speaker 1: It's coming to buying personal protective equipment, we have to 443 00:21:48,440 --> 00:21:50,640 Speaker 1: import it from China. We need sterile gloves and masks 444 00:21:50,640 --> 00:21:54,240 Speaker 1: from China. Probably something we should manufacture in the United States. 445 00:21:54,400 --> 00:21:56,800 Speaker 1: If we're worried about you know, toys and games, we 446 00:21:56,840 --> 00:21:58,159 Speaker 1: can live it at them. I don't think it's a 447 00:21:58,200 --> 00:22:00,880 Speaker 1: necessity items. So I think we have to figure out 448 00:22:00,920 --> 00:22:04,080 Speaker 1: what the objective is. If the objective is to raise 449 00:22:04,200 --> 00:22:07,760 Speaker 1: money for the government, like make that the objective, I'm 450 00:22:07,760 --> 00:22:11,560 Speaker 1: not sure how you do that, and how do you 451 00:22:11,680 --> 00:22:15,159 Speaker 1: do that with weighing the effects of inflation on the 452 00:22:15,200 --> 00:22:16,760 Speaker 1: other side, And I think you have to think about that. 453 00:22:17,000 --> 00:22:21,080 Speaker 1: So if we're just using it as a revenue raiser, 454 00:22:22,640 --> 00:22:26,160 Speaker 1: it creates a higher cost of goods. It's a really 455 00:22:26,200 --> 00:22:28,680 Speaker 1: regressive way to raise revenue. 456 00:22:28,720 --> 00:22:30,600 Speaker 2: Because I hate to say it, like it's the. 457 00:22:30,560 --> 00:22:34,400 Speaker 1: Reality of it is, poor people or less highly paid 458 00:22:34,400 --> 00:22:38,800 Speaker 1: people consume one hundred percent of their paychecks. Wealthier people 459 00:22:38,840 --> 00:22:41,040 Speaker 1: consume a very small percentage of their paychecks, and they 460 00:22:41,080 --> 00:22:43,959 Speaker 1: have to consume their paychecks. So the things that are 461 00:22:44,000 --> 00:22:46,280 Speaker 1: being tariffed, they are things that every day people buy 462 00:22:46,760 --> 00:22:49,600 Speaker 1: and it becomes a really regressive tax system. I don't 463 00:22:49,600 --> 00:22:52,040 Speaker 1: think we want a regressive tax system. I'm the first 464 00:22:52,080 --> 00:22:53,760 Speaker 1: guy here that's going to argue for a progressive tax 465 00:22:53,800 --> 00:22:56,359 Speaker 1: system that be rich people need to pay more in 466 00:22:56,440 --> 00:22:58,600 Speaker 1: this country. They should pay more, and they do pay 467 00:22:58,640 --> 00:23:00,200 Speaker 1: more today. I don't think we want to with that 468 00:23:00,240 --> 00:23:02,960 Speaker 1: around you, terraff. So to me, the question is what 469 00:23:03,119 --> 00:23:06,560 Speaker 1: is the objective here? Like there may be a real robust, 470 00:23:06,680 --> 00:23:08,280 Speaker 1: bona fide reason to have terriffs. 471 00:23:08,480 --> 00:23:10,720 Speaker 2: I just don't know what we're trying to achieve, and 472 00:23:10,800 --> 00:23:11,320 Speaker 2: you would know. 473 00:23:12,000 --> 00:23:13,800 Speaker 3: I feel like anyway there's a thoughtful answer, and I 474 00:23:13,800 --> 00:23:14,520 Speaker 3: appreciate your time. 475 00:23:14,520 --> 00:23:18,920 Speaker 2: Thank you, thank you, thanks so much. Possible