1 00:00:02,520 --> 00:00:07,040 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,720 --> 00:00:11,440 Speaker 2: Well, let's get back to the transit lanting tech route 3 00:00:11,560 --> 00:00:14,640 Speaker 2: that we saw yesterday, one trillion dollars worth of course 4 00:00:14,680 --> 00:00:18,240 Speaker 2: over Deep seeks dark horse entry to the AI race. 5 00:00:18,320 --> 00:00:20,920 Speaker 2: The question is it a turning point for the industry 6 00:00:21,120 --> 00:00:23,680 Speaker 2: and its investors. Joining us now to discuss is one 7 00:00:23,720 --> 00:00:26,840 Speaker 2: of the most prominent voices in this sphere. Kathy would 8 00:00:26,920 --> 00:00:30,040 Speaker 2: found at CEO and CIO of ARC invest. Lovely to 9 00:00:30,080 --> 00:00:32,120 Speaker 2: have you with us, Kathy, Thank you, Lucy, happy with 10 00:00:32,200 --> 00:00:34,520 Speaker 2: you here. Big question you buying the dip. 11 00:00:35,760 --> 00:00:39,640 Speaker 3: Well, we're fully invested, and we're very comfortable being fully invested. 12 00:00:40,479 --> 00:00:45,800 Speaker 3: I think our confident in the ballmarket broadening out has 13 00:00:46,000 --> 00:00:50,720 Speaker 3: increased in the last few days. So sure the megacaps 14 00:00:50,760 --> 00:00:53,960 Speaker 3: we think will continue to do well. Maybe in Vidia 15 00:00:54,440 --> 00:01:00,160 Speaker 3: there are some questions about its exalted position in the 16 00:01:00,280 --> 00:01:04,240 Speaker 3: training chip market. We think inference is going to become 17 00:01:04,400 --> 00:01:08,640 Speaker 3: more important with these new reasoning models, and that is 18 00:01:08,680 --> 00:01:12,000 Speaker 3: a more competitive part of the market. But we think 19 00:01:12,120 --> 00:01:16,440 Speaker 3: that what's happening here is the collapse in the cost 20 00:01:16,600 --> 00:01:17,360 Speaker 3: of innovation. 21 00:01:18,280 --> 00:01:19,840 Speaker 4: Deep Seek is adding to it. 22 00:01:19,840 --> 00:01:23,280 Speaker 3: It has been happening though AI training costs have been dropping, 23 00:01:23,800 --> 00:01:29,319 Speaker 3: led by Nvidia seventy five percent per year, and AI 24 00:01:29,640 --> 00:01:33,480 Speaker 3: inference costs have been dropping eighty five to ninety percent 25 00:01:33,560 --> 00:01:37,840 Speaker 3: per year. Deep seek is just putting those very rapid 26 00:01:37,880 --> 00:01:40,200 Speaker 3: declines into a bit of overdrive here. 27 00:01:40,640 --> 00:01:43,000 Speaker 1: So I wonder, if you see in Vidia falling much further, 28 00:01:43,080 --> 00:01:44,640 Speaker 1: is this an attractive price to be buying it? 29 00:01:44,680 --> 00:01:50,200 Speaker 3: Do you think, well, we we own it in some 30 00:01:50,240 --> 00:01:51,040 Speaker 3: of our portfolios. 31 00:01:51,080 --> 00:01:53,920 Speaker 4: We are not buying the dip yet. 32 00:01:54,080 --> 00:01:57,040 Speaker 3: We do want to learn more about deep seek and 33 00:01:57,240 --> 00:02:00,440 Speaker 3: more about how the market or the demands and for 34 00:02:00,680 --> 00:02:06,120 Speaker 3: inference chips might outpace those for training chips. 35 00:02:06,440 --> 00:02:08,840 Speaker 4: We think the whole area is going to be vibrant. 36 00:02:10,200 --> 00:02:12,760 Speaker 3: We just think there might be a little bit more 37 00:02:12,760 --> 00:02:15,600 Speaker 3: of an adjustment to this new reality. 38 00:02:15,360 --> 00:02:17,800 Speaker 2: Where you say that what deep seat shows is kind 39 00:02:17,840 --> 00:02:20,880 Speaker 2: of that you can do more with less kap X. 40 00:02:21,600 --> 00:02:24,280 Speaker 2: Who do you think that's going to motivate in the 41 00:02:24,400 --> 00:02:26,960 Speaker 2: US most as Donald Trump is kind of saying, who's 42 00:02:27,000 --> 00:02:29,120 Speaker 2: going to be best at doing more with less? 43 00:02:29,960 --> 00:02:31,840 Speaker 4: Well, anyone using AI. 44 00:02:32,120 --> 00:02:38,200 Speaker 3: We are focused on the productivity gains for knowledge workers, 45 00:02:38,280 --> 00:02:40,800 Speaker 3: which are going to be massive. That's the biggest impact 46 00:02:40,800 --> 00:02:44,440 Speaker 3: of AI. But if we're looking at sectors and spaces 47 00:02:44,520 --> 00:02:49,440 Speaker 3: generally where this acceleration and innovation is going to be meaningful. 48 00:02:49,880 --> 00:02:55,760 Speaker 3: We think autonomous mobility, so robotaxis. We think that's going 49 00:02:55,800 --> 00:02:58,400 Speaker 3: to scale from essentially nothing now to an eight to 50 00:02:58,440 --> 00:03:03,120 Speaker 3: ten trillion dollar globe opportunity, including China. But the sleeper 51 00:03:03,160 --> 00:03:10,040 Speaker 3: here is healthcare, and we're seeing the convergence of sequencing technologies, 52 00:03:10,080 --> 00:03:15,200 Speaker 3: all kinds of sequencing technologies, artificial intelligence, and then gene 53 00:03:15,440 --> 00:03:20,680 Speaker 3: editing technologies like Crisper CAST nine. That combination we believe 54 00:03:20,840 --> 00:03:24,799 Speaker 3: is going to cure disease. It's already curing disease, sickle 55 00:03:24,840 --> 00:03:29,960 Speaker 3: cell disease and beta thllosemia. That's Chrisper therapeutics. But we 56 00:03:30,080 --> 00:03:34,560 Speaker 3: think they're aiming now for type one and type two diabetes. 57 00:03:35,040 --> 00:03:38,920 Speaker 3: That would be a category killer, you know, So think 58 00:03:38,960 --> 00:03:44,840 Speaker 3: about that curing disease, AI helping us decode the secrets 59 00:03:44,880 --> 00:03:47,320 Speaker 3: of life, death health. 60 00:03:47,920 --> 00:03:50,960 Speaker 1: Those are two areas where regulation plays a big part. 61 00:03:51,240 --> 00:03:54,360 Speaker 1: You've spoken about your optimism about deregulation under Donald Trump. 62 00:03:54,400 --> 00:03:56,240 Speaker 1: Do you think those are two areas we should expect 63 00:03:56,240 --> 00:04:00,520 Speaker 1: to see big deregulation about what technology can get in. 64 00:04:01,320 --> 00:04:01,520 Speaker 4: Yes. 65 00:04:01,600 --> 00:04:05,400 Speaker 3: I think in both safety first and I think anyone 66 00:04:05,880 --> 00:04:09,120 Speaker 3: moving into this market or these markets would agree with that, 67 00:04:10,040 --> 00:04:15,720 Speaker 3: so sensible regulation. But in the US, the robotaxi field 68 00:04:16,000 --> 00:04:20,320 Speaker 3: is regulated by fifty states. We think that will change 69 00:04:20,440 --> 00:04:26,080 Speaker 3: to one regulator, the federal government. After all, transportation does 70 00:04:26,120 --> 00:04:31,120 Speaker 3: cross states, so that makes sense. In the healthcare realm, 71 00:04:31,600 --> 00:04:37,200 Speaker 3: that is where the thicket of regulations has really strangled 72 00:04:37,200 --> 00:04:42,320 Speaker 3: the industry. And even more than the outright regulations, it 73 00:04:42,520 --> 00:04:48,160 Speaker 3: was the FTC, the Federal Trade Commission not allowing mergers 74 00:04:48,160 --> 00:04:55,279 Speaker 3: and acquisitions and therefore not allowing strategic buyers big biotech companies, 75 00:04:58,880 --> 00:05:03,120 Speaker 3: not allowing them to the smaller companies, so that we 76 00:05:03,320 --> 00:05:07,279 Speaker 3: had price discovery. Now we're going to see price discovery. 77 00:05:07,560 --> 00:05:12,080 Speaker 3: How much are these companies that are curing disease worth 78 00:05:12,400 --> 00:05:16,400 Speaker 3: to these large strategic buyers now and saying that I 79 00:05:16,440 --> 00:05:19,279 Speaker 3: don't necessarily want our companies to be taken out. We 80 00:05:19,320 --> 00:05:21,760 Speaker 3: think they have miles to go, but we do want 81 00:05:21,800 --> 00:05:24,280 Speaker 3: price discovery back in the market. 82 00:05:24,440 --> 00:05:27,120 Speaker 2: Where you talk about your excitement about autonomous vehicles. Of course, 83 00:05:27,200 --> 00:05:29,680 Speaker 2: your biggest holding is Tesla, I think, and we've call 84 00:05:29,720 --> 00:05:32,960 Speaker 2: Elon Musk's first earnings call tomorrow since Trump's returned to 85 00:05:33,000 --> 00:05:36,560 Speaker 2: the White house January. Typically a call that's a look ahead. 86 00:05:36,920 --> 00:05:38,680 Speaker 2: What do you need to hear from muskue. 87 00:05:39,279 --> 00:05:44,400 Speaker 3: We need to hear a continuation of we're about to 88 00:05:44,560 --> 00:05:50,960 Speaker 3: launch our autonomous driving system. I mean, in effect, they've 89 00:05:51,040 --> 00:05:54,120 Speaker 3: launched it. I have a full self driving I know 90 00:05:54,160 --> 00:05:58,800 Speaker 3: it's not allowed here in the UK or in Europe, but. 91 00:06:00,720 --> 00:06:01,480 Speaker 1: You feel safe. 92 00:06:01,839 --> 00:06:06,400 Speaker 3: But oh yes, yes, yes, yes, Weymo really has broken 93 00:06:06,440 --> 00:06:09,920 Speaker 3: through a lot of barriers. I feel even though a 94 00:06:09,960 --> 00:06:14,320 Speaker 3: Waymo vehicle is not safer than a human driven vehicle, 95 00:06:16,000 --> 00:06:21,200 Speaker 3: it knows its roads. It's narrow roads, meaning there are 96 00:06:21,320 --> 00:06:25,280 Speaker 3: narrow territories. I think Tesla is going to go national. 97 00:06:25,440 --> 00:06:28,080 Speaker 3: That's going to be and we'll be able to do 98 00:06:28,160 --> 00:06:32,960 Speaker 3: so because it has effectively seven million robots roaming the roads. 99 00:06:33,040 --> 00:06:34,960 Speaker 3: Right now, I have two of them, a Y and 100 00:06:35,000 --> 00:06:38,440 Speaker 3: A three, and they're learning all about Connecticut and Florida 101 00:06:38,480 --> 00:06:43,159 Speaker 3: worlds roads. So it has a huge competitive advantage in 102 00:06:43,240 --> 00:06:48,080 Speaker 3: terms of proprietary data that nobody else has about the roads, 103 00:06:48,480 --> 00:06:51,640 Speaker 3: not only in the US, but in many places around 104 00:06:51,680 --> 00:06:52,080 Speaker 3: the world. 105 00:06:52,400 --> 00:06:54,720 Speaker 1: Do you worry that Elon Musk is taking on too much? 106 00:06:54,760 --> 00:06:56,640 Speaker 1: If he's got his new job in the US government, 107 00:06:56,760 --> 00:07:00,360 Speaker 1: he's got you know, sharp developments coming in Tesla, lots 108 00:07:00,440 --> 00:07:02,520 Speaker 1: going on with those other businesses as well. Is he 109 00:07:02,560 --> 00:07:03,520 Speaker 1: spreading himself too thin? 110 00:07:04,160 --> 00:07:07,880 Speaker 3: If you look at the way Elon Musk behaves as 111 00:07:07,880 --> 00:07:12,080 Speaker 3: a CEO, he's a sharpshooter. He looks for pain points 112 00:07:12,520 --> 00:07:17,280 Speaker 3: and he solves those right. And he's for example, right now, 113 00:07:17,320 --> 00:07:21,440 Speaker 3: where was in twenty eighteen, where was the biggest pain point? 114 00:07:21,840 --> 00:07:25,560 Speaker 3: It was manufacturing the Model three, scaling the Model three. 115 00:07:26,320 --> 00:07:31,160 Speaker 3: He slept famously on the factory floor. He's not doing 116 00:07:31,160 --> 00:07:34,200 Speaker 3: that anymore. He's not on the factory floor now. It's 117 00:07:34,320 --> 00:07:38,280 Speaker 3: all about autonomous. He is the first what we would 118 00:07:38,280 --> 00:07:44,960 Speaker 3: call CEO who really understands the convergence among technologies taking 119 00:07:45,000 --> 00:07:50,120 Speaker 3: place right now, really catalyzed by artificial intelligence each one 120 00:07:50,160 --> 00:07:55,120 Speaker 3: of his businesses. And he also understands how critically important 121 00:07:55,120 --> 00:07:59,080 Speaker 3: proprietary data is. And think about this, Each of those 122 00:07:59,240 --> 00:08:03,880 Speaker 3: companies is generating proprietary data that no one else has. 123 00:08:04,000 --> 00:08:05,600 Speaker 2: Well, maybe you should out to his plate. Should he 124 00:08:05,640 --> 00:08:07,600 Speaker 2: buy us TikTok ah? 125 00:08:08,640 --> 00:08:08,880 Speaker 4: Yes? 126 00:08:09,000 --> 00:08:11,760 Speaker 3: I mean, I know there have been rumors about that 127 00:08:11,960 --> 00:08:17,800 Speaker 3: and about the government sharing half, and that's kind of 128 00:08:17,800 --> 00:08:21,560 Speaker 3: anathema to the United States, you know, the government getting. 129 00:08:21,320 --> 00:08:23,960 Speaker 4: Involved like this. I'd be surprised if that happens. 130 00:08:24,000 --> 00:08:27,640 Speaker 3: But I will also say the Trump administration is full 131 00:08:27,640 --> 00:08:28,360 Speaker 3: of surprises. 132 00:08:29,120 --> 00:08:30,840 Speaker 1: It certainly is, and we're only a weekend to it. 133 00:08:31,080 --> 00:08:33,080 Speaker 1: You're here with us in London and we're delighted to 134 00:08:33,120 --> 00:08:35,880 Speaker 1: see you. What has you in London? What opportunities are 135 00:08:35,880 --> 00:08:37,880 Speaker 1: you excited about when you come to the UK. Is 136 00:08:37,920 --> 00:08:39,439 Speaker 1: this a good place to put your money? 137 00:08:40,480 --> 00:08:44,720 Speaker 3: Well, we have launched three funds. Our Europe has launched 138 00:08:44,760 --> 00:08:48,680 Speaker 3: three funds here. One is our flagship that is focused 139 00:08:48,679 --> 00:08:54,400 Speaker 3: on all five innovation platforms, so robotics, energy storage, artificial intelligence, 140 00:08:54,920 --> 00:09:01,480 Speaker 3: multiomic sequencing and blockchain technology. So that's the flagship AARKK 141 00:09:02,720 --> 00:09:05,320 Speaker 3: a special one for Europe which we do not have 142 00:09:05,400 --> 00:09:09,880 Speaker 3: in the US because of overlaps with other funds, But 143 00:09:09,960 --> 00:09:16,439 Speaker 3: we have an AI and robotics fund and that is ARKI. 144 00:09:17,520 --> 00:09:23,880 Speaker 3: We think this convergence of robotics and AI is going 145 00:09:23,960 --> 00:09:28,199 Speaker 3: to lead to humanoid robots and you know, not just 146 00:09:28,360 --> 00:09:33,200 Speaker 3: millions of them, but perhaps billions of them longer term, 147 00:09:34,120 --> 00:09:37,520 Speaker 3: which is going to be a key productivity driver in 148 00:09:37,559 --> 00:09:41,480 Speaker 3: both the home. You know, right now we're not paid 149 00:09:41,559 --> 00:09:45,840 Speaker 3: for house cleaning let's get a robot. Let's pay a 150 00:09:45,960 --> 00:09:50,360 Speaker 3: robot to do it, and in the manufacturing plants around 151 00:09:50,400 --> 00:09:50,760 Speaker 3: the world. 152 00:09:50,840 --> 00:09:53,200 Speaker 1: Kathy, the Prime Minister was in the building earlier. Kara Starmer, 153 00:09:53,280 --> 00:09:55,120 Speaker 1: you're exactly the sort of person that he wants to 154 00:09:55,160 --> 00:09:57,839 Speaker 1: meet because he wants investments going into those sorts of 155 00:09:57,880 --> 00:10:00,839 Speaker 1: businesses in this country. Have you spoken to what sort 156 00:10:00,840 --> 00:10:02,560 Speaker 1: of advice would you be giving him if he's trying 157 00:10:02,600 --> 00:10:03,400 Speaker 1: to grow the economy. 158 00:10:04,040 --> 00:10:07,880 Speaker 3: Well, half the solution is understanding the problem. And just 159 00:10:08,120 --> 00:10:12,840 Speaker 3: making that statement suggests to me that he's saying, why 160 00:10:12,920 --> 00:10:16,480 Speaker 3: hasn't this happened? And I also over the weekend heard 161 00:10:17,080 --> 00:10:20,000 Speaker 3: one of the finance ministers or maybe the Finance Minister 162 00:10:20,360 --> 00:10:23,160 Speaker 3: say the same thing. This is a mantra, and I 163 00:10:23,200 --> 00:10:26,720 Speaker 3: think it has been catalyzed by artificial intelligence as well. 164 00:10:26,960 --> 00:10:30,880 Speaker 3: If you think about the UK, you produced two of 165 00:10:30,920 --> 00:10:34,520 Speaker 3: the most important AI companies in the world, Deep Mind, 166 00:10:34,640 --> 00:10:41,280 Speaker 3: which Alphabet now owns, an arm which SoftBank primarily owns. 167 00:10:42,679 --> 00:10:46,800 Speaker 3: By all rights, you should be developing a deep venture 168 00:10:46,840 --> 00:10:53,240 Speaker 3: capital pool here, feeding startups and nourishing them and deepening 169 00:10:53,520 --> 00:10:57,440 Speaker 3: your listed equity markets. And so I think that's what 170 00:10:57,480 --> 00:11:01,560 Speaker 3: they're going to do, which is fantastic, fantastic for the UK. 171 00:11:01,800 --> 00:11:05,600 Speaker 2: Absolutely, they talk very ambitiously the UK leadership, but so 172 00:11:05,760 --> 00:11:10,280 Speaker 2: too do the European government's leadership. And Donald Trump, as 173 00:11:10,320 --> 00:11:14,160 Speaker 2: you say, is very passionate about deregulation. So yes, everybody's 174 00:11:14,160 --> 00:11:16,600 Speaker 2: on the same page. But it is a race. Is 175 00:11:16,640 --> 00:11:19,839 Speaker 2: the UK keeping up enough with the EU and the 176 00:11:20,040 --> 00:11:22,520 Speaker 2: US when it comes to deregulation for you to be 177 00:11:23,200 --> 00:11:25,200 Speaker 2: really confident, Kathy putting your money here. 178 00:11:26,200 --> 00:11:31,000 Speaker 3: I think that Europe is more tied up in regulatory 179 00:11:31,120 --> 00:11:34,400 Speaker 3: nuts than the UK is. And in fact, some of 180 00:11:34,440 --> 00:11:37,560 Speaker 3: our companies, Pallenteer has been very vocal, for example, and 181 00:11:37,600 --> 00:11:41,000 Speaker 3: we think that's one of the most important AI platform 182 00:11:41,080 --> 00:11:46,360 Speaker 3: companies out there. Alex Krp, the CEO, has said, I 183 00:11:46,400 --> 00:11:50,959 Speaker 3: am pulling employees out of Europe because you know, we're 184 00:11:51,040 --> 00:11:54,680 Speaker 3: running into all of these obstacles that could really harm us, 185 00:11:55,080 --> 00:11:58,640 Speaker 3: you know, being hit by a four percent fine, four 186 00:11:58,679 --> 00:12:02,400 Speaker 3: percent of revenue or and that's global revenue or seven 187 00:12:02,480 --> 00:12:04,240 Speaker 3: percent of revenue I heard recently. 188 00:12:05,280 --> 00:12:06,319 Speaker 4: That's crazy. 189 00:12:07,120 --> 00:12:11,880 Speaker 3: So I think Europe is going to be held back 190 00:12:12,040 --> 00:12:16,840 Speaker 3: until it gets its regulatory act together. I think the 191 00:12:16,960 --> 00:12:20,800 Speaker 3: UK is more progressive from a regulatory point of view, 192 00:12:20,800 --> 00:12:25,280 Speaker 3: and as I said, I really believe that half the solution. 193 00:12:25,480 --> 00:12:30,640 Speaker 3: The fact that your Prime Minister, your Finance Minister in 194 00:12:30,720 --> 00:12:33,760 Speaker 3: the same week are saying, you know, we've got to 195 00:12:33,760 --> 00:12:36,160 Speaker 3: figure this out. I think that's a very good thing 196 00:12:36,160 --> 00:12:36,640 Speaker 3: for the UK.