1 00:00:02,520 --> 00:00:09,000 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. 2 00:00:07,840 --> 00:00:09,960 Speaker 2: One of the biggest bulls on this stock for quite 3 00:00:10,000 --> 00:00:12,680 Speaker 2: a while. On the street, our convest CEO Kathy Wood. Kathy, 4 00:00:12,720 --> 00:00:14,720 Speaker 2: thanks for making time for us this morning. Let's just 5 00:00:14,760 --> 00:00:17,239 Speaker 2: talk about what Dan I've said to us yesterday that 6 00:00:17,280 --> 00:00:19,720 Speaker 2: he believes Elon Musk is crossing a line and that 7 00:00:19,760 --> 00:00:21,799 Speaker 2: the board needs to step in. Do you share that 8 00:00:21,880 --> 00:00:22,000 Speaker 2: for you? 9 00:00:23,400 --> 00:00:25,439 Speaker 1: Well, I have to tell you we've been dealing with 10 00:00:26,239 --> 00:00:30,080 Speaker 1: controversy around Elon Musk in one form or another since 11 00:00:30,200 --> 00:00:32,879 Speaker 1: we first bought the stock when the company was founded 12 00:00:32,960 --> 00:00:36,279 Speaker 1: in twenty fourteen, and we owned Tesla in It's one 13 00:00:36,320 --> 00:00:41,400 Speaker 1: of the top holdings in three of our ETF so, arkk, W, 14 00:00:42,159 --> 00:00:45,040 Speaker 1: and Q. So we are watching this like a hawk, 15 00:00:45,120 --> 00:00:48,239 Speaker 1: no question about it. But with the experience over the 16 00:00:48,320 --> 00:00:53,840 Speaker 1: last eleven years, we turn around today and see Tesla 17 00:00:54,280 --> 00:00:58,640 Speaker 1: really not an EV manufacturer anymore. Moving into the robotaxi age, 18 00:00:58,880 --> 00:01:02,240 Speaker 1: we believe successful and we believe it will scale much 19 00:01:02,320 --> 00:01:07,559 Speaker 1: better than most of its competitors. We see SpaceX really 20 00:01:07,680 --> 00:01:11,280 Speaker 1: only ninety percent of all of the satellites out there 21 00:01:11,720 --> 00:01:17,840 Speaker 1: neurally transforming lives of paralyzed people, people with als and 22 00:01:17,920 --> 00:01:21,520 Speaker 1: probably most surprising of all XAI. Now we own all 23 00:01:21,600 --> 00:01:25,560 Speaker 1: of those again in our venture fund, following them very carefully. 24 00:01:26,680 --> 00:01:31,960 Speaker 1: Xai is on some benchmarks. It hit a point that 25 00:01:32,400 --> 00:01:36,000 Speaker 1: three pro hit in June, it hit it in February. 26 00:01:36,120 --> 00:01:39,760 Speaker 1: So you know, we are very focused on barriers to 27 00:01:39,959 --> 00:01:44,959 Speaker 1: entry technology moats and we believe that the moats that 28 00:01:45,760 --> 00:01:48,800 Speaker 1: Elon has built, And obviously this is not just Elon. 29 00:01:48,960 --> 00:01:53,320 Speaker 1: He's attracting the best and the brightest to help solve 30 00:01:53,400 --> 00:01:57,600 Speaker 1: some of the world's biggest problems. So again, we do 31 00:01:58,760 --> 00:02:04,120 Speaker 1: trust the board and the board's instincts here and we 32 00:02:04,240 --> 00:02:05,280 Speaker 1: stay out of politics. 33 00:02:05,320 --> 00:02:07,600 Speaker 2: Well, we'd love your opinion on the current situation just 34 00:02:07,640 --> 00:02:11,000 Speaker 2: on Tesla specifically. You mentioned some phenomenal companies doing some 35 00:02:11,360 --> 00:02:14,400 Speaker 2: incredible things for Tesla, though, do you believe that Elon 36 00:02:14,800 --> 00:02:17,880 Speaker 2: can pursue his political ambitions at the same time pursuing 37 00:02:18,280 --> 00:02:20,119 Speaker 2: the best interest of TENSA shareholders. 38 00:02:21,680 --> 00:02:25,200 Speaker 1: One of the announcements Elon made recently is that he 39 00:02:25,480 --> 00:02:29,720 Speaker 1: is going to oversee sales in the US and in Europe. 40 00:02:30,000 --> 00:02:33,920 Speaker 1: And when he puts his mind on something, he usually 41 00:02:33,960 --> 00:02:38,800 Speaker 1: gets the job done. So I think he's much less 42 00:02:38,840 --> 00:02:42,600 Speaker 1: distracted now than he was let's say in the White 43 00:02:42,600 --> 00:02:44,880 Speaker 1: House twenty four to seven, at what. 44 00:02:44,760 --> 00:02:47,720 Speaker 3: Point do you see sort of the political landscape shifting 45 00:02:47,760 --> 00:02:50,520 Speaker 3: though not just for Tesla, but for the haves and 46 00:02:50,560 --> 00:02:53,000 Speaker 3: the have nots in some of the big tech space. 47 00:02:53,040 --> 00:02:55,880 Speaker 3: I know you've had a complicated relationship, say with Apple, 48 00:02:55,919 --> 00:02:59,480 Speaker 3: which seems uniquely pegged by some of these tariffs. Is 49 00:02:59,520 --> 00:03:01,840 Speaker 3: there anything that you could see that would make you 50 00:03:02,240 --> 00:03:04,280 Speaker 3: like that stock again or do you think that really 51 00:03:04,320 --> 00:03:06,520 Speaker 3: it is going to fall out of magnificent seven. 52 00:03:08,040 --> 00:03:10,640 Speaker 1: Yes, we've been watching Apple for a long time with 53 00:03:10,800 --> 00:03:14,600 Speaker 1: an AI lens, and that started, I'm going to say 54 00:03:14,600 --> 00:03:18,840 Speaker 1: about seven years ago when it was becoming serious about 55 00:03:19,000 --> 00:03:23,160 Speaker 1: autonomous vehicles. If you think about the ultimate mobile device, 56 00:03:23,280 --> 00:03:27,320 Speaker 1: it's an autonomous vehicle, and that should have been Apples 57 00:03:27,440 --> 00:03:31,960 Speaker 1: to win. And what we've seen there is one turnover 58 00:03:32,080 --> 00:03:36,880 Speaker 1: of management teams after another, and it's all autonomous driving. 59 00:03:37,400 --> 00:03:40,840 Speaker 1: Is an AI project, just the largest AI project on Earth, 60 00:03:41,600 --> 00:03:45,440 Speaker 1: we believe, and so losing the talent that it has, 61 00:03:45,480 --> 00:03:48,880 Speaker 1: and as I understand it lost another one today to 62 00:03:49,720 --> 00:03:54,640 Speaker 1: Mark Zuckerberg's Top fifty. So they've had a lot of 63 00:03:54,640 --> 00:03:57,560 Speaker 1: trouble in this regard, and I think the burden of 64 00:03:57,600 --> 00:03:58,360 Speaker 1: proof is on them. 65 00:03:58,760 --> 00:04:00,760 Speaker 3: Do you think that it is time to follow the talent, 66 00:04:01,040 --> 00:04:02,680 Speaker 3: that this is going to work the gamble that you're 67 00:04:02,680 --> 00:04:05,000 Speaker 3: seeing over at Meta, and that that's the path of 68 00:04:05,040 --> 00:04:07,520 Speaker 3: travel for the other big tech companies that are going 69 00:04:07,560 --> 00:04:09,520 Speaker 3: to just eat some of the companies from the inside 70 00:04:09,520 --> 00:04:11,600 Speaker 3: out take the talent unnecessarily by the company. 71 00:04:12,720 --> 00:04:15,080 Speaker 1: You know, it's a very good question. We're trying to 72 00:04:15,080 --> 00:04:19,560 Speaker 1: figure out if what Mark Zuckerberg is doing today is 73 00:04:19,680 --> 00:04:22,200 Speaker 1: much like he did when he was pivoting hard to 74 00:04:22,279 --> 00:04:27,360 Speaker 1: the metaverse, which proved because he thought that was the 75 00:04:27,400 --> 00:04:34,200 Speaker 1: next big thing, and that was incorrect. AI is the 76 00:04:34,240 --> 00:04:37,880 Speaker 1: next big thing, no question about it, and you do 77 00:04:38,000 --> 00:04:40,240 Speaker 1: have to have the right DNA and you have to 78 00:04:40,240 --> 00:04:43,520 Speaker 1: move fast and break things, as he would say. So, 79 00:04:44,040 --> 00:04:47,280 Speaker 1: you know, it'll be interesting to see if he's able 80 00:04:47,440 --> 00:04:51,840 Speaker 1: to turn his open source strategy and we've admired that 81 00:04:51,920 --> 00:04:56,320 Speaker 1: quite a bit in terms of generative AI into a 82 00:04:56,440 --> 00:05:01,640 Speaker 1: leader again or as we were observing it, Apple's LAMA 83 00:05:01,839 --> 00:05:05,520 Speaker 1: three at the time was improving at a faster rate 84 00:05:06,520 --> 00:05:10,160 Speaker 1: than some of Openay's models, and that was true of 85 00:05:10,200 --> 00:05:13,720 Speaker 1: open source generally, and then that has stopped. So, yes, 86 00:05:13,760 --> 00:05:17,200 Speaker 1: he had to do something. Is this the right thing? 87 00:05:17,480 --> 00:05:18,599 Speaker 1: Why did that stop? 88 00:05:18,839 --> 00:05:21,120 Speaker 2: I don't know, Kathy. Next time you're with us, we'll 89 00:05:21,120 --> 00:05:25,560 Speaker 2: have a longer conversation. Two leaders, incredible men doing incredible things. 90 00:05:25,600 --> 00:05:27,120 Speaker 2: I confessed the Kathy word.