1 00:00:00,160 --> 00:00:06,080 Speaker 1: What's the what's the top three long term holdings? I 2 00:00:06,120 --> 00:00:07,960 Speaker 1: love this question and hated at the same time. I'm 3 00:00:07,960 --> 00:00:09,399 Speaker 1: not gonna give you Tech two next, and I'm not 4 00:00:09,400 --> 00:00:12,719 Speaker 1: gonna say Apple Microsoft. If I have to start today 5 00:00:14,080 --> 00:00:16,040 Speaker 1: holder for five years, if I have to pick three 6 00:00:17,280 --> 00:00:24,599 Speaker 1: Mega cap, one would be am Gen mm hmmm, two 7 00:00:24,640 --> 00:00:28,800 Speaker 1: would be Lily because of the tech exposure, and three 8 00:00:28,840 --> 00:00:32,920 Speaker 1: would be Google. Those provide you almost little to no 9 00:00:33,040 --> 00:00:36,320 Speaker 1: draw down regardless of what the economy is going to do. 10 00:00:36,400 --> 00:00:39,320 Speaker 1: Am Gen and Lily are going to print money, but 11 00:00:39,479 --> 00:00:42,800 Speaker 1: especially am Gen. Amgen's like I can't say a baby 12 00:00:42,840 --> 00:00:46,640 Speaker 1: Lily because there are different vocations, but the things that 13 00:00:46,680 --> 00:00:48,879 Speaker 1: I loved about Lily, they have the same thing. 14 00:00:48,920 --> 00:00:50,080 Speaker 2: They're just not as great. 15 00:00:50,159 --> 00:00:52,479 Speaker 1: Like it's like Jordan and Kobe, Like, if I can 16 00:00:52,479 --> 00:00:55,600 Speaker 1: put Jordan Kobe on the same team, I'm good going 17 00:00:55,600 --> 00:01:01,600 Speaker 1: toward with anybody and no age. 18 00:01:01,640 --> 00:01:02,400 Speaker 2: Lily and Google. 19 00:01:03,080 --> 00:01:05,520 Speaker 1: So those are your top Those are your top three 20 00:01:06,080 --> 00:01:10,679 Speaker 1: if I have to start new today today five years. Yeah, 21 00:01:10,680 --> 00:01:13,760 Speaker 1: those are three. Over in video over Microsoft. 22 00:01:13,360 --> 00:01:13,840 Speaker 3: Who you got? 23 00:01:13,959 --> 00:01:19,920 Speaker 1: Before I said who you got in video Microsoft? T 24 00:01:20,160 --> 00:01:26,600 Speaker 1: S R Rasha said x R p R Sure, Well, Bitcoin, 25 00:01:27,120 --> 00:01:33,200 Speaker 1: if we're not talking just but if it's just stocks, yeah, 26 00:01:33,280 --> 00:01:34,920 Speaker 1: just sto just stocks. 27 00:01:34,920 --> 00:01:38,760 Speaker 4: If it's just stocks in video is is is one. 28 00:01:40,160 --> 00:01:43,240 Speaker 4: Google would be two mm hmm. It's just for the 29 00:01:43,280 --> 00:01:48,360 Speaker 4: next five years, You're just next five I don't want 30 00:01:48,360 --> 00:01:52,760 Speaker 4: to invest in it, but I think I don't. I'm 31 00:01:53,040 --> 00:01:55,200 Speaker 4: obviously ts M would be that because that's always my. 32 00:01:55,280 --> 00:01:58,600 Speaker 2: Number one say that I've said. 33 00:01:59,000 --> 00:02:02,880 Speaker 3: I think I think that over the next five years, 34 00:02:02,960 --> 00:02:04,440 Speaker 3: Tesla's going to have a hell of a run. 35 00:02:06,240 --> 00:02:11,359 Speaker 1: Even if he leaves, why would he leave. He just 36 00:02:11,400 --> 00:02:14,200 Speaker 1: got the billion, He just got the trillion dollar. Yeah, 37 00:02:14,360 --> 00:02:15,040 Speaker 1: he hits certain. 38 00:02:15,120 --> 00:02:17,440 Speaker 3: I think those the metrics that he's going to meet, 39 00:02:17,440 --> 00:02:18,520 Speaker 3: the metrics. 40 00:02:18,480 --> 00:02:20,160 Speaker 2: He can meet, the metrics and leave to the company 41 00:02:20,200 --> 00:02:21,000 Speaker 2: don't really matter. 42 00:02:21,200 --> 00:02:23,320 Speaker 4: But that he has to do it within a certain timeframe. 43 00:02:23,400 --> 00:02:25,320 Speaker 4: He doesn't get to the trillion. I think it's about 44 00:02:25,520 --> 00:02:27,760 Speaker 4: he has to do it in like eight or nine years, 45 00:02:27,760 --> 00:02:31,000 Speaker 4: so that falls within the range. I think the autonomous, 46 00:02:31,240 --> 00:02:35,280 Speaker 4: the robotics, what he's now doing in terms of like 47 00:02:35,360 --> 00:02:38,000 Speaker 4: technology and space, he'll figure out a way to incorporate that. 48 00:02:38,360 --> 00:02:40,880 Speaker 4: Tesla is just such an interesting proposition for the next 49 00:02:40,880 --> 00:02:44,240 Speaker 4: five years that outside of the ones you guys said 50 00:02:44,440 --> 00:02:48,080 Speaker 4: I think the return on that would be pretty pretty interesting, 51 00:02:48,480 --> 00:02:51,280 Speaker 4: high growth. I wouldn't want to do it, but I 52 00:02:51,320 --> 00:02:52,880 Speaker 4: think it'll have a hell of return in the next 53 00:02:52,919 --> 00:02:53,600 Speaker 4: five years. 54 00:02:53,840 --> 00:02:58,120 Speaker 1: And sometimes the person in the space will tell you 55 00:02:58,240 --> 00:03:01,040 Speaker 1: where the economy is going. When he said we're not 56 00:03:01,080 --> 00:03:04,160 Speaker 1: a car company, I'm going through robot it straight out, 57 00:03:04,160 --> 00:03:07,840 Speaker 1: I tells you the value of software is damn that. 58 00:03:08,440 --> 00:03:10,520 Speaker 2: I don't like a lot of his political spursions. 59 00:03:10,560 --> 00:03:13,480 Speaker 1: I don't like how's daddy acted, like how Daddy raised him, 60 00:03:14,480 --> 00:03:17,760 Speaker 1: what ELI has done well though. He knows how to 61 00:03:17,800 --> 00:03:20,400 Speaker 1: pick a category that's going to dominate for fifteen years. 62 00:03:20,400 --> 00:03:22,920 Speaker 3: So that's the line, bro, And that's why I thought it. 63 00:03:22,960 --> 00:03:25,440 Speaker 4: I'm like, he's telling you straight up, like, Yo, this 64 00:03:25,480 --> 00:03:27,960 Speaker 4: isn't about car, this isn't about us making cars. In 65 00:03:28,000 --> 00:03:31,200 Speaker 4: fact that with the whole sector, we're done making that. 66 00:03:32,040 --> 00:03:34,400 Speaker 4: It was nice, it made sense, but we're done with that. 67 00:03:34,480 --> 00:03:36,160 Speaker 4: Now this is where we're going. 68 00:03:36,760 --> 00:03:39,240 Speaker 1: But even that's problematic. Hit the like button and ship 69 00:03:39,280 --> 00:03:42,440 Speaker 1: with almost had seven thousand. Even that problematic to me 70 00:03:42,560 --> 00:03:45,280 Speaker 1: because I mean, they're missing numbers on the cars. And 71 00:03:45,320 --> 00:03:47,680 Speaker 1: now you shift into an industry once again. In the Buffalo, 72 00:03:49,120 --> 00:03:52,360 Speaker 1: it's a guess some people can raise money and be 73 00:03:52,880 --> 00:03:59,080 Speaker 1: off of it goes back to the metaverse speculation. They're 74 00:03:59,080 --> 00:04:03,440 Speaker 1: saying that human there's gonna be one hundred million humanoid robots, 75 00:04:03,760 --> 00:04:08,000 Speaker 1: That's what they're saying. Said that. They's also said that 76 00:04:08,040 --> 00:04:12,280 Speaker 1: real estate in the int in the metaverse was going 77 00:04:12,360 --> 00:04:14,920 Speaker 1: to be the new Frontier and Snoop Dogg. Somebody brought 78 00:04:14,960 --> 00:04:17,000 Speaker 1: like a million dollars for the house next to Snooper. 79 00:04:18,200 --> 00:04:22,240 Speaker 4: That's the day they Because people was killing zuck for it, 80 00:04:22,279 --> 00:04:23,800 Speaker 4: they didn't they didn't want to go on that ride 81 00:04:23,839 --> 00:04:24,120 Speaker 4: with him. 82 00:04:24,360 --> 00:04:25,960 Speaker 1: Well a lot of people did though, a lot of 83 00:04:25,960 --> 00:04:28,359 Speaker 1: people poured into the metaverse. It wasn't just Zuckerberg, a 84 00:04:28,400 --> 00:04:31,280 Speaker 1: lot of people out of the metase heavy. So, I mean, 85 00:04:31,279 --> 00:04:33,080 Speaker 1: I don't know, I just feel like like you're basing 86 00:04:33,120 --> 00:04:35,360 Speaker 1: your whole company off of a thesis of what could 87 00:04:35,360 --> 00:04:39,400 Speaker 1: potentially happen, but what's currently happening. He's disappointing that the cars, 88 00:04:39,480 --> 00:04:41,960 Speaker 1: and he's he's he's got himself in a lot of 89 00:04:42,200 --> 00:04:45,039 Speaker 1: political turmoil. When the Democrats take over, they're not just 90 00:04:45,040 --> 00:04:47,440 Speaker 1: going to forget everything that he did. He's gonna have 91 00:04:47,480 --> 00:04:49,680 Speaker 1: to have some level of retribution and payback for. 92 00:04:49,640 --> 00:04:52,560 Speaker 4: This, Well, he's still got three years, and that's the 93 00:04:52,600 --> 00:04:53,360 Speaker 4: crazy part. 94 00:04:53,839 --> 00:04:55,839 Speaker 3: So if it's in the next five, he's still got three. Level. 95 00:04:56,600 --> 00:05:01,840 Speaker 1: If x AI, who is X merging with his other company, 96 00:05:02,520 --> 00:05:06,960 Speaker 1: it's already done space X, that will tell you. If 97 00:05:06,960 --> 00:05:12,280 Speaker 1: I got an asset and Sam allegedly stole the baby 98 00:05:12,279 --> 00:05:14,919 Speaker 1: that I helped create and help get funded for it employees, 99 00:05:15,440 --> 00:05:17,840 Speaker 1: and you do your Super Bowl commercial, and I make 100 00:05:17,880 --> 00:05:20,760 Speaker 1: a competitive asset and I move it over to SpaceX, 101 00:05:20,800 --> 00:05:23,719 Speaker 1: it's interesting he didn't move it over the Tessela, No, exactly, 102 00:05:24,440 --> 00:05:29,520 Speaker 1: SpaceX is not. That's that's the one to me because 103 00:05:29,520 --> 00:05:32,200 Speaker 1: I think he's gonna do the same with Startlink and 104 00:05:32,240 --> 00:05:35,640 Speaker 1: put x ai SpaceX and Starlink and the one that's 105 00:05:35,640 --> 00:05:38,760 Speaker 1: a company that most people cannot cannot touch my company, 106 00:05:38,839 --> 00:05:39,760 Speaker 1: damn untouchable. 107 00:05:40,880 --> 00:05:43,760 Speaker 4: I'm not gonna comment today, but I will comment next week, 108 00:05:44,320 --> 00:05:46,240 Speaker 4: and this will make sense when I when I do it. 109 00:05:46,200 --> 00:05:49,640 Speaker 1: Got you space SpaceX. But at some point we do 110 00:05:49,720 --> 00:05:53,200 Speaker 1: have to have that real conversation. Though there's there's luxuries 111 00:05:53,200 --> 00:05:55,480 Speaker 1: that some people's afforded that other entrepreneurs are not. 112 00:05:57,480 --> 00:05:58,040 Speaker 2: A metaverse. 113 00:05:58,279 --> 00:06:01,839 Speaker 1: I love shout to everybody h Q at the Hudson Building. 114 00:06:02,160 --> 00:06:04,320 Speaker 1: But if I delivered that metaverse and that was the 115 00:06:04,400 --> 00:06:07,560 Speaker 1: losses on it, you're not there is no second chance. 116 00:06:08,320 --> 00:06:11,560 Speaker 1: You're not even getting the first chance ideah. The idea 117 00:06:11,600 --> 00:06:12,800 Speaker 1: is not getting you a first chance. 118 00:06:13,240 --> 00:06:13,520 Speaker 4: M h. 119 00:06:14,240 --> 00:06:15,960 Speaker 1: And we're gonna talk about access to capital later on 120 00:06:16,040 --> 00:06:20,480 Speaker 1: the episode. But some entrepreneurs are afforded to lose hundreds 121 00:06:20,480 --> 00:06:23,840 Speaker 1: of billions of dollars mm hmm, and then they get 122 00:06:23,880 --> 00:06:29,520 Speaker 1: rewarded with hundreds of billions of dollars. Other entrepreneurs have 123 00:06:29,600 --> 00:06:32,320 Speaker 1: no margin of error at all. You lose, you lose 124 00:06:32,320 --> 00:06:33,840 Speaker 1: a thousand dollars, and your life is over. 125 00:06:34,320 --> 00:06:34,680 Speaker 2: Mm hm. 126 00:06:37,160 --> 00:06:42,400 Speaker 1: So elon people like him on Wall Street? But what 127 00:06:42,560 --> 00:06:46,159 Speaker 1: is he he's you really think that humanoid robots is 128 00:06:46,160 --> 00:06:47,960 Speaker 1: going to take over the way that like a hundred 129 00:06:48,000 --> 00:06:49,800 Speaker 1: million humanoid robots. You don't think that's gonna happen? 130 00:06:50,000 --> 00:06:51,080 Speaker 3: Take over? In what sense? 131 00:06:51,279 --> 00:06:53,800 Speaker 1: One hundred million he's I think he said three hundred something. 132 00:06:53,920 --> 00:06:54,840 Speaker 3: I don't know. I don't think. 133 00:06:55,120 --> 00:06:55,520 Speaker 1: I don't think. 134 00:06:55,760 --> 00:06:57,039 Speaker 3: I don't think we're in terminator. 135 00:06:57,200 --> 00:07:03,480 Speaker 4: I think obviously in factory, in workforce, does it change, Yeah, 136 00:07:03,520 --> 00:07:06,280 Speaker 4: well everybody have it in their homes. I don't think 137 00:07:06,320 --> 00:07:08,880 Speaker 4: we're there in the next five years, I think you'll 138 00:07:08,880 --> 00:07:12,480 Speaker 4: start to see the signs of how it improves some 139 00:07:12,720 --> 00:07:15,400 Speaker 4: like I said, workforce, maybe some forms of life, like 140 00:07:15,440 --> 00:07:16,960 Speaker 4: with some thing, some task in life. 141 00:07:17,680 --> 00:07:19,200 Speaker 3: But can anybody afford it? 142 00:07:19,480 --> 00:07:23,280 Speaker 4: Right when that first iteration of anything is usually super 143 00:07:23,280 --> 00:07:26,400 Speaker 4: expensive and there's only a limited people that can afford 144 00:07:26,440 --> 00:07:28,320 Speaker 4: at that price point until they figure out a model 145 00:07:28,360 --> 00:07:30,680 Speaker 4: that makes sense that the everyday person can have. We're 146 00:07:30,720 --> 00:07:34,920 Speaker 4: not We're not there yet, but but you could. That's 147 00:07:34,960 --> 00:07:37,640 Speaker 4: where they had it. You can see all systems are 148 00:07:37,640 --> 00:07:40,880 Speaker 4: pointing that way. Although we maybe we should stop talking about. 149 00:07:42,760 --> 00:07:44,600 Speaker 1: Where are you? 150 00:07:44,640 --> 00:07:45,640 Speaker 2: Can you guys hear. 151 00:07:45,520 --> 00:07:47,800 Speaker 3: Me trying to get the boy out yet? You know what, 152 00:07:48,280 --> 00:07:52,960 Speaker 3: I'm sorry bad. 153 00:07:54,360 --> 00:07:57,000 Speaker 1: I think there'd be one hundred one hundred million humanoid 154 00:07:57,120 --> 00:08:01,040 Speaker 1: robotster in one company probability even if you just look 155 00:08:01,080 --> 00:08:03,600 Speaker 1: at the delivery history of what he said about FSD. 156 00:08:04,040 --> 00:08:07,360 Speaker 1: But like you said, some creators and founders are allowed 157 00:08:07,800 --> 00:08:10,280 Speaker 1: to pay a lofty picture that's bigger than a total 158 00:08:10,320 --> 00:08:11,240 Speaker 1: addressable market. 159 00:08:11,760 --> 00:08:13,080 Speaker 3: Yeah, mm hmm. 160 00:08:13,640 --> 00:08:15,680 Speaker 1: Like I said, if I would have pitched the metaverse 161 00:08:16,080 --> 00:08:21,880 Speaker 1: and it fell, I would have been for assassinated. Same 162 00:08:21,960 --> 00:08:25,200 Speaker 1: with the stuff that he get to say, and they man, 163 00:08:25,200 --> 00:08:26,840 Speaker 1: they criticized the hell out of Obama for putting his 164 00:08:26,880 --> 00:08:28,520 Speaker 1: feet up on the desk and wearing a tan houp. 165 00:08:29,120 --> 00:08:30,800 Speaker 2: There's just some liberties that are given. 166 00:08:31,680 --> 00:08:34,680 Speaker 1: And this is a fact, like even I know it 167 00:08:34,720 --> 00:08:37,720 Speaker 1: may be tough on Tim Cook, Steve wouldn't have missed 168 00:08:37,720 --> 00:08:41,800 Speaker 1: this AI wave like this any other company that missed 169 00:08:41,800 --> 00:08:45,400 Speaker 1: the AI wave. They're suffering the suffering as a result. 170 00:08:46,880 --> 00:08:49,040 Speaker 1: So so is that is that a model that can 171 00:08:49,080 --> 00:08:52,000 Speaker 1: really be a pivot for a company that's a car 172 00:08:52,040 --> 00:08:55,800 Speaker 1: company now, just to go into uncharted territories and say, okay, wait, 173 00:08:55,840 --> 00:08:58,720 Speaker 1: this is the play that we're going to do robots robotics. 174 00:08:59,000 --> 00:08:59,960 Speaker 2: I think because he's. 175 00:08:59,800 --> 00:09:01,960 Speaker 1: A visionary he would be able to do it. But 176 00:09:01,960 --> 00:09:05,160 Speaker 1: I ultimately think that he is going to railroad Tesla. 177 00:09:05,320 --> 00:09:07,000 Speaker 1: And to the person in the comments who say star 178 00:09:07,080 --> 00:09:10,679 Speaker 1: Link is already under ownership of SpaceX, they have of 179 00:09:10,679 --> 00:09:14,120 Speaker 1: course a partnership, but if it was completely owned by SpaceX, 180 00:09:14,120 --> 00:09:16,760 Speaker 1: you wouldn't be able to invest in both rounds, so they. 181 00:09:16,600 --> 00:09:18,040 Speaker 2: Don't solely own it. 182 00:09:18,880 --> 00:09:23,960 Speaker 1: I do think upon exit, you can put away in 183 00:09:24,320 --> 00:09:26,880 Speaker 1: for your shoes to be too big to feel his 184 00:09:27,080 --> 00:09:29,800 Speaker 1: are but it's also a poison pill so that the 185 00:09:29,800 --> 00:09:31,880 Speaker 1: company does not thrive. Look at what he's done to 186 00:09:31,920 --> 00:09:34,280 Speaker 1: the value of the company of just the cars in 187 00:09:34,320 --> 00:09:38,040 Speaker 1: the last five years. At one point, people are paying 188 00:09:38,200 --> 00:09:40,440 Speaker 1: eighty and ninety thousand dollars for those cars. 189 00:09:40,559 --> 00:09:41,720 Speaker 2: You can get some one of those cars out for 190 00:09:41,760 --> 00:09:42,480 Speaker 2: twenty two grand. 191 00:09:44,360 --> 00:09:47,760 Speaker 1: That's a hard pivot while people are sitting in a 192 00:09:47,920 --> 00:09:52,439 Speaker 1: finance an option and the value of the car is 193 00:09:52,520 --> 00:09:53,200 Speaker 1: less than the. 194 00:09:53,120 --> 00:09:53,920 Speaker 2: Loan that you got. 195 00:09:57,240 --> 00:10:01,400 Speaker 4: Yeah, the godfather of AI that robotics is a once 196 00:10:01,760 --> 00:10:05,319 Speaker 4: in a generation opportunity and it has it has a 197 00:10:05,440 --> 00:10:08,200 Speaker 4: chance to do work that we do now and we'll 198 00:10:08,200 --> 00:10:09,560 Speaker 4: no longer want to do in the future. 199 00:10:11,120 --> 00:10:13,600 Speaker 3: So he's back in that Tenton. 200 00:10:15,040 --> 00:10:17,480 Speaker 1: Yeah, everybody. I don't think anybody it's like once against Like, hey, 201 00:10:17,520 --> 00:10:19,440 Speaker 1: I don't think anybody's thinking that robotics is not going 202 00:10:19,520 --> 00:10:21,560 Speaker 1: to play a major part. But I just don't see 203 00:10:21,559 --> 00:10:23,840 Speaker 1: three hundred million robots. I don't either. 204 00:10:24,760 --> 00:10:27,439 Speaker 3: That's the whole country. 205 00:10:27,160 --> 00:10:30,400 Speaker 1: The human robots, factories and stuff like that. Yeah, for 206 00:10:30,440 --> 00:10:32,360 Speaker 1: sure they go and play a major part, but like 207 00:10:34,040 --> 00:10:36,320 Speaker 1: just hundreds of millions of robots just walking the streets 208 00:10:36,320 --> 00:10:39,720 Speaker 1: and just in everybody's household. I don't see that happening 209 00:10:39,760 --> 00:10:42,280 Speaker 1: in five years. No, I don't think it's five years. 210 00:10:42,400 --> 00:10:44,720 Speaker 4: But you think about how many people working factories, you 211 00:10:44,800 --> 00:10:49,200 Speaker 4: think about the global impact of that. You're talking like 212 00:10:49,280 --> 00:10:52,240 Speaker 4: if he's talking about walking outside and it's like I said, 213 00:10:52,320 --> 00:10:55,040 Speaker 4: is it terminator, No, But if you take them out 214 00:10:55,080 --> 00:10:58,199 Speaker 4: factory work, you're talking about millions of jobs. 215 00:10:58,640 --> 00:11:01,480 Speaker 1: A lot of those factories have already and replaced by robots, 216 00:11:01,520 --> 00:11:04,000 Speaker 1: not in the sense of robots like humanoid robots, but 217 00:11:04,360 --> 00:11:07,040 Speaker 1: machines that that do that do work already, that's already 218 00:11:07,120 --> 00:11:10,440 Speaker 1: that's already taken away a lot of jobs. So humanoid 219 00:11:10,520 --> 00:11:12,439 Speaker 1: robots specifically. 220 00:11:13,200 --> 00:11:13,600 Speaker 3: Different. 221 00:11:14,200 --> 00:11:17,040 Speaker 4: What I mean, what we saw with our own eyes 222 00:11:18,200 --> 00:11:22,120 Speaker 4: when when they we were uh on the toilet and 223 00:11:22,200 --> 00:11:24,400 Speaker 4: video and when we got to see what they're doing 224 00:11:24,559 --> 00:11:29,720 Speaker 4: in terms of programming the agenttic like robots, it was like, damn, 225 00:11:30,040 --> 00:11:30,760 Speaker 4: it was amazing. 226 00:11:31,360 --> 00:11:35,920 Speaker 1: Especially caliboration for different space environments for sure, like this 227 00:11:36,040 --> 00:11:36,520 Speaker 1: is different.