1 00:00:00,040 --> 00:00:04,240 Speaker 1: What's the what's the top three long term holdings? 2 00:00:05,800 --> 00:00:07,640 Speaker 2: I love this question and hated at the same time. 3 00:00:07,640 --> 00:00:09,000 Speaker 2: I'm not gonna give you to tech two next, and 4 00:00:09,039 --> 00:00:11,960 Speaker 2: I'm not gonna say Apple Microsoft. If I have to 5 00:00:11,960 --> 00:00:15,360 Speaker 2: start today holding for five years, if I have to 6 00:00:15,400 --> 00:00:22,840 Speaker 2: pick three mega cap, one would be am Gen mm hmmm, 7 00:00:24,160 --> 00:00:28,240 Speaker 2: two would be Lily because of the tech exposure, and 8 00:00:28,360 --> 00:00:32,559 Speaker 2: three would be Google. Those provide you almost little to 9 00:00:32,640 --> 00:00:35,839 Speaker 2: no draw down regardless of what the economy is going 10 00:00:35,880 --> 00:00:39,040 Speaker 2: to do. Am Gen and Lily are going to print money, 11 00:00:39,040 --> 00:00:42,280 Speaker 2: but especially am Gen. Amgen's like I can't say a 12 00:00:42,320 --> 00:00:46,280 Speaker 2: baby Lily because there are different vocations, but the things 13 00:00:46,320 --> 00:00:48,640 Speaker 2: that I loved about Lily, they have the same thing. 14 00:00:48,680 --> 00:00:51,960 Speaker 3: They're just not as great. Like it's like Jordan and Kobe. 15 00:00:51,800 --> 00:00:53,959 Speaker 2: Like, if I can put Jordan Kobe on the same team, 16 00:00:53,720 --> 00:01:00,840 Speaker 2: I'm good going toward with anybody, And then said, no, 17 00:01:01,000 --> 00:01:04,480 Speaker 2: am Lily and Google. So those are your top Those 18 00:01:04,480 --> 00:01:06,960 Speaker 2: are your top three if I have to start new 19 00:01:07,200 --> 00:01:12,000 Speaker 2: today today five years. Yeah, those are over in video 20 00:01:12,080 --> 00:01:13,600 Speaker 2: over Microsoft, who you got? 21 00:01:13,720 --> 00:01:16,480 Speaker 4: Before I said who you got in video? 22 00:01:16,600 --> 00:01:17,319 Speaker 1: Microsoft? 23 00:01:19,640 --> 00:01:21,800 Speaker 3: T s R Rasha, I said, x R p. 24 00:01:25,000 --> 00:01:28,080 Speaker 1: R for sure. Well, Bitcoin if we're not talking just 25 00:01:30,720 --> 00:01:34,640 Speaker 1: but if it's just stocks, yeah, just stocks. 26 00:01:34,640 --> 00:01:40,240 Speaker 5: If it's just stocks in video is is one. Google 27 00:01:40,240 --> 00:01:43,200 Speaker 5: would be two mm hmm. It's just for the next 28 00:01:43,240 --> 00:01:48,160 Speaker 5: five years, You're just next five I don't want to 29 00:01:48,160 --> 00:01:53,160 Speaker 5: invest in it, but I think I don't. I'm obviously 30 00:01:53,240 --> 00:01:55,200 Speaker 5: ts M would be that because that's always my number 31 00:01:55,240 --> 00:01:56,240 Speaker 5: one say. 32 00:01:56,360 --> 00:01:58,320 Speaker 3: That's what I've said. 33 00:01:58,720 --> 00:01:59,280 Speaker 4: I think. 34 00:01:59,720 --> 00:02:03,200 Speaker 5: I think that over the next five years, Tesla's going 35 00:02:03,280 --> 00:02:04,160 Speaker 5: to have a hell of a run. 36 00:02:05,960 --> 00:02:08,920 Speaker 3: Even if he leaves. 37 00:02:10,160 --> 00:02:10,880 Speaker 4: Why would he leave. 38 00:02:10,880 --> 00:02:13,960 Speaker 1: He just got the billion, He just got the trillion dollar. Yeah, 39 00:02:14,040 --> 00:02:14,720 Speaker 1: he hits certain. 40 00:02:14,800 --> 00:02:17,120 Speaker 4: I think those the metrics that he's going to meet, 41 00:02:17,160 --> 00:02:17,720 Speaker 4: the metrics. 42 00:02:18,280 --> 00:02:19,840 Speaker 3: He can meet the metrics and leave to the company 43 00:02:19,919 --> 00:02:20,680 Speaker 3: it really matter. 44 00:02:20,880 --> 00:02:23,000 Speaker 5: But that he has to do it within a certain timeframe. 45 00:02:23,080 --> 00:02:25,000 Speaker 5: He doesn't get to the trillion. I think it's about 46 00:02:25,200 --> 00:02:27,399 Speaker 5: he has to do it in like eight or nine years, 47 00:02:27,440 --> 00:02:30,679 Speaker 5: so that falls within the range. I think the autonomous, 48 00:02:30,919 --> 00:02:34,960 Speaker 5: the robotics, what he's now doing in terms of like 49 00:02:35,000 --> 00:02:37,680 Speaker 5: technology and space, he'll figure out a way to incorporate that. 50 00:02:38,040 --> 00:02:40,520 Speaker 5: Tesla is just such an interesting proposition for the next 51 00:02:40,560 --> 00:02:43,919 Speaker 5: five years that outside of the ones you guys said, 52 00:02:44,080 --> 00:02:47,720 Speaker 5: I think the return on that would be pretty pretty interesting, 53 00:02:48,160 --> 00:02:50,960 Speaker 5: high growth. I wouldn't want to do it, but I 54 00:02:51,000 --> 00:02:52,560 Speaker 5: think it'll have a hell of return in the next 55 00:02:52,600 --> 00:02:53,160 Speaker 5: five years. 56 00:02:53,480 --> 00:02:57,840 Speaker 2: And sometimes the person in the space will tell you 57 00:02:57,880 --> 00:03:00,680 Speaker 2: where the economy is going. When he's said, we're not 58 00:03:00,720 --> 00:03:03,799 Speaker 2: a car company, I'm going to robot it straight out 59 00:03:04,160 --> 00:03:08,160 Speaker 2: tells you the value of software is damn that. I 60 00:03:08,160 --> 00:03:10,480 Speaker 2: don't like a lot of his political spursions. I don't 61 00:03:10,480 --> 00:03:14,200 Speaker 2: like how's daddy acted, like how Daddy raised them, what 62 00:03:14,360 --> 00:03:17,680 Speaker 2: ELI has done well though. He knows how to pick 63 00:03:17,680 --> 00:03:19,959 Speaker 2: a category that's going to dominate for fifteen years. 64 00:03:20,040 --> 00:03:22,560 Speaker 4: So that's the line, bro, And that's why I thought it. 65 00:03:22,600 --> 00:03:25,080 Speaker 5: I'm like, he's telling you straight up, like, Yo, this 66 00:03:25,120 --> 00:03:27,600 Speaker 5: isn't about car, this isn't about us making cars. In 67 00:03:27,600 --> 00:03:30,800 Speaker 5: fact that with the whole sector, we're done making that. 68 00:03:31,639 --> 00:03:34,040 Speaker 5: It was nice, it made sense what we're done with that. 69 00:03:34,120 --> 00:03:35,720 Speaker 5: Now this is where we're going. 70 00:03:36,400 --> 00:03:38,840 Speaker 1: But even that's problematic. Hit the like button and ship 71 00:03:38,880 --> 00:03:42,040 Speaker 1: with almost had seven thousand. Even that problematic to me 72 00:03:42,160 --> 00:03:44,880 Speaker 1: because I mean, they're missing numbers on the cars, and 73 00:03:44,920 --> 00:03:47,280 Speaker 1: now you shift into an industry once again. In the Buffalo, 74 00:03:48,080 --> 00:03:51,920 Speaker 1: it's a guess some people can raise money and be 75 00:03:52,480 --> 00:03:52,960 Speaker 1: off of. 76 00:03:55,280 --> 00:03:55,360 Speaker 4: It. 77 00:03:55,440 --> 00:03:59,000 Speaker 1: Goes back to the to the metaverse speculation. They saying 78 00:03:59,040 --> 00:04:03,040 Speaker 1: that human is gonna be one hundred million humanoid robots. 79 00:04:03,360 --> 00:04:07,240 Speaker 1: That's what they're saying. That also said that, they've also 80 00:04:07,280 --> 00:04:10,280 Speaker 1: said that real estate in the in the in the 81 00:04:10,280 --> 00:04:13,760 Speaker 1: metaverse was going to be the new Frontier and Snoop Dogg. 82 00:04:13,840 --> 00:04:16,160 Speaker 1: Somebody brought like a million dollars for the house next 83 00:04:16,160 --> 00:04:16,560 Speaker 1: to Snooper. 84 00:04:17,760 --> 00:04:21,800 Speaker 5: That's the day they Because people was killing zuck for it, 85 00:04:21,839 --> 00:04:23,400 Speaker 5: they didn't they didn't want to go on that ride 86 00:04:23,400 --> 00:04:23,680 Speaker 5: with him. 87 00:04:23,920 --> 00:04:25,520 Speaker 1: Well a lot of people did though, a lot of 88 00:04:25,520 --> 00:04:27,919 Speaker 1: people poured into the metaverse. It wasn't just Zuckerberg, a 89 00:04:27,960 --> 00:04:30,840 Speaker 1: lot of people out the metase heavy. So, I mean, 90 00:04:30,839 --> 00:04:32,640 Speaker 1: I don't know, I just feel like like you're basing 91 00:04:32,680 --> 00:04:34,880 Speaker 1: your whole company off of a thesis of what could 92 00:04:34,920 --> 00:04:38,960 Speaker 1: potentially happen, but what's currently happening. He's disappointing that the cars, 93 00:04:39,000 --> 00:04:41,520 Speaker 1: and he's he's he's got himself in a lot of 94 00:04:41,720 --> 00:04:44,600 Speaker 1: political turmoil. When the Democrats take over, they're not just 95 00:04:44,600 --> 00:04:47,000 Speaker 1: going to forget everything that he did. He's gonna have 96 00:04:47,040 --> 00:04:49,320 Speaker 1: to have some level of retribution and payback for this. 97 00:04:49,760 --> 00:04:52,880 Speaker 5: Well, he's still got three years, and that's the crazy part. 98 00:04:53,360 --> 00:04:55,080 Speaker 5: So if it's in the next five, he's still got three. 99 00:04:55,160 --> 00:04:55,400 Speaker 4: Level. 100 00:04:56,120 --> 00:05:01,400 Speaker 2: If x AI, who is X merging with his other company, 101 00:05:02,040 --> 00:05:06,479 Speaker 2: it's already done space X. That will tell you. If 102 00:05:06,480 --> 00:05:11,760 Speaker 2: I got an asset and Sam allegedly stole the baby 103 00:05:11,800 --> 00:05:14,360 Speaker 2: that I helped create and help get funded Ford employees, 104 00:05:14,960 --> 00:05:17,360 Speaker 2: and you do your Super Bowl commercial and I make 105 00:05:17,400 --> 00:05:20,279 Speaker 2: a competitive asset and I move it over to SpaceX, 106 00:05:20,279 --> 00:05:22,240 Speaker 2: it's interesting he didn't move it over the Tessela. 107 00:05:22,600 --> 00:05:27,160 Speaker 1: No, exactly, SpaceX is not. That's that's the one to me. 108 00:05:28,760 --> 00:05:31,600 Speaker 2: Because I think he's gonna do the same with Startlink 109 00:05:31,640 --> 00:05:34,920 Speaker 2: and put x ai SpaceX and starlingk and the one 110 00:05:34,960 --> 00:05:38,279 Speaker 2: that's a company that most people cannot cannot touch the company, 111 00:05:38,320 --> 00:05:39,240 Speaker 2: damn it untouchable. 112 00:05:40,360 --> 00:05:43,240 Speaker 5: I'm not gonna comment today, but I will comment next week. 113 00:05:43,800 --> 00:05:45,600 Speaker 5: And this will make sense when I when I do it, 114 00:05:45,680 --> 00:05:49,000 Speaker 5: got you, yeah, space SpaceX. But at some point we 115 00:05:49,000 --> 00:05:52,120 Speaker 5: do have to have that real conversation. Though there's there's 116 00:05:52,200 --> 00:05:54,960 Speaker 5: luxuries that some people's afforded that other entrepreneurs are not. 117 00:05:56,960 --> 00:05:57,520 Speaker 3: A metaverse. 118 00:05:57,760 --> 00:06:01,320 Speaker 2: I love shout to everybody h Q at the Hudson Building. 119 00:06:01,640 --> 00:06:03,800 Speaker 2: But if I delivered that metaverse and that was the 120 00:06:03,880 --> 00:06:04,480 Speaker 2: losses on. 121 00:06:04,440 --> 00:06:08,039 Speaker 5: It, you're not there is no second chance. You're not 122 00:06:08,080 --> 00:06:11,240 Speaker 5: even getting the first chance. Yeah, the idea is not 123 00:06:11,279 --> 00:06:12,279 Speaker 5: getting you a first chance. 124 00:06:12,680 --> 00:06:13,040 Speaker 4: M hm. 125 00:06:13,720 --> 00:06:15,479 Speaker 1: And we're gonna talk about access to capital later on 126 00:06:15,480 --> 00:06:19,920 Speaker 1: the episode. But some entrepreneurs are afforded to lose hundreds 127 00:06:19,920 --> 00:06:23,280 Speaker 1: of billions of dollars mm hmm, and then they get 128 00:06:23,320 --> 00:06:28,960 Speaker 1: rewarded with hundreds of billions of dollars. Other entrepreneurs have 129 00:06:29,040 --> 00:06:31,720 Speaker 1: no margin of error at all. You lose, you lose 130 00:06:31,760 --> 00:06:34,080 Speaker 1: a thousand dollars in your life is over mm hm. 131 00:06:36,600 --> 00:06:41,839 Speaker 1: So elon people like him on Wall Street? But what 132 00:06:42,000 --> 00:06:45,560 Speaker 1: is he he's you really think that humanoid robots is 133 00:06:45,600 --> 00:06:47,360 Speaker 1: going to take over the way that like one hundred 134 00:06:47,400 --> 00:06:49,279 Speaker 1: million humanoid robots, You really think that's gonna happen? 135 00:06:49,400 --> 00:06:50,520 Speaker 4: Take over? In what sense? 136 00:06:50,720 --> 00:06:52,719 Speaker 1: One hundred million? He's I think he said three hundred 137 00:06:52,960 --> 00:06:53,719 Speaker 1: something like, I don't. 138 00:06:53,600 --> 00:06:55,560 Speaker 4: Know, I don't think, I don't think. I don't think 139 00:06:55,560 --> 00:06:56,440 Speaker 4: we're in terminator. 140 00:06:56,600 --> 00:07:03,120 Speaker 5: I think obviously in factories and work force, does it change? Yeah, Well, 141 00:07:03,160 --> 00:07:06,120 Speaker 5: everybody have it in their homes I don't think we're there. 142 00:07:06,160 --> 00:07:08,599 Speaker 5: In the next five years, I think you'll start to 143 00:07:08,600 --> 00:07:13,080 Speaker 5: see the signs of how it improves some like I said, workforce, 144 00:07:13,360 --> 00:07:15,600 Speaker 5: maybe some forms of life, like with some thing, some 145 00:07:15,680 --> 00:07:16,360 Speaker 5: task in life. 146 00:07:17,080 --> 00:07:18,600 Speaker 4: But can anybody afford it? 147 00:07:18,880 --> 00:07:21,840 Speaker 5: Right when that first iteration of anything, it is usually 148 00:07:22,320 --> 00:07:25,440 Speaker 5: super expensive and there's only a limited people that can 149 00:07:25,480 --> 00:07:27,440 Speaker 5: afford that that price point until they figure out a 150 00:07:27,440 --> 00:07:28,880 Speaker 5: model that makes sense that the. 151 00:07:28,840 --> 00:07:29,760 Speaker 4: Everyday person can have. 152 00:07:29,880 --> 00:07:33,960 Speaker 5: We're not We're not there yet, but but you could. 153 00:07:34,120 --> 00:07:36,480 Speaker 5: That's where they had it. You can see all all 154 00:07:36,520 --> 00:07:39,280 Speaker 5: systems are pointing that way. Although we maybe we should 155 00:07:39,280 --> 00:07:40,320 Speaker 5: stop talking about. 156 00:07:42,120 --> 00:07:43,960 Speaker 1: Where are you? 157 00:07:44,000 --> 00:07:45,040 Speaker 3: Can you guys hear me? 158 00:07:45,480 --> 00:07:48,040 Speaker 4: They trying to get the boy out yet? You know what, I'm. 159 00:07:47,840 --> 00:07:54,960 Speaker 2: Sorry bad, but I think there'll be one hundred one 160 00:07:55,000 --> 00:07:59,800 Speaker 2: hundred million humanoid robotster in one company. Probability is even 161 00:07:59,840 --> 00:08:02,080 Speaker 2: if just look at the delivery history of what he 162 00:08:02,080 --> 00:08:05,480 Speaker 2: said about FSD. But like you said, some creators and 163 00:08:05,600 --> 00:08:09,119 Speaker 2: founders are allowed to pay a lofty picture that's bigger 164 00:08:09,160 --> 00:08:10,559 Speaker 2: than a total addressable market. 165 00:08:11,120 --> 00:08:12,440 Speaker 4: Yeah, mm hmm. 166 00:08:13,000 --> 00:08:15,000 Speaker 2: Like I said, if I would have pitched the metaverse 167 00:08:15,400 --> 00:08:19,720 Speaker 2: and then fell, I would have been for assassinated. 168 00:08:20,800 --> 00:08:23,040 Speaker 4: The same with the stuff that he gets to say. 169 00:08:23,080 --> 00:08:25,800 Speaker 2: And they man, they criticized the hell out of Obama 170 00:08:25,800 --> 00:08:27,280 Speaker 2: for putting his feet up on the desk and wearing 171 00:08:27,280 --> 00:08:27,840 Speaker 2: a tan houp. 172 00:08:28,440 --> 00:08:30,119 Speaker 3: There's just some liberties that are given. 173 00:08:30,960 --> 00:08:34,000 Speaker 2: And this is a fact, like even I know it 174 00:08:34,040 --> 00:08:37,040 Speaker 2: may be tough on Tim Cook, Steve wouldn't have missed 175 00:08:37,040 --> 00:08:41,120 Speaker 2: this AI wave like this any other company that missed 176 00:08:41,120 --> 00:08:44,680 Speaker 2: the AI wave, they're suffering the suffering as a result. 177 00:08:46,200 --> 00:08:48,360 Speaker 1: So so is that is that a model that can 178 00:08:48,400 --> 00:08:51,320 Speaker 1: really be a pivot for a company that's a car 179 00:08:51,320 --> 00:08:55,120 Speaker 1: company now, just to go into uncharted territories and say, okay, wait, 180 00:08:55,160 --> 00:08:58,000 Speaker 1: this is the play that we're going to do robots robotics. 181 00:08:58,320 --> 00:09:00,880 Speaker 2: I think because he's a visionary he would be able 182 00:09:00,880 --> 00:09:02,840 Speaker 2: to do it. But I ultimately think that he is 183 00:09:02,880 --> 00:09:05,400 Speaker 2: going to railroad Tesla. And to the person in the 184 00:09:05,400 --> 00:09:09,240 Speaker 2: comments who say star Link is already under ownership of SpaceX, 185 00:09:09,280 --> 00:09:12,040 Speaker 2: they have of course a partnership, but if it was 186 00:09:12,080 --> 00:09:14,560 Speaker 2: completely owned by SpaceX, you wouldn't be able to invest 187 00:09:14,559 --> 00:09:18,280 Speaker 2: in both rounds, so they don't solely own it. I 188 00:09:18,360 --> 00:09:23,760 Speaker 2: do think upon exit, you can put away in for 189 00:09:23,960 --> 00:09:26,520 Speaker 2: your shoes to be too big to feel his are, 190 00:09:27,200 --> 00:09:29,440 Speaker 2: but it's also a poison pill so that the company 191 00:09:29,440 --> 00:09:31,240 Speaker 2: does not thrive. Look at what he's done to the 192 00:09:31,320 --> 00:09:33,599 Speaker 2: value of the company of just the cars in the 193 00:09:33,720 --> 00:09:37,760 Speaker 2: last five years. At one point, people are paying eighty 194 00:09:37,800 --> 00:09:39,680 Speaker 2: and ninety thousand dollars for those cars. 195 00:09:39,800 --> 00:09:40,959 Speaker 3: You can get some one of those cars out for 196 00:09:41,000 --> 00:09:41,720 Speaker 3: twenty two grand. 197 00:09:43,600 --> 00:09:47,040 Speaker 2: That's a hard pivot while people are sitting in a 198 00:09:47,160 --> 00:09:52,040 Speaker 2: financing option and the value of the car is less 199 00:09:52,040 --> 00:09:52,680 Speaker 2: than the loan. 200 00:09:52,640 --> 00:09:53,160 Speaker 4: That you got. 201 00:09:56,480 --> 00:10:00,199 Speaker 5: Yeah, the godfather of AI said that robotics is a 202 00:10:00,320 --> 00:10:04,560 Speaker 5: once in a generation opportunity. Uh, and it has a 203 00:10:04,640 --> 00:10:07,440 Speaker 5: chance to do work that we do now and we'll 204 00:10:07,440 --> 00:10:08,800 Speaker 5: no longer want to do in the future. 205 00:10:10,320 --> 00:10:12,840 Speaker 4: So he's back in that Fenson. 206 00:10:14,240 --> 00:10:16,000 Speaker 1: Yeah, I mean everybody. I don't think anybody. It's like 207 00:10:16,040 --> 00:10:17,760 Speaker 1: once again, it's like, hey, I don't think anybody's thinking 208 00:10:17,760 --> 00:10:19,600 Speaker 1: that robotics is not going to play a major part. 209 00:10:19,920 --> 00:10:22,480 Speaker 1: But I just don't see three hundred million robots. I 210 00:10:22,559 --> 00:10:23,040 Speaker 1: don't either. 211 00:10:23,960 --> 00:10:26,640 Speaker 4: That's the whole country. 212 00:10:26,360 --> 00:10:29,600 Speaker 1: The human robots, factories and stuff like that. Yeah, for 213 00:10:29,640 --> 00:10:31,560 Speaker 1: sure they go and play a major part, but like 214 00:10:33,240 --> 00:10:35,520 Speaker 1: just hundreds of millions of robots just walking the streets 215 00:10:35,520 --> 00:10:36,960 Speaker 1: and just in everybody's household. 216 00:10:37,400 --> 00:10:37,880 Speaker 4: I don't know. 217 00:10:38,040 --> 00:10:40,160 Speaker 5: I don't see that happening in five years. No, I 218 00:10:40,200 --> 00:10:42,320 Speaker 5: don't think it's five years. But you think about how 219 00:10:42,360 --> 00:10:45,840 Speaker 5: many people working factories. You think about the global impact 220 00:10:45,840 --> 00:10:49,960 Speaker 5: of that. You're talking like if he's talking about walking 221 00:10:50,000 --> 00:10:52,520 Speaker 5: outside and it's like I said, is it terminator, No, 222 00:10:53,080 --> 00:10:55,800 Speaker 5: But if you taking them out factory work, you're talking 223 00:10:55,840 --> 00:10:57,360 Speaker 5: about millions of jobs. 224 00:10:57,800 --> 00:11:00,679 Speaker 1: A lot of those factories have already been replaced by robots, 225 00:11:00,720 --> 00:11:03,160 Speaker 1: not in the sense of robots like humanoid robots, but 226 00:11:03,520 --> 00:11:06,280 Speaker 1: machines that that do that do work already, that's already 227 00:11:06,280 --> 00:11:09,600 Speaker 1: that's already taken away a lot of jobs. So humanoid 228 00:11:09,679 --> 00:11:11,600 Speaker 1: robots specifically. 229 00:11:12,360 --> 00:11:12,760 Speaker 4: Different. 230 00:11:13,360 --> 00:11:16,200 Speaker 5: What I mean, what we saw with our own eyes 231 00:11:17,320 --> 00:11:21,280 Speaker 5: when when they we were uh on the toilet and 232 00:11:21,360 --> 00:11:23,559 Speaker 5: video and when we got to see what they're doing 233 00:11:23,720 --> 00:11:28,080 Speaker 5: in terms of programming the augantic like robots, it was like, 234 00:11:28,520 --> 00:11:28,840 Speaker 5: damn 235 00:11:29,200 --> 00:11:34,160 Speaker 2: It was amazing, especially caliboration for different space environments for sure, 236 00:11:34,640 --> 00:11:35,600 Speaker 2: Like this is different.