1 00:00:02,520 --> 00:00:08,360 Speaker 1: Bloomberg Audio Studios, podcasts, radio news. Joining us now from 2 00:00:08,400 --> 00:00:12,680 Speaker 1: Snowflake Clothing, Dan, i'ves joins us this morning. 3 00:00:12,760 --> 00:00:18,200 Speaker 2: Thank you for toning it down. Dan. I'm apoplectic this morning. 4 00:00:18,640 --> 00:00:23,720 Speaker 1: Over what I saw is a complete reaffirmation of the 5 00:00:23,760 --> 00:00:24,800 Speaker 1: Google model. 6 00:00:25,239 --> 00:00:27,640 Speaker 2: It's out there. I'm gonna do a major shout out 7 00:00:27,800 --> 00:00:29,960 Speaker 2: to as a fitch over it. Heard in the street 8 00:00:30,320 --> 00:00:32,919 Speaker 2: on the Wall Street Journal, who I think really nailed? 9 00:00:34,000 --> 00:00:37,360 Speaker 2: I was OMG? Was I right to be? Omg? At 10 00:00:37,400 --> 00:00:40,720 Speaker 2: four twenty yesterday afternoon? I mean, look this it. 11 00:00:41,080 --> 00:00:43,519 Speaker 3: I think it's a game changer for the whole story 12 00:00:43,680 --> 00:00:47,400 Speaker 3: because it shows like Google, now there was a view 13 00:00:47,440 --> 00:00:50,400 Speaker 3: they're in the defensive. AI is going to ruin their 14 00:00:50,479 --> 00:00:56,400 Speaker 3: search business now look at look at search, mass advertising, struck, YouTube, cloud, 15 00:00:56,520 --> 00:00:59,279 Speaker 3: better things back and then the best thing by then 16 00:00:59,440 --> 00:01:02,920 Speaker 3: increasing cap bax ten billion. That's then basically beat in 17 00:01:02,960 --> 00:01:06,679 Speaker 3: their chest saying hey in this arm. 18 00:01:07,040 --> 00:01:08,680 Speaker 1: I went and looked at it, folks. You can do 19 00:01:08,720 --> 00:01:12,000 Speaker 1: this with the WACC screen. Good morning, mister Secunda. The 20 00:01:12,040 --> 00:01:14,560 Speaker 1: way did the average cost a capitol screen on the 21 00:01:14,560 --> 00:01:18,520 Speaker 1: Bloomberg an added ten billion of capax for Google is 22 00:01:18,600 --> 00:01:21,480 Speaker 1: basically trying to buy another house out in Menlo Park. 23 00:01:21,800 --> 00:01:23,160 Speaker 2: It's like a rounding error. 24 00:01:23,280 --> 00:01:26,480 Speaker 3: That's it's exactly and look Tom, the last thing that 25 00:01:26,600 --> 00:01:29,120 Speaker 3: they could be is on the outside looking in on 26 00:01:29,160 --> 00:01:31,559 Speaker 3: this AI arms race. And I think the view, look, 27 00:01:31,600 --> 00:01:33,840 Speaker 3: we think on a large cap perspective, this has been 28 00:01:33,880 --> 00:01:35,920 Speaker 3: the most over sold stock relative. 29 00:01:36,160 --> 00:01:36,720 Speaker 2: I'll go with it. 30 00:01:36,880 --> 00:01:40,800 Speaker 3: And again I get like DOJ regulatory, But the view 31 00:01:40,840 --> 00:01:45,559 Speaker 3: that AI is going to ruin Alphabet's business is dramatically flowed. 32 00:01:45,680 --> 00:01:47,760 Speaker 2: Let's go touch a feey concept here, it's away. 33 00:01:47,800 --> 00:01:52,240 Speaker 1: Lisa Rolls, what did we learn yesterday about how we 34 00:01:52,520 --> 00:01:54,280 Speaker 1: use AI. 35 00:01:54,480 --> 00:01:57,960 Speaker 2: With our searches? I think we learned something new yesterday. 36 00:01:58,080 --> 00:02:01,360 Speaker 3: I think what you learn is that you do those searches. 37 00:02:01,600 --> 00:02:04,600 Speaker 3: Google's going to play a huge part. And I think 38 00:02:04,880 --> 00:02:08,079 Speaker 3: if you look at like actual like advertising, you look 39 00:02:08,080 --> 00:02:11,040 Speaker 3: at like search results, you look at all the underlying metrics, 40 00:02:11,520 --> 00:02:15,880 Speaker 3: it shows that it's actually an accelerant for their business. 41 00:02:15,639 --> 00:02:19,040 Speaker 3: That's something that takes away and I think that's the thing. 42 00:02:19,160 --> 00:02:22,519 Speaker 1: Is Paul Dan's code. Okay, for those of you on radio, 43 00:02:23,040 --> 00:02:24,760 Speaker 1: it looks like look at his code. It's like a 44 00:02:24,840 --> 00:02:26,520 Speaker 1: truckboard in a physics lecture. 45 00:02:26,639 --> 00:02:29,359 Speaker 4: It's not kill tech it's very troubling. 46 00:02:29,600 --> 00:02:30,920 Speaker 2: In a Cartesian space. 47 00:02:31,240 --> 00:02:35,640 Speaker 4: Continue, search revenue is number of clicks times revenue per click. 48 00:02:35,880 --> 00:02:40,160 Speaker 4: I'm clicking materially less. How does that not bead for 49 00:02:40,200 --> 00:02:41,000 Speaker 4: their search case? 50 00:02:41,240 --> 00:02:45,000 Speaker 3: So that you can say it's bad now. But the 51 00:02:45,120 --> 00:02:48,960 Speaker 3: view is is that Paul, given what they're doing in 52 00:02:49,040 --> 00:02:53,280 Speaker 3: terms of investments in AI, given where they view ultimately 53 00:02:53,280 --> 00:02:55,960 Speaker 3: the future, that when you're going to come back in 54 00:02:56,040 --> 00:02:59,840 Speaker 3: terms of searching through models, through through actually like LMS, 55 00:03:00,320 --> 00:03:02,960 Speaker 3: Google is going to play a role in monetizing that 56 00:03:03,080 --> 00:03:05,679 Speaker 3: and even on the advertising side. And I think that's 57 00:03:05,720 --> 00:03:09,040 Speaker 3: really the view is that this is not something that's 58 00:03:09,120 --> 00:03:12,320 Speaker 3: structurally gonna be a threat to their business. 59 00:03:12,520 --> 00:03:13,799 Speaker 2: It's actually the opposite. 60 00:03:14,040 --> 00:03:16,040 Speaker 3: They're gonna turn it into what I view. 61 00:03:15,880 --> 00:03:16,400 Speaker 2: As a towel. 62 00:03:16,480 --> 00:03:18,720 Speaker 3: And I think that's what Tom's talking about. That's what 63 00:03:18,800 --> 00:03:20,799 Speaker 3: came out last night, is that this is not a 64 00:03:20,880 --> 00:03:22,120 Speaker 3: company on the defensive. 65 00:03:22,560 --> 00:03:23,680 Speaker 2: This is coming on the offense. 66 00:03:23,680 --> 00:03:25,320 Speaker 4: Okay, all right, So, and we're seeing that in this 67 00:03:25,440 --> 00:03:27,440 Speaker 4: in the stock the stocks up three point six percent 68 00:03:28,280 --> 00:03:30,400 Speaker 4: here today, although it's flat for the year, so it's 69 00:03:30,440 --> 00:03:33,720 Speaker 4: there's definitely concern out there in the marketplace about this story. 70 00:03:33,720 --> 00:03:38,360 Speaker 4: Here more concern is with Tesla. I don't know, man, 71 00:03:38,840 --> 00:03:42,000 Speaker 4: My hat's off to you hanging with this dude, But 72 00:03:42,360 --> 00:03:45,360 Speaker 4: making cars is not a good business for anybody. It's 73 00:03:45,400 --> 00:03:47,920 Speaker 4: never has been. So he's trying to redirect us to 74 00:03:48,000 --> 00:03:50,360 Speaker 4: all this other stuff, which fine, you know, a lot 75 00:03:50,400 --> 00:03:52,960 Speaker 4: of people will bank on this guy. Where should we 76 00:03:52,960 --> 00:03:55,360 Speaker 4: be focusing our attention on the Tesla story because I 77 00:03:55,360 --> 00:03:56,680 Speaker 4: don't want to be bending metal here. 78 00:03:56,840 --> 00:03:59,760 Speaker 3: Look but me, we've talked about for years, like I 79 00:04:00,160 --> 00:04:02,960 Speaker 3: don't view this as a car company, and you never 80 00:04:03,000 --> 00:04:08,320 Speaker 3: have right nowlogies. But to me, the future is about AI. 81 00:04:08,440 --> 00:04:11,960 Speaker 3: It's about physical AI in terms of autonomous robotics. No 82 00:04:12,520 --> 00:04:14,520 Speaker 3: this quarter, in the next few quarters, there is nothing 83 00:04:14,600 --> 00:04:17,760 Speaker 3: right at home about right in terms of actual deliveries, demand, 84 00:04:17,800 --> 00:04:21,159 Speaker 3: gross margins EPs. But if you look at the story 85 00:04:21,240 --> 00:04:24,560 Speaker 3: over the long term, I believe ninety percent of the 86 00:04:24,560 --> 00:04:28,920 Speaker 3: future value is about robotaxis autonomous. Eventually, it's going to 87 00:04:28,960 --> 00:04:31,960 Speaker 3: be optimist that when it comes to physical AI, and 88 00:04:32,080 --> 00:04:34,559 Speaker 3: Jensen talks about this all the time, the two best 89 00:04:34,600 --> 00:04:37,320 Speaker 3: physical AI plays are Tesla in the video. 90 00:04:37,520 --> 00:04:39,880 Speaker 4: Okay, that I mean, that's obviously the future here. Give 91 00:04:39,920 --> 00:04:42,680 Speaker 4: us that the mile post. We should be if we're 92 00:04:42,880 --> 00:04:45,360 Speaker 4: taking our attention away from the car business, what are 93 00:04:45,400 --> 00:04:48,240 Speaker 4: some of the mile posts for these other businesses shoulday 94 00:04:48,240 --> 00:04:48,640 Speaker 4: attention to. 95 00:04:48,880 --> 00:04:52,440 Speaker 3: So well, first off, from a robotaxic perspective, you're looking 96 00:04:52,440 --> 00:04:53,320 Speaker 3: to be in twenty. 97 00:04:53,040 --> 00:04:54,800 Speaker 2: Five cities in the next year. Okay. 98 00:04:54,920 --> 00:04:58,640 Speaker 3: So our team was in Austin. We saw Robotaxi front center. 99 00:04:58,680 --> 00:05:03,520 Speaker 3: So you're seeing that expansing diffinitely in Austin. California's next, 100 00:05:03,560 --> 00:05:06,680 Speaker 3: then Florida and bunch other Okay, you know, see, look, 101 00:05:06,720 --> 00:05:09,120 Speaker 3: and I believe one of the big mile posts that 102 00:05:09,160 --> 00:05:12,480 Speaker 3: everyone's looking for is, you know, at some point, probably 103 00:05:12,600 --> 00:05:15,400 Speaker 3: early twenty six, when Keyn's in a robo taxi and 104 00:05:15,680 --> 00:05:18,640 Speaker 3: the thing is, I do think that I do think 105 00:05:18,640 --> 00:05:19,680 Speaker 3: that that will happen. 106 00:05:20,520 --> 00:05:21,760 Speaker 2: Okay, Yeah, it's interesting. 107 00:05:21,839 --> 00:05:25,279 Speaker 4: I mean Tom, when Tom's in a roto taxi, Yeah, 108 00:05:25,360 --> 00:05:26,000 Speaker 4: I might be mine. 109 00:05:26,320 --> 00:05:27,120 Speaker 2: Good look for that. 110 00:05:27,520 --> 00:05:29,599 Speaker 1: Dan Ives is here in your commune across the nation, 111 00:05:29,680 --> 00:05:33,800 Speaker 1: particularly good early early morning in the sixteen miles between 112 00:05:33,839 --> 00:05:35,320 Speaker 1: Menlo Park and Coopertino. 113 00:05:35,560 --> 00:05:39,159 Speaker 2: How did Coopertino digest the Google earnings yesterday? 114 00:05:39,600 --> 00:05:41,920 Speaker 1: You know, the basic you know, I can just see 115 00:05:43,279 --> 00:05:47,200 Speaker 1: the leadership of Google and Apple taking a breakfast or 116 00:05:47,240 --> 00:05:50,600 Speaker 1: lunch at Mademoiselle Collette, you know, some fancy place out 117 00:05:50,600 --> 00:05:53,120 Speaker 1: there where everybody eats herbal And the answer is, how 118 00:05:53,120 --> 00:05:54,040 Speaker 1: does Apple take this? 119 00:05:54,320 --> 00:05:56,520 Speaker 2: To me, it's a huge deal for Apple weave. 120 00:05:57,120 --> 00:06:00,360 Speaker 3: I think that the view that Google is going to 121 00:06:00,440 --> 00:06:05,480 Speaker 3: become less important Apple is the exact opt. 122 00:06:04,640 --> 00:06:10,200 Speaker 2: An agreement with that talk about. We've said it. It's 123 00:06:10,200 --> 00:06:11,520 Speaker 2: a two part thing. 124 00:06:11,600 --> 00:06:14,480 Speaker 3: I think you need to get a significant sort of 125 00:06:14,560 --> 00:06:17,599 Speaker 3: upping of that partnership with Google because when you look 126 00:06:17,640 --> 00:06:19,680 Speaker 3: at partners that's one that they're going to bet on. 127 00:06:19,880 --> 00:06:23,320 Speaker 3: In terms of Apple, and I continue as I've talked about, 128 00:06:23,360 --> 00:06:25,359 Speaker 3: I think for the first time, they're going to have 129 00:06:25,360 --> 00:06:27,040 Speaker 3: to do an acquisition. Like I told you, I think 130 00:06:27,080 --> 00:06:30,160 Speaker 3: perplexity continues to be the one that they should go after, 131 00:06:30,240 --> 00:06:33,719 Speaker 3: because my whole point is we should Apple should go 132 00:06:33,760 --> 00:06:34,560 Speaker 3: after perplexity. 133 00:06:34,600 --> 00:06:37,599 Speaker 2: And let's let's dig into that time. Full disclosure, folks. 134 00:06:37,640 --> 00:06:40,960 Speaker 1: Mister Secunda, one of the founders of Bloomberg, is hugely 135 00:06:41,000 --> 00:06:45,000 Speaker 1: influential in AI with the Secunda Foundation, and he leads 136 00:06:45,040 --> 00:06:45,840 Speaker 1: with perplexity. 137 00:06:45,960 --> 00:06:47,760 Speaker 2: What is perplexity When. 138 00:06:47,560 --> 00:06:50,719 Speaker 3: You think about models, chat, GPT, you know anyone that 139 00:06:50,880 --> 00:06:54,360 Speaker 3: uses perplexity as you know, I've used probably one of 140 00:06:54,400 --> 00:06:58,200 Speaker 3: the best next gen AI models out there. The reason 141 00:06:58,240 --> 00:07:02,280 Speaker 3: it's so important and some disagree, but Apple needs to 142 00:07:02,320 --> 00:07:06,360 Speaker 3: make an aggressive move. They cannot watch this AI revolution 143 00:07:06,960 --> 00:07:09,280 Speaker 3: and just when it pass by. That's why I think 144 00:07:09,320 --> 00:07:13,400 Speaker 3: what you saw from Alphabet was a stepping up when 145 00:07:13,400 --> 00:07:16,720 Speaker 3: it comes to Apple. You know, if you go back WWDC, 146 00:07:17,080 --> 00:07:19,520 Speaker 3: they really hardly even mentioned AI, and I think the 147 00:07:19,640 --> 00:07:23,680 Speaker 3: time has come for them to make aggressive moves in AI, 148 00:07:23,680 --> 00:07:25,320 Speaker 3: and that will be the big focus next week in 149 00:07:25,360 --> 00:07:25,760 Speaker 3: the conference. 150 00:07:25,880 --> 00:07:30,240 Speaker 1: Lisa, what's the most herbal macha coffee thing in the morning, Like, 151 00:07:30,280 --> 00:07:33,360 Speaker 1: what would be the typical West Coast you know, a 152 00:07:33,360 --> 00:07:34,320 Speaker 1: healthy coffee thing. 153 00:07:34,640 --> 00:07:38,520 Speaker 2: It's called a dirty macha Cooper Tina. 154 00:07:38,600 --> 00:07:41,640 Speaker 1: This morning, the team's having their dirty matchas to get 155 00:07:41,640 --> 00:07:45,840 Speaker 1: the days started again. How does Apple respond to what 156 00:07:45,880 --> 00:07:49,000 Speaker 1: we learned thirteen percent YouTube growth? 157 00:07:49,040 --> 00:07:51,680 Speaker 2: Thank you folks for that. I'm living it every day. 158 00:07:51,960 --> 00:07:56,120 Speaker 1: And the answer is how does Apple respond to perplexity, 159 00:07:56,480 --> 00:08:00,760 Speaker 1: to Gemini and to Google's you know, their embedded relationship Google. 160 00:08:00,560 --> 00:08:03,960 Speaker 3: If you can't beat them, join them, Like they're basically 161 00:08:04,000 --> 00:08:07,280 Speaker 3: going to have to double down on the partnership. Despite 162 00:08:07,320 --> 00:08:10,320 Speaker 3: all the words in terms of d J and everything 163 00:08:10,320 --> 00:08:12,720 Speaker 3: that Google is dealing with. And I think for Apple, 164 00:08:13,200 --> 00:08:16,600 Speaker 3: look this, you have the best global install based in 165 00:08:16,600 --> 00:08:19,040 Speaker 3: the world consumer. You have two point four billion device 166 00:08:19,120 --> 00:08:22,239 Speaker 3: one point five billion iPhones. Now's the time to start 167 00:08:22,280 --> 00:08:25,080 Speaker 3: to monetize AI. But that's not going to happen internally. 168 00:08:25,360 --> 00:08:28,360 Speaker 3: It's not going to happen within the walls of Apple Park. 169 00:08:28,760 --> 00:08:31,520 Speaker 3: It needs to happen through partnerships and what I believe 170 00:08:31,600 --> 00:08:33,760 Speaker 3: ultimately is an acquisition tom. 171 00:08:33,880 --> 00:08:37,800 Speaker 4: This is courtesy of Google AI. A dirty Macha, also 172 00:08:37,880 --> 00:08:41,520 Speaker 4: known as a macha espresso fusion, is a popular beverage 173 00:08:41,520 --> 00:08:44,400 Speaker 4: that combines the earthly flavor of macha tea with the 174 00:08:44,400 --> 00:08:47,359 Speaker 4: bold intensity of a shot or two of espresso. 175 00:08:51,960 --> 00:08:52,160 Speaker 3: Dan. 176 00:08:52,480 --> 00:08:55,960 Speaker 2: In terms of like single, best Buy, web Bush, I 177 00:08:55,960 --> 00:08:56,600 Speaker 2: mean it's great. 178 00:08:56,679 --> 00:08:59,840 Speaker 1: I love the snowflake. Look you look like Heinman's Physics 179 00:09:00,120 --> 00:09:03,440 Speaker 1: last years ago when you got that what's the hat? 180 00:09:03,559 --> 00:09:06,120 Speaker 2: Damn it? Yeah, Majorca hat. 181 00:09:06,280 --> 00:09:09,360 Speaker 3: It's a you know, it's from Majorca to one of 182 00:09:09,400 --> 00:09:14,160 Speaker 3: my Orca It's just my Orca hat. And look it's 183 00:09:14,320 --> 00:09:17,240 Speaker 3: it's one where you know this is I mean to 184 00:09:17,360 --> 00:09:19,760 Speaker 3: me when I look at like Webbush, best Buy, the 185 00:09:19,800 --> 00:09:24,000 Speaker 3: best names. I believe it's Microsoft here. This is the 186 00:09:24,040 --> 00:09:26,959 Speaker 3: one that you know because this is the one. It's 187 00:09:27,000 --> 00:09:29,680 Speaker 3: not just about four trillion, it's about as the enterprise 188 00:09:29,840 --> 00:09:34,840 Speaker 3: AI plays out in the backyard of red. It's Nandella 189 00:09:34,920 --> 00:09:35,719 Speaker 3: that's gonna does. 190 00:09:36,240 --> 00:09:37,880 Speaker 2: Think of what Google did yesterday. 191 00:09:38,520 --> 00:09:41,720 Speaker 3: I think what Google did yesterday on Google Cloud, I 192 00:09:41,720 --> 00:09:43,679 Speaker 3: think just speaks further what you're going to see what 193 00:09:43,760 --> 00:09:46,640 Speaker 3: they did on Azure coming next week. But the most 194 00:09:46,640 --> 00:09:50,360 Speaker 3: important thing is that everything that Google's doing, all the investment, 195 00:09:50,440 --> 00:09:54,120 Speaker 3: everything is sand search gets who eventually benefits. 196 00:09:53,679 --> 00:09:54,720 Speaker 2: Because they're the toll. 197 00:09:54,800 --> 00:09:59,320 Speaker 3: Microsoft's the toll collector. It doesn't matter who's there in 198 00:09:59,400 --> 00:10:01,920 Speaker 3: one way or know you're paying them as more and 199 00:10:02,000 --> 00:10:05,320 Speaker 3: more moved to the cloud right and it's AI driven 200 00:10:05,480 --> 00:10:06,040 Speaker 3: that continues. 201 00:10:06,200 --> 00:10:10,080 Speaker 1: One final question, this is so important the gloom crew 202 00:10:10,360 --> 00:10:11,319 Speaker 1: on dan Ives. 203 00:10:11,520 --> 00:10:13,600 Speaker 2: Did you know I understand there's haters out there. 204 00:10:13,559 --> 00:10:15,960 Speaker 3: Now, haters, haters. 205 00:10:15,720 --> 00:10:20,080 Speaker 1: Lisa, and then you know thousands others, the dan Ives haters. 206 00:10:20,679 --> 00:10:25,000 Speaker 1: You have been so right, including pounding the table the 207 00:10:25,080 --> 00:10:27,840 Speaker 1: first week of April. I want you to speak to 208 00:10:27,880 --> 00:10:31,960 Speaker 1: the haters right now about the X axis of the 209 00:10:32,040 --> 00:10:33,880 Speaker 1: ives tech boom. 210 00:10:33,920 --> 00:10:36,240 Speaker 3: Look, I think the haters, you know, and right now 211 00:10:36,320 --> 00:10:39,600 Speaker 3: they're they're in the hibernation mood in those caves. They 212 00:10:39,640 --> 00:10:42,160 Speaker 3: can find AI in the spreadsheets. And I think part 213 00:10:42,160 --> 00:10:45,680 Speaker 3: of the problem is that they're underestimating the Fourth Industrial Revolution. 214 00:10:46,000 --> 00:10:47,040 Speaker 3: And as I've said, what. 215 00:10:46,920 --> 00:10:47,480 Speaker 2: Does that mean? 216 00:10:47,720 --> 00:10:49,679 Speaker 3: Look three millionaire miles. 217 00:10:49,520 --> 00:10:51,160 Speaker 2: Ask walking test speech. 218 00:10:51,240 --> 00:10:54,280 Speaker 3: Look it point is three millionaire miles to last twenty 219 00:10:54,280 --> 00:10:56,640 Speaker 3: five years. It's about seeing what's happening in the world. 220 00:10:56,679 --> 00:10:58,840 Speaker 3: And if you look what's happened in Asia and you 221 00:10:58,920 --> 00:11:02,320 Speaker 3: look at Chipped the man, we are only it is 222 00:11:02,520 --> 00:11:04,880 Speaker 3: fractional where we are relative to the opportunity. 223 00:11:04,920 --> 00:11:05,360 Speaker 2: It's our view. 224 00:11:05,600 --> 00:11:08,360 Speaker 3: It's ten pm in the AI party. It was nine 225 00:11:08,360 --> 00:11:10,600 Speaker 3: pm and the party goes to four am. 226 00:11:10,920 --> 00:11:14,600 Speaker 1: I'm gonna say this is simple as I can. 227 00:11:14,840 --> 00:11:15,640 Speaker 2: We keep tracked. 228 00:11:15,840 --> 00:11:19,040 Speaker 1: We learn from people who are wrong, we learn from 229 00:11:19,120 --> 00:11:22,240 Speaker 1: people who are right. But it's this thematic idea of 230 00:11:22,320 --> 00:11:24,760 Speaker 1: where we are folks right now to me, I said 231 00:11:24,760 --> 00:11:25,600 Speaker 1: this last. 232 00:11:25,480 --> 00:11:28,720 Speaker 2: Night to missus Keane. It's like late nineteen ninety four, 233 00:11:29,200 --> 00:11:31,720 Speaker 2: early nineteen ninety five and Dan. 234 00:11:31,559 --> 00:11:33,559 Speaker 1: Ives was in a bar with a Budweiser in his 235 00:11:33,640 --> 00:11:36,400 Speaker 1: hand saying, Codje Google's. 236 00:11:35,920 --> 00:11:37,080 Speaker 2: Buy a dog file. 237 00:11:37,520 --> 00:11:40,240 Speaker 3: It's nineteen ninety five, not ninety ninety nine. 238 00:11:40,480 --> 00:11:43,200 Speaker 2: Good Jedives, Thank you so much with Medbush