1 00:00:02,720 --> 00:00:14,000 Speaker 1: Bloomberg Audio Studios, Podcasts, Radio News. 2 00:00:18,040 --> 00:00:21,040 Speaker 2: Hello and welcome to another episode of the Odd Lots podcast. 3 00:00:21,079 --> 00:00:23,279 Speaker 3: I'm Joe Wassenthal and I'm Tracy Alloway. 4 00:00:23,360 --> 00:00:26,599 Speaker 2: Today you are going to be listening or watching a 5 00:00:26,640 --> 00:00:31,480 Speaker 2: special episode of our podcast recorded live at a conference 6 00:00:31,520 --> 00:00:32,880 Speaker 2: which you've probably never seen before. 7 00:00:33,280 --> 00:00:33,520 Speaker 4: Yeah. 8 00:00:33,560 --> 00:00:36,280 Speaker 3: This was at the Lazard for Square conference, and we 9 00:00:36,360 --> 00:00:40,559 Speaker 3: interviewed Dimitri Shevalenko. He is the chief business officer of Perplexity, 10 00:00:40,720 --> 00:00:43,479 Speaker 3: which is something you might have used before. We certainly 11 00:00:43,560 --> 00:00:43,919 Speaker 3: use it. 12 00:00:44,040 --> 00:00:46,920 Speaker 2: YEP, one of the leading AI models, and so this 13 00:00:47,040 --> 00:00:49,879 Speaker 2: is a conference related to media and deal making, and 14 00:00:49,920 --> 00:00:52,400 Speaker 2: so we talked about what the impact of AI is 15 00:00:52,440 --> 00:00:55,320 Speaker 2: going to be on news, media and brands, the Internet 16 00:00:55,360 --> 00:00:55,960 Speaker 2: as we know it. 17 00:00:56,240 --> 00:00:58,520 Speaker 3: We also took questions from the audience, so you will 18 00:00:58,520 --> 00:01:01,400 Speaker 3: hear us referring to question that we are receiving on 19 00:01:01,440 --> 00:01:04,200 Speaker 3: the app. We hope you enjoy take a listen. All right, 20 00:01:04,240 --> 00:01:07,720 Speaker 3: So here's my first question, should we even be talking 21 00:01:07,760 --> 00:01:10,880 Speaker 3: to you? What I mean by that is I could 22 00:01:10,880 --> 00:01:13,040 Speaker 3: just go on Perplexity and ask all these questions and 23 00:01:13,080 --> 00:01:16,920 Speaker 3: it will give me a pretty decent answer. What's the point. 24 00:01:18,040 --> 00:01:20,280 Speaker 4: So, first of all, thank you for having me it's 25 00:01:20,280 --> 00:01:26,480 Speaker 4: great to be here. The I love that that tee 26 00:01:26,560 --> 00:01:29,119 Speaker 4: up and you guys are scheming backstage of like what's 27 00:01:29,160 --> 00:01:31,680 Speaker 4: gonna be the hard opening question? But I love it 28 00:01:31,720 --> 00:01:37,280 Speaker 4: because the thing that humans will always be uniquely exceptional 29 00:01:37,319 --> 00:01:42,160 Speaker 4: at is asking questions. So perplexity may have the answer, 30 00:01:43,360 --> 00:01:47,720 Speaker 4: but perplexity does not have an innate desire to be curious. 31 00:01:48,840 --> 00:01:51,000 Speaker 4: And that's what you guys have. And that's why journalism 32 00:01:51,080 --> 00:01:54,240 Speaker 4: is so important, and that's why having these conversations important. 33 00:01:54,240 --> 00:01:57,840 Speaker 4: It's about the questions, not necessarily the answers. 34 00:01:58,760 --> 00:02:01,040 Speaker 2: Kind of related to this, I guess. But here's the 35 00:02:01,120 --> 00:02:04,320 Speaker 2: thing that I always think about, which is I enjoy 36 00:02:04,440 --> 00:02:08,280 Speaker 2: using perplexity, et cetera. Let's say, you know, you have 37 00:02:08,360 --> 00:02:11,480 Speaker 2: great relationships with all the news publishers, etc. And they're 38 00:02:11,480 --> 00:02:13,880 Speaker 2: really cool with it all, and all those deals get solved, 39 00:02:13,960 --> 00:02:17,200 Speaker 2: et cetera. Like why do we need the internet as 40 00:02:17,200 --> 00:02:20,360 Speaker 2: we know it that has things called websites built for humans? 41 00:02:20,400 --> 00:02:22,440 Speaker 2: Now that we're in this age where we can go 42 00:02:22,520 --> 00:02:24,880 Speaker 2: to a chat about or something like it, and the 43 00:02:24,960 --> 00:02:28,400 Speaker 2: answer is right there, do we need to have what 44 00:02:28,440 --> 00:02:32,520 Speaker 2: are traditionally called websites anymore? Why would a publisher continue 45 00:02:32,560 --> 00:02:35,320 Speaker 2: and invest in design and all of these things when 46 00:02:35,320 --> 00:02:38,959 Speaker 2: I could just get the answer from perplexity or tragic 47 00:02:39,000 --> 00:02:39,880 Speaker 2: pterior whatever else. 48 00:02:40,880 --> 00:02:44,000 Speaker 4: So again it goes back to I think the reason 49 00:02:44,080 --> 00:02:48,519 Speaker 4: we're drawn to media properties is because of the questions 50 00:02:48,600 --> 00:02:52,080 Speaker 4: the journalists and editors are asking. Right, it's not necessarily 51 00:02:52,160 --> 00:02:55,480 Speaker 4: it's the framing. It is the what is actually worth 52 00:02:55,600 --> 00:02:59,680 Speaker 4: asking about, what's worth reading about. Perplexity doesn't have answers 53 00:02:59,720 --> 00:03:02,919 Speaker 4: to that. You have to come to perplexity already armed 54 00:03:03,280 --> 00:03:07,360 Speaker 4: with a question. And oftentimes people you know read news 55 00:03:07,880 --> 00:03:10,520 Speaker 4: and there's something they didn't understand fully and they come 56 00:03:10,560 --> 00:03:13,800 Speaker 4: to perplexity to get that deeper understanding. But the spark 57 00:03:14,720 --> 00:03:18,440 Speaker 4: that comes from editorial judgment, that comes from the intuition 58 00:03:18,800 --> 00:03:22,240 Speaker 4: of a world class reporter, and that's why we're excited 59 00:03:22,320 --> 00:03:27,240 Speaker 4: through programs like Commet Plus to find new business models 60 00:03:27,240 --> 00:03:27,520 Speaker 4: for that. 61 00:03:28,000 --> 00:03:31,160 Speaker 3: How do you actually judge editorial judgment or quality of 62 00:03:31,240 --> 00:03:34,240 Speaker 3: news sources? Because I can imagine a future where a 63 00:03:34,280 --> 00:03:37,400 Speaker 3: lot of our news consumption is through something like a 64 00:03:37,440 --> 00:03:40,760 Speaker 3: perplexity model, which means that you're basically the arbiter of 65 00:03:40,760 --> 00:03:47,800 Speaker 3: what information is reliable. So how do you make those decisions. 66 00:03:45,640 --> 00:03:50,480 Speaker 4: It's not me at at the wheel. You know, ultimately, 67 00:03:51,200 --> 00:03:54,600 Speaker 4: this is where we focus a big part of our technology. 68 00:03:54,640 --> 00:03:58,680 Speaker 4: It's building accurate AI. And the thing that we believe 69 00:03:58,720 --> 00:04:01,080 Speaker 4: is that's going to be most scared in the future 70 00:04:01,640 --> 00:04:05,760 Speaker 4: is not intelligence, it's trust. I think the bedrock of 71 00:04:05,920 --> 00:04:10,080 Speaker 4: trust is transparency. That's why Perplexity pioneered, showing you exactly 72 00:04:10,680 --> 00:04:14,240 Speaker 4: which source the information is coming from and empowering users 73 00:04:14,280 --> 00:04:17,760 Speaker 4: to apply their own judgment of whether that is a 74 00:04:17,800 --> 00:04:21,360 Speaker 4: good source or not so good source. But we deploy 75 00:04:21,440 --> 00:04:28,040 Speaker 4: many different algorithmic strategies. Oftentimes, if many publications are saying 76 00:04:28,080 --> 00:04:30,919 Speaker 4: the same thing, you know, that's a strong signal that 77 00:04:30,920 --> 00:04:33,720 Speaker 4: they're onto something, and you know, the outliers you're you're 78 00:04:33,800 --> 00:04:37,599 Speaker 4: kind of more sensitive to. But this isn't you know, 79 00:04:37,640 --> 00:04:40,039 Speaker 4: it's not a simple problem, and this is why, you know, 80 00:04:40,080 --> 00:04:42,440 Speaker 4: we raise a lot of capital and hire the world's 81 00:04:42,440 --> 00:04:44,400 Speaker 4: best technologists to build accurate AI. 82 00:04:44,760 --> 00:04:47,839 Speaker 2: So I think anyone could sort of intuitively understand why 83 00:04:47,880 --> 00:04:51,839 Speaker 2: publishers might be anxious about Perplexity as being a destination. 84 00:04:52,400 --> 00:04:56,680 Speaker 2: Evidently it's not just news publishers. Literally, when since we 85 00:04:56,800 --> 00:05:00,440 Speaker 2: got on stage in three minutes ago, news rope at 86 00:05:00,720 --> 00:05:05,080 Speaker 2: Amazon demands perplexity to stop AI from a stop AI 87 00:05:05,160 --> 00:05:10,080 Speaker 2: agent from making purchases. Just broke three minutes ago from Bloomberg. 88 00:05:10,520 --> 00:05:11,839 Speaker 4: What is the job Bloomberg. 89 00:05:11,880 --> 00:05:14,760 Speaker 2: It's not just news that's anxious about the idea of 90 00:05:14,920 --> 00:05:18,960 Speaker 2: perplexity being the destination. Clearly, it's all kinds of businesses. 91 00:05:19,440 --> 00:05:22,960 Speaker 4: So I think, and I haven't read the article yet, 92 00:05:23,000 --> 00:05:30,960 Speaker 4: but I think I have an understanding the issues. The Ultimately, 93 00:05:30,960 --> 00:05:34,719 Speaker 4: the issue with Amazon is tied to perplexities. Web browser 94 00:05:34,800 --> 00:05:38,680 Speaker 4: called Commet and Comment is a web browser that has 95 00:05:38,720 --> 00:05:41,800 Speaker 4: an AI assistant built into it. And what we think 96 00:05:41,880 --> 00:05:46,160 Speaker 4: is important is that consumers and end users have access 97 00:05:46,200 --> 00:05:52,040 Speaker 4: to powerful AI when they are accessing publications, accessing websites, 98 00:05:52,320 --> 00:05:56,520 Speaker 4: accessing the Internet. The first thirty years of the Internet, 99 00:05:57,040 --> 00:06:02,200 Speaker 4: we had lopsided AI where big tech you know, Google, Netflix, Facebook, 100 00:06:02,520 --> 00:06:06,040 Speaker 4: they used AI to control what we see, to get 101 00:06:06,120 --> 00:06:07,919 Speaker 4: us to click on certain things and not click on 102 00:06:08,000 --> 00:06:12,600 Speaker 4: other things. And what is changing now is Perplexity is 103 00:06:12,600 --> 00:06:16,360 Speaker 4: focusing on giving powerful AI to the end user, and 104 00:06:16,400 --> 00:06:18,040 Speaker 4: we think this is the right thing to do. We 105 00:06:18,080 --> 00:06:20,680 Speaker 4: think it's a principle viewpoint that when you're on Amazon, 106 00:06:21,120 --> 00:06:24,360 Speaker 4: it's not Amazon's business. Whether you're using AI to help 107 00:06:24,400 --> 00:06:27,279 Speaker 4: you evaluate is this the best thing to buy? You know, 108 00:06:27,400 --> 00:06:30,240 Speaker 4: can I find it cheaper somewhere else? Do I actually 109 00:06:30,240 --> 00:06:31,920 Speaker 4: need to buy the other thing they're telling me to 110 00:06:31,960 --> 00:06:35,839 Speaker 4: buy with it? The world's wealthiest people they have personal 111 00:06:35,880 --> 00:06:40,120 Speaker 4: shoppers that help them do this, and we're democratizing that 112 00:06:40,200 --> 00:06:43,000 Speaker 4: sort of intelligence, that sort of assistant access to everyone. 113 00:06:43,760 --> 00:06:47,400 Speaker 3: Just in terms of competition. Is the argument basically that technically, 114 00:06:47,640 --> 00:06:51,000 Speaker 3: you know, a Google or an Amazon could create similar 115 00:06:51,040 --> 00:06:54,560 Speaker 3: products to what you guys are doing, but their own 116 00:06:54,640 --> 00:06:58,400 Speaker 3: business model basically prohibits them from doing what you're doing. 117 00:06:59,120 --> 00:07:01,679 Speaker 4: I should have you joined our next investor pittures exactly 118 00:07:01,720 --> 00:07:07,120 Speaker 4: that it is, there is a big opening for an independent, 119 00:07:07,560 --> 00:07:12,800 Speaker 4: neutral AI player that is aligned with users. You know, 120 00:07:12,880 --> 00:07:16,160 Speaker 4: Google is still you know, they're generating you know, one 121 00:07:16,240 --> 00:07:19,960 Speaker 4: hundred billion in revenue per quarter, which is amazing, but 122 00:07:20,240 --> 00:07:24,200 Speaker 4: a lot of that is coming from getting you know, 123 00:07:24,320 --> 00:07:27,160 Speaker 4: average you know, their their customers, advertisers, you know Amazon, 124 00:07:28,360 --> 00:07:30,520 Speaker 4: they're trying to get you to buy as many things 125 00:07:30,560 --> 00:07:31,200 Speaker 4: as possible. 126 00:07:31,600 --> 00:07:31,720 Speaker 1: Uh. 127 00:07:32,240 --> 00:07:35,640 Speaker 4: Our success metric is do users are they willing to 128 00:07:35,680 --> 00:07:39,360 Speaker 4: pay for a Perplexity subscription? And so we're purely aligned 129 00:07:39,360 --> 00:07:41,040 Speaker 4: with our users. 130 00:07:42,720 --> 00:07:46,600 Speaker 2: Speaking of competition, like what is the differentiator? Okay, maybe 131 00:07:46,960 --> 00:07:50,600 Speaker 2: Amazon has some obvious angles and some of the others 132 00:07:51,520 --> 00:07:54,760 Speaker 2: Opening Eyes seem sort of independent in the same bucket. 133 00:07:54,840 --> 00:07:57,520 Speaker 2: It's really huge. It's one of the biggest websites in 134 00:07:57,600 --> 00:08:01,320 Speaker 2: the in the world. It does and strike many of 135 00:08:01,360 --> 00:08:04,360 Speaker 2: the things that they offer replicated seems sort of similar 136 00:08:04,400 --> 00:08:07,520 Speaker 2: to perplexity. You must get this. I know private stock 137 00:08:07,600 --> 00:08:10,480 Speaker 2: answer this, but what is the differentiator between you and 138 00:08:10,840 --> 00:08:11,480 Speaker 2: an open AI? 139 00:08:12,240 --> 00:08:15,080 Speaker 4: So you know, I'm fortunate to get to speak with 140 00:08:15,120 --> 00:08:18,520 Speaker 4: a lot of curious people that have to answer a 141 00:08:18,520 --> 00:08:22,480 Speaker 4: lot of hard questions every day, and there they tell 142 00:08:22,520 --> 00:08:27,480 Speaker 4: me they use perplexity, And it comes back to work 143 00:08:27,560 --> 00:08:32,120 Speaker 4: focused on accuracy above all else. Chat GBT is an 144 00:08:32,120 --> 00:08:36,200 Speaker 4: amazing product, but they built a product that is optimized 145 00:08:36,200 --> 00:08:38,800 Speaker 4: for engagement. It's trying to be your cheerleader or your coach. 146 00:08:39,040 --> 00:08:42,640 Speaker 4: It's trying to get you to keep going. And they've 147 00:08:42,679 --> 00:08:46,080 Speaker 4: also built a product that is designed not to use 148 00:08:46,200 --> 00:08:50,240 Speaker 4: the best LM in service of your question, but it's 149 00:08:50,240 --> 00:08:53,920 Speaker 4: designed to use their lms. And so part of being neutral, 150 00:08:53,960 --> 00:08:57,480 Speaker 4: part of being independent, trustworthy, accurate is using the best 151 00:08:57,480 --> 00:09:01,440 Speaker 4: possible technology to answer the question the best possible way, 152 00:09:01,760 --> 00:09:04,840 Speaker 4: and so being an aggregator of all the best models, 153 00:09:05,160 --> 00:09:08,360 Speaker 4: having our own independent search infrastructure that isn't the being 154 00:09:08,760 --> 00:09:12,600 Speaker 4: search API, which is what chat GPT uses. Those are 155 00:09:12,640 --> 00:09:16,840 Speaker 4: all the building blocks of a more useful, accurate AI. 156 00:09:17,360 --> 00:09:20,000 Speaker 3: What's your actual revenue source at this point, because I 157 00:09:20,040 --> 00:09:24,000 Speaker 3: know you're big on subscriptions, you experimented with ads. I 158 00:09:24,040 --> 00:09:26,280 Speaker 3: think you might have paused that at this point, but 159 00:09:26,320 --> 00:09:28,880 Speaker 3: it does seem if you're trying to be different to Google, 160 00:09:29,280 --> 00:09:31,520 Speaker 3: you probably don't want to be as ad driven as 161 00:09:31,559 --> 00:09:31,920 Speaker 3: they are. 162 00:09:32,559 --> 00:09:36,560 Speaker 4: Yeah, So we're seeing tremendous traction with both consumer subscriptions 163 00:09:36,600 --> 00:09:40,080 Speaker 4: and then companies buying the enterprise version of perplexity for 164 00:09:40,160 --> 00:09:46,240 Speaker 4: their employees. Ultimately, it's in that enterprise context where having accurate, 165 00:09:46,480 --> 00:09:51,679 Speaker 4: useful knowledge is incredibly valuable. And so you know, we've had, 166 00:09:51,840 --> 00:09:55,800 Speaker 4: you know, been fortunate with a enterprise go to market 167 00:09:55,840 --> 00:09:58,480 Speaker 4: team of just five people, you know, seeing that business 168 00:09:58,559 --> 00:10:01,760 Speaker 4: you know, more more than eight x this year, seeing 169 00:10:01,840 --> 00:10:05,280 Speaker 4: you know, many multiples increase in our consumer subscriptions, and 170 00:10:05,480 --> 00:10:11,560 Speaker 4: we're excited about building products that our user aligned that 171 00:10:11,760 --> 00:10:15,959 Speaker 4: also create marketing value for brands and advertisers. But this 172 00:10:16,000 --> 00:10:19,160 Speaker 4: isn't going to be rebuilding AdWords and click based systems. 173 00:10:19,640 --> 00:10:23,200 Speaker 4: We think it's important to create real value so that 174 00:10:23,240 --> 00:10:27,880 Speaker 4: when your personal AI agent powered by Perplexity is evaluating 175 00:10:28,280 --> 00:10:32,400 Speaker 4: a potential commercial opportunity something you might buy imagine, you know, 176 00:10:32,720 --> 00:10:35,920 Speaker 4: alongside the answer you get an offer for an exclusive discount. 177 00:10:36,600 --> 00:10:39,640 Speaker 4: That would be something that your agent would say, Hey, 178 00:10:39,679 --> 00:10:42,480 Speaker 4: this is even though this someone's paying for this to 179 00:10:42,520 --> 00:10:45,040 Speaker 4: show up here, this is actually a good deal. If 180 00:10:45,040 --> 00:10:47,160 Speaker 4: it's just an AD that says click me, click me, 181 00:10:47,280 --> 00:10:50,160 Speaker 4: click me, we don't think that's going to be sustainable, 182 00:10:50,960 --> 00:10:53,320 Speaker 4: and that's not the type of marketing products we're looking 183 00:10:53,360 --> 00:10:53,679 Speaker 4: to build. 184 00:11:09,960 --> 00:11:12,400 Speaker 2: Can we talk a little bit about the fundraising environment 185 00:11:12,480 --> 00:11:15,160 Speaker 2: for AI? I mean, obviously we know how crazy, you know, 186 00:11:15,240 --> 00:11:16,120 Speaker 2: intense things are in. 187 00:11:16,040 --> 00:11:18,080 Speaker 3: The word are people throwing cash at you as you 188 00:11:18,120 --> 00:11:18,920 Speaker 3: walk down the street. 189 00:11:19,160 --> 00:11:21,640 Speaker 2: But also it feels like every day, like I like 190 00:11:21,720 --> 00:11:23,920 Speaker 2: it used to be when I paid attention to startups, 191 00:11:23,960 --> 00:11:25,760 Speaker 2: it felt like there were discrete rounds. There's a C 192 00:11:26,040 --> 00:11:27,760 Speaker 2: and then an A, and then a B and maybe 193 00:11:27,760 --> 00:11:30,079 Speaker 2: a C and then an I P O. When I read, 194 00:11:30,200 --> 00:11:33,280 Speaker 2: it feels like every day there is just it. Are 195 00:11:33,320 --> 00:11:35,560 Speaker 2: the are the rounds? Blending into each other, so that 196 00:11:35,640 --> 00:11:38,840 Speaker 2: it's just this sort of ongoing stream of oh, here's 197 00:11:38,840 --> 00:11:41,520 Speaker 2: a new raise, et cetera. Here we got some new chips, 198 00:11:41,520 --> 00:11:43,520 Speaker 2: and then we are giving them some equity, et cetera. 199 00:11:43,760 --> 00:11:46,360 Speaker 2: Has the line between rounds is it blurring? 200 00:11:47,240 --> 00:11:50,120 Speaker 4: Yeah? So before Perplexity, I mean I spent most of 201 00:11:50,120 --> 00:11:52,960 Speaker 4: my career at consumer internet companies by the extint as 202 00:11:53,080 --> 00:11:56,720 Speaker 4: a founder of a robotics startup, and AI is definitely 203 00:11:56,760 --> 00:12:00,920 Speaker 4: a different fundraising environment than that was, which was very challenging. 204 00:12:01,800 --> 00:12:06,400 Speaker 4: I think at his core, a lot of investors believe, 205 00:12:06,440 --> 00:12:09,800 Speaker 4: and we agree with this belief that what's happening with 206 00:12:09,840 --> 00:12:12,880 Speaker 4: AI now is as profound as the original introduction of 207 00:12:12,920 --> 00:12:16,480 Speaker 4: the Internet, and so they're there are going to be 208 00:12:16,520 --> 00:12:19,120 Speaker 4: a wave of new generational companies that are being built 209 00:12:19,160 --> 00:12:23,360 Speaker 4: right now. And you know, a lot of the value 210 00:12:23,400 --> 00:12:27,600 Speaker 4: is actually uh, you know, compounding while these companies are 211 00:12:27,600 --> 00:12:30,920 Speaker 4: privately held, and so that that is you know, there's 212 00:12:30,960 --> 00:12:34,360 Speaker 4: a lot of conviction by by private investors in that. 213 00:12:35,040 --> 00:12:39,040 Speaker 4: And I think a second piece is that investors themselves 214 00:12:39,360 --> 00:12:43,160 Speaker 4: are power users of products like Perplexity. They are people 215 00:12:43,200 --> 00:12:45,560 Speaker 4: that have a lot of questions, do a lot of research, 216 00:12:45,960 --> 00:12:49,600 Speaker 4: and so they have not just an abstract understanding of 217 00:12:49,640 --> 00:12:52,880 Speaker 4: a balance sheet, They have their own intuition, you know, 218 00:12:52,880 --> 00:12:54,840 Speaker 4: which is very different than like B to B SaaS 219 00:12:54,880 --> 00:12:57,000 Speaker 4: where it's like, you know, you have to go do 220 00:12:57,320 --> 00:13:01,120 Speaker 4: thirty customer interviews and you're you're kind of getting all 221 00:13:01,160 --> 00:13:04,480 Speaker 4: the second order information, you know, with perplexity, use it yourself. 222 00:13:05,160 --> 00:13:07,840 Speaker 4: You understand the value the time that you just saved, 223 00:13:08,200 --> 00:13:11,320 Speaker 4: the creative expansion you just had, and and that gives 224 00:13:11,360 --> 00:13:15,520 Speaker 4: investors conviction. And so that that is you know, fueling 225 00:13:15,559 --> 00:13:19,880 Speaker 4: this environment. Is it at times you know, too frothy? 226 00:13:20,480 --> 00:13:20,800 Speaker 3: Sure? 227 00:13:21,600 --> 00:13:23,160 Speaker 4: You know, are there going to be some investors that 228 00:13:23,480 --> 00:13:29,400 Speaker 4: make the wrong bets? Absolutely, But the core human need 229 00:13:29,800 --> 00:13:33,880 Speaker 4: of getting good answers to good questions, you know, we 230 00:13:33,880 --> 00:13:38,280 Speaker 4: we think that is boundless and so we're we're confident 231 00:13:38,320 --> 00:13:39,240 Speaker 4: in the path ahead. 232 00:13:39,480 --> 00:13:42,240 Speaker 3: Well, speaking of froth, is there anyone you turn away 233 00:13:42,400 --> 00:13:43,680 Speaker 3: in terms of being an investor? 234 00:13:44,559 --> 00:13:48,800 Speaker 4: Yeah, So there's this phenomenon that that tends to happen 235 00:13:49,160 --> 00:13:56,360 Speaker 4: in bubbly periods of multi layered SPVs. So SPVs are 236 00:13:56,360 --> 00:14:01,000 Speaker 4: special purpose vehicles and so you basically have many layers 237 00:14:01,040 --> 00:14:05,839 Speaker 4: of outsource fundraising where you have you know, principal agent 238 00:14:05,920 --> 00:14:09,720 Speaker 4: misalignment where the fundraiser is not making money off the investment, 239 00:14:09,800 --> 00:14:12,560 Speaker 4: they're just making money off the management fee. And so 240 00:14:12,840 --> 00:14:16,520 Speaker 4: we often get approach from players in that space, and we, 241 00:14:16,880 --> 00:14:19,440 Speaker 4: you know, whenever we can detect that's what it is. 242 00:14:19,960 --> 00:14:22,440 Speaker 4: We we avoid taking on funds like that because we 243 00:14:22,480 --> 00:14:26,800 Speaker 4: want investors who are getting direct access to perplexity and 244 00:14:26,840 --> 00:14:28,880 Speaker 4: who are in it for the long haul, who aren't 245 00:14:28,920 --> 00:14:31,120 Speaker 4: just trying to make a quick buck on a management 246 00:14:31,160 --> 00:14:31,680 Speaker 4: fee flip. 247 00:14:32,000 --> 00:14:34,040 Speaker 2: I want to talk about this more. I'm really fascinated. 248 00:14:34,080 --> 00:14:37,440 Speaker 2: We actually we did an episode with A. Vlaw Ton 249 00:14:37,520 --> 00:14:40,520 Speaker 2: of the Robinhood cee only was talking about tokenization of 250 00:14:40,640 --> 00:14:43,600 Speaker 2: private assets, and I guess this sort of actually relates 251 00:14:43,640 --> 00:14:46,440 Speaker 2: to the question about discrete rounds. But does it feel like, 252 00:14:47,240 --> 00:14:50,120 Speaker 2: you know, in theory, if some sort of slice of 253 00:14:50,120 --> 00:14:53,960 Speaker 2: equity perplexing maybe people so forwards or these vehicles that 254 00:14:54,000 --> 00:14:56,240 Speaker 2: are linked to it, does it feel like the line 255 00:14:56,280 --> 00:14:58,160 Speaker 2: between what it even means to be a private and 256 00:14:58,200 --> 00:15:02,040 Speaker 2: a public company is inevitable going to disappear, especially add 257 00:15:02,040 --> 00:15:03,520 Speaker 2: in the fact that we seem to be on this 258 00:15:03,640 --> 00:15:07,440 Speaker 2: road where public companies will not be expected or required 259 00:15:07,480 --> 00:15:11,280 Speaker 2: to have to report information as timely as they used to. 260 00:15:12,600 --> 00:15:15,000 Speaker 4: I think it's important when you're a privately held company 261 00:15:15,520 --> 00:15:19,680 Speaker 4: to control the sale of your equity. And I think 262 00:15:19,720 --> 00:15:22,840 Speaker 4: that the challenge, you know, why we haven't, you know, 263 00:15:22,880 --> 00:15:25,400 Speaker 4: we've been approached by a lot of these tokenization schemes. 264 00:15:26,040 --> 00:15:28,800 Speaker 4: And you know, again, if you have folks on your 265 00:15:28,880 --> 00:15:32,040 Speaker 4: cap table that are not aligned for the long term, 266 00:15:32,480 --> 00:15:35,040 Speaker 4: they may be looking for quick flips and then that 267 00:15:35,080 --> 00:15:37,440 Speaker 4: creates a weird signal in the market, and that may 268 00:15:37,640 --> 00:15:41,400 Speaker 4: harm you know, a primary raise, right, and so the 269 00:15:41,400 --> 00:15:45,160 Speaker 4: the noise that comes with that. It's not like we're 270 00:15:45,200 --> 00:15:50,520 Speaker 4: short on other you know investment, you know, avenues. So 271 00:15:50,560 --> 00:15:52,760 Speaker 4: it's just an unnecessary kind of noise risk. 272 00:15:53,520 --> 00:15:57,520 Speaker 3: So, speaking of throwing money around, the thirty five billion 273 00:15:57,560 --> 00:16:01,560 Speaker 3: dollar bid for Chrome was what was that? Was that 274 00:16:01,600 --> 00:16:02,080 Speaker 3: a troll? 275 00:16:05,320 --> 00:16:08,400 Speaker 4: We that was when we made that bid, the DOJ 276 00:16:08,560 --> 00:16:13,480 Speaker 4: had still not decided whether Chrome would be divested from Google. 277 00:16:15,000 --> 00:16:19,320 Speaker 4: The web browser as evidenced by you know, Amazon's behavior 278 00:16:19,480 --> 00:16:24,080 Speaker 4: is clearly incredibly valuable surface area, and that you know, 279 00:16:24,120 --> 00:16:27,560 Speaker 4: thirty five billion was a starting bid. You know, at 280 00:16:27,560 --> 00:16:29,280 Speaker 4: this point I can safely say I think Chrome is 281 00:16:29,320 --> 00:16:33,960 Speaker 4: worth more than that. But what I can confidently say 282 00:16:34,040 --> 00:16:37,400 Speaker 4: is we had more than thirty five billion of investor 283 00:16:37,480 --> 00:16:40,200 Speaker 4: interests in making that bid, so it was not a 284 00:16:40,960 --> 00:16:46,160 Speaker 4: constraint on you know, the funding side. Obviously, Google does 285 00:16:46,200 --> 00:16:48,480 Speaker 4: not want to sell Chrome unless they will be forced to, 286 00:16:48,640 --> 00:16:50,800 Speaker 4: and it doesn't look like the government will be forcing them. 287 00:16:51,080 --> 00:16:53,080 Speaker 3: So you're serious about that one. It's not just a 288 00:16:53,160 --> 00:16:55,760 Speaker 3: sort of antitrust flag waving exercise. 289 00:16:55,880 --> 00:16:59,520 Speaker 4: No, we had, you know, very large sovereign wealth funds 290 00:16:59,520 --> 00:17:02,080 Speaker 4: that you know, we're we're kind of lined up and 291 00:17:02,120 --> 00:17:02,600 Speaker 4: ready to go. 292 00:17:03,320 --> 00:17:07,320 Speaker 2: Here's a great question from someone anonymous via the app. 293 00:17:08,080 --> 00:17:11,199 Speaker 2: Can you talk about unit margins? You often hear this, 294 00:17:11,320 --> 00:17:14,080 Speaker 2: it's like, oh, these AI companies they're not even on 295 00:17:14,119 --> 00:17:17,800 Speaker 2: a unit basis making money given the compute costs et cetera. 296 00:17:17,880 --> 00:17:21,040 Speaker 2: Are what can you talk about gross margins? And right 297 00:17:21,080 --> 00:17:23,880 Speaker 2: now whether like you know, a given query is profitable 298 00:17:23,920 --> 00:17:25,560 Speaker 2: given the cost of delivering. 299 00:17:26,320 --> 00:17:31,959 Speaker 4: And so the uh A paying pro subscriber to Perplexity 300 00:17:32,160 --> 00:17:35,560 Speaker 4: is profitable for us. Okay, so the unit margins are 301 00:17:35,600 --> 00:17:40,119 Speaker 4: strong the free users right now because we do not 302 00:17:40,359 --> 00:17:45,639 Speaker 4: monetize them, you know, we don't we lose money on 303 00:17:45,680 --> 00:17:50,840 Speaker 4: those AI companies. I think something that is underappreciated. They 304 00:17:50,920 --> 00:17:54,720 Speaker 4: generally actually when they do the accounting of free users, 305 00:17:55,280 --> 00:17:58,600 Speaker 4: they attribute that to research and development because you're still 306 00:17:58,720 --> 00:18:03,639 Speaker 4: using those prompts to improve from the model. So you know, 307 00:18:04,000 --> 00:18:07,760 Speaker 4: it does allow for your unit economics to potentially appear 308 00:18:07,840 --> 00:18:09,560 Speaker 4: more pristine than they are. 309 00:18:09,680 --> 00:18:12,800 Speaker 2: Are all pro users profitable? Because this also strikes me 310 00:18:12,800 --> 00:18:15,560 Speaker 2: as something unique with AI, which is that you know, 311 00:18:15,560 --> 00:18:17,960 Speaker 2: when back when Twitter was and it's infencing power users 312 00:18:17,960 --> 00:18:19,880 Speaker 2: were good, it feels like in the case with AI, 313 00:18:20,000 --> 00:18:22,720 Speaker 2: sometimes you get these power users where you're losing money 314 00:18:22,720 --> 00:18:24,879 Speaker 2: on them. Are all pro users profitable or do some 315 00:18:25,040 --> 00:18:26,600 Speaker 2: use it so much that it is more than the 316 00:18:26,640 --> 00:18:27,440 Speaker 2: subscription fee? 317 00:18:27,800 --> 00:18:30,879 Speaker 4: I mean, I don't. It's not a material amount. And 318 00:18:31,000 --> 00:18:34,000 Speaker 4: we've also introduced a Max tier, So we have a 319 00:18:34,000 --> 00:18:37,520 Speaker 4: twenty dollars month pro subscription, and for the folks that want, 320 00:18:37,640 --> 00:18:41,560 Speaker 4: you know, the most expensive reasoning models and higher limits 321 00:18:41,560 --> 00:18:43,600 Speaker 4: on them, there's a two hundred dollars month product called 322 00:18:43,640 --> 00:18:46,600 Speaker 4: Max and that also gets you access to our email assistant, 323 00:18:47,280 --> 00:18:50,240 Speaker 4: and so I think you'll you know, and other companies 324 00:18:50,240 --> 00:18:53,640 Speaker 4: have also introduced kind of higher tiers of subscription. So 325 00:18:54,000 --> 00:18:57,600 Speaker 4: this is not a problem that's unique necessary to AI, 326 00:18:57,800 --> 00:19:01,080 Speaker 4: and you kind of introduce tiers of service uh to 327 00:19:01,080 --> 00:19:02,280 Speaker 4: to support it at scale. 328 00:19:03,040 --> 00:19:05,639 Speaker 3: What does your expenditure actually look like at this point? Like, 329 00:19:05,680 --> 00:19:07,600 Speaker 3: what are you spending the most money on? Is it 330 00:19:08,080 --> 00:19:12,000 Speaker 3: we read about these insane salaries for engineers going to 331 00:19:12,320 --> 00:19:15,880 Speaker 3: Claude or Open AI or whatever. Is it talent retention 332 00:19:16,320 --> 00:19:19,960 Speaker 3: or is it compute power? Is it the content deals 333 00:19:20,040 --> 00:19:22,119 Speaker 3: with media organizations? What are you spending on? 334 00:19:22,600 --> 00:19:25,480 Speaker 4: So the biggest expense is inference? Right, So it is 335 00:19:25,520 --> 00:19:29,240 Speaker 4: the cost of compute. Uh, when you're you're asking a 336 00:19:29,320 --> 00:19:32,760 Speaker 4: question and running it through an l M, you know 337 00:19:32,840 --> 00:19:37,560 Speaker 4: that that is uh, that is a material expenditure. We 338 00:19:37,920 --> 00:19:41,920 Speaker 4: you know, Perplexity doesn't do pre training. So the sorts 339 00:19:41,960 --> 00:19:45,760 Speaker 4: of AI PhDs that other companies are paying, you know, 340 00:19:46,480 --> 00:19:50,160 Speaker 4: reported one hundreds of millions of four we're not that. 341 00:19:50,160 --> 00:19:52,000 Speaker 4: That's not we're not in the mark of that. We 342 00:19:52,040 --> 00:19:55,919 Speaker 4: have incredible AI application engineers and that's that's what we 343 00:19:55,960 --> 00:20:01,080 Speaker 4: recruit for. Ultimately, engineers that are joining Perplexity joining because 344 00:20:01,400 --> 00:20:04,399 Speaker 4: they believe we're just getting started, right and they're coming 345 00:20:04,520 --> 00:20:09,560 Speaker 4: for an equity appreciation story that's you know, already bearing fruit. 346 00:20:10,160 --> 00:20:12,280 Speaker 4: But we're we're just at the beginning of that journey. 347 00:20:12,440 --> 00:20:15,840 Speaker 2: Maybe this is sort of a way of restating Tracy's question, 348 00:20:15,920 --> 00:20:19,159 Speaker 2: but it all feel like any AI conversation. It's like 349 00:20:19,200 --> 00:20:20,680 Speaker 2: what is the bottleneck right now? 350 00:20:20,760 --> 00:20:20,920 Speaker 1: Right? 351 00:20:21,200 --> 00:20:24,480 Speaker 2: Is it talent? Is it capital? Is it compute? Is 352 00:20:24,560 --> 00:20:25,320 Speaker 2: it power? 353 00:20:25,480 --> 00:20:25,680 Speaker 3: Etc? 354 00:20:26,080 --> 00:20:29,240 Speaker 2: Like if you could have unlimited amounts of any one 355 00:20:29,240 --> 00:20:31,920 Speaker 2: of those things, what would you want more of right now? 356 00:20:33,320 --> 00:20:35,840 Speaker 4: So I think it's important to buy fur Kate. There's 357 00:20:35,920 --> 00:20:39,480 Speaker 4: the companies that do the training of foundation models, and 358 00:20:39,560 --> 00:20:42,359 Speaker 4: those are the ones that are really pushing for the 359 00:20:42,440 --> 00:20:47,600 Speaker 4: massive infrastructure buildouts. Yeah, we're not constrained on compute for 360 00:20:47,680 --> 00:20:52,760 Speaker 4: inference at the current usage levels. But as you know, 361 00:20:52,960 --> 00:20:55,520 Speaker 4: behavior shift, I mean by a show of hands in 362 00:20:55,560 --> 00:20:58,680 Speaker 4: the room, like how many people have stopped primarily using 363 00:20:58,720 --> 00:21:03,600 Speaker 4: Google when they have an informational question? So you know 364 00:21:03,880 --> 00:21:07,000 Speaker 4: about with eighty percent of the room, you know, raise 365 00:21:07,040 --> 00:21:10,280 Speaker 4: their hands and we see the early adopter behavior in 366 00:21:10,680 --> 00:21:12,760 Speaker 4: you know, audiences like this. You know, folks that work 367 00:21:12,760 --> 00:21:16,320 Speaker 4: in financial services, people that again for whom quickly getting 368 00:21:16,440 --> 00:21:21,399 Speaker 4: high quality information matters. As you know, this behavior becomes 369 00:21:21,440 --> 00:21:24,240 Speaker 4: more mainstream, we obviously will need more compute for inference. 370 00:21:26,160 --> 00:21:30,240 Speaker 4: But the the folks that are are kind of talking 371 00:21:30,240 --> 00:21:34,399 Speaker 4: about trillions of dollars in investment. It's it's really a 372 00:21:34,400 --> 00:21:39,560 Speaker 4: different universe than ours. It's unclear to me why some 373 00:21:39,640 --> 00:21:42,000 Speaker 4: of those investments are needed. Sometimes it feels like a 374 00:21:42,160 --> 00:21:46,080 Speaker 4: strategy of, you know, kind of what what the US 375 00:21:46,200 --> 00:21:47,840 Speaker 4: was trying to do with the Soviet Union, of just 376 00:21:47,880 --> 00:21:52,000 Speaker 4: getting your your competitors to go broke by you know, 377 00:21:52,040 --> 00:21:54,280 Speaker 4: being able to raise the most capital. And we'll see 378 00:21:54,320 --> 00:21:55,040 Speaker 4: how that plays out. 379 00:21:55,920 --> 00:21:58,560 Speaker 3: Since you mentioned financial services, we have a good question 380 00:21:58,840 --> 00:22:03,000 Speaker 3: from the audience, and we've done some media naval gazing. 381 00:22:03,040 --> 00:22:04,600 Speaker 3: I'm sure we're going to do some more, but for 382 00:22:04,640 --> 00:22:07,840 Speaker 3: now financial services naval gazing. What do you think the 383 00:22:07,880 --> 00:22:10,639 Speaker 3: impact of AI will be on places like Wizard and 384 00:22:10,720 --> 00:22:11,800 Speaker 3: other Wall Street firms. 385 00:22:13,680 --> 00:22:17,159 Speaker 4: I just think this is going to be a incredible 386 00:22:17,760 --> 00:22:23,719 Speaker 4: flowering of creativity and productivity lift. I think of using 387 00:22:23,760 --> 00:22:26,560 Speaker 4: a product like Perplexity and other AI tools in the 388 00:22:26,640 --> 00:22:28,640 Speaker 4: day to day of somebody that works in investment bank 389 00:22:28,760 --> 00:22:32,960 Speaker 4: is really twofold. The first is saving you eighty percent 390 00:22:33,000 --> 00:22:36,600 Speaker 4: of the time to get to the eighty percent first 391 00:22:36,680 --> 00:22:39,879 Speaker 4: draft of a work product. Right in investment banks, that's 392 00:22:39,920 --> 00:22:43,679 Speaker 4: kind of putting the other pitch decks doing modeling, just 393 00:22:43,800 --> 00:22:47,720 Speaker 4: gaining tremendous efficiency. And it's important to distinguish it's not 394 00:22:47,880 --> 00:22:52,000 Speaker 4: about making that workflow autonomous. You still absolutely need human 395 00:22:52,080 --> 00:22:56,359 Speaker 4: judgment on what the end output should be, how to 396 00:22:56,400 --> 00:22:59,520 Speaker 4: approach it. This is not about replacing investment bankers. It's 397 00:22:59,560 --> 00:23:03,560 Speaker 4: about giving them tremendous leverage. I think the more disruptive 398 00:23:03,600 --> 00:23:07,879 Speaker 4: impact is now that it got a lot easier to 399 00:23:07,920 --> 00:23:11,600 Speaker 4: begin a new initiative, to begin a new prospect. How 400 00:23:11,640 --> 00:23:15,320 Speaker 4: many more new engagements are you going to be able 401 00:23:15,320 --> 00:23:19,040 Speaker 4: to spin up? Right? Because the biggest constraint on new 402 00:23:19,080 --> 00:23:22,600 Speaker 4: business is an employee feeling they have the bandwidth to 403 00:23:22,600 --> 00:23:27,479 Speaker 4: take it on. And you know, experienced leaders now have 404 00:23:27,760 --> 00:23:29,640 Speaker 4: much more leverage, and I think that's going to play 405 00:23:29,640 --> 00:23:33,240 Speaker 4: a profound role, Atlizard, and you know part of why 406 00:23:33,280 --> 00:23:34,840 Speaker 4: I'm excited to be part of the organization. 407 00:23:51,280 --> 00:23:54,880 Speaker 2: I love these audience questions because they're saying things that 408 00:23:55,119 --> 00:23:57,399 Speaker 2: I would think that are reminded me and that I 409 00:23:57,440 --> 00:23:59,680 Speaker 2: would have been rude to say, you didn't answer the 410 00:23:59,720 --> 00:24:01,800 Speaker 2: first question about are you going to kill journalism? But 411 00:24:01,880 --> 00:24:04,600 Speaker 2: someone says you didn't answer the first question, how would 412 00:24:04,640 --> 00:24:08,320 Speaker 2: journalism survive when AI has skinned it? I get Oh, yeah, 413 00:24:08,480 --> 00:24:11,000 Speaker 2: you know, AI is an interested in asking questions, but 414 00:24:11,440 --> 00:24:15,080 Speaker 2: it's not. The question is like, why would brands continue 415 00:24:15,080 --> 00:24:17,480 Speaker 2: to like build out these presents for like a human 416 00:24:17,560 --> 00:24:21,360 Speaker 2: reading and scrolling when I can just go to one 417 00:24:21,440 --> 00:24:24,000 Speaker 2: destination that doesn't necessarily click me anywhere else. 418 00:24:24,560 --> 00:24:27,960 Speaker 4: Well, I mean, when I look at the earnings numbers 419 00:24:27,960 --> 00:24:30,119 Speaker 4: for the Wall Street Journal, the New York Times, for 420 00:24:30,720 --> 00:24:34,400 Speaker 4: Bloomberg media companies seem to be doing fine well. 421 00:24:34,440 --> 00:24:36,840 Speaker 2: New York Times is doing well. Games and I don't 422 00:24:36,880 --> 00:24:38,760 Speaker 2: know if you get earnings numbers for but. 423 00:24:38,640 --> 00:24:41,639 Speaker 4: What I hear, you know, like some subscription numbers. 424 00:24:41,640 --> 00:24:44,600 Speaker 2: But by the way, I was doing a lot more 425 00:24:44,680 --> 00:24:47,520 Speaker 2: than Games actually did not. It's a very impressive company. 426 00:24:47,520 --> 00:24:50,080 Speaker 2: But by and large it does not look like the 427 00:24:50,119 --> 00:24:51,119 Speaker 2: actual news business. 428 00:24:51,200 --> 00:24:53,639 Speaker 3: It's a lifestyle company is doing particularly well well. 429 00:24:53,680 --> 00:24:56,240 Speaker 4: But I think it's it's part of the value of 430 00:24:56,240 --> 00:24:58,919 Speaker 4: the mundle I mean, this is a very This is 431 00:24:59,119 --> 00:25:03,280 Speaker 4: very analogous to when the Internet was first introduced. You 432 00:25:03,400 --> 00:25:06,920 Speaker 4: had traditional brick and mortar retailers, and the best brick 433 00:25:06,960 --> 00:25:11,199 Speaker 4: and mortar retailers adapted to the digital environment, and the 434 00:25:11,240 --> 00:25:16,040 Speaker 4: best publishers are adapting and gaining leverage with AI. I mean, 435 00:25:16,160 --> 00:25:21,840 Speaker 4: last night we had the CEO of Axel Springer talk 436 00:25:21,920 --> 00:25:24,800 Speaker 4: about how he's leaning into AI and it's actually an 437 00:25:24,800 --> 00:25:29,840 Speaker 4: incredible opportunity for productivity. And so I don't think this 438 00:25:29,920 --> 00:25:32,639 Speaker 4: is zero sum. I do think there's going to be 439 00:25:32,640 --> 00:25:37,240 Speaker 4: some publishers that thrive and others. And the media business 440 00:25:37,240 --> 00:25:40,639 Speaker 4: was already facing challenges, you know, certain segments of it 441 00:25:40,720 --> 00:25:44,000 Speaker 4: well before AI. So I don't see this as being 442 00:25:44,080 --> 00:25:45,359 Speaker 4: kind of the fundamental tension. 443 00:25:46,359 --> 00:25:48,560 Speaker 3: But if you fast forward five or ten years from now, 444 00:25:48,680 --> 00:25:51,000 Speaker 3: what does the front page of the Internet actually look 445 00:25:51,160 --> 00:25:53,679 Speaker 3: like to you? And what exactly is I guess the 446 00:25:53,760 --> 00:25:58,160 Speaker 3: economic packed between media and platforms such as yourself, What 447 00:25:58,160 --> 00:26:03,040 Speaker 3: would the ideal packed agreement actually look like? How basically, 448 00:26:03,080 --> 00:26:05,880 Speaker 3: give give me your vision of the Internet in five. 449 00:26:05,760 --> 00:26:11,240 Speaker 4: Years were two seconds. We're introducing something called Comment Plus, 450 00:26:12,119 --> 00:26:15,760 Speaker 4: which we already have the Washington Posts and Conde Nast 451 00:26:15,880 --> 00:26:18,679 Speaker 4: La Times, many great publishers signed on for and it 452 00:26:18,720 --> 00:26:23,600 Speaker 4: was really inspired by Apple News Plus, where if you 453 00:26:23,880 --> 00:26:26,120 Speaker 4: are a Comment Plus subscriber, which is five dollars a month, 454 00:26:26,160 --> 00:26:29,080 Speaker 4: and we bundle it into Perplexity pro So even people 455 00:26:29,119 --> 00:26:31,639 Speaker 4: that didn't necessarily you know, want to pay for it. 456 00:26:31,680 --> 00:26:35,399 Speaker 4: We're taking money out of our cut and feeding it 457 00:26:35,440 --> 00:26:39,360 Speaker 4: into this program to seat it. As you consume premium 458 00:26:39,400 --> 00:26:43,000 Speaker 4: content in the comment browser, we're then sharing that revenue 459 00:26:43,000 --> 00:26:46,879 Speaker 4: back proportionally with publishers. And it's actually accounting for not 460 00:26:47,119 --> 00:26:50,639 Speaker 4: just page views of premium content, but agent actions on 461 00:26:50,720 --> 00:26:55,200 Speaker 4: that content and citations in the commt browser. And so ultimately, 462 00:26:56,000 --> 00:27:00,840 Speaker 4: I think subscriptions are a very powerful business small for media. 463 00:27:01,680 --> 00:27:04,200 Speaker 4: I think there's a lot more to do there, and 464 00:27:04,440 --> 00:27:06,000 Speaker 4: we want to play a leading role in doing that. 465 00:27:06,160 --> 00:27:11,439 Speaker 2: I'm very impressed by the finance module within Perplexity. You know, 466 00:27:11,480 --> 00:27:14,800 Speaker 2: there's a whole graveyard of companies that have tried to 467 00:27:14,800 --> 00:27:20,440 Speaker 2: take down our employer at Bloomberg. Do you see yourself 468 00:27:20,560 --> 00:27:24,480 Speaker 2: as a potential challenger company? Tracy asked about you know, 469 00:27:24,600 --> 00:27:26,960 Speaker 2: financial services, et cetera. Do you see yourself, at some 470 00:27:26,960 --> 00:27:29,840 Speaker 2: point down the road is a challenger to the terminal? 471 00:27:30,119 --> 00:27:32,760 Speaker 4: Yeah? I mean, how many people have access to Bloomberg terminal. 472 00:27:32,760 --> 00:27:35,520 Speaker 4: It's like, I mean, you guys like, it's not it's 473 00:27:35,520 --> 00:27:39,800 Speaker 4: not millions, right, So yeah, it's not millions. So for us, 474 00:27:39,840 --> 00:27:42,639 Speaker 4: the success of Perplexity Finance is it's similar to what 475 00:27:42,680 --> 00:27:45,439 Speaker 4: I was saying about with Amazon and people that have 476 00:27:45,480 --> 00:27:48,520 Speaker 4: personal shoppers like Bloomberg Terminal, is that like it's this 477 00:27:48,640 --> 00:27:51,719 Speaker 4: is like company. 478 00:27:51,800 --> 00:27:53,879 Speaker 2: And so when you're thinking about the lizards of the 479 00:27:53,880 --> 00:27:55,800 Speaker 2: world and all the other banks, could you at some 480 00:27:55,840 --> 00:27:59,760 Speaker 2: points see yourself pitching Perplexity or Perplexity Terminal as a 481 00:27:59,800 --> 00:28:01,439 Speaker 2: world all these other ones had failed to do it. 482 00:28:02,119 --> 00:28:05,480 Speaker 4: I think for us, the bigger opportunity and the bigger 483 00:28:05,480 --> 00:28:08,840 Speaker 4: opportunity is not to go after Bloomberg Criminal. It's giving 484 00:28:09,200 --> 00:28:13,560 Speaker 4: everyone who can't afford Bloomberg Terminal a version of it 485 00:28:13,720 --> 00:28:17,000 Speaker 4: that is useful for them and when they're making retail 486 00:28:17,080 --> 00:28:20,280 Speaker 4: stock investment decisions, when they're understanding the interplay of the 487 00:28:20,320 --> 00:28:23,399 Speaker 4: markets and the news cycle, that they have access to 488 00:28:23,520 --> 00:28:27,520 Speaker 4: powerful technology. Right, So it's not I don't think we 489 00:28:27,600 --> 00:28:31,200 Speaker 4: can't like Bloomberg Terminal, like people are there for the chatting, 490 00:28:31,280 --> 00:28:33,639 Speaker 4: Like there's all kinds of features there that we're not 491 00:28:34,080 --> 00:28:37,960 Speaker 4: trying to replace. Thank you, and it's an amazing product 492 00:28:38,080 --> 00:28:41,040 Speaker 4: and amazing brand. I think it's for us this is 493 00:28:41,080 --> 00:28:44,760 Speaker 4: about democratizing, expanding the scope of who has access to 494 00:28:44,960 --> 00:28:46,440 Speaker 4: premium financial intelligence. 495 00:28:47,320 --> 00:28:50,560 Speaker 3: Another question from the audience. This one's from Oliviate Chastan, 496 00:28:51,280 --> 00:28:53,400 Speaker 3: What do you think will be the key skill sets 497 00:28:53,400 --> 00:28:57,440 Speaker 3: for media entertainment executives to manage the impact of AI 498 00:28:57,600 --> 00:28:58,920 Speaker 3: on content industries? 499 00:29:02,000 --> 00:29:04,600 Speaker 4: I mean, I think my answer to this is the 500 00:29:04,640 --> 00:29:07,160 Speaker 4: answer I think it's a key skill set for any 501 00:29:07,200 --> 00:29:10,040 Speaker 4: professional in the age of AI is you have to 502 00:29:10,040 --> 00:29:13,880 Speaker 4: get really good at asking the right questions. And so 503 00:29:14,000 --> 00:29:18,000 Speaker 4: that means asking your teams why are they working on, 504 00:29:18,000 --> 00:29:21,040 Speaker 4: what they're working on, what's the leverage, what's the unique differentiation? 505 00:29:21,720 --> 00:29:24,600 Speaker 4: And so it is approaching their business with a profound 506 00:29:24,640 --> 00:29:29,840 Speaker 4: curiosity and an understanding that you know there are new 507 00:29:30,440 --> 00:29:34,080 Speaker 4: disruption and leverage opportunities, and to kind of lead with 508 00:29:34,120 --> 00:29:34,920 Speaker 4: that curiosity. 509 00:29:35,320 --> 00:29:37,520 Speaker 3: Everyone always says that, but what does a good prompt 510 00:29:37,600 --> 00:29:39,320 Speaker 3: actually look like to you? Can you give us a 511 00:29:39,320 --> 00:29:42,040 Speaker 3: specific example of like a prompt that really impressed you? 512 00:29:43,280 --> 00:29:45,680 Speaker 4: I think to me, asking a good question it's like 513 00:29:45,720 --> 00:29:49,160 Speaker 4: a level deeper than like the prompt engineering is is 514 00:29:49,240 --> 00:29:52,400 Speaker 4: kind of that that in some ways will get abstracted 515 00:29:52,520 --> 00:29:55,120 Speaker 4: very soon. When I say asking a good question, it 516 00:29:55,200 --> 00:29:59,560 Speaker 4: means a question that embodies and embeds all the known 517 00:30:00,200 --> 00:30:03,640 Speaker 4: knowledge on a given topic. And the question you're asking is, 518 00:30:03,680 --> 00:30:06,640 Speaker 4: then you know, where do you push the frontier, Like 519 00:30:06,680 --> 00:30:10,240 Speaker 4: what is the thing that is not yet knowable that 520 00:30:10,360 --> 00:30:13,360 Speaker 4: in your question you're pushing your organization on, you know. 521 00:30:13,400 --> 00:30:16,840 Speaker 2: Going back to the browser question about the bid for 522 00:30:16,960 --> 00:30:20,720 Speaker 2: Chrome Google itself earlier this year, there are a lot 523 00:30:20,720 --> 00:30:23,520 Speaker 2: of people, Oh, Google is going to get disrupted by AI, 524 00:30:23,640 --> 00:30:26,280 Speaker 2: and it's a classic innovator's dilemma. They invented all this 525 00:30:26,360 --> 00:30:29,240 Speaker 2: tech and they're not making now the stock market views 526 00:30:29,400 --> 00:30:32,960 Speaker 2: Google's done phenomenally or alphabet They've done phenomenally well, and 527 00:30:33,000 --> 00:30:35,240 Speaker 2: a lot of people have really changed their tune on them. 528 00:30:35,600 --> 00:30:38,000 Speaker 2: If the browser is such an important part of the 529 00:30:38,040 --> 00:30:40,960 Speaker 2: real estate here, does that just give them a massive 530 00:30:41,400 --> 00:30:43,080 Speaker 2: leg up? I mean, I know you're going to launch 531 00:30:43,200 --> 00:30:45,600 Speaker 2: have your own and open AI is going to have 532 00:30:45,720 --> 00:30:48,719 Speaker 2: its own, but Chrome is everywhere, Like how that sounds 533 00:30:48,760 --> 00:30:50,080 Speaker 2: like a very big mode. 534 00:30:51,000 --> 00:30:55,520 Speaker 4: It's a very powerful distribution leverage point. But their greatest 535 00:30:55,520 --> 00:30:58,840 Speaker 4: strength is also a weakness and that they built it 536 00:30:58,960 --> 00:31:03,840 Speaker 4: around a a adword system where they're reliant on people 537 00:31:03,880 --> 00:31:07,760 Speaker 4: clicking on links, and the behavior of clicking on promoted 538 00:31:07,760 --> 00:31:10,480 Speaker 4: links will just matter less in the future of the 539 00:31:10,520 --> 00:31:12,040 Speaker 4: Internet than it has in the past. 540 00:31:12,280 --> 00:31:14,800 Speaker 2: Speaking of promoted links, this is another question I've been 541 00:31:14,800 --> 00:31:18,000 Speaker 2: wondering about. Sometimes I see these companies and they advertise 542 00:31:18,000 --> 00:31:20,200 Speaker 2: on LinkedIn and they offer what I think is sort 543 00:31:20,200 --> 00:31:22,680 Speaker 2: of would be the equivalent of like the AI equivalent 544 00:31:22,680 --> 00:31:25,400 Speaker 2: of SEO. Is that all snake oil is there? 545 00:31:25,520 --> 00:31:25,680 Speaker 4: Like? 546 00:31:25,920 --> 00:31:29,200 Speaker 2: Whatever does that exist? Can are there these games that 547 00:31:29,280 --> 00:31:31,880 Speaker 2: publishers or any brands can play to do well in 548 00:31:31,960 --> 00:31:33,600 Speaker 2: AI or they Is that sort of a fraud? 549 00:31:34,560 --> 00:31:36,720 Speaker 4: I don't want to call it a fraud. I have 550 00:31:36,880 --> 00:31:43,360 Speaker 4: not uncovered those companies provide analytics, but they are not actionable, okay, 551 00:31:44,120 --> 00:31:47,720 Speaker 4: and so I have not found an actionable output there. 552 00:31:47,800 --> 00:31:52,440 Speaker 4: And in some ways our purpose is to prevent those 553 00:31:53,200 --> 00:31:56,560 Speaker 4: insights from being actionable because if you can game a 554 00:31:56,600 --> 00:32:00,239 Speaker 4: Perplexity answer, it means it's not accurate and trustworthy. And 555 00:32:00,320 --> 00:32:03,640 Speaker 4: so you know, if they do find a hack, like, 556 00:32:04,080 --> 00:32:07,480 Speaker 4: we have better engineers that will fill that hack. 557 00:32:08,280 --> 00:32:11,200 Speaker 3: I have a slightly philosophical question, but is there any 558 00:32:11,280 --> 00:32:15,080 Speaker 3: way to inject a tiny bit of randomness or creativity 559 00:32:15,240 --> 00:32:18,320 Speaker 3: in some of Perplexity's output? And what I mean by 560 00:32:18,320 --> 00:32:20,280 Speaker 3: that is I think at this point we're all aware 561 00:32:20,400 --> 00:32:26,760 Speaker 3: of algorithmic content moderation and the idea that if you 562 00:32:27,000 --> 00:32:30,480 Speaker 3: start your Netflix account and you watch one romantic comedy 563 00:32:30,520 --> 00:32:32,600 Speaker 3: for the rest of your life. Netflix is going to 564 00:32:32,600 --> 00:32:35,920 Speaker 3: be throwing like Matthew McConaughey films that you forever and 565 00:32:35,960 --> 00:32:38,760 Speaker 3: ever and ever. How do you deal with that aspect 566 00:32:38,800 --> 00:32:42,239 Speaker 3: of it, the risk that people are funneled into like 567 00:32:42,360 --> 00:32:46,160 Speaker 3: one particular answer stream or one corner of the Internet. 568 00:32:46,480 --> 00:32:50,160 Speaker 4: So memory will be an important feature of products like Perplexity, 569 00:32:50,640 --> 00:32:54,920 Speaker 4: and the way we avoid that trap is transparency, just 570 00:32:54,920 --> 00:32:57,320 Speaker 4: like we do with sources, where we actually show you 571 00:32:57,840 --> 00:33:00,600 Speaker 4: all of the things that we've added to your memory. 572 00:33:00,960 --> 00:33:03,120 Speaker 4: And if you don't want to be pigeonholed as a 573 00:33:03,240 --> 00:33:07,840 Speaker 4: romantic comedy lover, you just delete that memory. And I mean, 574 00:33:07,880 --> 00:33:11,360 Speaker 4: I'm kind of I recently got into health supplements and 575 00:33:11,400 --> 00:33:15,200 Speaker 4: now every time I ask about health supplements, my decision 576 00:33:15,240 --> 00:33:20,840 Speaker 4: to begin, you know, pomegranate supplements is getting mentioned the answer. Yeah, 577 00:33:20,960 --> 00:33:23,560 Speaker 4: And I you know, I deleted that because I'm like, 578 00:33:23,600 --> 00:33:26,000 Speaker 4: you know what, I don't want it like coming back 579 00:33:26,000 --> 00:33:30,120 Speaker 4: to the pomegranates, as great as they are, and that 580 00:33:30,280 --> 00:33:32,440 Speaker 4: was you know that that I just went and deleted 581 00:33:32,440 --> 00:33:34,200 Speaker 4: that memory. I saw where it was, and you know, 582 00:33:34,960 --> 00:33:37,240 Speaker 4: I'm pomegranate free and my answers. 583 00:33:41,280 --> 00:33:44,160 Speaker 2: There's a question here about let's see but the comparison 584 00:33:44,160 --> 00:33:46,600 Speaker 2: to the first wave of the Internet. So this person 585 00:33:46,760 --> 00:33:50,160 Speaker 2: is talking about content creation, AI agent studios, pace of 586 00:33:50,240 --> 00:33:55,360 Speaker 2: change moving very fast, I mean, is decision. What's really 587 00:33:55,360 --> 00:33:57,480 Speaker 2: extraordinary to me about the last several years is the 588 00:33:57,520 --> 00:34:00,360 Speaker 2: speed with which we seem to get new paradigm. Right, 589 00:34:00,800 --> 00:34:02,920 Speaker 2: blogs were one or going to be a big deal. 590 00:34:03,080 --> 00:34:04,960 Speaker 2: That was twenty years ago. They had a lifespan of 591 00:34:05,000 --> 00:34:08,359 Speaker 2: a few years. Then you know social media, Twitter, etc. 592 00:34:08,640 --> 00:34:12,400 Speaker 2: There's speed. How do you think about things get overturned 593 00:34:12,440 --> 00:34:14,799 Speaker 2: so fast these days? How do you think about making 594 00:34:14,840 --> 00:34:18,440 Speaker 2: investment decisions under such conditions of uncertainty? 595 00:34:19,560 --> 00:34:22,800 Speaker 4: Yeah, this is I mean, a funny time to drop 596 00:34:22,840 --> 00:34:24,560 Speaker 4: his name, but this is where we love the Jeff 597 00:34:24,600 --> 00:34:27,960 Speaker 4: Bezos line of you know, it's the hard thing to 598 00:34:28,000 --> 00:34:30,920 Speaker 4: predict is not what's going to change, but what's going 599 00:34:30,960 --> 00:34:34,320 Speaker 4: to remain the same. I think from our vantage point, 600 00:34:34,719 --> 00:34:38,840 Speaker 4: the bedrock is accuracy and trust, right, And that's actually 601 00:34:39,400 --> 00:34:44,960 Speaker 4: as intelligence gets much more powerful, it gets more powerful 602 00:34:44,960 --> 00:34:47,360 Speaker 4: and manipulating us, and so that that's why kind of 603 00:34:47,360 --> 00:34:53,600 Speaker 4: that trust bedrock is paramount. And that said one of 604 00:34:53,640 --> 00:34:58,080 Speaker 4: my you know, management mantras is that I have absolute 605 00:34:58,120 --> 00:35:00,360 Speaker 4: conviction that six months from now I'm going to have 606 00:35:00,360 --> 00:35:03,239 Speaker 4: a top three priority that today I don't know what 607 00:35:03,360 --> 00:35:07,600 Speaker 4: it is. And that embedded in that is an understanding 608 00:35:07,680 --> 00:35:11,200 Speaker 4: that even the model makers themselves when they release a 609 00:35:11,239 --> 00:35:14,800 Speaker 4: new model until us out, they don't know what emergent 610 00:35:14,840 --> 00:35:17,960 Speaker 4: capabilities and what the useful applications of that will be. Yeah, 611 00:35:18,120 --> 00:35:21,880 Speaker 4: and so kind of you know, having core foundational principles 612 00:35:22,120 --> 00:35:26,200 Speaker 4: well at the same time preserving agility, adapt adaptability and 613 00:35:26,239 --> 00:35:29,439 Speaker 4: being the best at applying the new gifts we get 614 00:35:30,160 --> 00:35:32,320 Speaker 4: for the service of users. That that's going to be 615 00:35:32,320 --> 00:35:33,360 Speaker 4: a key operating skill. 616 00:35:34,000 --> 00:35:36,840 Speaker 3: There's a second part of that audience question that Joe 617 00:35:36,880 --> 00:35:39,120 Speaker 3: just asked, and I think it's it's probably the one 618 00:35:39,120 --> 00:35:41,600 Speaker 3: that's on all of our minds, and it's the bluntest one. 619 00:35:41,640 --> 00:35:43,440 Speaker 3: But is AI in a bubble? 620 00:35:46,000 --> 00:35:47,880 Speaker 4: There are certainly parts of it that. 621 00:35:47,760 --> 00:35:50,960 Speaker 3: Are not you obviously, but your competitors presumably. 622 00:35:51,760 --> 00:35:55,160 Speaker 4: I mean, I think of all of the time and 623 00:35:55,280 --> 00:36:02,279 Speaker 4: leverage that professionals and knowledge workers are getting and to me, 624 00:36:02,400 --> 00:36:04,560 Speaker 4: the ROI on that, like I can see the trillions 625 00:36:04,560 --> 00:36:08,960 Speaker 4: of dollars in economic productivity happening from it. But are 626 00:36:09,040 --> 00:36:11,400 Speaker 4: there certain companies that have just raised too much capital 627 00:36:11,440 --> 00:36:14,480 Speaker 4: and is going to create imbalances in the ecosystem? 628 00:36:14,600 --> 00:36:17,640 Speaker 2: Yes, all right, but I really have forty five seconds. 629 00:36:17,680 --> 00:36:20,479 Speaker 2: My issue is a lot of things still don't work 630 00:36:20,480 --> 00:36:23,920 Speaker 2: in AI. I tried to ask Chad Gpt yesterday how 631 00:36:23,960 --> 00:36:26,440 Speaker 2: many full school weeks are in New York City public schools. 632 00:36:26,480 --> 00:36:28,240 Speaker 2: It's like, oh, I'd have to count it by hand, 633 00:36:28,600 --> 00:36:31,160 Speaker 2: and it seriously said that. I posted a screenshot of it. 634 00:36:31,320 --> 00:36:33,960 Speaker 2: A lot of these agent things like, they don't work 635 00:36:34,080 --> 00:36:36,759 Speaker 2: very well. Like a lot of things. I once asked, 636 00:36:36,760 --> 00:36:38,880 Speaker 2: who are the most common guests on odd lots over 637 00:36:38,960 --> 00:36:40,759 Speaker 2: the years. It just gave me a bunch of this 638 00:36:40,840 --> 00:36:42,680 Speaker 2: was this was not your site. But I'm just saying, 639 00:36:42,719 --> 00:36:45,799 Speaker 2: like a lot of AI agents and so forth, they 640 00:36:45,880 --> 00:36:47,319 Speaker 2: sort of about. 641 00:36:47,040 --> 00:36:50,120 Speaker 4: Halfway through, Yeah, I'm tempted to do a live check 642 00:36:50,120 --> 00:36:51,840 Speaker 4: of weather perplexity. We give you a good answer on that. 643 00:36:51,920 --> 00:36:56,120 Speaker 4: But this goes back to what I was saying about 644 00:36:56,200 --> 00:36:59,320 Speaker 4: emergent capabilities. The fail case, and this comes up actually 645 00:36:59,320 --> 00:37:02,880 Speaker 4: in many enterprise conversations, is just because you tried something 646 00:37:03,280 --> 00:37:06,200 Speaker 4: two months ago, it doesn't mean and it didn't work. 647 00:37:06,239 --> 00:37:08,440 Speaker 4: It doesn't mean it's not working today, and so we 648 00:37:08,560 --> 00:37:11,400 Speaker 4: all need to keep this open mind that these technologies 649 00:37:11,440 --> 00:37:15,239 Speaker 4: are getting smarter and better at a rapid clip, and 650 00:37:15,680 --> 00:37:17,520 Speaker 4: you know those are solvable problems. 651 00:37:18,680 --> 00:37:21,239 Speaker 3: All right, Dimitri, thank you so much for thank you, 652 00:37:22,040 --> 00:37:24,399 Speaker 3: thank you for conversation. Really appreciate it. 653 00:37:38,280 --> 00:37:41,880 Speaker 2: That was our conversation with Dmitri Schevalenco, chief business officer 654 00:37:42,000 --> 00:37:42,799 Speaker 2: at Perplexity. 655 00:37:43,120 --> 00:37:44,960 Speaker 3: Shall we leave it there, Let's leave it there, all right. 656 00:37:45,040 --> 00:37:47,280 Speaker 3: This has been another episode of the All Thoughts podcast. 657 00:37:47,320 --> 00:37:50,360 Speaker 3: I'm Tracy Alloway. You can follow me at Tracy Alloway. 658 00:37:50,120 --> 00:37:52,760 Speaker 2: And I'm Joe Wisenthal. You can follow me at the Stalwart. 659 00:37:52,920 --> 00:37:56,640 Speaker 2: Follow our guest Dmitri Schevalenco. He's at Dmitri one. Follow 660 00:37:56,680 --> 00:38:00,120 Speaker 2: our producers Kerman Rodriguez at Kerman armand dash O Bennett 661 00:38:00,160 --> 00:38:03,560 Speaker 2: dashbod at kill Brooks at Killbrooks. From our Odd Lots content, 662 00:38:03,600 --> 00:38:05,719 Speaker 2: go to Bloomberg dot com slash odd Lots with the 663 00:38:05,800 --> 00:38:08,600 Speaker 2: daily newsletter and all of our episodes, and you can 664 00:38:08,600 --> 00:38:10,680 Speaker 2: shout about all these subjects twenty four to seven in 665 00:38:10,800 --> 00:38:14,360 Speaker 2: our discord Discord dot gg slash od Lots. 666 00:38:14,200 --> 00:38:15,960 Speaker 3: And if you enjoy odd Lots, if you like it, 667 00:38:16,000 --> 00:38:18,279 Speaker 3: when we do these live recordings. Please leave us a 668 00:38:18,320 --> 00:38:21,640 Speaker 3: positive review on your favorite podcast platform, and remember, if 669 00:38:21,680 --> 00:38:24,120 Speaker 3: you are a Bloomberg subscriber, you can listen to all 670 00:38:24,120 --> 00:38:26,759 Speaker 3: of our episodes absolutely ad free. All you need to 671 00:38:26,760 --> 00:38:29,239 Speaker 3: do is find the Bloomberg channel on Apple Podcasts and 672 00:38:29,320 --> 00:38:31,720 Speaker 3: follow the instructions there. Thanks for listening.