1 00:00:00,080 --> 00:00:03,800 Speaker 1: I'm off my game today. No, you're not. People are 2 00:00:03,800 --> 00:00:05,640 Speaker 1: going to have to start making better content. I think 3 00:00:05,640 --> 00:00:07,160 Speaker 1: we're gonna be talking about this for a long time. 4 00:00:07,240 --> 00:00:10,200 Speaker 1: When you program for everyone, you program for no one. 5 00:00:10,240 --> 00:00:12,319 Speaker 1: I think it's that we're purpose driven platform, like we're 6 00:00:12,320 --> 00:00:15,600 Speaker 1: trying to get to substance. How was that? Are you 7 00:00:15,680 --> 00:00:19,200 Speaker 1: happy with that? This is marketing therapy right now? It 8 00:00:19,480 --> 00:00:29,280 Speaker 1: really is? What's up? I'm Laura Currency and I'm Alexa Christin. 9 00:00:29,320 --> 00:00:35,680 Speaker 1: Welcome back at Landia soon to be Blockchain with returning 10 00:00:35,720 --> 00:00:39,680 Speaker 1: guests and fan favorite Jared Dicker, the new CEO of 11 00:00:39,760 --> 00:00:44,159 Speaker 1: blockchain company Poet. That's pet Et Jared who came from 12 00:00:44,159 --> 00:00:46,959 Speaker 1: the Washington Post. He led ad tech at the Washington 13 00:00:47,040 --> 00:00:49,720 Speaker 1: Post and the whole ad lab went over, took a 14 00:00:49,720 --> 00:00:52,120 Speaker 1: big leap, went over to the startup Poet, and he's 15 00:00:52,120 --> 00:00:55,760 Speaker 1: going to explain to us why blockchain matters, what it 16 00:00:55,840 --> 00:00:59,720 Speaker 1: really is, why it matters for media, and what is 17 00:00:59,760 --> 00:01:04,120 Speaker 1: going to happen to creators and the whole buying system 18 00:01:04,160 --> 00:01:20,040 Speaker 1: in media and content. We'll be right back and we're 19 00:01:20,040 --> 00:01:23,119 Speaker 1: back in the studio. I'm Jared Dicker taking over. This 20 00:01:23,200 --> 00:01:28,080 Speaker 1: is at Landia, and here we go. Band is back together. 21 00:01:28,200 --> 00:01:33,480 Speaker 1: Jared Dicker, CEO of poet po dot et. Yeah, for 22 00:01:33,520 --> 00:01:36,720 Speaker 1: the past six months, I have transitioned into the blockchain, 23 00:01:36,840 --> 00:01:43,640 Speaker 1: which has been um it is it kind of is 24 00:01:43,720 --> 00:01:48,240 Speaker 1: you know, Um, every time I switched a job, you 25 00:01:48,280 --> 00:01:50,360 Speaker 1: look back and it's like, Wow, I can't believe I've 26 00:01:50,360 --> 00:01:53,880 Speaker 1: been here for so long. But uh, in the blockchain 27 00:01:53,960 --> 00:01:56,160 Speaker 1: space and apt poet, yes, I definitely feel like I've 28 00:01:56,200 --> 00:01:58,080 Speaker 1: been here for this long because it is a lot 29 00:01:58,080 --> 00:02:00,520 Speaker 1: of work and it's very fast, and I'm sure as 30 00:02:00,560 --> 00:02:04,320 Speaker 1: you guys know, um through following the trades and conversations 31 00:02:04,360 --> 00:02:07,320 Speaker 1: that you've had, it's it's the thing to talk about, right. 32 00:02:07,360 --> 00:02:09,520 Speaker 1: You have cannabis, you have blockchain, and then you have 33 00:02:10,440 --> 00:02:15,040 Speaker 1: AI mL, like yeah, artificial intelligence, machine learning. But I 34 00:02:15,040 --> 00:02:17,320 Speaker 1: feel like we're getting into like another winter of AI 35 00:02:17,400 --> 00:02:20,680 Speaker 1: and m L, and we're probably going into a winter 36 00:02:20,760 --> 00:02:23,639 Speaker 1: of blockchain because people don't really think it's real and 37 00:02:23,720 --> 00:02:25,040 Speaker 1: a lot in a lot of cases, there are a 38 00:02:25,080 --> 00:02:29,320 Speaker 1: lot of mainstream folks in advertising, marketing and media who 39 00:02:29,480 --> 00:02:32,399 Speaker 1: maybe are enamored with the idea and the concept of blockchain, 40 00:02:32,560 --> 00:02:38,359 Speaker 1: but they don't see the reality. So uh, you're right, um, 41 00:02:38,400 --> 00:02:41,040 Speaker 1: when you say that blockchain is kind of used as 42 00:02:41,200 --> 00:02:43,640 Speaker 1: as a marketing tool right now. I'd equate it to 43 00:02:44,440 --> 00:02:47,400 Speaker 1: uh these strategies behind at tech, where you have something 44 00:02:47,440 --> 00:02:50,320 Speaker 1: that's somewhat simple, and when you break it down and 45 00:02:50,400 --> 00:02:53,600 Speaker 1: understand what it does, it's it's very simple, but the 46 00:02:53,680 --> 00:02:56,080 Speaker 1: people that control it are purposely making it confusing to 47 00:02:56,160 --> 00:02:58,880 Speaker 1: keep others out. And that's kind of what you're seeing 48 00:02:59,040 --> 00:03:04,760 Speaker 1: with blockchain, particularly particularly in the media space, right um uh. 49 00:03:04,960 --> 00:03:07,080 Speaker 1: Note that this has been around for you know, almost 50 00:03:07,080 --> 00:03:09,440 Speaker 1: a decade, and that there's been a lot of work 51 00:03:09,440 --> 00:03:12,080 Speaker 1: and focus going into it, but as it's impacted media, 52 00:03:12,280 --> 00:03:14,680 Speaker 1: you know, especially u C blockchain in the ad space 53 00:03:14,720 --> 00:03:17,320 Speaker 1: and what I could do for advertising and transparency and 54 00:03:17,360 --> 00:03:19,480 Speaker 1: trust less systems and all of these things, a lot 55 00:03:19,520 --> 00:03:22,920 Speaker 1: of that is absolutely just lip service. So blockchain is 56 00:03:23,000 --> 00:03:27,120 Speaker 1: just a distributed ledger. So the main differences between that 57 00:03:27,200 --> 00:03:29,600 Speaker 1: and how people use a database today is that it's 58 00:03:29,639 --> 00:03:32,440 Speaker 1: just controlled by a consensus. So basically, if we were 59 00:03:32,440 --> 00:03:35,280 Speaker 1: having this conversation on the blockchain, all of the information 60 00:03:35,360 --> 00:03:37,720 Speaker 1: is recorded, and Laura can change what she said, and 61 00:03:37,760 --> 00:03:39,920 Speaker 1: Alexa can't change what she said, and I can't change 62 00:03:39,920 --> 00:03:42,160 Speaker 1: what I said without a group of us you know, 63 00:03:42,240 --> 00:03:45,520 Speaker 1: agreeing collectively on what should be changed and what should 64 00:03:45,560 --> 00:03:50,160 Speaker 1: not be changed. And that's really it. So when people say, right, um, 65 00:03:50,280 --> 00:03:52,920 Speaker 1: the blockchain, like we are using the blockchain to solve 66 00:03:52,960 --> 00:03:56,440 Speaker 1: for this um, they're most likely able to use a 67 00:03:56,520 --> 00:03:58,960 Speaker 1: database to have that solution be better than what they 68 00:03:59,000 --> 00:04:03,520 Speaker 1: do today. And that's like biggest um kind of argument 69 00:04:03,600 --> 00:04:07,200 Speaker 1: that's coming from people, whether they're pro or anti blockchain, 70 00:04:07,280 --> 00:04:09,600 Speaker 1: particularly in the media space, is kind of like, well, 71 00:04:09,600 --> 00:04:11,320 Speaker 1: why do you need a blockchain? Now? The cool part 72 00:04:11,320 --> 00:04:14,600 Speaker 1: about blockchain is that how can you best leverage this 73 00:04:14,720 --> 00:04:18,680 Speaker 1: distributed ledger and put incentivized systems against it. Early huff 74 00:04:18,720 --> 00:04:21,839 Speaker 1: Post days, we did huff Post badging and comment moderation 75 00:04:21,880 --> 00:04:26,280 Speaker 1: where we were self governing the comment moderation system at 76 00:04:26,320 --> 00:04:29,520 Speaker 1: the huffing in Post against the technology that we call Julia. 77 00:04:29,600 --> 00:04:32,360 Speaker 1: But we were basically able to incentivize people within our 78 00:04:32,400 --> 00:04:36,040 Speaker 1: community to find the best comments, curate the best conversations, 79 00:04:36,320 --> 00:04:39,919 Speaker 1: highlight that content, give them with intrinsic rewards, so that 80 00:04:40,000 --> 00:04:41,760 Speaker 1: they were kind of doing the job that the huffing 81 00:04:41,760 --> 00:04:44,280 Speaker 1: Impost should be doing, but it was controlled empowered by 82 00:04:44,279 --> 00:04:47,279 Speaker 1: the network. That's the coolest part about what blockchain is 83 00:04:47,320 --> 00:04:49,920 Speaker 1: today and what the best technologies will enable it to do. 84 00:04:50,000 --> 00:04:52,920 Speaker 1: It's not just about this distributed ledger and being able 85 00:04:52,960 --> 00:04:56,440 Speaker 1: to categorize and have information that is immutable and cannot 86 00:04:56,440 --> 00:04:59,120 Speaker 1: be changed. It's about how you can incentivize people to 87 00:04:59,160 --> 00:05:02,400 Speaker 1: do things again that information and against that data. So 88 00:05:02,480 --> 00:05:04,239 Speaker 1: other than it being a ledger. When you talk about 89 00:05:04,279 --> 00:05:06,640 Speaker 1: media solutions, right, one of the words that you just 90 00:05:07,240 --> 00:05:09,720 Speaker 1: said around some of the solves that brands are thinking 91 00:05:09,760 --> 00:05:13,000 Speaker 1: about is transparency. Right. So obviously if this is an 92 00:05:13,080 --> 00:05:17,280 Speaker 1: uneditable ledger, brands are then able to go in and 93 00:05:17,360 --> 00:05:22,360 Speaker 1: identify what we often in this space, find a new technology, 94 00:05:22,480 --> 00:05:25,360 Speaker 1: get excited about it, and just slap it on things, right, 95 00:05:25,440 --> 00:05:29,080 Speaker 1: and and you all know this best. But um, that's 96 00:05:29,080 --> 00:05:31,880 Speaker 1: exactly what's happening again here with the blockchain, um, which 97 00:05:31,920 --> 00:05:34,040 Speaker 1: is a huge mistake. What we really need when we're 98 00:05:34,040 --> 00:05:36,680 Speaker 1: thinking about these technologies and the opportunities is how can 99 00:05:36,680 --> 00:05:39,880 Speaker 1: we actually create something new, right, not back into existing 100 00:05:40,000 --> 00:05:43,440 Speaker 1: systems where these technologies didn't exist when these problems first 101 00:05:43,440 --> 00:05:45,760 Speaker 1: came about and shoving them in there to try to 102 00:05:45,760 --> 00:05:47,919 Speaker 1: put a band aid on it, but actually create something 103 00:05:47,960 --> 00:05:51,760 Speaker 1: new and create new behaviors. So identifying things like transparency, 104 00:05:52,360 --> 00:05:54,440 Speaker 1: How can we know where our money is being spent? 105 00:05:54,600 --> 00:05:58,600 Speaker 1: How can we best engage users? UM and his blockchain 106 00:05:58,640 --> 00:06:01,440 Speaker 1: is solution for that. How we be better with transacting 107 00:06:01,600 --> 00:06:05,320 Speaker 1: right with influencers and being able to identify influencers based 108 00:06:05,360 --> 00:06:09,040 Speaker 1: on the content they've created and their reputation and things that. Again, 109 00:06:09,880 --> 00:06:13,640 Speaker 1: a ledger and an incentivized system can help bring forward 110 00:06:13,800 --> 00:06:16,920 Speaker 1: right more information behind that to do so, the biggest 111 00:06:16,920 --> 00:06:20,200 Speaker 1: one today, which is what we're focusing on a poet 112 00:06:20,200 --> 00:06:21,880 Speaker 1: and a lot of others in the space, is kind 113 00:06:21,880 --> 00:06:25,480 Speaker 1: of like, how can you provide information behind what you're 114 00:06:25,480 --> 00:06:29,440 Speaker 1: about to consume? So UM being able to expose more 115 00:06:29,480 --> 00:06:32,000 Speaker 1: information so that people can make better decisions. So when 116 00:06:32,000 --> 00:06:35,640 Speaker 1: you drink a can of coke or you smoke a cigarette, Um, 117 00:06:35,680 --> 00:06:37,880 Speaker 1: you're free to do whatever you want, but there is 118 00:06:37,920 --> 00:06:40,239 Speaker 1: more information that tells you're like, these are the ingredients 119 00:06:40,320 --> 00:06:42,400 Speaker 1: behind what you're about to put into your body. And 120 00:06:42,520 --> 00:06:45,279 Speaker 1: yes it's FDA regulated, but it helps you make better 121 00:06:45,279 --> 00:06:48,680 Speaker 1: informed decisions towards what makes most sense for you. Every 122 00:06:48,720 --> 00:06:52,080 Speaker 1: single piece of information, whether it's content, whether it's a 123 00:06:52,080 --> 00:06:56,560 Speaker 1: transaction like bitcoin, UM it's a public exposed ledger of 124 00:06:56,600 --> 00:07:00,840 Speaker 1: information that everyone could see, right, and then the beauty 125 00:07:00,880 --> 00:07:03,560 Speaker 1: of that, which is what we're saying with the incentive systems, 126 00:07:03,880 --> 00:07:06,960 Speaker 1: is like what can we do to spark behaviors against that? 127 00:07:07,080 --> 00:07:10,200 Speaker 1: So if we are looking for who an author is, 128 00:07:10,440 --> 00:07:12,840 Speaker 1: or if something has been fact checked, or how has 129 00:07:12,880 --> 00:07:16,160 Speaker 1: something been sourced or where has your money gone, you 130 00:07:16,200 --> 00:07:18,760 Speaker 1: could have the Ledger and have that information and then 131 00:07:18,800 --> 00:07:22,120 Speaker 1: give incentive systems for people to make things verifiable, like 132 00:07:22,240 --> 00:07:25,560 Speaker 1: be able to say, is this actually Laura Crerenti who 133 00:07:25,560 --> 00:07:28,400 Speaker 1: wrote this? Well, yes it is, because we found that 134 00:07:28,440 --> 00:07:31,160 Speaker 1: this is the source of the article right through where 135 00:07:31,200 --> 00:07:34,320 Speaker 1: it was originally published, and that claim that Laura is 136 00:07:34,360 --> 00:07:38,360 Speaker 1: the author is now verifiable. Um, but someone may say 137 00:07:38,600 --> 00:07:41,800 Speaker 1: I created this article, Um, my name is Laura Crerenti, 138 00:07:41,840 --> 00:07:43,440 Speaker 1: and then you find out that it was actually written 139 00:07:43,480 --> 00:07:46,520 Speaker 1: five years ago by Alexa Kristen. If you do choose 140 00:07:46,600 --> 00:07:49,800 Speaker 1: to read breitemart Right or huffing In Post or The 141 00:07:49,880 --> 00:07:53,960 Speaker 1: Daily Caller, you can so choose to. But this is 142 00:07:54,000 --> 00:07:56,800 Speaker 1: a technology and opportunity for you to learn more information 143 00:07:56,800 --> 00:07:59,400 Speaker 1: about what you're about to consume. That this source of 144 00:07:59,400 --> 00:08:02,320 Speaker 1: this content may not be verified, right, Like we think 145 00:08:02,360 --> 00:08:05,640 Speaker 1: of being able to label information like this is sponsored 146 00:08:05,680 --> 00:08:08,120 Speaker 1: and this isn't sponsored but that, but it's not binary. 147 00:08:08,160 --> 00:08:11,280 Speaker 1: It's not up to the publishers anymore. Right, The conversation 148 00:08:11,360 --> 00:08:15,240 Speaker 1: with publishers today is we're valuable and we're premium, and 149 00:08:15,320 --> 00:08:20,320 Speaker 1: yes it's premium, and I don't subscribe to like that 150 00:08:20,360 --> 00:08:25,680 Speaker 1: whole premium content industry. It is, but the industry, right 151 00:08:25,800 --> 00:08:29,600 Speaker 1: has a premium content basically stacked right, right, a charge 152 00:08:29,640 --> 00:08:32,000 Speaker 1: against that, right, And it's proving my point exactly what 153 00:08:32,000 --> 00:08:34,920 Speaker 1: you said, because we say premium, but that's an insider 154 00:08:34,960 --> 00:08:37,760 Speaker 1: definition that is not conveyed at all to the consumers. 155 00:08:37,600 --> 00:08:40,280 Speaker 1: That's like hasn't really been able to be quantified in 156 00:08:40,360 --> 00:08:44,080 Speaker 1: any particularly thank you. So that is what we're building. 157 00:08:44,200 --> 00:08:46,200 Speaker 1: We want to be able to convey the value behind 158 00:08:46,240 --> 00:08:49,120 Speaker 1: the creator's work and what that really means. Right. We 159 00:08:49,360 --> 00:08:54,040 Speaker 1: constantly emphasize um things that happen in the results, right, 160 00:08:54,040 --> 00:08:56,040 Speaker 1: like how many page views do you have, how many 161 00:08:56,160 --> 00:08:59,880 Speaker 1: UH followers you have, how many engagement And that is 162 00:09:00,080 --> 00:09:01,680 Speaker 1: nothing to do with the reason why you went and 163 00:09:01,760 --> 00:09:05,320 Speaker 1: engaged with that musician or publisher or creator in the 164 00:09:05,320 --> 00:09:09,280 Speaker 1: first place, because we have no way of identifying or 165 00:09:09,400 --> 00:09:12,559 Speaker 1: verifying that information, right, and being able to explose it. 166 00:09:12,679 --> 00:09:15,040 Speaker 1: So the last thing I'll say on this rant is 167 00:09:15,440 --> 00:09:18,960 Speaker 1: the other notion is why are publishers premium to Alexis point? Like, 168 00:09:19,640 --> 00:09:22,640 Speaker 1: what is premium? And his premium is stack in the system, 169 00:09:22,679 --> 00:09:27,000 Speaker 1: And why is the Washington Post content more premium than 170 00:09:27,120 --> 00:09:30,080 Speaker 1: Ben Thompson Stratechory's right, it was operating on his own 171 00:09:30,280 --> 00:09:34,600 Speaker 1: and that's because we don't have a way to quantify 172 00:09:34,679 --> 00:09:36,560 Speaker 1: it or to show it that you don't have a 173 00:09:36,640 --> 00:09:42,240 Speaker 1: way to verify it by the buyer. That's the point here, right, 174 00:09:42,320 --> 00:09:44,280 Speaker 1: And I think that's for me at least, that's what's 175 00:09:44,280 --> 00:09:47,440 Speaker 1: exciting about blockchain. Blockchain for me is at the end 176 00:09:47,480 --> 00:09:50,880 Speaker 1: of the day, we're putting the hands of the verification 177 00:09:51,280 --> 00:09:56,120 Speaker 1: and the validation into a public, transparent setting for the 178 00:09:56,160 --> 00:09:59,360 Speaker 1: community to decide, right, And that's the point. And a 179 00:09:59,360 --> 00:10:01,320 Speaker 1: lot of the project explain the gray, right, Like we 180 00:10:01,400 --> 00:10:03,480 Speaker 1: live in a black and white world where where you're 181 00:10:03,480 --> 00:10:05,559 Speaker 1: either on one side or the other, and no one 182 00:10:05,640 --> 00:10:08,560 Speaker 1: is focusing on the gray. So so today we're in 183 00:10:08,600 --> 00:10:11,520 Speaker 1: a world where there's fake news, right, and what's real 184 00:10:11,640 --> 00:10:14,480 Speaker 1: and what's fake and what's truth and what's false. However, 185 00:10:14,840 --> 00:10:17,600 Speaker 1: while I believe that everything should be objective. We live 186 00:10:17,600 --> 00:10:20,520 Speaker 1: in a world that's extremely subjective. You're talking about very 187 00:10:20,640 --> 00:10:26,160 Speaker 1: hard objective things about verification, transparency, data, right, You're talking 188 00:10:26,160 --> 00:10:31,360 Speaker 1: about data, information backed shit. But you're also talking about 189 00:10:32,040 --> 00:10:37,719 Speaker 1: major movements in society and culture in thinking, in brands, 190 00:10:37,720 --> 00:10:43,439 Speaker 1: in preferences and personalization. Right, because a new infrastructure is 191 00:10:43,480 --> 00:10:47,280 Speaker 1: built that actually exposes all of that. Yeah, the coolest 192 00:10:47,280 --> 00:10:49,200 Speaker 1: part about what could happen in black chain and media 193 00:10:49,280 --> 00:10:51,400 Speaker 1: is that the web today is built off to ours 194 00:10:51,559 --> 00:10:55,400 Speaker 1: right and decency and RELI right. So when you search 195 00:10:55,480 --> 00:10:57,960 Speaker 1: content right, or you're in your timeline on Facebook or 196 00:10:58,080 --> 00:11:01,760 Speaker 1: Twitter or Google, you search based on when something was published, right, 197 00:11:02,000 --> 00:11:05,960 Speaker 1: or or um being able to sort it right based 198 00:11:05,960 --> 00:11:08,600 Speaker 1: on published time, or you want something that's relevant. I 199 00:11:08,920 --> 00:11:10,920 Speaker 1: I want to search for something and I want those 200 00:11:10,960 --> 00:11:13,960 Speaker 1: results to get back verbatim the information that I have, 201 00:11:14,240 --> 00:11:15,680 Speaker 1: and that is the way for the past ten to 202 00:11:15,720 --> 00:11:18,520 Speaker 1: twenty years we have discovered and looked for content. Now 203 00:11:18,559 --> 00:11:21,360 Speaker 1: what's happened is that we've actually changed the way creators 204 00:11:21,360 --> 00:11:24,760 Speaker 1: create based on those results. So instead of when Michael 205 00:11:24,840 --> 00:11:27,719 Speaker 1: Jackson passed away saying Michael Jackson passes away due to 206 00:11:27,800 --> 00:11:31,280 Speaker 1: drug overdose, doctor investigated, we did, Michael Jackson dead right, 207 00:11:31,280 --> 00:11:33,040 Speaker 1: because that's what people are going to search, and that 208 00:11:33,080 --> 00:11:36,079 Speaker 1: actually takes the value out of why I read in 209 00:11:36,280 --> 00:11:39,000 Speaker 1: author's work or creator's work, because now they're actually creating 210 00:11:39,040 --> 00:11:41,720 Speaker 1: based on how they're valued and the results. So blockchain 211 00:11:41,800 --> 00:11:46,679 Speaker 1: brings reputation. The third are so recency, relevancy, reputation. How 212 00:11:46,679 --> 00:11:49,480 Speaker 1: do we put emphasis and how do we put information 213 00:11:49,520 --> 00:11:53,520 Speaker 1: in quantifiable metrics against the work that goes into it, 214 00:11:53,600 --> 00:11:56,080 Speaker 1: the effort that goes into it, and not just the results, 215 00:11:56,200 --> 00:11:58,640 Speaker 1: not just how fast you published or when it was published, 216 00:11:58,880 --> 00:12:02,480 Speaker 1: or how relevant you can of your headline or clickbait 217 00:12:02,600 --> 00:12:04,720 Speaker 1: or all of these things that are driving us nuts, 218 00:12:04,760 --> 00:12:07,680 Speaker 1: But how do we now put focus back on the value. 219 00:12:07,960 --> 00:12:10,680 Speaker 1: So somebody who's who spent the entirety of her career 220 00:12:10,720 --> 00:12:16,400 Speaker 1: evaluating and buying media and content and having to associate 221 00:12:16,640 --> 00:12:21,120 Speaker 1: value or cost against um what that was being pitched 222 00:12:21,120 --> 00:12:23,760 Speaker 1: as is worth, or having done homework to understand the 223 00:12:23,800 --> 00:12:27,440 Speaker 1: market rates, etcetera. Well, you're basically alluding to is that 224 00:12:27,480 --> 00:12:31,040 Speaker 1: blockchain is going to be able to democratize premium and 225 00:12:31,160 --> 00:12:35,560 Speaker 1: equaling the playing field to say, consumer choice is what's 226 00:12:35,600 --> 00:12:39,479 Speaker 1: going to drive premium and what they decide is valuable, 227 00:12:40,720 --> 00:12:44,000 Speaker 1: whether it's person A or entity B, is up to them. 228 00:12:44,120 --> 00:12:48,800 Speaker 1: That's right. Yeah, we have not defined premium right to 229 00:12:48,880 --> 00:12:53,640 Speaker 1: alexis point, and premium is in the eyes of the 230 00:12:53,720 --> 00:12:57,880 Speaker 1: buyer or the consumer or the beholder. And what's happened 231 00:12:57,920 --> 00:12:59,880 Speaker 1: to date is that we've had a fox guarding the 232 00:13:00,000 --> 00:13:02,880 Speaker 1: in house scenario. Right. We've had the New York Times 233 00:13:02,960 --> 00:13:05,560 Speaker 1: and even Netflix and Spotify and all of these platforms 234 00:13:05,600 --> 00:13:08,520 Speaker 1: say we're premium because of this, you should subscribe because 235 00:13:08,520 --> 00:13:11,160 Speaker 1: of this, and now and we're seeing a media first 236 00:13:11,240 --> 00:13:14,560 Speaker 1: bitatal shifts, right. We're seeing consumers say, I don't get it. 237 00:13:14,720 --> 00:13:16,160 Speaker 1: I don't get why I'm paying for this. Why do 238 00:13:16,240 --> 00:13:18,360 Speaker 1: I need to pay for all these subscriptions? There's too 239 00:13:18,360 --> 00:13:20,319 Speaker 1: many media companies, right, Like it's not that there's not 240 00:13:20,440 --> 00:13:24,280 Speaker 1: enough advertising dollars, there's too many publishers, right. So and 241 00:13:24,320 --> 00:13:28,080 Speaker 1: now the same publishers right that that kind of infiltrated 242 00:13:28,080 --> 00:13:30,960 Speaker 1: and took the ad revenue are now saying we're doing subscriptions. 243 00:13:31,120 --> 00:13:32,880 Speaker 1: So now they're gonna take more of the subscription pome. 244 00:13:32,920 --> 00:13:34,400 Speaker 1: Do you you wander what's gonna happen? There's not gonna be 245 00:13:34,480 --> 00:13:37,040 Speaker 1: enough money, So like, we kind of need things to 246 00:13:37,120 --> 00:13:39,760 Speaker 1: die in order for new things to exist. And what 247 00:13:39,920 --> 00:13:45,160 Speaker 1: you're saying is true because the blockchain is a distributed ledger. 248 00:13:45,280 --> 00:13:47,640 Speaker 1: It is a network driven by the network. It is 249 00:13:47,679 --> 00:13:50,800 Speaker 1: consumer driven. It's not Fox guarding the henhouse. These are 250 00:13:50,840 --> 00:13:53,120 Speaker 1: people that are gonna put value on things based on 251 00:13:53,200 --> 00:13:57,120 Speaker 1: what they feel, right, is verifiable, interesting, Right. What they 252 00:13:57,160 --> 00:14:01,959 Speaker 1: could then associate with that they think is good content 253 00:14:02,000 --> 00:14:03,959 Speaker 1: that's relevant to them. So how can they curate their 254 00:14:03,960 --> 00:14:06,560 Speaker 1: own experiences? How could they pay and give value to 255 00:14:06,960 --> 00:14:10,440 Speaker 1: You know at Ben Thompson versus Washington Post, do authors 256 00:14:10,440 --> 00:14:15,400 Speaker 1: and creators need to be with media companies or our 257 00:14:15,480 --> 00:14:19,000 Speaker 1: media companies? Yeah, or our media company, And like I 258 00:14:19,040 --> 00:14:21,600 Speaker 1: do believe media companies are valuable, but their value maybe 259 00:14:21,680 --> 00:14:24,440 Speaker 1: we could get you distribution, right, we could get you 260 00:14:24,600 --> 00:14:27,160 Speaker 1: pr and publicity. They're kind of like the labels right 261 00:14:27,240 --> 00:14:30,080 Speaker 1: for music, Like you don't necessarily need them to be 262 00:14:30,120 --> 00:14:32,680 Speaker 1: a creator, but they could help you. They're like an agency, right, 263 00:14:32,760 --> 00:14:35,920 Speaker 1: And and you often forget like, oh, a publisher is 264 00:14:35,920 --> 00:14:38,080 Speaker 1: a place where I could create content, right, and they're 265 00:14:38,080 --> 00:14:41,000 Speaker 1: making me me? No, like you make you you and 266 00:14:41,000 --> 00:14:43,120 Speaker 1: a publisher could help distribute you. But when you're ready 267 00:14:43,120 --> 00:14:44,520 Speaker 1: to go out on your own, or when you want 268 00:14:44,560 --> 00:14:47,600 Speaker 1: to go to a different label or agency or test 269 00:14:47,640 --> 00:14:50,040 Speaker 1: new waters, you should go and there should be systems 270 00:14:50,040 --> 00:14:51,880 Speaker 1: in place that allow you to enable that. To who 271 00:14:51,920 --> 00:14:54,240 Speaker 1: are you working with? What media companies are you working 272 00:14:54,240 --> 00:14:56,680 Speaker 1: with right now? So so we're speaking to a ton 273 00:14:56,680 --> 00:14:59,800 Speaker 1: of media companies. To be completely honest, I'm really for 274 00:15:00,000 --> 00:15:03,120 Speaker 1: gusing on the long tail talking to any brands. Some 275 00:15:03,200 --> 00:15:06,040 Speaker 1: brands I have to be like, honest, we're a fifteen 276 00:15:06,080 --> 00:15:08,680 Speaker 1: person team and fourteen or engineers, so it's literally me 277 00:15:08,840 --> 00:15:11,480 Speaker 1: just running around and having these combos. We have a 278 00:15:11,560 --> 00:15:16,000 Speaker 1: twenty person community um that's actually leveraging and using Poet, 279 00:15:16,040 --> 00:15:19,320 Speaker 1: and about fifty large across all of our different systems 280 00:15:19,360 --> 00:15:23,120 Speaker 1: that are putting information onto Poet. These are like screenwriters, independent, 281 00:15:23,560 --> 00:15:26,040 Speaker 1: who are they like the majority of your digital artists 282 00:15:26,080 --> 00:15:29,000 Speaker 1: their physical artists. Like I want to get these partnerships right. 283 00:15:29,040 --> 00:15:31,760 Speaker 1: I want to get the big publishers and media companies 284 00:15:31,760 --> 00:15:34,040 Speaker 1: that I have great relationships with that want to jump 285 00:15:34,120 --> 00:15:35,960 Speaker 1: on board and be a part of it too. But 286 00:15:36,080 --> 00:15:38,840 Speaker 1: I don't want to distract from this big mission because well, 287 00:15:38,920 --> 00:15:41,440 Speaker 1: Laura described is exactly what we're looking to do, and 288 00:15:41,480 --> 00:15:44,240 Speaker 1: that's a ship ton of work, and it's highly ambitious 289 00:15:44,600 --> 00:15:47,160 Speaker 1: and a lot of education, right, and to take on 290 00:15:47,200 --> 00:15:49,200 Speaker 1: a new brand or a new publisher that is going 291 00:15:49,240 --> 00:15:51,840 Speaker 1: to look for what's our revenue strategy off of this 292 00:15:51,880 --> 00:15:54,120 Speaker 1: thing in three months? We don't need that yet, that's right. 293 00:15:54,400 --> 00:15:57,680 Speaker 1: But will you eventually need salespeople to sit on top 294 00:15:57,720 --> 00:16:01,560 Speaker 1: of your consortium of ass as that are going out 295 00:16:01,600 --> 00:16:06,080 Speaker 1: and saying like, here's the Poet top one this week 296 00:16:06,600 --> 00:16:10,440 Speaker 1: and here's why, Like because then it transitions from ad 297 00:16:10,520 --> 00:16:13,160 Speaker 1: teck like this Poet actually become a media company, right 298 00:16:13,200 --> 00:16:15,480 Speaker 1: that the next question, Yeah, that's a good question. So 299 00:16:15,480 --> 00:16:18,600 Speaker 1: so POD is the underlying technology with all of these things. 300 00:16:18,600 --> 00:16:22,120 Speaker 1: So think of HTTP right where you type HTTPS into 301 00:16:22,120 --> 00:16:24,520 Speaker 1: your browser, right, Like that is what pode is doing 302 00:16:24,520 --> 00:16:27,280 Speaker 1: at a protocol level. How could people build reputation, how 303 00:16:27,320 --> 00:16:31,160 Speaker 1: could they license content? Right? How could they build verifiable content? 304 00:16:31,400 --> 00:16:34,080 Speaker 1: And then there's all these apps right that are going 305 00:16:34,120 --> 00:16:36,200 Speaker 1: to be built on top. What we're really looking to 306 00:16:36,240 --> 00:16:39,760 Speaker 1: do is invest in that protocol so that creators cannot 307 00:16:39,800 --> 00:16:42,640 Speaker 1: just leverage it for their own I P but also 308 00:16:42,640 --> 00:16:45,200 Speaker 1: start building tools and technologies on top of it. Like 309 00:16:45,240 --> 00:16:47,720 Speaker 1: not to necessarily say like the app store, but to 310 00:16:47,760 --> 00:16:51,880 Speaker 1: make an analogy like Apple as the device and Poet 311 00:16:51,920 --> 00:16:54,320 Speaker 1: as the protocol, and what could be built on top 312 00:16:54,320 --> 00:16:56,400 Speaker 1: of that is endless, Right, how do I build the 313 00:16:56,440 --> 00:16:59,800 Speaker 1: hooks that make it so accessible and so obvious for 314 00:16:59,840 --> 00:17:02,080 Speaker 1: them to do it that takes no additional efforts, So 315 00:17:02,120 --> 00:17:04,960 Speaker 1: that we build this repository of creators like you that say, 316 00:17:05,000 --> 00:17:08,119 Speaker 1: do you want to what? I love what poet is doing. 317 00:17:08,400 --> 00:17:10,720 Speaker 1: I love what this person is building on Poet, But 318 00:17:10,800 --> 00:17:12,199 Speaker 1: I have an idea that I'm gonna build on my 319 00:17:12,200 --> 00:17:14,320 Speaker 1: own and commercialize on top of Poet that I'm taking 320 00:17:14,359 --> 00:17:16,600 Speaker 1: for myself. Yeah. I mean the whole point is getting 321 00:17:16,600 --> 00:17:20,280 Speaker 1: your content out there to an audience and getting it 322 00:17:20,480 --> 00:17:26,600 Speaker 1: monetized as i P that is valuable. Who are poets competitors? Yeah? Yeah, 323 00:17:26,800 --> 00:17:29,800 Speaker 1: it's a So the cool part about blockchain right now 324 00:17:30,080 --> 00:17:33,000 Speaker 1: is that because a lot of the projects are open source, 325 00:17:33,080 --> 00:17:35,679 Speaker 1: you know, meaning that like you don't really own the 326 00:17:35,680 --> 00:17:38,680 Speaker 1: information to the i P, anyone could use it. It's 327 00:17:38,680 --> 00:17:43,720 Speaker 1: more collaborative than competitive. At this point. There's companies called 328 00:17:44,040 --> 00:17:46,280 Speaker 1: uh steam steam it, which is kind of like a 329 00:17:46,320 --> 00:17:49,680 Speaker 1: Reddit for blockchain, but there's not so much like a 330 00:17:49,800 --> 00:17:54,119 Speaker 1: one to one type comparison. I think what we're fortunate 331 00:17:54,240 --> 00:17:57,120 Speaker 1: in doing and where our investment lies is can we 332 00:17:57,200 --> 00:18:00,639 Speaker 1: build something that could bread an ecosis them right, that 333 00:18:00,640 --> 00:18:04,600 Speaker 1: could bread competitors and and and bred people to start 334 00:18:04,600 --> 00:18:06,560 Speaker 1: thinking and building on top of it. What we could 335 00:18:06,600 --> 00:18:10,040 Speaker 1: all agree on is that again not in I'm very 336 00:18:10,080 --> 00:18:13,560 Speaker 1: good at non apples to apples comparisons, but hopefully they 337 00:18:13,600 --> 00:18:17,160 Speaker 1: make sense. Um, it's like we're in a napster moment 338 00:18:17,920 --> 00:18:21,880 Speaker 1: with creators and media. It's not necessarily that people want 339 00:18:22,040 --> 00:18:25,000 Speaker 1: things for free. It's just that there's really no good 340 00:18:25,080 --> 00:18:28,760 Speaker 1: systems for people to see the value to get verifiable, 341 00:18:28,880 --> 00:18:31,240 Speaker 1: reputable content and to be able to know that their 342 00:18:31,240 --> 00:18:34,399 Speaker 1: rewarding creators directly. When napsture was around, like the music 343 00:18:34,440 --> 00:18:37,840 Speaker 1: and information you got wasn't great, right, Like songs were 344 00:18:37,880 --> 00:18:41,879 Speaker 1: cut off, the quality was shitty. You, but you had 345 00:18:42,000 --> 00:18:44,040 Speaker 1: no other option to get digital music right, that was 346 00:18:44,040 --> 00:18:47,280 Speaker 1: the only platform until people saw what needed to be 347 00:18:47,320 --> 00:18:49,479 Speaker 1: done in order to make this a real viable product. 348 00:18:49,520 --> 00:18:52,399 Speaker 1: People wanted digital music, So how can you make it reputable? 349 00:18:52,480 --> 00:18:54,520 Speaker 1: How can you make it verifiable? Right? How can you 350 00:18:54,560 --> 00:18:57,080 Speaker 1: make sure that when you're purchasing something or streaming or 351 00:18:57,119 --> 00:18:59,760 Speaker 1: licensing something, that it's the best quality and that it's 352 00:18:59,800 --> 00:19:01,560 Speaker 1: men from the artists, and that it's the work that 353 00:19:01,600 --> 00:19:04,560 Speaker 1: you paid for and it's paying the artists. Right. That's 354 00:19:04,600 --> 00:19:06,440 Speaker 1: kind of where we are right now in media, which 355 00:19:06,480 --> 00:19:08,360 Speaker 1: is kind of like people are like people are using 356 00:19:08,359 --> 00:19:11,439 Speaker 1: ad blocker, or people don't want to subscribe, and it's like, no, 357 00:19:11,880 --> 00:19:14,159 Speaker 1: it's not that people don't want to do these things. One, 358 00:19:14,200 --> 00:19:16,959 Speaker 1: there's no better system, and too, we're really bad at 359 00:19:17,000 --> 00:19:19,480 Speaker 1: conveying the value to these consumers. There are a lot 360 00:19:19,520 --> 00:19:21,920 Speaker 1: of people who actually know that the situation from a 361 00:19:21,960 --> 00:19:25,360 Speaker 1: consumer side is shitty. They're annoyed, but they don't actually 362 00:19:25,560 --> 00:19:28,480 Speaker 1: write we don't think about it because it's just been 363 00:19:28,760 --> 00:19:33,359 Speaker 1: this behavior, right, and we've been trained by institutions and 364 00:19:33,440 --> 00:19:36,960 Speaker 1: by the media companies to have the behavior versus going 365 00:19:36,960 --> 00:19:39,280 Speaker 1: into a more natural behavior. This is like when we 366 00:19:39,280 --> 00:19:42,159 Speaker 1: talk to Charlie Melcher two years ago, right, and it 367 00:19:42,240 --> 00:19:43,920 Speaker 1: was powerful and he was saying, you know, at the 368 00:19:44,000 --> 00:19:45,840 Speaker 1: end of the day, and was so obvious, but we 369 00:19:45,880 --> 00:19:47,760 Speaker 1: were like, oh, he's like, at the end of the day, 370 00:19:47,840 --> 00:19:51,800 Speaker 1: screens aren't natural. He's like, we should not have screens right. 371 00:19:51,920 --> 00:19:53,119 Speaker 1: But at the end of the day, we should be 372 00:19:53,200 --> 00:19:55,240 Speaker 1: living in like three six, should be living in the 373 00:19:55,320 --> 00:19:57,760 Speaker 1: natural world. I mean, this is the same kind of 374 00:19:57,960 --> 00:20:01,359 Speaker 1: right idea, is that there our behaviors that are natural 375 00:20:01,400 --> 00:20:04,600 Speaker 1: to us that we've now kind of stopped right and 376 00:20:04,640 --> 00:20:07,560 Speaker 1: build other things around that have diverted our behaviors to 377 00:20:07,680 --> 00:20:10,439 Speaker 1: something else. So I don't think if you went to 378 00:20:11,240 --> 00:20:15,600 Speaker 1: you know, a consumer and a USA Today subscriber, right, 379 00:20:15,720 --> 00:20:17,440 Speaker 1: would they say this is a pain in the ass. 380 00:20:17,480 --> 00:20:19,920 Speaker 1: They're like, no, I'm getting my news. Yeah. I mean 381 00:20:20,480 --> 00:20:22,320 Speaker 1: I actually have a question for you guys, which is, 382 00:20:23,000 --> 00:20:27,200 Speaker 1: we've seen the evolution of how we deliver content change 383 00:20:27,240 --> 00:20:29,639 Speaker 1: so much, right, whether it's how you look at the 384 00:20:29,680 --> 00:20:33,639 Speaker 1: formats right and from text to video to audio, or 385 00:20:33,760 --> 00:20:36,640 Speaker 1: you look at the platforms like how it's distributed through 386 00:20:36,720 --> 00:20:40,840 Speaker 1: a Facebook or a Twitter, Google or Snapchat. But the 387 00:20:40,880 --> 00:20:46,040 Speaker 1: revenue models never change, right, And and we keep reconstructing 388 00:20:46,080 --> 00:20:48,560 Speaker 1: these things. And in my mind, and like I'm curious 389 00:20:48,600 --> 00:20:51,640 Speaker 1: your take, because you're both in this space, is don't 390 00:20:51,640 --> 00:20:53,879 Speaker 1: we kind of have to blow the whole thing up, right, Like, 391 00:20:53,880 --> 00:20:56,520 Speaker 1: like we haven't followed suits, so coming up with a 392 00:20:56,600 --> 00:21:00,800 Speaker 1: new subscription bundle or a new ad unit not necessarily 393 00:21:00,840 --> 00:21:05,280 Speaker 1: won't supplement or be sufficient enough to sustain what we're 394 00:21:05,320 --> 00:21:07,239 Speaker 1: looking to do, but it's not going to make this 395 00:21:07,320 --> 00:21:12,240 Speaker 1: industry better. Well you're basically creating is an opportunity for 396 00:21:12,440 --> 00:21:18,040 Speaker 1: buyers to think about how they do direct deals with creators. 397 00:21:18,480 --> 00:21:20,720 Speaker 1: I mean this is this is the I think one 398 00:21:20,720 --> 00:21:22,959 Speaker 1: of the big points and something you and I had 399 00:21:22,960 --> 00:21:25,639 Speaker 1: talked about Jared a while ago is like, this is DTC, 400 00:21:25,840 --> 00:21:28,800 Speaker 1: but it's direct to creator, right, I mean that's the 401 00:21:28,800 --> 00:21:32,320 Speaker 1: whole like that is that is it? And there and 402 00:21:32,359 --> 00:21:35,240 Speaker 1: the thing is this gang philosophical for a second, But 403 00:21:35,280 --> 00:21:37,600 Speaker 1: the thing is is I don't even know if you 404 00:21:37,720 --> 00:21:41,200 Speaker 1: need media companies anymore. That's the right, That's the whole 405 00:21:41,200 --> 00:21:45,280 Speaker 1: point is that does everyone become their own agent. What 406 00:21:45,680 --> 00:21:48,880 Speaker 1: blows my mind on this, and I'd love to talk 407 00:21:48,960 --> 00:21:51,320 Speaker 1: more about it, is that this gets down to i 408 00:21:51,400 --> 00:21:55,160 Speaker 1: P at a very very micro level, and it doesn't 409 00:21:55,160 --> 00:21:58,600 Speaker 1: matter for who and from who? Right, So meaning it 410 00:21:58,640 --> 00:22:01,840 Speaker 1: doesn't matter who to create. There is whether it's a brand, 411 00:22:02,040 --> 00:22:06,680 Speaker 1: a person, a community institution, that it is truly saying 412 00:22:06,800 --> 00:22:11,480 Speaker 1: this i P is something that has been created, originated 413 00:22:11,600 --> 00:22:17,160 Speaker 1: somewhere right from someone some buddies and it therefore exists 414 00:22:17,160 --> 00:22:20,720 Speaker 1: in these forms because they have right, they have done X. 415 00:22:21,160 --> 00:22:23,359 Speaker 1: That is mind blowing. And so I'm actually going to 416 00:22:23,440 --> 00:22:24,879 Speaker 1: go back to something you said. You said there are 417 00:22:24,880 --> 00:22:29,280 Speaker 1: too many publishers. Actually maybe there are too few publishers. 418 00:22:29,280 --> 00:22:32,280 Speaker 1: Like if you think about too many versus too few, 419 00:22:32,800 --> 00:22:35,840 Speaker 1: does this actually mean that the personalities that were driving 420 00:22:35,880 --> 00:22:41,080 Speaker 1: publishers now transcend them? And so now it actually weeds 421 00:22:41,119 --> 00:22:44,959 Speaker 1: out fluff pieces, etcetera. And now you sort of whatever 422 00:22:45,119 --> 00:22:50,399 Speaker 1: is floats to the top just completely reconstructs the entire system. 423 00:22:50,520 --> 00:22:52,600 Speaker 1: I think we're going to start to really understand the 424 00:22:52,720 --> 00:22:56,280 Speaker 1: value of a media company and publisher, not in terms 425 00:22:56,320 --> 00:22:58,960 Speaker 1: of creator, but as a media companies value through the 426 00:22:59,040 --> 00:23:02,520 Speaker 1: lens of what consumed. You know what publishers deem it right, 427 00:23:02,680 --> 00:23:05,159 Speaker 1: similar to the record labels, the artists on the labels, 428 00:23:05,240 --> 00:23:07,000 Speaker 1: or who you go and read, or who you go 429 00:23:07,240 --> 00:23:10,080 Speaker 1: or who you go and listen to, who you purchase, 430 00:23:10,119 --> 00:23:12,120 Speaker 1: who you want to meet, who you want to see, 431 00:23:12,440 --> 00:23:14,479 Speaker 1: and you understand what the labels bring to them. They 432 00:23:14,520 --> 00:23:18,640 Speaker 1: bring marketing, they bring pop up, they bring merge, right, 433 00:23:19,119 --> 00:23:22,040 Speaker 1: they have the connections, they get them on stage. That's 434 00:23:22,080 --> 00:23:25,359 Speaker 1: what these media companies will be And to your point, 435 00:23:25,400 --> 00:23:28,040 Speaker 1: I think, yes, there will be more creators, but there 436 00:23:28,080 --> 00:23:31,160 Speaker 1: may not need to be all these mammoth at least 437 00:23:31,160 --> 00:23:34,119 Speaker 1: new emerging media companies, if all they're really doing is 438 00:23:34,200 --> 00:23:36,959 Speaker 1: providing a service, right, which is, we could grow your 439 00:23:36,960 --> 00:23:39,240 Speaker 1: business and we could grow your exposure. If you want 440 00:23:39,240 --> 00:23:41,439 Speaker 1: to go and do it indeed on your own, go 441 00:23:41,560 --> 00:23:43,040 Speaker 1: for it, and there will be tools to do that. 442 00:23:43,240 --> 00:23:44,439 Speaker 1: But if you want to come here and then we 443 00:23:44,440 --> 00:23:47,680 Speaker 1: will help expose you. But you are right. The the 444 00:23:47,680 --> 00:23:51,960 Speaker 1: the seesaw right is like completely tilting the other way 445 00:23:52,040 --> 00:23:55,040 Speaker 1: where well it's impact and attention versus reach frequency and 446 00:23:55,080 --> 00:23:57,760 Speaker 1: all of a sudden, what you're saying is that scale 447 00:23:57,800 --> 00:24:00,520 Speaker 1: doesn't matter if only a small percentage of what's in 448 00:24:00,640 --> 00:24:04,520 Speaker 1: that scale is right. And scale fucked up this whole thing. 449 00:24:04,640 --> 00:24:07,359 Speaker 1: I mean, like scale did two things right. Scale changed 450 00:24:07,400 --> 00:24:09,880 Speaker 1: the way that we create content, right, Scale literally going 451 00:24:09,960 --> 00:24:13,680 Speaker 1: back to like clickbait and like creators now creating content 452 00:24:13,720 --> 00:24:15,720 Speaker 1: based off how fast they can publish or the time 453 00:24:15,760 --> 00:24:17,719 Speaker 1: to publish, and how relevant they can make a headline 454 00:24:17,800 --> 00:24:21,560 Speaker 1: or the content within it, and to in normalized content. Right, 455 00:24:21,600 --> 00:24:24,040 Speaker 1: it kind of set a level that said news content 456 00:24:24,280 --> 00:24:27,840 Speaker 1: is equal to lifestyle content is equal to video content 457 00:24:27,920 --> 00:24:30,120 Speaker 1: is equal to a R Even though all of these 458 00:24:30,160 --> 00:24:33,840 Speaker 1: things cost different, the investments are different, the time is different, 459 00:24:34,000 --> 00:24:35,919 Speaker 1: yet the way they get paid on them because of 460 00:24:36,080 --> 00:24:39,400 Speaker 1: scale different is the same. So what replaces the CPN? 461 00:24:39,800 --> 00:24:42,240 Speaker 1: So I think, like platform look like I don't think 462 00:24:42,280 --> 00:24:44,840 Speaker 1: that the platform model. And again you probably have better 463 00:24:44,880 --> 00:24:47,080 Speaker 1: perspective on this than me. I think for brands and 464 00:24:47,119 --> 00:24:49,760 Speaker 1: purchasing and if you're looking to like sell a product 465 00:24:49,880 --> 00:24:51,439 Speaker 1: right or get a product in front of people, are 466 00:24:51,520 --> 00:24:54,280 Speaker 1: used to the Nike Kaepernick campaign, Like the value of 467 00:24:54,320 --> 00:24:58,560 Speaker 1: that is huge. So I think what we need to 468 00:24:58,600 --> 00:25:02,080 Speaker 1: be doing again going act to like we're constantly repairing 469 00:25:02,119 --> 00:25:05,400 Speaker 1: old models and trying to sustain and fix them. So 470 00:25:05,640 --> 00:25:07,840 Speaker 1: what happens to the CPM, What happens to the ad? 471 00:25:07,920 --> 00:25:11,240 Speaker 1: What happens to subscriptions, what happens to bundle, um, you 472 00:25:11,280 --> 00:25:13,639 Speaker 1: see cable, O, T T all of these things? What 473 00:25:13,720 --> 00:25:16,159 Speaker 1: if there's like new models or ideas where like you 474 00:25:16,160 --> 00:25:19,159 Speaker 1: actually hold like your i P is your asset? What 475 00:25:19,200 --> 00:25:21,239 Speaker 1: if people bid, like what a people bid on your 476 00:25:21,280 --> 00:25:24,000 Speaker 1: i P? But then you could resell that, right, you're 477 00:25:24,080 --> 00:25:26,240 Speaker 1: holding that that is an asset that you bought it 478 00:25:26,320 --> 00:25:29,159 Speaker 1: a million dollars, that is now dollars, right, So like 479 00:25:29,480 --> 00:25:32,159 Speaker 1: this economy could open up new things. That's not a 480 00:25:32,200 --> 00:25:35,680 Speaker 1: new concept, right, That is not a fucking new concept. 481 00:25:36,040 --> 00:25:40,920 Speaker 1: What's new about it is it's completely distributed and it's gone. 482 00:25:41,000 --> 00:25:43,840 Speaker 1: It goes back to the individual, not just the individual 483 00:25:43,920 --> 00:25:48,879 Speaker 1: to institution, and it's all consumer based. Most sales representatives 484 00:25:48,920 --> 00:25:52,560 Speaker 1: will come in and they'll say, our platform reaches x 485 00:25:52,600 --> 00:25:56,840 Speaker 1: million people per month. The conversation to what you're alluding 486 00:25:56,880 --> 00:25:58,560 Speaker 1: to and correct me if this is wrong, is now 487 00:25:58,640 --> 00:26:05,720 Speaker 1: saying our reputation amongst our consumers is x people value creator, A, 488 00:26:05,840 --> 00:26:10,600 Speaker 1: B and C. At why that's why this is pretium, Yes, exactly, 489 00:26:10,760 --> 00:26:14,399 Speaker 1: and it allows emerging creators to come about. Right. So 490 00:26:14,440 --> 00:26:17,720 Speaker 1: going back to why blockchain of it being an immutable 491 00:26:17,800 --> 00:26:21,119 Speaker 1: ledger of information and an incentive model is like people 492 00:26:21,119 --> 00:26:23,960 Speaker 1: who have ideas that are in deep pockets of this 493 00:26:24,119 --> 00:26:27,080 Speaker 1: earth that do not know where to send that information 494 00:26:27,440 --> 00:26:30,359 Speaker 1: and do not have the connections, are somewhat scared to 495 00:26:30,400 --> 00:26:32,439 Speaker 1: put their IP out there. Right. The beauty of an 496 00:26:32,440 --> 00:26:35,480 Speaker 1: immutable ledger and blockchain is like, hey, look before I 497 00:26:35,560 --> 00:26:37,800 Speaker 1: show this to the world, I'm gonna put it here 498 00:26:37,840 --> 00:26:39,320 Speaker 1: so then I could prove that it was mine and 499 00:26:39,320 --> 00:26:41,240 Speaker 1: that it was my idea, so that it cannot be 500 00:26:41,320 --> 00:26:44,479 Speaker 1: ripped off, and I'm not scared to be discovered writing right, 501 00:26:44,560 --> 00:26:47,080 Speaker 1: and then you open up this opportunity. So this idea 502 00:26:47,119 --> 00:26:50,320 Speaker 1: of discover ability and authenticity and what's real and what's 503 00:26:50,320 --> 00:26:53,159 Speaker 1: not real are all elements that could be built on 504 00:26:53,200 --> 00:26:55,760 Speaker 1: top of the blockchain. And that's a great way to 505 00:26:55,760 --> 00:26:57,760 Speaker 1: think about it, because it doesn't necessarily need to be 506 00:26:57,800 --> 00:27:00,560 Speaker 1: a replacement, but if it's more efficient, if it opens 507 00:27:00,600 --> 00:27:02,680 Speaker 1: up new opportunities, then that should be the goal of 508 00:27:02,720 --> 00:27:06,080 Speaker 1: any emergency authorship its ownership. Like how does a brand 509 00:27:06,920 --> 00:27:09,120 Speaker 1: start and say, hey, this is really interesting. I'm gonna 510 00:27:09,160 --> 00:27:11,760 Speaker 1: throw a couple of dollars towards this. What does that 511 00:27:11,840 --> 00:27:16,280 Speaker 1: look like? Great question? Short answer. The best part about 512 00:27:16,320 --> 00:27:20,440 Speaker 1: blockchain is that it is so new and so overexposed 513 00:27:20,720 --> 00:27:22,720 Speaker 1: that likely you do not need to spend too much 514 00:27:22,720 --> 00:27:26,040 Speaker 1: money at all experimenting with it, and I strongly encourage 515 00:27:26,040 --> 00:27:28,000 Speaker 1: a lot of companies to do so because some of 516 00:27:28,000 --> 00:27:30,479 Speaker 1: the best technologists and thinkers are in this space right now, 517 00:27:30,520 --> 00:27:33,000 Speaker 1: and that is without a doubt happening. Do I need 518 00:27:33,080 --> 00:27:35,399 Speaker 1: engineers do I need data architects, do I need what? 519 00:27:35,440 --> 00:27:38,600 Speaker 1: Do I need? I need my staff? So so a 520 00:27:38,640 --> 00:27:41,359 Speaker 1: lot of it is education, right, Like what's actually amazing 521 00:27:41,440 --> 00:27:46,720 Speaker 1: is medium has become this this Bible um, this like 522 00:27:47,520 --> 00:27:50,560 Speaker 1: Bible upon Bible bomb, Bible of ways of thinking right, 523 00:27:50,600 --> 00:27:55,680 Speaker 1: the methods behind blockchain, incentive systems, governance systems, creation systems. 524 00:27:55,840 --> 00:27:57,840 Speaker 1: So a lot of it is just educating and learning. 525 00:27:58,119 --> 00:28:01,480 Speaker 1: There's companies like ad Ledger that are together blockchain communities. 526 00:28:01,560 --> 00:28:04,880 Speaker 1: That's the mad Hive crew UM and others like IBM 527 00:28:04,880 --> 00:28:06,680 Speaker 1: and so forth that are really thinking about how best 528 00:28:06,680 --> 00:28:08,280 Speaker 1: to use it. I think the way to think about 529 00:28:08,280 --> 00:28:10,640 Speaker 1: it is too fold. One is have a curious mind. 530 00:28:10,720 --> 00:28:12,960 Speaker 1: And we always said this at like Rebel Mouse, and 531 00:28:13,000 --> 00:28:14,919 Speaker 1: I say it now is like, find that one person 532 00:28:14,960 --> 00:28:17,480 Speaker 1: within your organ that is hungry, that wants to do 533 00:28:17,560 --> 00:28:21,359 Speaker 1: something interesting and let them go. And you don't necessarily 534 00:28:21,359 --> 00:28:23,320 Speaker 1: need to say we need a million dollar budget or 535 00:28:23,359 --> 00:28:25,560 Speaker 1: we need three engineers to do this. A lot of 536 00:28:25,560 --> 00:28:27,160 Speaker 1: the companies that are here, a lot of the code 537 00:28:27,280 --> 00:28:29,600 Speaker 1: is open source, and they all have resources to work with. 538 00:28:29,800 --> 00:28:32,840 Speaker 1: I could speak um personally right when it comes to 539 00:28:32,880 --> 00:28:36,280 Speaker 1: poet that we are equipped right to work with publishers 540 00:28:36,320 --> 00:28:38,960 Speaker 1: brands to onboard them and help educate them and help 541 00:28:38,960 --> 00:28:42,880 Speaker 1: them learn about this project. Right, I think what the 542 00:28:42,920 --> 00:28:45,400 Speaker 1: best companies in this space will do is build bridges, 543 00:28:45,400 --> 00:28:49,840 Speaker 1: not motes. Right, we understand that it's new, um unlike 544 00:28:49,880 --> 00:28:51,720 Speaker 1: what happened with a tech which is like, oh, you 545 00:28:51,760 --> 00:28:53,960 Speaker 1: don't get it, forget it, or you need this because 546 00:28:54,000 --> 00:28:56,600 Speaker 1: you don't get it. Where like, let us explain exactly 547 00:28:56,640 --> 00:28:58,680 Speaker 1: why this is important and let us lead you over 548 00:28:58,760 --> 00:29:01,000 Speaker 1: here and then prove to you why it's interesting. Tell 549 00:29:01,080 --> 00:29:02,920 Speaker 1: me what I need. All you need is you? Okay, 550 00:29:02,960 --> 00:29:07,280 Speaker 1: so do I need to All you need is you? 551 00:29:07,320 --> 00:29:09,160 Speaker 1: Like you don't need to be an engineer. You just 552 00:29:09,200 --> 00:29:11,760 Speaker 1: have to really think about what makes most sense when 553 00:29:11,800 --> 00:29:15,640 Speaker 1: things become distributed, when they are not focused on one 554 00:29:15,920 --> 00:29:18,840 Speaker 1: central point of failure, when everything could be open, and 555 00:29:18,880 --> 00:29:21,760 Speaker 1: what makes most sense. That's contributor networks. That's a way 556 00:29:21,800 --> 00:29:24,960 Speaker 1: of exposing information. That is a way of delivering information 557 00:29:25,200 --> 00:29:28,800 Speaker 1: that could be anything, and it's particular based on your specifically. 558 00:29:28,960 --> 00:29:31,480 Speaker 1: The other thing is and and I worked on this side. 559 00:29:31,640 --> 00:29:33,960 Speaker 1: Don't think of a business case off the bat, like 560 00:29:34,600 --> 00:29:37,680 Speaker 1: have fun experiment. Think of it like something on the 561 00:29:37,720 --> 00:29:39,520 Speaker 1: side because there's so much that could come out of it. 562 00:29:39,560 --> 00:29:41,720 Speaker 1: And it's the first inning that if you could be 563 00:29:41,760 --> 00:29:43,440 Speaker 1: on the bleeding edge of thinking about these things and 564 00:29:43,480 --> 00:29:46,040 Speaker 1: you can actually not just build and learn, but you 565 00:29:46,040 --> 00:29:48,160 Speaker 1: can actually construct the way that it's going to work 566 00:29:48,160 --> 00:29:50,800 Speaker 1: in the future. Line. So when you first started a poet, 567 00:29:50,840 --> 00:29:52,520 Speaker 1: I called you and I said I want to put 568 00:29:52,560 --> 00:29:55,080 Speaker 1: at Land down poet, and we talked about it. So 569 00:29:55,120 --> 00:29:56,880 Speaker 1: now I am going to bring it back up because 570 00:29:56,880 --> 00:29:58,680 Speaker 1: we have not done it. And let's say that this 571 00:29:58,760 --> 00:30:01,800 Speaker 1: becomes a challenge at ANDIA to go on, get on, poet, 572 00:30:02,360 --> 00:30:05,480 Speaker 1: get distributed. We'll bring our listeners with us and tell 573 00:30:05,520 --> 00:30:07,520 Speaker 1: them like what happens and how we're doing and what 574 00:30:07,560 --> 00:30:10,640 Speaker 1: we're learning and all that. It's an official partnership done. 575 00:30:10,720 --> 00:30:13,320 Speaker 1: You heard it here first, Jared Dicker. So before you 576 00:30:13,360 --> 00:30:15,920 Speaker 1: go kill buy d I y what would you kill? 577 00:30:16,080 --> 00:30:18,760 Speaker 1: I think what I would kill as influencer networks? Okay? Good? 578 00:30:18,800 --> 00:30:22,320 Speaker 1: What would you buy? Um? What would I buy? I 579 00:30:22,360 --> 00:30:24,440 Speaker 1: would actually buy? Like right now and thinking about buying 580 00:30:24,480 --> 00:30:26,960 Speaker 1: media companies that poets should be buying media companies and 581 00:30:27,000 --> 00:30:29,240 Speaker 1: helping structure them in this sort of way of like 582 00:30:30,080 --> 00:30:34,480 Speaker 1: focus on reputation, focus on creators, So I would buy maybe, 583 00:30:34,520 --> 00:30:37,240 Speaker 1: let's say the Village Voice. What would I do myself? 584 00:30:38,040 --> 00:30:40,400 Speaker 1: What everyone should be thinking about and doing themselves is 585 00:30:40,480 --> 00:30:43,719 Speaker 1: kind of how best to redefine what their models and 586 00:30:43,760 --> 00:30:45,920 Speaker 1: what their business and value is. I mean, I could say, 587 00:30:46,160 --> 00:30:48,280 Speaker 1: and I'm biased, but what the Washington Post has done 588 00:30:48,320 --> 00:30:51,360 Speaker 1: with ARC, they announced with Dallas Morning News they're powering. 589 00:30:52,160 --> 00:30:53,840 Speaker 1: They'll soon be I mean, from what it looks like, 590 00:30:53,880 --> 00:30:56,760 Speaker 1: they'll soon be powering like all of the local news 591 00:30:56,880 --> 00:31:01,040 Speaker 1: and uh newspaper services that we know today and likely beyond. 592 00:31:01,600 --> 00:31:05,000 Speaker 1: And I think that that's an excellent UM way of thinking, 593 00:31:05,240 --> 00:31:07,720 Speaker 1: and it's an excellent proofpoint that if you invest in 594 00:31:07,800 --> 00:31:11,000 Speaker 1: something and take certain risks, then new opportunities open up 595 00:31:11,000 --> 00:31:14,600 Speaker 1: that aren't necessarily limiting and that are interoperable beyond kind 596 00:31:14,640 --> 00:31:16,200 Speaker 1: of what you're doing. And those are like the most 597 00:31:16,200 --> 00:31:21,120 Speaker 1: successful people write these entrepreneurs that think about um, question everything, 598 00:31:21,160 --> 00:31:24,440 Speaker 1: and just are honest with themselves about where they are 599 00:31:24,440 --> 00:31:26,520 Speaker 1: and where they should be. Well, if people want to 600 00:31:26,560 --> 00:31:28,560 Speaker 1: reach you to learn more about the blockchain, where can 601 00:31:28,600 --> 00:31:30,480 Speaker 1: they get in touch with you? Jared at Peo dot 602 00:31:30,520 --> 00:31:33,800 Speaker 1: et or Twitter at Jared Dicker or any fish show, 603 00:31:33,960 --> 00:31:37,680 Speaker 1: any Fish show. Um. Yes, like yeah, if anyone wants 604 00:31:37,680 --> 00:31:39,840 Speaker 1: to engage or talk about fish or the Dead or 605 00:31:39,960 --> 00:31:44,040 Speaker 1: Leon Russell or anything on Twitter, That's what I'm enjoyed 606 00:31:44,080 --> 00:31:50,960 Speaker 1: talking about. Thank you, Jared, so big thanks to Jared Dicker, 607 00:31:51,040 --> 00:31:54,720 Speaker 1: who is always blowing our minds. Lots to learn, lots 608 00:31:54,720 --> 00:31:58,400 Speaker 1: to digest, lots really kind of think about and apply 609 00:31:58,600 --> 00:32:00,960 Speaker 1: one thing. Like I would say in vite Jared into 610 00:32:01,000 --> 00:32:04,720 Speaker 1: your company. Invite Jared whether media company, brand, startup, Invite 611 00:32:04,760 --> 00:32:08,200 Speaker 1: Jared into your company to walk your teams through this 612 00:32:08,440 --> 00:32:12,280 Speaker 1: whole new um way of thinking in this technology. I 613 00:32:12,320 --> 00:32:14,520 Speaker 1: think that his whole point about, like we don't really 614 00:32:14,520 --> 00:32:17,320 Speaker 1: have competitors because we're all kind of moving towards this 615 00:32:17,760 --> 00:32:21,680 Speaker 1: you know, idea and technology of greater good and real 616 00:32:21,720 --> 00:32:27,040 Speaker 1: trust and transparency, etcetera. He wants people to understand this 617 00:32:27,160 --> 00:32:30,240 Speaker 1: and he wants to bring people on board, so call 618 00:32:30,360 --> 00:32:33,400 Speaker 1: him so with that, big thanks to our producer Dana, 619 00:32:34,240 --> 00:32:37,600 Speaker 1: Jacob Weisberg, Andy Bowers, Matt Turk, and all of our 620 00:32:37,640 --> 00:32:39,840 Speaker 1: friends and family at Panoply. We'll be back in two 621 00:32:39,840 --> 00:32:46,840 Speaker 1: weeks by Atlandia. Full disclosure, our opinions, our own