1 00:00:00,280 --> 00:00:08,119 Speaker 1: Bloomberg Audio Studios, Podcasts, radio news. 2 00:00:08,600 --> 00:00:13,840 Speaker 2: Hello, Hello, Hello, welcome to on Air Fast twenty twenty six. 3 00:00:14,720 --> 00:00:17,240 Speaker 2: Thank you so much. I'm Kristin Meinzer. I'm your MC 4 00:00:17,560 --> 00:00:20,360 Speaker 2: for the day. And in case you don't know who 5 00:00:20,360 --> 00:00:22,800 Speaker 2: I am, I'm a podcaster, I'm a royal watcher. I'm 6 00:00:22,840 --> 00:00:25,000 Speaker 2: a culture critic. I've been very busy over the last 7 00:00:25,000 --> 00:00:26,920 Speaker 2: week with some royal news you may have heard of, 8 00:00:27,320 --> 00:00:30,680 Speaker 2: so you may know me from those things. But today 9 00:00:30,760 --> 00:00:33,360 Speaker 2: is not about me. It is about the fantastic talent 10 00:00:33,440 --> 00:00:35,680 Speaker 2: that is going to be on the stage and at 11 00:00:35,680 --> 00:00:39,400 Speaker 2: this festival. Let's get to our first live taping of 12 00:00:39,520 --> 00:00:44,040 Speaker 2: the day, Bloomberg's Odd Lots podcast. The hosts of Odd 13 00:00:44,080 --> 00:00:48,040 Speaker 2: Lots are Joe Wisenthal and Tracy Alloway. Please welcome them 14 00:00:48,040 --> 00:00:53,160 Speaker 2: to the stage, along with their very special guest, Henry Blodgett. 15 00:00:53,400 --> 00:01:03,000 Speaker 2: Thank you so much. Welcome to the stage. 16 00:01:03,560 --> 00:01:04,399 Speaker 1: Thank you so much. 17 00:01:04,440 --> 00:01:07,240 Speaker 3: And hello and welcome to a live episode of the 18 00:01:07,280 --> 00:01:08,200 Speaker 3: Odd Lots podcast. 19 00:01:08,240 --> 00:01:10,280 Speaker 4: I'm Joe Wisenthal and I'm Tracy Alloway. 20 00:01:10,680 --> 00:01:14,200 Speaker 3: Tracy, We're definitely not going to do this, but it 21 00:01:14,280 --> 00:01:16,000 Speaker 3: definitely feels like almost. 22 00:01:15,800 --> 00:01:18,120 Speaker 1: Every episode could be about AI these days. Right, Like 23 00:01:18,160 --> 00:01:18,759 Speaker 1: it's just touching. 24 00:01:18,840 --> 00:01:21,199 Speaker 3: So all the things we talk about are solely getting 25 00:01:21,400 --> 00:01:24,160 Speaker 3: subsumed into this topic that we're not going to do it, 26 00:01:24,160 --> 00:01:24,920 Speaker 3: but we probably could. 27 00:01:25,040 --> 00:01:27,559 Speaker 4: It's not just that every episode could be on AI, 28 00:01:27,680 --> 00:01:30,160 Speaker 4: it's that every episode could probably be made by AI 29 00:01:30,240 --> 00:01:32,120 Speaker 4: at this point, right. And in fact, we've had a 30 00:01:32,160 --> 00:01:34,320 Speaker 4: few people last year. I remember there were a few 31 00:01:34,319 --> 00:01:38,000 Speaker 4: people who made AI generated episodes of the podcast and 32 00:01:38,040 --> 00:01:40,040 Speaker 4: they weren't that bad. And I have to think if 33 00:01:40,040 --> 00:01:42,360 Speaker 4: they did it today, they would be even better. 34 00:01:42,600 --> 00:01:45,479 Speaker 1: It would definitely be better anyway, not better than us, 35 00:01:45,600 --> 00:01:47,840 Speaker 1: but they would be better than they were. I'm not 36 00:01:48,040 --> 00:01:50,160 Speaker 1: sure they'd be better than they were a year ago anyway. 37 00:01:50,320 --> 00:01:52,760 Speaker 3: So we have to talk about AI and what it 38 00:01:52,840 --> 00:01:55,520 Speaker 3: is doing to the market, finance, economics, et cetera. And 39 00:01:55,560 --> 00:01:57,440 Speaker 3: of course we have to talk about what it means 40 00:01:57,480 --> 00:01:59,880 Speaker 3: for us in the media industry. And I'm really excited 41 00:01:59,880 --> 00:02:02,080 Speaker 3: to say we do have the perfect guest. 42 00:02:02,120 --> 00:02:02,600 Speaker 1: We're going to be. 43 00:02:02,600 --> 00:02:05,720 Speaker 3: Speaking here with Henry Blodgett, the CEO of Regenerator. He 44 00:02:05,840 --> 00:02:07,360 Speaker 3: used to be my boss when I was at a 45 00:02:07,480 --> 00:02:11,720 Speaker 3: Business Insider, someone who has a background in both media 46 00:02:11,880 --> 00:02:14,519 Speaker 3: and of course Wall Street and finance, so the perfect 47 00:02:14,560 --> 00:02:17,079 Speaker 3: person to talk about all the different dimensions. 48 00:02:17,120 --> 00:02:18,120 Speaker 4: Truly the perfect guest. 49 00:02:18,160 --> 00:02:20,400 Speaker 3: All right, Henry, thank you so much for coming back 50 00:02:20,440 --> 00:02:21,040 Speaker 3: on odlots. 51 00:02:21,160 --> 00:02:22,680 Speaker 5: Thank you for having me so. 52 00:02:24,400 --> 00:02:26,360 Speaker 3: The last time we talked to was in May of 53 00:02:26,440 --> 00:02:29,600 Speaker 3: last year, and the theme then was everyone was talking 54 00:02:29,600 --> 00:02:33,160 Speaker 3: about here are these huge evaluations for AI companies, how 55 00:02:33,200 --> 00:02:35,520 Speaker 3: could they possibly be justified? And that was an interesting 56 00:02:35,520 --> 00:02:39,600 Speaker 3: conversation And it feels like the narrative has now flipped 57 00:02:39,600 --> 00:02:42,000 Speaker 3: one hundred and eighty degrees where it's like these companies 58 00:02:42,040 --> 00:02:44,280 Speaker 3: are so powerful they're going to destroy every other company 59 00:02:44,320 --> 00:02:47,200 Speaker 3: they're wake and every software company, a payments provider, et cetera. 60 00:02:47,440 --> 00:02:49,680 Speaker 3: Has been like plunging really since the start of the year. 61 00:02:50,280 --> 00:02:52,280 Speaker 3: How have you like updated your views? Where are you 62 00:02:52,320 --> 00:02:54,040 Speaker 3: now versus where you were in same May. 63 00:02:54,400 --> 00:02:56,560 Speaker 5: We are early in the development of the industry. This 64 00:02:56,639 --> 00:02:59,320 Speaker 5: looks a lot like the Internet in the nineteen nineties 65 00:02:59,639 --> 00:03:02,560 Speaker 5: for the if you remember, basically, when you have a 66 00:03:02,639 --> 00:03:06,560 Speaker 5: huge opportunity like this early on, you can have a 67 00:03:06,680 --> 00:03:10,880 Speaker 5: huge range of reasonable predictions about what is going to happen. 68 00:03:11,120 --> 00:03:14,240 Speaker 5: And here we one day it's catastrophe. None of us 69 00:03:14,240 --> 00:03:16,120 Speaker 5: are ever going to have a job. Again, everything's going 70 00:03:16,160 --> 00:03:18,320 Speaker 5: to create her other days, Hey, it's just a fancy 71 00:03:18,320 --> 00:03:21,560 Speaker 5: word process or no worries. Open ai is way overvalued. 72 00:03:21,760 --> 00:03:24,519 Speaker 5: I will say that since we talked last I believe 73 00:03:24,560 --> 00:03:28,400 Speaker 5: that we were aghast at the open ai three hundred 74 00:03:28,400 --> 00:03:31,200 Speaker 5: billion dollar valuation. I believe the latest one is eight 75 00:03:31,280 --> 00:03:34,399 Speaker 5: hundred billion, So we are climbing. But there are other 76 00:03:34,440 --> 00:03:36,760 Speaker 5: companies out there that are worth many times that. So 77 00:03:36,800 --> 00:03:39,440 Speaker 5: if one does think that AI is going to take 78 00:03:39,480 --> 00:03:41,720 Speaker 5: over the world and open Ai is the Google of 79 00:03:41,760 --> 00:03:44,680 Speaker 5: the AI era, which is the thesis which I think 80 00:03:44,760 --> 00:03:48,000 Speaker 5: is very misguided, we can get into, then one can 81 00:03:48,080 --> 00:03:51,200 Speaker 5: imagine that a company might be worth many multiples of 82 00:03:51,360 --> 00:03:52,720 Speaker 5: some of the companies that are out there. 83 00:03:53,040 --> 00:03:55,520 Speaker 4: So one of the crazy things that happened this week 84 00:03:55,600 --> 00:03:58,880 Speaker 4: speaking of AI and media, is there's a guy. We've 85 00:03:58,880 --> 00:04:02,000 Speaker 4: had him on the show, James Galen aka Satriny. He 86 00:04:02,080 --> 00:04:05,520 Speaker 4: runs a hedge fund also publishes an investment newsletter. He 87 00:04:05,600 --> 00:04:08,920 Speaker 4: basically wrote a doom case scenario of the year twenty 88 00:04:09,000 --> 00:04:12,800 Speaker 4: twenty eight where everything had become AI and because everything 89 00:04:12,880 --> 00:04:15,360 Speaker 4: was AI, no one had any jobs, no one was 90 00:04:15,400 --> 00:04:19,240 Speaker 4: actually buying anything in the economy, and we basically had 91 00:04:19,279 --> 00:04:23,560 Speaker 4: an economic and financial crisis when I read that piece 92 00:04:23,640 --> 00:04:26,919 Speaker 4: and then saw the market reaction to it, so stocks 93 00:04:26,960 --> 00:04:30,520 Speaker 4: fell pretty dramatically, and people were attributing it to basically 94 00:04:30,560 --> 00:04:33,760 Speaker 4: a newsletter that was a thought piece. I thought that 95 00:04:33,800 --> 00:04:36,159 Speaker 4: was kind of insane, and it speaks to what you 96 00:04:36,320 --> 00:04:38,800 Speaker 4: just said, which is, when we're in this really uncertain time, 97 00:04:39,120 --> 00:04:42,880 Speaker 4: you can go from sort of euphoria to doomerism very 98 00:04:42,960 --> 00:04:43,520 Speaker 4: very quickly. 99 00:04:43,720 --> 00:04:46,400 Speaker 5: Exactly because it's all predictions. We're all looking into the 100 00:04:46,400 --> 00:04:48,640 Speaker 5: hazy future saying how big is it going to be? 101 00:04:49,200 --> 00:04:52,719 Speaker 5: These small changes in your assumptions make a huge difference 102 00:04:52,760 --> 00:04:55,480 Speaker 5: in the current value. I will say, because that research 103 00:04:55,480 --> 00:05:00,279 Speaker 5: report was both heralded and pillarid, this is exactly fully 104 00:05:00,360 --> 00:05:02,800 Speaker 5: what you want analysts to do. You want them to 105 00:05:02,839 --> 00:05:05,400 Speaker 5: look out and take a strong position and say here's 106 00:05:05,400 --> 00:05:07,440 Speaker 5: what it can look like. And I thought it was 107 00:05:07,440 --> 00:05:10,960 Speaker 5: a fascinating report. I will point out that even though 108 00:05:11,000 --> 00:05:14,280 Speaker 5: as you just characterized it, that this is economic armageddon, 109 00:05:14,560 --> 00:05:17,000 Speaker 5: one of the first sentences of the report says ten 110 00:05:17,040 --> 00:05:21,400 Speaker 5: percent unemployment, which means ninety percent employment. Most of us 111 00:05:21,440 --> 00:05:24,760 Speaker 5: still have a job, and yes there are changes in industries, 112 00:05:24,800 --> 00:05:27,000 Speaker 5: but most of us are still working. So I was 113 00:05:27,040 --> 00:05:32,600 Speaker 5: surprised after hearing about this predicts catastrophe that I picked 114 00:05:32,600 --> 00:05:34,800 Speaker 5: it up. It's like, well, no, it predicts a big 115 00:05:34,839 --> 00:05:38,240 Speaker 5: technology change that sometimes takes the economy awhile to digest, 116 00:05:38,560 --> 00:05:39,800 Speaker 5: but not armageddon. 117 00:05:40,240 --> 00:05:42,200 Speaker 3: What do you think though, it says like about the 118 00:05:42,240 --> 00:05:45,080 Speaker 3: market right now? I mean, and it's kind of an 119 00:05:45,120 --> 00:05:48,120 Speaker 3: interaction of a market story in a media story, because 120 00:05:48,200 --> 00:05:51,040 Speaker 3: here's a substack piece that really did seem to move 121 00:05:51,120 --> 00:05:51,720 Speaker 3: the market. 122 00:05:51,560 --> 00:05:53,279 Speaker 1: Quite a bit on Monday, right yeah. 123 00:05:53,360 --> 00:05:56,640 Speaker 3: What does it say about the market environment that, because 124 00:05:56,640 --> 00:05:59,599 Speaker 3: it's all your right, the software stocks have been getting 125 00:05:59,600 --> 00:06:03,040 Speaker 3: clovered all year, and our AI agents, I mean, we'll 126 00:06:03,040 --> 00:06:05,039 Speaker 3: get into that, gonna somehow make it so we don't 127 00:06:05,040 --> 00:06:07,640 Speaker 3: need to buy regular business software anymore. I don't totally 128 00:06:07,760 --> 00:06:10,080 Speaker 3: understand that, but this has been the fear all year. 129 00:06:10,200 --> 00:06:11,920 Speaker 3: What does it sort of say about the market slash 130 00:06:11,960 --> 00:06:15,000 Speaker 3: media environment that a substack piece like this could just 131 00:06:15,120 --> 00:06:17,640 Speaker 3: go absolutely megaviral like that and move the market. 132 00:06:17,880 --> 00:06:22,840 Speaker 5: Everybody is twitchy. Valuations were high to begin with. We're 133 00:06:22,880 --> 00:06:27,040 Speaker 5: all reevaluating our views every day saying, am I crazy 134 00:06:27,080 --> 00:06:28,960 Speaker 5: thinking it could be this? Maybe I'm crazy? I have 135 00:06:29,000 --> 00:06:32,719 Speaker 5: to panic a little bit so we've seen stocks roll over. 136 00:06:32,800 --> 00:06:36,640 Speaker 5: The multiples are compressing a little bit again. So far 137 00:06:36,680 --> 00:06:39,400 Speaker 5: from armageddon. It's people saying, okay, there may be some 138 00:06:39,600 --> 00:06:42,880 Speaker 5: change here. And to your point, the big Bayer case 139 00:06:43,120 --> 00:06:46,159 Speaker 5: now is that, oh my goodness, claud Code is so good, 140 00:06:46,520 --> 00:06:49,279 Speaker 5: no one will ever buy software anymore. You'll just say, hey, 141 00:06:49,320 --> 00:06:50,200 Speaker 5: make me the software. 142 00:06:50,279 --> 00:06:51,640 Speaker 1: Yeah, I've done that. 143 00:06:51,720 --> 00:06:54,680 Speaker 5: Yeah you have. You're a big experimenter. Yeah, but I 144 00:06:54,720 --> 00:06:57,239 Speaker 5: would say you were probably not spending billions of dollars 145 00:06:57,279 --> 00:06:58,960 Speaker 5: in enterprise software before I do. 146 00:06:59,520 --> 00:07:01,479 Speaker 3: I don't know a thousands of people out in my 147 00:07:01,560 --> 00:07:04,159 Speaker 3: payroll who they need to get it at Fried you 148 00:07:04,200 --> 00:07:04,960 Speaker 3: know exactly. 149 00:07:05,000 --> 00:07:07,080 Speaker 5: You depend on this. And one of the nice things 150 00:07:07,120 --> 00:07:09,720 Speaker 5: about having a company standing behind a product is you 151 00:07:09,760 --> 00:07:12,440 Speaker 5: actually can hold them accountable for it. And maybe claud 152 00:07:12,520 --> 00:07:15,200 Speaker 5: Code doesn't make your enterprise software the you why you want, 153 00:07:15,360 --> 00:07:18,240 Speaker 5: somebody doesn't get paid. So as I hear that, to me, 154 00:07:18,320 --> 00:07:22,640 Speaker 5: that is hysteria. I don't understand it. Software companies are 155 00:07:22,720 --> 00:07:26,040 Speaker 5: not going away. Companies that are not in the software 156 00:07:26,080 --> 00:07:29,360 Speaker 5: business are not going to suddenly say, oh, let's have 157 00:07:29,760 --> 00:07:33,000 Speaker 5: a junior person who understands AI build all our software 158 00:07:33,040 --> 00:07:36,360 Speaker 5: for us. It is not going to happen, but overall, 159 00:07:36,720 --> 00:07:40,160 Speaker 5: big changes coming, lots of disruption. Do we know for 160 00:07:40,280 --> 00:07:42,600 Speaker 5: sure what that's going to do to profits of these 161 00:07:42,640 --> 00:07:45,800 Speaker 5: companies and change in the economy. We do not, And 162 00:07:45,840 --> 00:07:49,160 Speaker 5: that is called the discount rate getting higher. It feels riskier. 163 00:07:49,440 --> 00:07:54,280 Speaker 5: And on that day, Monday, everybody was feeling suddenly risk averse. 164 00:07:54,440 --> 00:07:58,040 Speaker 5: Today I'll point out as we tape, everyone's saying, no problem. 165 00:07:58,400 --> 00:07:59,440 Speaker 5: Off to the races again. 166 00:08:00,200 --> 00:08:03,840 Speaker 4: You've obviously lived and been very active and influential in 167 00:08:03,880 --> 00:08:08,240 Speaker 4: a previous technological I guess cycle of disruption. What do 168 00:08:08,280 --> 00:08:11,280 Speaker 4: you think people are getting wrong here, especially on Wall 169 00:08:11,280 --> 00:08:14,160 Speaker 4: Street when it comes to their approach on this particular 170 00:08:14,200 --> 00:08:15,800 Speaker 4: one AI versus dot Com. 171 00:08:16,200 --> 00:08:19,400 Speaker 5: I think this goes way beyond Wall Street. I think 172 00:08:19,400 --> 00:08:22,880 Speaker 5: there really is a big camp still, and the case 173 00:08:22,880 --> 00:08:24,680 Speaker 5: gets made and then it gets tamped down, and then 174 00:08:24,720 --> 00:08:27,200 Speaker 5: it comes back even more strongly that there really will 175 00:08:27,240 --> 00:08:30,720 Speaker 5: be no jobs for humans anymore. And what I would 176 00:08:30,720 --> 00:08:32,559 Speaker 5: say to that is, I think the bigger risk with 177 00:08:32,679 --> 00:08:38,440 Speaker 5: AI is actually that somebody invents an agenttic system that 178 00:08:38,600 --> 00:08:40,520 Speaker 5: actually gets out in the world and does a lot 179 00:08:40,559 --> 00:08:43,120 Speaker 5: of damage. I think that is a bigger risk. If 180 00:08:43,160 --> 00:08:46,920 Speaker 5: you look at technology job transitions over time, like the 181 00:08:47,000 --> 00:08:51,760 Speaker 5: big one, which is agriculture to industrial. Two hundred years ago, 182 00:08:51,960 --> 00:08:54,720 Speaker 5: ninety three percent of us worked on farms. Now it's 183 00:08:54,720 --> 00:08:58,400 Speaker 5: two percent. That's a big transition, and yet all through 184 00:08:58,440 --> 00:09:01,560 Speaker 5: that the number of jobs grow grew. And same thing 185 00:09:01,640 --> 00:09:05,360 Speaker 5: with things like telephone operators with steam engines. You look 186 00:09:05,400 --> 00:09:08,559 Speaker 5: at any piece of it, there is disruption and pain 187 00:09:08,800 --> 00:09:11,480 Speaker 5: and I'm not minimizing that for the people whose jobs 188 00:09:11,520 --> 00:09:15,240 Speaker 5: are changed or disrupted by it. But the economy overall 189 00:09:15,520 --> 00:09:18,440 Speaker 5: so far in history has gone on to create a 190 00:09:18,440 --> 00:09:21,480 Speaker 5: lot more jobs. And so when you see the lay 191 00:09:21,520 --> 00:09:25,000 Speaker 5: waste to jobs predictions, you don't also often get the 192 00:09:25,040 --> 00:09:26,760 Speaker 5: other side, which is how many new jobs are going 193 00:09:26,800 --> 00:09:27,320 Speaker 5: to be created? 194 00:09:27,640 --> 00:09:29,680 Speaker 4: Well, this is exactly what I want to ask you, 195 00:09:29,720 --> 00:09:33,960 Speaker 4: because we hear that all the time during technological advancement cycles, 196 00:09:34,440 --> 00:09:36,640 Speaker 4: new jobs are created. But I think the difficulty with 197 00:09:36,720 --> 00:09:39,640 Speaker 4: AI is that it's hard to see what those new 198 00:09:39,720 --> 00:09:40,960 Speaker 4: jobs are we better are. 199 00:09:40,760 --> 00:09:41,080 Speaker 2: Going to be? 200 00:09:41,160 --> 00:09:43,520 Speaker 4: Yeah, like, what are we better at than a supercomputer 201 00:09:43,679 --> 00:09:47,000 Speaker 4: that can code a program or write an article in 202 00:09:47,080 --> 00:09:48,079 Speaker 4: less than a minute. 203 00:09:47,880 --> 00:09:49,679 Speaker 1: Or mimical voice and do a podcast. 204 00:09:49,800 --> 00:09:53,679 Speaker 5: Okay, so how where are we three years in at 205 00:09:53,679 --> 00:09:57,680 Speaker 5: this point. These job doom predictions started twenty minutes after 206 00:09:57,760 --> 00:10:02,800 Speaker 5: chat GPT came out. Where are we maybe a little 207 00:10:02,800 --> 00:10:05,640 Speaker 5: bit at the margin, we're seeing something maybe, but maybe 208 00:10:05,640 --> 00:10:08,440 Speaker 5: it's still that companies way over hired after the COVID 209 00:10:08,480 --> 00:10:12,240 Speaker 5: boom or now sort of right sizing themselves. As you say, 210 00:10:12,600 --> 00:10:15,560 Speaker 5: I can go into LA, I can go to Google. 211 00:10:15,640 --> 00:10:18,080 Speaker 5: I think it is drop in a topic and say 212 00:10:18,160 --> 00:10:20,600 Speaker 5: make this into a great Odd Lots podcast. 213 00:10:21,880 --> 00:10:23,000 Speaker 1: Some people have said that Google. 214 00:10:23,480 --> 00:10:26,600 Speaker 3: Multiple people think that Google's notebook LM, which does that 215 00:10:27,040 --> 00:10:28,560 Speaker 3: creates a podcast out of a document. 216 00:10:28,640 --> 00:10:30,120 Speaker 1: Multiple people have said they. 217 00:10:30,000 --> 00:10:32,400 Speaker 3: Trained this on Jo and Tracy like this sounds like 218 00:10:32,960 --> 00:10:35,680 Speaker 3: other people are claiming that too, but yes they could have. 219 00:10:35,920 --> 00:10:39,240 Speaker 5: I'm just saying, you guys continue to grow. All I 220 00:10:39,280 --> 00:10:41,160 Speaker 5: hear more and more about is oh, you gotta listen 221 00:10:41,160 --> 00:10:43,400 Speaker 5: to Odd Lots grow and growing, growing, growing, growing. And 222 00:10:43,440 --> 00:10:46,600 Speaker 5: I don't think we were talking earlier that you've radically changed, 223 00:10:46,640 --> 00:10:48,640 Speaker 5: you made the way you make the show. We're still 224 00:10:48,679 --> 00:10:52,960 Speaker 5: recording it here. You didn't just queue up lams. Who 225 00:10:53,280 --> 00:10:57,680 Speaker 5: is going to rush to do the big interview with Google. 226 00:10:58,240 --> 00:11:00,439 Speaker 5: They're not. They're gonna come do it with you guys, 227 00:11:00,480 --> 00:11:03,360 Speaker 5: because they love you, and your audience loves you. So 228 00:11:03,400 --> 00:11:07,600 Speaker 5: I'm just saying, even in something that seemingly looks so 229 00:11:07,960 --> 00:11:11,120 Speaker 5: ripe to be destroyed, you guys cost money. Why is 230 00:11:11,120 --> 00:11:19,439 Speaker 5: Bloomberg spending it away right to defend I'm just saying optimistic. 231 00:11:19,559 --> 00:11:21,520 Speaker 1: So I mean I hear both ways, and thank you 232 00:11:21,559 --> 00:11:22,120 Speaker 1: for saying that. 233 00:11:22,160 --> 00:11:24,000 Speaker 3: On the other hand, you know, people might not have 234 00:11:24,040 --> 00:11:27,560 Speaker 3: the same affinity for someone who does claims adjusted and 235 00:11:27,559 --> 00:11:29,280 Speaker 3: at insurance. They don't really care if they have the 236 00:11:29,360 --> 00:11:32,240 Speaker 3: human touch for that or tech support. I want to 237 00:11:32,240 --> 00:11:35,320 Speaker 3: ask a related question. It also relates to the piece 238 00:11:35,360 --> 00:11:37,320 Speaker 3: that came out on one day, and it relates to 239 00:11:37,360 --> 00:11:39,520 Speaker 3: this question of whether open ai is could be. 240 00:11:39,440 --> 00:11:40,960 Speaker 1: The Google of the next generation. 241 00:11:41,520 --> 00:11:44,440 Speaker 3: One of the arguments that the piece stipulates, and I 242 00:11:44,440 --> 00:11:46,760 Speaker 3: think it sort of fits with what I have observed, 243 00:11:47,240 --> 00:11:49,719 Speaker 3: is that ai doesn't seem to have network. 244 00:11:49,480 --> 00:11:50,680 Speaker 1: Effects in the same way. 245 00:11:50,720 --> 00:11:53,240 Speaker 3: So it's like you could switch from O chat, GPT 246 00:11:53,840 --> 00:11:56,560 Speaker 3: to claude with almost no issues and it's just not 247 00:11:56,720 --> 00:11:59,520 Speaker 3: an issue. Whereas after the first time I used Google 248 00:11:59,559 --> 00:12:03,040 Speaker 3: in two though I never used Yahoo again, that was 249 00:12:03,080 --> 00:12:04,440 Speaker 3: the last time I'd ever. 250 00:12:04,440 --> 00:12:05,680 Speaker 1: Used another search engine. 251 00:12:06,320 --> 00:12:08,280 Speaker 3: This seems this is one of the arguments in the 252 00:12:08,280 --> 00:12:10,400 Speaker 3: piece that there isn't gonna be the same stickiness and 253 00:12:10,480 --> 00:12:13,480 Speaker 3: network effects in the AI era in the way that 254 00:12:13,600 --> 00:12:16,760 Speaker 3: defined the last twenty five years of the Internet. And 255 00:12:16,840 --> 00:12:19,960 Speaker 3: so when you think about like open ai versus Google 256 00:12:20,000 --> 00:12:22,560 Speaker 3: this year versus the dot com era, does that resonate 257 00:12:22,600 --> 00:12:23,800 Speaker 3: to I. 258 00:12:23,720 --> 00:12:27,040 Speaker 5: Think that what folks who would say that open ai 259 00:12:27,240 --> 00:12:29,160 Speaker 5: is the Google of the era would say that more 260 00:12:29,200 --> 00:12:31,640 Speaker 5: and more, you're gonna build more of your life into it. 261 00:12:31,640 --> 00:12:33,640 Speaker 5: It's going to know everything about you, it's going to 262 00:12:33,720 --> 00:12:36,040 Speaker 5: custom so it's gonna be the switching costs are going 263 00:12:36,080 --> 00:12:38,320 Speaker 5: to grow. But to this point they have not. And 264 00:12:38,360 --> 00:12:40,640 Speaker 5: I'll tell you the thing that really made me think 265 00:12:40,679 --> 00:12:43,320 Speaker 5: they were in trouble, and I still think that given 266 00:12:43,360 --> 00:12:46,640 Speaker 5: the valuation and given the general consensus that they've already won, 267 00:12:47,440 --> 00:12:49,880 Speaker 5: was when Google Gemini came out and people said, oh 268 00:12:50,200 --> 00:12:54,199 Speaker 5: it's better. Yeah, they passed them, and all the bulls said, oh, 269 00:12:54,440 --> 00:12:57,120 Speaker 5: just wait till the next version of open ai comes out. 270 00:12:57,360 --> 00:12:59,480 Speaker 5: I will say, if you go back into the nineteen nineties, 271 00:13:00,360 --> 00:13:02,920 Speaker 5: Amazon was very controversial. A lot of people thought it 272 00:13:02,920 --> 00:13:06,120 Speaker 5: would go bankrupt. It's just a bookstore. Barnes and Noble 273 00:13:06,200 --> 00:13:07,880 Speaker 5: is gonna put them out of business as soon as 274 00:13:07,880 --> 00:13:11,280 Speaker 5: they have a website or Walmart or whatever. Walmart and 275 00:13:11,320 --> 00:13:16,000 Speaker 5: Barnes and Noble never got anywhere near Amazon. Amazon was 276 00:13:16,120 --> 00:13:19,840 Speaker 5: so far out ahead always and still almost went bankrupt 277 00:13:20,000 --> 00:13:23,200 Speaker 5: by the way, but nobody was ever close. And in 278 00:13:23,240 --> 00:13:26,400 Speaker 5: this case, less than two years after the product comes out, 279 00:13:26,679 --> 00:13:31,280 Speaker 5: the incumbent has caught up to me. That's when I said, like, whoa, Okay, 280 00:13:31,640 --> 00:13:33,800 Speaker 5: maybe this actually is going to be like the Internet, 281 00:13:34,240 --> 00:13:37,760 Speaker 5: which is the first five years all of us, including me, 282 00:13:38,080 --> 00:13:41,680 Speaker 5: were saying like, oh, just buy the leaders, no problem, 283 00:13:41,880 --> 00:13:44,320 Speaker 5: like they're the ones that are gonna survive. You know 284 00:13:44,360 --> 00:13:47,720 Speaker 5: how many survived. Of that first five hundred companies that 285 00:13:47,760 --> 00:13:52,559 Speaker 5: went public in the nineteen nineties, one went on to 286 00:13:52,800 --> 00:13:56,000 Speaker 5: actually make investors a lot of money after the crash. 287 00:13:56,080 --> 00:13:59,840 Speaker 5: That was Amazon. Two others that I know of, Cisco, 288 00:14:00,280 --> 00:14:04,480 Speaker 5: after twenty five years, finally got back to its stock prize. 289 00:14:04,960 --> 00:14:08,240 Speaker 5: eBay did okay for a while and then kind of 290 00:14:08,240 --> 00:14:10,600 Speaker 5: sputtered out a little bit. That's out of hundreds and 291 00:14:10,679 --> 00:14:14,840 Speaker 5: hundreds of companies. So I think it's very possible that yes, 292 00:14:15,160 --> 00:14:18,600 Speaker 5: AI is huge. It's a great idea. We all want it, 293 00:14:18,720 --> 00:14:21,480 Speaker 5: and somebody is going to come along soon and due 294 00:14:21,480 --> 00:14:24,120 Speaker 5: to open ai what Google did to Yahoo, which by 295 00:14:24,160 --> 00:14:27,120 Speaker 5: the way, Yahoo was the big search engine winner and 296 00:14:27,160 --> 00:14:30,200 Speaker 5: then Google destroyed it. So I just think it's way 297 00:14:30,240 --> 00:14:45,760 Speaker 5: too early to say that we open a eyes the Google. 298 00:14:49,080 --> 00:14:52,760 Speaker 4: We started by mentioning the open ai valuation, which everyone 299 00:14:53,440 --> 00:14:56,560 Speaker 4: was a gas stat last year. Now maybe not so much. 300 00:14:56,600 --> 00:14:59,760 Speaker 4: But even if you assume a future where open ai 301 00:14:59,880 --> 00:15:03,440 Speaker 4: or whoever comes in and disrupts every single industry, and 302 00:15:03,560 --> 00:15:07,120 Speaker 4: there are thousands of employees and companies that are paying 303 00:15:07,120 --> 00:15:10,640 Speaker 4: two hundred dollars a month or whatever to access the 304 00:15:10,680 --> 00:15:14,520 Speaker 4: new LM whatever the new model is, open ai, at 305 00:15:14,600 --> 00:15:17,040 Speaker 4: least for right now, is still losing money on every 306 00:15:17,080 --> 00:15:19,720 Speaker 4: customer that's like a power user. So how does that 307 00:15:19,760 --> 00:15:22,000 Speaker 4: actually translate into a working business model. 308 00:15:22,360 --> 00:15:25,280 Speaker 5: It's a good question. I think what the bulls would 309 00:15:25,280 --> 00:15:28,640 Speaker 5: say is the cost of what they do is going 310 00:15:28,680 --> 00:15:31,520 Speaker 5: to plummet, and so the lines are going to cross 311 00:15:31,520 --> 00:15:34,080 Speaker 5: here pretty soon. If they own the whole market, suddenly 312 00:15:34,160 --> 00:15:37,520 Speaker 5: they start to become incredibly profitable. We are not there yet. 313 00:15:37,600 --> 00:15:39,760 Speaker 5: And I think when we talked last time at a 314 00:15:39,840 --> 00:15:43,200 Speaker 5: three hundred billion dollar valuation, which everybody was horrified by 315 00:15:43,200 --> 00:15:45,440 Speaker 5: at that point. But if you looked at the projections, 316 00:15:45,560 --> 00:15:48,640 Speaker 5: the projections were three years from now, they'll do one 317 00:15:48,760 --> 00:15:52,120 Speaker 5: hundred billion of revenue. And if you at that point 318 00:15:52,160 --> 00:15:54,520 Speaker 5: were looking at it saying, okay, if I believe that, 319 00:15:54,720 --> 00:15:58,080 Speaker 5: then it's three times revenue, the economics will probably change. 320 00:15:58,200 --> 00:16:00,320 Speaker 5: It starts to look a little more reasonable. Now we're 321 00:16:00,320 --> 00:16:03,120 Speaker 5: at eight hundred billion, so it's a much bigger multiple, 322 00:16:03,280 --> 00:16:06,320 Speaker 5: and you've got to actually have a much bigger revenue 323 00:16:06,360 --> 00:16:09,640 Speaker 5: stream and profits. But that's the argument. I don't see 324 00:16:09,680 --> 00:16:13,680 Speaker 5: it yet. I gotta say the economics are incredibly difficult 325 00:16:13,880 --> 00:16:14,240 Speaker 5: for them. 326 00:16:14,440 --> 00:16:16,520 Speaker 1: They seem brutal. And there's another element too. 327 00:16:16,560 --> 00:16:20,760 Speaker 3: So obviously, unlike during the Web two point zero era 328 00:16:20,840 --> 00:16:23,480 Speaker 3: where there was like cheap, plentiful compute, obviously, the capital 329 00:16:23,480 --> 00:16:24,720 Speaker 3: expenditure budgets. 330 00:16:24,360 --> 00:16:27,480 Speaker 1: To these companies are just eyewatering. They're spending like crazy. 331 00:16:27,600 --> 00:16:30,080 Speaker 3: There's also this other element that I think is interesting, 332 00:16:30,320 --> 00:16:33,760 Speaker 3: which is at least of a few of these labs, 333 00:16:34,200 --> 00:16:36,680 Speaker 3: like the people found to them are true believers, Like 334 00:16:36,720 --> 00:16:39,280 Speaker 3: they think they're like they think they're on a mission 335 00:16:39,640 --> 00:16:42,680 Speaker 3: to build AGI is going to change everything. Build God, 336 00:16:42,800 --> 00:16:45,200 Speaker 3: Build God, right, build a digital God, and it's like 337 00:16:45,240 --> 00:16:47,320 Speaker 3: they're not going to stop at anything. They're not gonna 338 00:16:47,320 --> 00:16:49,280 Speaker 3: be like, you know what, it's pretty good, let's tap 339 00:16:49,320 --> 00:16:51,680 Speaker 3: the brakes. We're pretty happy with how this model works. 340 00:16:52,160 --> 00:16:53,640 Speaker 1: You know, I don't, or at least I don't get 341 00:16:53,640 --> 00:16:54,240 Speaker 1: that a privy. 342 00:16:54,400 --> 00:16:57,920 Speaker 4: Also, they've couched it as this like existential race, like 343 00:16:57,920 --> 00:16:59,880 Speaker 4: there's only gonna be one company that dominates in the 344 00:17:00,080 --> 00:17:03,160 Speaker 4: and so the upper limit on spending to become that 345 00:17:03,240 --> 00:17:05,520 Speaker 4: company is infinity. 346 00:17:06,000 --> 00:17:10,360 Speaker 5: Yes, However, there is a limit to money. And when 347 00:17:10,359 --> 00:17:13,359 Speaker 5: you look at when people are raising money, you watch 348 00:17:13,440 --> 00:17:16,880 Speaker 5: where they're going for it. Sure, the vcs, that's one 349 00:17:16,920 --> 00:17:19,080 Speaker 5: pool of money, and then the big strategics, and then 350 00:17:19,119 --> 00:17:21,160 Speaker 5: the vcs who missed the first time and are feeling 351 00:17:21,160 --> 00:17:22,960 Speaker 5: so embarrassed about it they'll pay up to get in. 352 00:17:23,240 --> 00:17:26,320 Speaker 5: Then it is the international funds that are watching all 353 00:17:26,359 --> 00:17:29,320 Speaker 5: this stuff. Yeah, exactly. So it's some point you run 354 00:17:29,359 --> 00:17:31,359 Speaker 5: out of money, and we are talking about a company 355 00:17:31,400 --> 00:17:35,240 Speaker 5: that is raising hundreds of billions of dollars just to 356 00:17:35,320 --> 00:17:38,560 Speaker 5: keep the engine running and just to keep pace. This 357 00:17:38,640 --> 00:17:44,120 Speaker 5: is the other problem. Google, Facebook, Microsoft are generating tens 358 00:17:44,160 --> 00:17:46,680 Speaker 5: of billions of dollars of free cash flow every year 359 00:17:46,800 --> 00:17:49,600 Speaker 5: that they can use to buy chips open AI has 360 00:17:49,600 --> 00:17:51,880 Speaker 5: to raise it in the open market. So if those 361 00:17:51,920 --> 00:17:55,919 Speaker 5: economics don't start to change soon, we're gonna run out 362 00:17:55,960 --> 00:17:56,280 Speaker 5: of money. 363 00:17:56,720 --> 00:17:58,480 Speaker 4: Joe and I were talking about this before we came 364 00:17:58,520 --> 00:18:01,440 Speaker 4: on stage. But another thing happened recently was there is 365 00:18:01,480 --> 00:18:05,440 Speaker 4: a karaoke machine making company that announced it was now 366 00:18:05,560 --> 00:18:09,119 Speaker 4: an AI company, which when I see headlines like that, 367 00:18:09,480 --> 00:18:12,000 Speaker 4: remind me a lot of the crypto boom, but also 368 00:18:12,040 --> 00:18:14,000 Speaker 4: remind me a lot of the dot com boom, and 369 00:18:14,040 --> 00:18:15,640 Speaker 4: we are starting to see more and more of those. 370 00:18:15,720 --> 00:18:17,360 Speaker 4: Wasn't there a toilet company recently? 371 00:18:17,720 --> 00:18:20,520 Speaker 3: The toilet owner is legit, so like apparently, no, no, no, 372 00:18:20,600 --> 00:18:23,120 Speaker 3: for real, So you said it was not like, oh, 373 00:18:23,119 --> 00:18:26,040 Speaker 3: we're an AI company. Apparently it's the Japanese toilet company 374 00:18:26,040 --> 00:18:28,239 Speaker 3: and they have some like porcelain or something like that. 375 00:18:28,320 --> 00:18:31,480 Speaker 3: It's very important for like semiconductors or something. So everyone's like, oh, 376 00:18:31,520 --> 00:18:33,680 Speaker 3: get out of the toilet business. You have this access 377 00:18:33,680 --> 00:18:34,640 Speaker 3: to this material. 378 00:18:34,920 --> 00:18:38,760 Speaker 1: But so that one was kind of legit. The karaoke one. 379 00:18:39,000 --> 00:18:42,840 Speaker 4: Okay, okay, so yeah, the toilet one. Maybe the karaoke one. 380 00:18:42,880 --> 00:18:44,840 Speaker 4: When you see headlines like that, what do you think 381 00:18:44,880 --> 00:18:47,440 Speaker 4: do you think like this is a rational pivot into 382 00:18:47,480 --> 00:18:50,840 Speaker 4: a booming market and everyone should be experimenting with AI 383 00:18:50,920 --> 00:18:52,560 Speaker 4: at the moment, or do you think like, nah, this 384 00:18:52,680 --> 00:18:53,399 Speaker 4: is getting crazy. 385 00:18:53,760 --> 00:18:58,040 Speaker 5: I think we are in a euphoric bubble period where 386 00:18:58,040 --> 00:19:01,879 Speaker 5: we just discovered this amazing new thing. Nobody knows what 387 00:19:01,920 --> 00:19:04,520 Speaker 5: the future is, nobody knows what's gonna work, and we 388 00:19:04,560 --> 00:19:09,320 Speaker 5: are effectively creating this enormous R and D lab, which is, hey, 389 00:19:09,600 --> 00:19:12,240 Speaker 5: what you can do something, here's some money, go out 390 00:19:12,240 --> 00:19:14,120 Speaker 5: and experiment. I hope you'll be one of the winners. 391 00:19:14,560 --> 00:19:17,199 Speaker 5: Fifty to one hundred years ago, all of that stuff 392 00:19:17,280 --> 00:19:21,960 Speaker 5: happened in labs like Edison's lab or Bell Labs inside 393 00:19:21,960 --> 00:19:25,600 Speaker 5: these big corporations. You never saw it. Now everything is 394 00:19:25,600 --> 00:19:28,600 Speaker 5: a startup. You see it. It's happening in front of you. 395 00:19:28,680 --> 00:19:31,199 Speaker 5: But there's no question that most of the companies are 396 00:19:31,240 --> 00:19:33,600 Speaker 5: gonna fail. Happens aget in again and again. And I 397 00:19:33,640 --> 00:19:36,280 Speaker 5: think what investors are saying when they see that is okay, 398 00:19:36,520 --> 00:19:39,000 Speaker 5: you know that is a quick flip. It's hey, I'll 399 00:19:39,040 --> 00:19:41,240 Speaker 5: get it. Other people. A lot of other people love AI. 400 00:19:41,280 --> 00:19:43,000 Speaker 5: They'll buy it after I do, and it'll go up 401 00:19:43,040 --> 00:19:43,760 Speaker 5: and then I'll sell it. 402 00:19:43,960 --> 00:19:46,160 Speaker 3: I want to ask you a question about capital markets 403 00:19:46,200 --> 00:19:48,480 Speaker 3: and how they're different to the late nineties of the 404 00:19:48,480 --> 00:19:51,199 Speaker 3: dot com bubble, because one really big thing that's happening 405 00:19:51,240 --> 00:19:54,199 Speaker 3: these days is, yes, these companies are in private but 406 00:19:54,240 --> 00:19:56,639 Speaker 3: there is a lot of stock floating around, and there 407 00:19:56,640 --> 00:19:59,760 Speaker 3: are these SPVs that will buy a choir, a chunk 408 00:19:59,800 --> 00:20:02,679 Speaker 3: of Anthropic and sell them on a secondary market, and 409 00:20:02,680 --> 00:20:05,159 Speaker 3: people are trying to get They're much more liquid than 410 00:20:05,200 --> 00:20:07,920 Speaker 3: a private company would have been in the late nineties. 411 00:20:08,040 --> 00:20:09,680 Speaker 1: I get someone someone messaged me. 412 00:20:09,640 --> 00:20:10,960 Speaker 3: They're like, do you want to buy a little I like, no, 413 00:20:10,960 --> 00:20:12,200 Speaker 3: I don't do that, but they're like, would you like 414 00:20:12,240 --> 00:20:14,560 Speaker 3: to buy a little bit of Anthropic at a three 415 00:20:14,600 --> 00:20:17,280 Speaker 3: hundred and fifty billion dollar valuation? But you must get 416 00:20:17,280 --> 00:20:19,080 Speaker 3: a bunch of those sort of messages from people. And 417 00:20:19,080 --> 00:20:22,639 Speaker 3: I'm curious about your perspective on this world, because the 418 00:20:22,640 --> 00:20:25,159 Speaker 3: other thing is like, there's no financials, You have no 419 00:20:25,200 --> 00:20:26,679 Speaker 3: idea like what their p and L is for these 420 00:20:26,720 --> 00:20:28,920 Speaker 3: private company what's your take on some of this quasi 421 00:20:28,960 --> 00:20:30,440 Speaker 3: liquid private market stock. 422 00:20:30,600 --> 00:20:33,399 Speaker 5: Yeah, and let us just say then when our phones 423 00:20:33,440 --> 00:20:36,000 Speaker 5: are ringing, yeah, I know that's the right thing. 424 00:20:37,119 --> 00:20:40,320 Speaker 1: Money out there It's like, if someone is reaching out 425 00:20:40,359 --> 00:20:42,800 Speaker 1: to me, I can't be that much money. 426 00:20:42,800 --> 00:20:45,919 Speaker 5: Big, big warning signs going on over there. So I 427 00:20:45,920 --> 00:20:48,200 Speaker 5: think one of the things that's changed versus the dot 428 00:20:48,240 --> 00:20:50,320 Speaker 5: com boom in the nineteen nineties and before that is 429 00:20:50,520 --> 00:20:53,800 Speaker 5: companies used to go public very early before, and you 430 00:20:53,840 --> 00:20:57,080 Speaker 5: can invest in Amazon, for example, at a four hundred 431 00:20:57,119 --> 00:21:00,520 Speaker 5: million dollar valuation, sounds like a lot. It's worth trillions 432 00:21:00,560 --> 00:21:04,840 Speaker 5: of dollars. Now there's been incredible appreciation. Individuals can't do 433 00:21:04,960 --> 00:21:07,560 Speaker 5: that anymore, and there's good reason for it. People lost 434 00:21:07,600 --> 00:21:09,720 Speaker 5: a lot of money after the dot com crash. We 435 00:21:09,760 --> 00:21:12,960 Speaker 5: didn't like that. We want to protect everybody. So now 436 00:21:13,000 --> 00:21:15,919 Speaker 5: companies go public a lot later, the threat of lawsuits 437 00:21:15,960 --> 00:21:18,879 Speaker 5: and all of the different regulation that you've got to do, 438 00:21:18,960 --> 00:21:22,000 Speaker 5: so it's later. But the bummer about it is a 439 00:21:22,040 --> 00:21:25,440 Speaker 5: lot of investors who actually do have the risk tolerance 440 00:21:25,920 --> 00:21:28,679 Speaker 5: can't access them. And so, yes, you've had a new 441 00:21:29,240 --> 00:21:32,320 Speaker 5: business that has been built around Okay, let's get that 442 00:21:32,400 --> 00:21:36,000 Speaker 5: private stock. Somehow get it, I think. And the problem is, again, 443 00:21:36,080 --> 00:21:39,560 Speaker 5: if you choose right great company, it works, but it's 444 00:21:39,640 --> 00:21:42,520 Speaker 5: more fees packed around it. It's much more you know 445 00:21:42,560 --> 00:21:45,760 Speaker 5: that intermediary that called you has to get paid, so 446 00:21:45,840 --> 00:21:48,520 Speaker 5: that's somebody else. It's more difficult to trade. So I 447 00:21:48,560 --> 00:21:51,119 Speaker 5: think it is the market trying to solve the problem 448 00:21:51,440 --> 00:21:55,160 Speaker 5: that it's much harder to make money in open AI 449 00:21:55,520 --> 00:21:58,200 Speaker 5: at eight hundred billion or Anthropic at three hundred and 450 00:21:58,240 --> 00:22:01,120 Speaker 5: fifty billion than it is in the seed round when 451 00:22:01,160 --> 00:22:03,880 Speaker 5: it was a few hundred million. And so you've got 452 00:22:03,920 --> 00:22:06,280 Speaker 5: a lot of people clamoring to get in. But it's 453 00:22:06,320 --> 00:22:10,080 Speaker 5: the same problem. You don't even infect less companies go public. 454 00:22:10,160 --> 00:22:12,320 Speaker 5: They do, or they are releasing a lot of information, 455 00:22:12,520 --> 00:22:13,520 Speaker 5: whereas these guys are not. 456 00:22:30,200 --> 00:22:32,639 Speaker 4: Should we do some media stuff? All right, let the 457 00:22:32,880 --> 00:22:36,200 Speaker 4: media naval gazing begin, Okay, So we started out by 458 00:22:36,400 --> 00:22:38,639 Speaker 4: saying that, you know, we were talking about the Satrini 459 00:22:38,680 --> 00:22:41,080 Speaker 4: piece that got it went viral, but it also got 460 00:22:41,080 --> 00:22:47,159 Speaker 4: a backlash. You did your own AI newsroom experiment newsletter 461 00:22:47,280 --> 00:22:50,919 Speaker 4: on substack that also got a bit of a backlash. 462 00:22:51,119 --> 00:22:52,400 Speaker 4: Were you surprised at that one? 463 00:22:53,200 --> 00:22:55,160 Speaker 5: Yes, I was surprised. Basically, what I did I left 464 00:22:55,200 --> 00:22:57,800 Speaker 5: business inside or I started this new thing Regenerator and 465 00:22:57,840 --> 00:22:59,960 Speaker 5: a podcast too. Trying your business. 466 00:22:59,760 --> 00:23:03,200 Speaker 4: It's very difficult talk about this is the most important 467 00:23:03,240 --> 00:23:06,680 Speaker 4: aspect of podcasting podcast. 468 00:23:05,880 --> 00:23:09,520 Speaker 5: Exactly, yes, podcast called Solutions with check it out? Yes please. 469 00:23:09,920 --> 00:23:13,720 Speaker 5: So I left and it was just me. So I said, okay, great, 470 00:23:13,880 --> 00:23:16,280 Speaker 5: let's see what it can do, because if you try 471 00:23:16,280 --> 00:23:19,320 Speaker 5: that as a CEO of a company, people freak out, justifiably. 472 00:23:19,480 --> 00:23:22,040 Speaker 5: Oh my goodness, they're going to discover something. My job's threatened. 473 00:23:22,240 --> 00:23:25,400 Speaker 5: We are all worried about our jobs being threatened. Including 474 00:23:25,440 --> 00:23:29,520 Speaker 5: by the way, do we need articles any more? Articles? 475 00:23:29,680 --> 00:23:31,359 Speaker 5: I mean, boy, there are a lot of great articles 476 00:23:31,359 --> 00:23:33,680 Speaker 5: written everybody. In fact, this is the problem in media. 477 00:23:33,760 --> 00:23:35,639 Speaker 5: There's way too much of it. Let's just get that 478 00:23:35,680 --> 00:23:38,720 Speaker 5: out there anyway. So yes, I experimented with everything, and 479 00:23:38,760 --> 00:23:40,960 Speaker 5: the first thing that I did was, oh, if I 480 00:23:40,960 --> 00:23:43,520 Speaker 5: were actually starting a publication and hiring a team, who 481 00:23:43,560 --> 00:23:45,679 Speaker 5: would I hire. Well, I'd hire five people. I have 482 00:23:45,680 --> 00:23:48,040 Speaker 5: a managing editor, and I'd have a few reporters and 483 00:23:48,119 --> 00:23:50,879 Speaker 5: so forth. So I asked chat GPT, can you help me? 484 00:23:51,000 --> 00:23:55,040 Speaker 5: CHATCHEPET said sure, let's make them. So in two hours 485 00:23:55,080 --> 00:23:57,680 Speaker 5: we made them. They all had different personalities, they had 486 00:23:57,720 --> 00:24:00,840 Speaker 5: head shots, et cetera. I wrote this, We had like 487 00:24:01,000 --> 00:24:03,720 Speaker 5: slack going where we're all talking about stories, and it 488 00:24:03,800 --> 00:24:07,840 Speaker 5: was pretty good, not spectacular but pretty good, and so 489 00:24:07,920 --> 00:24:10,280 Speaker 5: that was cool, so I wrote about it. Yes, I 490 00:24:10,359 --> 00:24:14,440 Speaker 5: also made the mistake of complimenting one of the headshots 491 00:24:14,840 --> 00:24:18,879 Speaker 5: and was immediately lambasted for that, which actually helped me 492 00:24:19,040 --> 00:24:21,160 Speaker 5: hone my own AI morality. 493 00:24:21,200 --> 00:24:23,000 Speaker 4: Well, I have a question on this. If you had 494 00:24:23,080 --> 00:24:25,840 Speaker 4: had a human editor for that piece, do you think 495 00:24:25,840 --> 00:24:29,600 Speaker 4: you would have complimented your AI generated managing editor on 496 00:24:29,640 --> 00:24:30,840 Speaker 4: her appearance. 497 00:24:30,600 --> 00:24:33,800 Speaker 5: The piece needed to be editing edited. This is the problem. 498 00:24:34,000 --> 00:24:35,560 Speaker 5: You were out on your own. You don't have great 499 00:24:35,640 --> 00:24:39,360 Speaker 5: editors protecting you anymore. You let your enthusiasm bubble over 500 00:24:39,400 --> 00:24:41,199 Speaker 5: a little bit too much. You know who I have 501 00:24:41,400 --> 00:24:44,440 Speaker 5: edited my stuff now sometimes if I feel like I'm 502 00:24:44,480 --> 00:24:48,280 Speaker 5: maybe a little bit too fasty as enthusiastic JATGBT or 503 00:24:48,359 --> 00:24:50,240 Speaker 5: Claude and very good spots it. 504 00:24:50,280 --> 00:24:53,119 Speaker 3: Anyway, you talk to us about that newsroom dynamic particularly, 505 00:24:53,200 --> 00:24:57,639 Speaker 3: you know, every editor at a newsroom, every every newsroom right. 506 00:24:57,520 --> 00:25:00,080 Speaker 1: Now is like trying to figure out something right. You 507 00:25:00,160 --> 00:25:01,840 Speaker 1: can't like alienate. 508 00:25:01,400 --> 00:25:03,720 Speaker 3: The entire newsroom and say like, oh you all have 509 00:25:03,800 --> 00:25:07,320 Speaker 3: to be doing whats Yeah. 510 00:25:07,080 --> 00:25:08,680 Speaker 1: Like that doesn't work. 511 00:25:08,760 --> 00:25:10,960 Speaker 3: The newsroom obviously hates it, and I would say probably 512 00:25:10,960 --> 00:25:13,400 Speaker 3: for good reason. But on the other hand, everyone has 513 00:25:13,440 --> 00:25:17,720 Speaker 3: to be like figuring stuff out. What should newsroom executives, 514 00:25:17,760 --> 00:25:20,520 Speaker 3: whether it's editors in chief or whatever, how should they 515 00:25:20,520 --> 00:25:22,920 Speaker 3: be thinking about solving this problem? 516 00:25:23,200 --> 00:25:24,879 Speaker 5: So, I mean, so going back the other thing, the 517 00:25:24,920 --> 00:25:27,480 Speaker 5: other thing that I learned from publishing, that was just 518 00:25:27,600 --> 00:25:31,240 Speaker 5: how much anxiety there is, especially in younger generation, about 519 00:25:31,520 --> 00:25:33,119 Speaker 5: jobs that have been there for a long time that 520 00:25:33,160 --> 00:25:36,560 Speaker 5: are just disappearing. And that was extremely clear to me, 521 00:25:36,600 --> 00:25:38,840 Speaker 5: and I would have been even more Yeah, I understand it. 522 00:25:38,880 --> 00:25:41,760 Speaker 5: I mean it is scary. And so what does a 523 00:25:41,800 --> 00:25:44,560 Speaker 5: newsroom do. I think, let's look at what's going on media. 524 00:25:44,680 --> 00:25:48,760 Speaker 5: The problem again is there's way too much media. There 525 00:25:48,800 --> 00:25:52,520 Speaker 5: are so many amazing articles, for example, produced every day. 526 00:25:52,600 --> 00:25:54,480 Speaker 5: When I open the New York Times or the Wall 527 00:25:54,480 --> 00:25:58,760 Speaker 5: Street Journal or Bloomberg, I want to read dozens of articles. 528 00:25:58,840 --> 00:26:02,359 Speaker 5: I might get to one or two. And distribution has 529 00:26:02,480 --> 00:26:07,520 Speaker 5: changed radically five years ago. Facebook, Google, we're driving enormous 530 00:26:07,560 --> 00:26:11,480 Speaker 5: distribution to lots of different publishers. That is all now changing. 531 00:26:11,600 --> 00:26:13,440 Speaker 5: We are going back to something that looks like the 532 00:26:13,520 --> 00:26:17,240 Speaker 5: nineteen nineties, where actually you have to have a direct 533 00:26:17,280 --> 00:26:21,320 Speaker 5: relationship with your subscriber. The industry is under a ton 534 00:26:21,400 --> 00:26:26,760 Speaker 5: of pressure, both from advertising but also we have maximized 535 00:26:26,760 --> 00:26:29,639 Speaker 5: the amount of time as humans that we can spend 536 00:26:29,960 --> 00:26:33,760 Speaker 5: consuming media. This is very different from twenty years ago 537 00:26:33,840 --> 00:26:36,399 Speaker 5: when we start a business socider. You remember this, Like 538 00:26:36,720 --> 00:26:39,200 Speaker 5: in the nineteen nineties, media was very mature. 539 00:26:39,680 --> 00:26:42,120 Speaker 1: It was, says twenty years ago. By the way, that's 540 00:26:42,160 --> 00:26:43,960 Speaker 1: like thirty years Oh, wit to figure. 541 00:26:43,960 --> 00:26:47,040 Speaker 5: Okay, but let's go back to the ninety nineties. Even farther. 542 00:26:47,320 --> 00:26:50,520 Speaker 5: Media was very mature. Magazines totally mature, very hard to 543 00:26:50,560 --> 00:26:53,000 Speaker 5: start one. You needed fifty million dollars and you need 544 00:26:53,000 --> 00:26:55,880 Speaker 5: to hire all these great people, and then maybe five 545 00:26:55,960 --> 00:26:59,200 Speaker 5: years down the road you might, if you're lucky, become profitable. 546 00:26:59,280 --> 00:27:02,560 Speaker 5: Very hard, very hard to get into newspapers, television, even worse. 547 00:27:02,640 --> 00:27:05,520 Speaker 5: Your slave for a long long time getting coffee, so 548 00:27:06,119 --> 00:27:10,680 Speaker 5: very very tough. Then desktops came along and got connected. 549 00:27:10,920 --> 00:27:13,400 Speaker 5: So that was suddenly twelve hours a day that people 550 00:27:13,440 --> 00:27:15,640 Speaker 5: are sitting in the office. They need to know what's 551 00:27:15,640 --> 00:27:19,080 Speaker 5: going on. It's a new opportunity. Then phones came and 552 00:27:19,119 --> 00:27:21,159 Speaker 5: that's the thing that really drove business Insider. In the 553 00:27:21,119 --> 00:27:23,720 Speaker 5: earlier we're standing around, we want to know what's happening. 554 00:27:24,040 --> 00:27:28,399 Speaker 5: Nobody else is serving that, so big, big opportunity. But 555 00:27:28,600 --> 00:27:32,440 Speaker 5: over time all the folks in the industry caught up. 556 00:27:32,960 --> 00:27:35,880 Speaker 5: New York Times is terrific, so is Wall Street Journal, 557 00:27:35,920 --> 00:27:39,160 Speaker 5: so it's business insiders, so it's Bloomberg. These companies are 558 00:27:39,280 --> 00:27:42,080 Speaker 5: really really good at this now. And if you just 559 00:27:42,160 --> 00:27:46,240 Speaker 5: look at, for example, political coverage, we don't need one 560 00:27:46,320 --> 00:27:50,440 Speaker 5: hundred White House reporters, we just don't. And yet that's 561 00:27:50,520 --> 00:27:52,640 Speaker 5: the thing that everybody's interested in, so of course you're 562 00:27:52,640 --> 00:27:55,159 Speaker 5: going to have a reporter focus on that. And yet 563 00:27:55,560 --> 00:27:57,719 Speaker 5: there's just too much capacity. And then the other thing 564 00:27:57,760 --> 00:28:00,199 Speaker 5: that happened, and I'll finished up very quickly, is the 565 00:28:00,280 --> 00:28:04,680 Speaker 5: Internet blew up the protective mote around all of the 566 00:28:04,760 --> 00:28:08,320 Speaker 5: nineteen nineties era media. Print and TV used to be 567 00:28:08,359 --> 00:28:12,080 Speaker 5: completely separate. Now they're the same. You used to not 568 00:28:12,160 --> 00:28:14,600 Speaker 5: have a television company be able to own a newspaper. 569 00:28:14,800 --> 00:28:17,159 Speaker 5: Now we've done away with that. Everybody can own everything, 570 00:28:17,359 --> 00:28:21,040 Speaker 5: everybody can do everything, and it's all a click away. 571 00:28:21,240 --> 00:28:24,800 Speaker 5: So it has become even more intensely competitive. And so 572 00:28:25,040 --> 00:28:27,760 Speaker 5: it's never been an easy industry, but we are back 573 00:28:27,800 --> 00:28:31,840 Speaker 5: to it being an incredibly intensely competitive industry, and I 574 00:28:31,880 --> 00:28:34,000 Speaker 5: do think there's just in general too much of it. 575 00:28:34,680 --> 00:28:38,160 Speaker 4: So when we think about the impact of AI on media, 576 00:28:38,640 --> 00:28:40,560 Speaker 4: it feels to me like people are talking about two 577 00:28:40,640 --> 00:28:43,120 Speaker 4: paths here, and one is where you know, the big 578 00:28:43,200 --> 00:28:46,000 Speaker 4: chat platforms are basically they become the front page of 579 00:28:46,000 --> 00:28:49,440 Speaker 4: the internet, right and you type in whatever query you 580 00:28:49,520 --> 00:28:52,240 Speaker 4: have today, how did the State of the Union address 581 00:28:52,280 --> 00:28:54,760 Speaker 4: go last night or something like that, and it spits 582 00:28:54,800 --> 00:28:56,840 Speaker 4: out a bunch of information that's based on a bunch 583 00:28:56,840 --> 00:29:00,560 Speaker 4: of articles that you, yourself, are not paying for. And 584 00:29:00,600 --> 00:29:02,600 Speaker 4: then the other thing that I hear is well, actually, 585 00:29:02,640 --> 00:29:06,920 Speaker 4: in the age of Internet slop AI slop, that distribution 586 00:29:07,280 --> 00:29:09,640 Speaker 4: and quality is going to become even more important. So 587 00:29:09,720 --> 00:29:12,000 Speaker 4: people are going to want to go to the New 588 00:29:12,080 --> 00:29:15,160 Speaker 4: York Times or hopefully to Bloomberg or wherever and get 589 00:29:15,320 --> 00:29:18,840 Speaker 4: that take on last night's events. Where do you stand 590 00:29:18,960 --> 00:29:21,800 Speaker 4: on that sort of You know, it's a very binary 591 00:29:21,880 --> 00:29:24,080 Speaker 4: outcome to me, which way do you think we're going? 592 00:29:24,640 --> 00:29:27,760 Speaker 5: I think this is great for the brands that make it, 593 00:29:27,960 --> 00:29:31,479 Speaker 5: and again, nobody has a right to exist. It's going 594 00:29:31,520 --> 00:29:33,640 Speaker 5: to be a fight, But for the brands that make it, 595 00:29:33,640 --> 00:29:38,600 Speaker 5: it is terrific. I trust Bloomberg. I read it Bloomberg article. 596 00:29:38,720 --> 00:29:40,680 Speaker 5: I know the reporter knows what they're doing and they 597 00:29:40,680 --> 00:29:43,760 Speaker 5: know the subject. I know there is editing there. Occasionally 598 00:29:43,760 --> 00:29:46,360 Speaker 5: there's a mistake, but it is an honest mistake that 599 00:29:46,400 --> 00:29:50,320 Speaker 5: will be fixed quickly. Compare that to what I see 600 00:29:50,360 --> 00:29:54,840 Speaker 5: on social media, especially now where AI can create photos 601 00:29:54,880 --> 00:29:57,760 Speaker 5: and videos and everything else. I don't know. I know 602 00:29:57,920 --> 00:30:01,320 Speaker 5: the New York Times will immediately dive in if something 603 00:30:01,360 --> 00:30:05,240 Speaker 5: is bubbling up, some video that's apparently scandalous that we've 604 00:30:05,240 --> 00:30:07,760 Speaker 5: all got to look at. I know they'll do work 605 00:30:07,800 --> 00:30:10,560 Speaker 5: on it. Maybe we'll sometime get to the point where 606 00:30:10,600 --> 00:30:14,480 Speaker 5: they can be fooled. Not yet, and so I trust them. 607 00:30:14,560 --> 00:30:18,160 Speaker 5: And you actually need a brand and a great organization 608 00:30:18,240 --> 00:30:21,120 Speaker 5: to do that. And people say, wow, yes, but everybody 609 00:30:21,160 --> 00:30:23,320 Speaker 5: out there, you know, who knows what's true and it's over. 610 00:30:23,760 --> 00:30:26,320 Speaker 5: That has always been the case. At its peak in 611 00:30:26,400 --> 00:30:30,840 Speaker 5: the print era, a million people read The New York Times. 612 00:30:31,640 --> 00:30:35,480 Speaker 5: That's it. Everybody else heard what was said from their friends, 613 00:30:35,880 --> 00:30:38,560 Speaker 5: or they heard something else, they never thought about it. 614 00:30:38,640 --> 00:30:41,320 Speaker 5: And so the idea that yeah, there's stuff out there 615 00:30:41,400 --> 00:30:44,720 Speaker 5: that some people believe, There's always been stuff out there 616 00:30:44,720 --> 00:30:47,400 Speaker 5: that some people believe. So I think it's great for brands, 617 00:30:47,400 --> 00:30:48,720 Speaker 5: but I do think it's gonna be a fight. It's 618 00:30:48,760 --> 00:30:50,680 Speaker 5: never going to be an easy industry talk. 619 00:30:50,840 --> 00:30:54,160 Speaker 3: Just more about this question though, of like, Okay, everyone, 620 00:30:54,200 --> 00:30:55,800 Speaker 3: you know New York Times, like anyone else, they have 621 00:30:55,840 --> 00:30:57,480 Speaker 3: to figure this out, right, They have to figure out, 622 00:30:57,480 --> 00:30:58,880 Speaker 3: like what can they leverage it. 623 00:30:59,040 --> 00:30:59,960 Speaker 1: I'm doing scare quotes. 624 00:31:00,560 --> 00:31:02,400 Speaker 3: I hate that word, you know, but it's like the 625 00:31:02,440 --> 00:31:04,440 Speaker 3: word that everyone used. Could it be leveraged in some 626 00:31:04,480 --> 00:31:07,640 Speaker 3: way in the context of the traditional newsroom AI? 627 00:31:07,960 --> 00:31:09,440 Speaker 1: Yeah, And so. 628 00:31:09,440 --> 00:31:11,280 Speaker 5: Going back to your question, which I realized I didn't 629 00:31:11,520 --> 00:31:14,080 Speaker 5: get to before, So what what should news. 630 00:31:14,000 --> 00:31:15,040 Speaker 1: How do they handle that? 631 00:31:15,080 --> 00:31:17,880 Speaker 3: Like, because everyone here's you have to like run your 632 00:31:18,000 --> 00:31:21,240 Speaker 3: runder text through this chatbot or train your train the 633 00:31:21,280 --> 00:31:23,320 Speaker 3: AI that's going to replace whatever or they feel, you know, 634 00:31:23,760 --> 00:31:26,720 Speaker 3: very reasonable fear, how do you how can anyone overcome that? 635 00:31:26,960 --> 00:31:30,280 Speaker 5: So I think there there is the opportunity for there 636 00:31:30,320 --> 00:31:33,040 Speaker 5: to be a lot of productivity enhancements in what we 637 00:31:33,120 --> 00:31:36,280 Speaker 5: have today. We were talking earlier. You both use research 638 00:31:36,600 --> 00:31:40,200 Speaker 5: chattyp for research. Me too. It's great. That is a 639 00:31:40,360 --> 00:31:44,520 Speaker 5: very different That is one use of media is me 640 00:31:45,320 --> 00:31:47,920 Speaker 5: looking for information going out It used to be that 641 00:31:47,960 --> 00:31:50,120 Speaker 5: I would search for articles. Now I do it with 642 00:31:50,200 --> 00:31:54,480 Speaker 5: Claude or chat GPT. That is a smaller piece than 643 00:31:54,920 --> 00:31:58,840 Speaker 5: what most media has lived on forever, which is I 644 00:31:58,840 --> 00:32:02,720 Speaker 5: don't know what I'm interested in. WHOA, that's an interesting story. 645 00:32:03,040 --> 00:32:06,840 Speaker 5: And that's the serendipitous consumption of media. That's why newspaper 646 00:32:06,840 --> 00:32:10,400 Speaker 5: headlines have been like this forever. It's why magazine covers matter. 647 00:32:10,800 --> 00:32:12,840 Speaker 5: We don't know what we want until we see it, 648 00:32:13,000 --> 00:32:16,600 Speaker 5: and then we get it. And it's why in our 649 00:32:16,680 --> 00:32:20,440 Speaker 5: business there are many different talents you need to be 650 00:32:20,680 --> 00:32:25,520 Speaker 5: extraordinarily talented in the business. One is sure reporting, accuracy, expertise, 651 00:32:25,600 --> 00:32:27,600 Speaker 5: so forth. You also have to know what people care 652 00:32:27,640 --> 00:32:30,520 Speaker 5: about and what a good story is and why it matters. 653 00:32:31,160 --> 00:32:34,120 Speaker 5: And I don't think that changes. And so where do 654 00:32:34,160 --> 00:32:38,920 Speaker 5: we look around the interestry research, drafting stories? Why not? 655 00:32:39,320 --> 00:32:41,600 Speaker 5: And when people are aghast at that, let me just 656 00:32:41,640 --> 00:32:45,600 Speaker 5: say back in the print era, for newspapers, especially a 657 00:32:45,680 --> 00:32:48,600 Speaker 5: daily newspaper, what would happen is the reporters would be 658 00:32:48,640 --> 00:32:51,480 Speaker 5: out in the field, they would get information, they would 659 00:32:51,760 --> 00:32:54,840 Speaker 5: call it into the rewrite desk. Verbally, they didn't have 660 00:32:54,840 --> 00:32:57,080 Speaker 5: anything to write it with. It's just say you're the facts. 661 00:32:57,280 --> 00:32:59,760 Speaker 5: The rewriters would write it. And in the last twenty 662 00:32:59,840 --> 00:33:03,040 Speaker 5: years or thirty years, we who have been trained in 663 00:33:03,120 --> 00:33:05,280 Speaker 5: writing and we value it and we go to Columbia 664 00:33:05,320 --> 00:33:11,600 Speaker 5: Journalism School, we have conflated reporting and writing as journalism. 665 00:33:11,800 --> 00:33:14,400 Speaker 5: And I do think there are opportunities now where Okay, 666 00:33:14,400 --> 00:33:17,440 Speaker 5: the writing lift may may be lightened by this. And 667 00:33:18,000 --> 00:33:20,880 Speaker 5: I'm not saying say here's the article and just publish it. 668 00:33:21,160 --> 00:33:23,840 Speaker 5: I'm just saying, you know what, perplexity is pretty good 669 00:33:24,280 --> 00:33:29,280 Speaker 5: at writing a well informed, well sourced story in six seconds, 670 00:33:29,880 --> 00:33:33,040 Speaker 5: and maybe that actually accelerates what you're doing and then 671 00:33:33,080 --> 00:33:35,360 Speaker 5: maybe you can add your piece to the top or 672 00:33:35,360 --> 00:33:37,360 Speaker 5: what have you. So I do think their uses. 673 00:33:37,120 --> 00:33:39,760 Speaker 4: For it too, So I actually I agree with that argument. 674 00:33:39,840 --> 00:33:42,959 Speaker 4: I think like social skills and investigative skills are going 675 00:33:43,000 --> 00:33:45,800 Speaker 4: to become more important in the age of AI. However, 676 00:33:46,320 --> 00:33:48,320 Speaker 4: what you always hear is, well, you're gonna need You're 677 00:33:48,360 --> 00:33:50,560 Speaker 4: going to need to produce scoops. Right, You're gonna have 678 00:33:50,600 --> 00:33:53,400 Speaker 4: to find the stuff that the models don't already know about, 679 00:33:53,440 --> 00:33:56,080 Speaker 4: and that's going to be the valuable aspect of journalism. 680 00:33:56,560 --> 00:33:58,680 Speaker 4: But the problem seems to be that those scoops get 681 00:33:58,680 --> 00:34:02,840 Speaker 4: commodified so freaking quickly that I'm not sure that actually 682 00:34:02,920 --> 00:34:05,760 Speaker 4: generates much value for the news organization. 683 00:34:06,320 --> 00:34:10,480 Speaker 5: It is very hard to protect news. You can't. But 684 00:34:11,600 --> 00:34:15,719 Speaker 5: the organizations that produce it all the time are gonna 685 00:34:15,760 --> 00:34:18,400 Speaker 5: have a huge advantage because it's going to be worth 686 00:34:18,560 --> 00:34:21,360 Speaker 5: subscribing to them. And that's where That's where the New 687 00:34:21,440 --> 00:34:23,839 Speaker 5: York Times, the Welster Journal, and Bloomberg that's where it's 688 00:34:23,880 --> 00:34:27,120 Speaker 5: coming from. And Bloomberg's an interesting one because you really 689 00:34:27,239 --> 00:34:33,280 Speaker 5: serve a somewhat small but highly committed base where Bloomberg 690 00:34:33,360 --> 00:34:35,439 Speaker 5: reporters are breaking stuff all the time that allow people 691 00:34:35,480 --> 00:34:37,759 Speaker 5: to make or lose money. They're going to care and 692 00:34:37,800 --> 00:34:41,160 Speaker 5: they're gonna have that terminal and chackb is not doing that. 693 00:34:41,400 --> 00:34:43,520 Speaker 5: So part of what I hear you say is wait 694 00:34:43,560 --> 00:34:48,800 Speaker 5: a minute, it has to be differentiated. Yeah, it does. Actually, 695 00:34:48,840 --> 00:34:50,920 Speaker 5: there's a lot less value than there used to be 696 00:34:51,200 --> 00:34:55,120 Speaker 5: in somebody said something interesting, let's write it in an 697 00:34:55,160 --> 00:34:57,040 Speaker 5: intelligent way, do a little more work and get it 698 00:34:57,080 --> 00:35:00,440 Speaker 5: to somebody else. That is not as useful anymore for 699 00:35:00,480 --> 00:35:04,920 Speaker 5: the reason that you describe. But those companies, if they 700 00:35:04,920 --> 00:35:06,919 Speaker 5: want to compete with each other and they do chet, 701 00:35:06,960 --> 00:35:09,279 Speaker 5: GPT and Claude right now, they are going to have 702 00:35:09,320 --> 00:35:12,160 Speaker 5: to pay for that information. And that's where a Bloomberg 703 00:35:12,280 --> 00:35:15,160 Speaker 5: is in great position. And say another thing, it's not 704 00:35:15,239 --> 00:35:19,600 Speaker 5: just the big generalists. It is the niche providers who 705 00:35:19,600 --> 00:35:23,279 Speaker 5: are experts like you guys and do an amazing job 706 00:35:23,400 --> 00:35:26,359 Speaker 5: in a particular area that people really care about. Those 707 00:35:26,360 --> 00:35:28,080 Speaker 5: are gonna Those are going to survive. And I'll say 708 00:35:28,080 --> 00:35:30,960 Speaker 5: one more thing, which is Jeff Bezos was an investor 709 00:35:31,640 --> 00:35:33,759 Speaker 5: and business Cider is incredibly helpful to talk to him, 710 00:35:33,840 --> 00:35:36,680 Speaker 5: very smart. I realize now he's become kind of not 711 00:35:37,200 --> 00:35:39,880 Speaker 5: applauded in the journalims of industry. But one of the 712 00:35:39,960 --> 00:35:42,560 Speaker 5: things that he liked to say is everybody wants to 713 00:35:42,600 --> 00:35:45,879 Speaker 5: know what's changing and what it means. It's actually more 714 00:35:45,960 --> 00:35:48,719 Speaker 5: useful to step back and say what is going to 715 00:35:48,800 --> 00:35:51,880 Speaker 5: stay the same. What is going to stay the same 716 00:35:52,239 --> 00:35:55,720 Speaker 5: in media is that we are always wanted. We're always 717 00:35:55,719 --> 00:35:58,440 Speaker 5: going to want to know what's happening and what it means, 718 00:35:58,680 --> 00:36:01,239 Speaker 5: and we're always going to want to be entertained, and 719 00:36:01,320 --> 00:36:06,719 Speaker 5: that's what media does. It's tough, but until actually there 720 00:36:06,760 --> 00:36:09,320 Speaker 5: are no more humans, we are going to want media 721 00:36:09,400 --> 00:36:10,040 Speaker 5: to deliver that. 722 00:36:10,120 --> 00:36:10,799 Speaker 1: I agree with that. 723 00:36:11,040 --> 00:36:12,759 Speaker 3: You know, there's some really good stories by the way, 724 00:36:12,800 --> 00:36:15,239 Speaker 3: about the like the New York Post and daily news 725 00:36:15,280 --> 00:36:18,160 Speaker 3: like report like the crime reporters out in the fields, 726 00:36:18,200 --> 00:36:19,799 Speaker 3: and then they would call the rewrite desk and then 727 00:36:19,840 --> 00:36:22,440 Speaker 3: they would like translate it into like tabloid to speak 728 00:36:22,760 --> 00:36:24,719 Speaker 3: is a that's a cool, that'd be cool cool. 729 00:36:24,800 --> 00:36:27,840 Speaker 4: I remember sending notes on a BlackBerry in like the 730 00:36:27,880 --> 00:36:30,120 Speaker 4: early two thousands, and then someone in the newsroom would 731 00:36:30,280 --> 00:36:32,160 Speaker 4: translate it into readable words. 732 00:36:32,600 --> 00:36:34,760 Speaker 3: With a few minutes left, I want to talk about 733 00:36:34,920 --> 00:36:37,200 Speaker 3: you mentioned, Jeff Bezos. I want to talk about the 734 00:36:37,239 --> 00:36:41,520 Speaker 3: business environment right now. When President Trump won, there was 735 00:36:41,560 --> 00:36:44,200 Speaker 3: a lot of headlines about you know, these CEOs. They'd 736 00:36:44,200 --> 00:36:46,279 Speaker 3: be like, well, now we free speech, and now we 737 00:36:46,320 --> 00:36:48,840 Speaker 3: can speak our minds and you know, none of this 738 00:36:48,960 --> 00:36:51,000 Speaker 3: woke stuff where we have to like be careful about 739 00:36:51,000 --> 00:36:53,799 Speaker 3: what we speak. And like I get the impression that 740 00:36:54,280 --> 00:36:56,960 Speaker 3: it is literally the exact opposite, and that CEOs have 741 00:36:57,040 --> 00:37:00,160 Speaker 3: never been more scared, there never been more timid, and 742 00:37:00,200 --> 00:37:02,440 Speaker 3: they like are obsequious to him, and they like do 743 00:37:02,520 --> 00:37:04,080 Speaker 3: these big pilgrimages and stuff. 744 00:37:04,360 --> 00:37:06,240 Speaker 1: And meanwhile, we know they hate the tariffs. 745 00:37:06,280 --> 00:37:08,680 Speaker 3: We know they hate all the stuff going on. What 746 00:37:08,719 --> 00:37:11,319 Speaker 3: do you like make of that? Because he's not even 747 00:37:11,360 --> 00:37:13,520 Speaker 3: that popular anymore. He's actually, like, arguably like one of 748 00:37:13,560 --> 00:37:16,040 Speaker 3: the least popular presidents ever. And yet I don't get 749 00:37:16,080 --> 00:37:18,000 Speaker 3: the impression that there's been much of a change in 750 00:37:18,000 --> 00:37:21,200 Speaker 3: the sort of like world of CEOs and rich guys 751 00:37:21,200 --> 00:37:23,440 Speaker 3: and billionaires about speaking their minds. 752 00:37:23,800 --> 00:37:27,160 Speaker 5: Yes, the whole free speech thing was always I want 753 00:37:27,239 --> 00:37:29,680 Speaker 5: more speech than I believe in, so I can say 754 00:37:29,719 --> 00:37:32,080 Speaker 5: whatever I want. You can't say what you want, so 755 00:37:32,280 --> 00:37:36,920 Speaker 5: it was gonna ludicrous. I think that in general, the 756 00:37:36,960 --> 00:37:40,759 Speaker 5: big companies and CEOs, especially because President Trump won the 757 00:37:40,840 --> 00:37:45,680 Speaker 5: last election, clearly they said, Okay, I believe in democracy, 758 00:37:46,160 --> 00:37:49,200 Speaker 5: I'm going to be pragmatic and take care of my 759 00:37:49,320 --> 00:37:54,400 Speaker 5: team and my company. And also for most public company CEOs, 760 00:37:55,400 --> 00:37:58,759 Speaker 5: they are easily removed and they know that. And your 761 00:37:58,840 --> 00:38:01,359 Speaker 5: job is to actually take care of your company and shareholders. 762 00:38:01,400 --> 00:38:04,600 Speaker 5: It is not to be a free speech crusader and 763 00:38:04,719 --> 00:38:07,640 Speaker 5: stand up for your personal thing that you believe in. 764 00:38:08,080 --> 00:38:10,200 Speaker 5: And so I think there was a lot of pragmatism. 765 00:38:10,719 --> 00:38:13,360 Speaker 5: I think that may start to change here. So I 766 00:38:13,840 --> 00:38:16,360 Speaker 5: feel like things have happened in the last few months. 767 00:38:16,400 --> 00:38:19,080 Speaker 5: We may be starting to see a change, but I 768 00:38:19,080 --> 00:38:21,640 Speaker 5: think it is mainly pragmatism. And you can look at 769 00:38:21,680 --> 00:38:24,200 Speaker 5: that from the outside and so that's outrageous, but I 770 00:38:24,239 --> 00:38:26,359 Speaker 5: think they were looking out for their companies and their 771 00:38:26,400 --> 00:38:27,680 Speaker 5: teams and their shareholders. 772 00:38:28,000 --> 00:38:30,440 Speaker 4: I want to go back to media and naval raising 773 00:38:30,480 --> 00:38:33,200 Speaker 4: because I've got more questions, and I actually hesitate to 774 00:38:33,320 --> 00:38:35,759 Speaker 4: ask this one because I fear that your response is 775 00:38:35,800 --> 00:38:37,359 Speaker 4: going to get back to asking Joe and I to 776 00:38:37,440 --> 00:38:38,640 Speaker 4: justify our own salaries. 777 00:38:38,840 --> 00:38:40,600 Speaker 5: I ad, I'm telling you why you're not going to 778 00:38:40,600 --> 00:38:42,640 Speaker 5: be replaced by notebook LM. 779 00:38:42,840 --> 00:38:46,040 Speaker 4: Okay, I'm going to ask you to justify a particular 780 00:38:46,040 --> 00:38:50,000 Speaker 4: paycheck kind of Okay, when you sold Business Insider to 781 00:38:50,080 --> 00:38:50,880 Speaker 4: Axel Springer, what. 782 00:38:50,920 --> 00:38:53,600 Speaker 5: Year was that, twenty fifteen? Twenty fifteen? 783 00:38:53,640 --> 00:38:57,280 Speaker 4: Okay, if you were selling Business Insider to Axel Springer, now, 784 00:38:58,080 --> 00:39:01,600 Speaker 4: how different would your which to them be in twenty 785 00:39:01,640 --> 00:39:04,120 Speaker 4: twenty six versus twenty fifteen. 786 00:39:05,160 --> 00:39:09,120 Speaker 5: I would say that I think one of Business Desiders advantages, 787 00:39:09,320 --> 00:39:11,799 Speaker 5: and Joe was a big part of this was we 788 00:39:11,800 --> 00:39:16,520 Speaker 5: were not trying to defend the old way that journalism 789 00:39:16,600 --> 00:39:19,359 Speaker 5: was done in print and television. We were trying to 790 00:39:19,440 --> 00:39:23,840 Speaker 5: bring rigorous journalism to a new medium where people wanted 791 00:39:23,880 --> 00:39:28,160 Speaker 5: different things, where distribution was very different. So we were 792 00:39:28,600 --> 00:39:33,799 Speaker 5: experimenters and innovators, and for every idea that worked, we 793 00:39:33,880 --> 00:39:36,959 Speaker 5: had twenty five ideas that were dumb and didn't work. 794 00:39:37,040 --> 00:39:40,120 Speaker 5: I take responsibility for all of them. That was the 795 00:39:40,160 --> 00:39:42,560 Speaker 5: only way to figure out what worked. And one of 796 00:39:42,600 --> 00:39:45,480 Speaker 5: the things we noticed relatively early is hey, we can 797 00:39:45,480 --> 00:39:48,399 Speaker 5: figure out what people like and we want to serve them. 798 00:39:48,560 --> 00:39:50,160 Speaker 5: And we did a good job at that, and I 799 00:39:50,200 --> 00:39:52,080 Speaker 5: think that was the that was the thing that made 800 00:39:52,080 --> 00:39:56,280 Speaker 5: businesses that are successful. I think now for all companies, 801 00:39:56,440 --> 00:39:59,399 Speaker 5: stuffing your head in the sand and wishing AI would 802 00:39:59,400 --> 00:40:02,879 Speaker 5: go away is not a great strategy. So what would 803 00:40:02,960 --> 00:40:05,279 Speaker 5: be What can we do? And again, one of the 804 00:40:05,320 --> 00:40:08,440 Speaker 5: things that's changed as Google and Facebook and social have 805 00:40:09,719 --> 00:40:12,480 Speaker 5: totally dried up as sources of distribution is we are 806 00:40:12,560 --> 00:40:17,680 Speaker 5: back in this direct subscriber world, just like magazines and 807 00:40:17,719 --> 00:40:20,680 Speaker 5: newspapers way back in the world. So that is serving 808 00:40:20,880 --> 00:40:23,360 Speaker 5: your subscribers. This is what you, guys do. You have 809 00:40:23,440 --> 00:40:26,360 Speaker 5: this incredibly passionate audience. They know you, they want to 810 00:40:26,360 --> 00:40:28,600 Speaker 5: hear from you every week, and so that's what media 811 00:40:28,640 --> 00:40:30,279 Speaker 5: companies need. To do. So that's what I would say 812 00:40:30,480 --> 00:40:32,919 Speaker 5: is we are as focused as we have ever been 813 00:40:33,160 --> 00:40:37,600 Speaker 5: on serving our readers, viewers, people who love us. 814 00:40:38,440 --> 00:40:42,960 Speaker 3: So you are now in subscribe very subscriber oriented media. 815 00:40:43,080 --> 00:40:45,880 Speaker 1: You have a newsletter, you have a podcast, et cetera. 816 00:40:46,280 --> 00:40:49,879 Speaker 3: Other than realizing like, actually you know editors are good 817 00:40:49,960 --> 00:40:52,279 Speaker 3: and so forth, what else have you learned or what's 818 00:40:52,320 --> 00:40:56,880 Speaker 3: been notable about your own personal journey into subscriber based media. 819 00:40:56,920 --> 00:40:58,200 Speaker 1: What does surprise do well? 820 00:40:58,239 --> 00:41:00,279 Speaker 5: So I give you a I'll think So A lot 821 00:41:00,320 --> 00:41:02,440 Speaker 5: of what I've been doing the last year or two 822 00:41:02,719 --> 00:41:06,640 Speaker 5: is writing novels. This is something I wanted to do forever. 823 00:41:06,760 --> 00:41:09,520 Speaker 5: It's like, Okay, now's the time. And I talked to 824 00:41:09,560 --> 00:41:11,400 Speaker 5: another friend of mine who used to be a screenwriter, 825 00:41:11,480 --> 00:41:14,279 Speaker 5: and I confess that to him. He's like, dude, that 826 00:41:14,480 --> 00:41:19,920 Speaker 5: is crazy. Just go to Claude. Tell Claude what you want, 827 00:41:20,520 --> 00:41:22,839 Speaker 5: and Claude will spit out three hundred and twenty five 828 00:41:22,880 --> 00:41:26,600 Speaker 5: pages and it'll be pretty good. And I don't I 829 00:41:26,680 --> 00:41:29,200 Speaker 5: believe it and it'll take twenty minutes. And so I 830 00:41:29,239 --> 00:41:31,960 Speaker 5: feel like the world's biggest idiot. From a business perspective. 831 00:41:32,360 --> 00:41:34,839 Speaker 5: On the other hand, I did want to do it. 832 00:41:35,280 --> 00:41:38,359 Speaker 5: I'm glad I did it. It's really funny. Tell us 833 00:41:38,400 --> 00:41:40,879 Speaker 5: what's coming out next on It's called The Upgrade. It's 834 00:41:40,920 --> 00:41:43,440 Speaker 5: about a tech billionaire who wants to use AI to 835 00:41:43,480 --> 00:41:44,320 Speaker 5: take over the world. 836 00:41:44,400 --> 00:41:45,560 Speaker 4: That doesn't sound like fiction. 837 00:41:45,680 --> 00:41:48,080 Speaker 5: It's nonfiction, but it'll be out. It's going to be 838 00:41:48,080 --> 00:41:51,400 Speaker 5: out in beta. It's a new form of books a 839 00:41:51,440 --> 00:41:53,640 Speaker 5: screen test. Yes, it's been in alpha for a while. 840 00:41:53,640 --> 00:41:56,279 Speaker 5: A lot of people have read it's different. No, it's 841 00:41:56,320 --> 00:41:58,120 Speaker 5: a book book, but there's gonna be a period. But 842 00:41:58,440 --> 00:42:00,560 Speaker 5: I invite you to read it. Thank you if you 843 00:42:00,600 --> 00:42:03,000 Speaker 5: would send me what you think. And then when I 844 00:42:03,080 --> 00:42:06,080 Speaker 5: publish it for real, it may will have benefited from 845 00:42:06,120 --> 00:42:09,280 Speaker 5: your reading anyway. So I feel like a complete idiot 846 00:42:09,360 --> 00:42:11,600 Speaker 5: for doing it. But what I will say is for 847 00:42:11,719 --> 00:42:16,440 Speaker 5: me as an analyst and writer and speaker, part of 848 00:42:16,440 --> 00:42:19,400 Speaker 5: the joy is the process. Yea, and I learned so 849 00:42:19,560 --> 00:42:22,480 Speaker 5: much and this is what I don't understand about how 850 00:42:22,520 --> 00:42:24,279 Speaker 5: research is going to be conducted. One of the things 851 00:42:24,320 --> 00:42:26,719 Speaker 5: that I was, Hey, write me a research report about 852 00:42:26,960 --> 00:42:29,520 Speaker 5: X six minutes later it comes back. Is better than 853 00:42:29,520 --> 00:42:31,360 Speaker 5: a research report that I would have written as a 854 00:42:31,880 --> 00:42:34,000 Speaker 5: young analyst when I didn't know anything, and it took 855 00:42:34,040 --> 00:42:36,920 Speaker 5: six minutes. So I said, okay, I'm not going to 856 00:42:36,960 --> 00:42:40,480 Speaker 5: write any research reports anymore. But when it took me 857 00:42:40,680 --> 00:42:43,279 Speaker 5: a month to do that as a twenty five year old, 858 00:42:43,800 --> 00:42:46,960 Speaker 5: I learned a lot through that process. And so now 859 00:42:47,000 --> 00:42:49,320 Speaker 5: I don't know how you learn whether you be reading 860 00:42:49,360 --> 00:42:51,160 Speaker 5: the report like I don't. That's the thing I'm really 861 00:42:51,160 --> 00:42:53,560 Speaker 5: struggling with. But I will say so for me, the 862 00:42:53,640 --> 00:42:56,359 Speaker 5: process has been interesting. I hope you like the book. 863 00:42:56,400 --> 00:42:57,960 Speaker 5: If you like it, maybe I'll do another one. That 864 00:42:57,960 --> 00:43:00,880 Speaker 5: would be great. But what I would say is on 865 00:43:00,960 --> 00:43:03,359 Speaker 5: this whole thing, for what are humans going to do? 866 00:43:04,200 --> 00:43:08,240 Speaker 5: You're a chess player. It has been forty years since 867 00:43:08,280 --> 00:43:12,880 Speaker 5: our phones could clabor us in chess. Chess is bigger 868 00:43:12,920 --> 00:43:16,520 Speaker 5: than it is ever true in person. So my hope 869 00:43:16,600 --> 00:43:20,440 Speaker 5: is we have ai research and write stuff that we 870 00:43:20,480 --> 00:43:23,400 Speaker 5: don't really want to do, and we save our writing 871 00:43:23,600 --> 00:43:26,840 Speaker 5: and orating and everything else for the stuff that we 872 00:43:26,880 --> 00:43:27,880 Speaker 5: are really passionate about. 873 00:43:28,400 --> 00:43:29,160 Speaker 1: Henry blogIn. 874 00:43:29,360 --> 00:43:31,480 Speaker 3: Thank you so much and thanks for joining us here 875 00:43:31,480 --> 00:43:32,680 Speaker 3: at all on air Fast. 876 00:43:32,719 --> 00:43:33,439 Speaker 1: There's a lot of fun. 877 00:43:33,760 --> 00:43:50,040 Speaker 4: Thank you so much, Henny. This has been another episode 878 00:43:50,080 --> 00:43:52,600 Speaker 4: of the au Thoughts podcast. I'm Tracy Alloway. You can 879 00:43:52,640 --> 00:43:54,880 Speaker 4: follow me at Tracy Alloway. 880 00:43:54,560 --> 00:43:57,160 Speaker 3: And I'm Joe Wisenthal. You can follow me at The Stalwart. 881 00:43:57,280 --> 00:44:00,759 Speaker 3: Follow our producers Carmen Rodriguez at Carmen armag, dash Ol 882 00:44:00,760 --> 00:44:03,719 Speaker 3: Bennett at Dashbot, and kill Brooks at Kalebrooks. And for 883 00:44:03,800 --> 00:44:06,680 Speaker 3: more odd Laws content, go to Bloomberg dot com slash Oddlogs. 884 00:44:06,719 --> 00:44:09,200 Speaker 3: We've a daily newsletter and all of our episodes, and 885 00:44:09,239 --> 00:44:11,200 Speaker 3: you can chaut about all of these things twenty four 886 00:44:11,200 --> 00:44:14,520 Speaker 3: to seven in our discord Discord dot gg slash od. 887 00:44:14,520 --> 00:44:17,560 Speaker 4: Loots And if you enjoy odd Lots then please leave 888 00:44:17,640 --> 00:44:21,040 Speaker 4: us a positive review on your favorite podcast platform. And remember, 889 00:44:21,120 --> 00:44:23,480 Speaker 4: if you are a Bloomberg subscriber, you can listen to 890 00:44:23,560 --> 00:44:26,520 Speaker 4: all of our episodes absolutely ad free. All you need 891 00:44:26,560 --> 00:44:29,360 Speaker 4: to do is find the Bloomberg channel on Apple Podcasts 892 00:44:29,360 --> 00:44:32,000 Speaker 4: and follow the instructions there. Thanks for listening