1 00:00:02,720 --> 00:00:16,720 Speaker 1: Bloomberg Audio Studios, Podcasts, radio News. 2 00:00:18,200 --> 00:00:21,880 Speaker 2: Hello and welcome to another episode of the Odd Lots podcast. 3 00:00:21,960 --> 00:00:24,400 Speaker 3: I'm Jolle Wisenthal and I'm Tracy Alloway. 4 00:00:25,000 --> 00:00:27,880 Speaker 2: You know, not a novel observation here, but I do 5 00:00:27,960 --> 00:00:31,440 Speaker 2: think that if, like in early twenty twenty two or 6 00:00:31,560 --> 00:00:34,960 Speaker 2: late twenty twenty one, you had someone had revealed to 7 00:00:35,040 --> 00:00:37,400 Speaker 2: you all of the amazing things that we could do 8 00:00:37,440 --> 00:00:41,239 Speaker 2: with AI these days, I think you would have expected 9 00:00:41,280 --> 00:00:44,600 Speaker 2: that either the broader economy or society would be more 10 00:00:44,640 --> 00:00:48,640 Speaker 2: different than it has been. Like I am large, you know, 11 00:00:49,320 --> 00:00:52,640 Speaker 2: I think most people's jobs are done roughly the same way. 12 00:00:53,400 --> 00:00:56,000 Speaker 2: Sort of society still seems to operate, although a little 13 00:00:56,040 --> 00:00:58,640 Speaker 2: maybe a little bit worse every day. I don't know, 14 00:00:58,800 --> 00:01:01,400 Speaker 2: But like I just don't think like it's mind blowing 15 00:01:01,400 --> 00:01:04,000 Speaker 2: technology and get by and large, like it hasn't had 16 00:01:04,040 --> 00:01:07,040 Speaker 2: the economic disruption that I think many people would have guessed. 17 00:01:07,080 --> 00:01:09,280 Speaker 3: Maybe I'm cynical on the subject, but I always say, 18 00:01:09,280 --> 00:01:13,360 Speaker 3: never underestimate the human capacity for stasis, I guess, and 19 00:01:13,440 --> 00:01:16,440 Speaker 3: making things much more difficult than they actually need to be, 20 00:01:17,000 --> 00:01:20,840 Speaker 3: and putting up you know, bureaucratic barriers, regulatory barriers and 21 00:01:20,880 --> 00:01:23,720 Speaker 3: things like that. So it's not it's not that surprising 22 00:01:23,760 --> 00:01:25,360 Speaker 3: to me, but it is true. You have a lot 23 00:01:25,400 --> 00:01:27,720 Speaker 3: of economists out there who think there was going to 24 00:01:27,760 --> 00:01:31,080 Speaker 3: be this massive productivity boost, right well, it's certainly. 25 00:01:30,720 --> 00:01:33,039 Speaker 2: Tech people like I don't know, like economists. You know, 26 00:01:33,120 --> 00:01:35,200 Speaker 2: they're like, oh, there were some we're going to have 27 00:01:35,240 --> 00:01:38,279 Speaker 2: a productivity boom. We're raising our estimates from two percent 28 00:01:38,319 --> 00:01:41,520 Speaker 2: too and a quarter percent, and then you have all 29 00:01:41,560 --> 00:01:43,959 Speaker 2: things relative to and then you know, we talked to 30 00:01:44,000 --> 00:01:45,960 Speaker 2: Kathy Wood one time, and I think she what did 31 00:01:46,000 --> 00:01:49,760 Speaker 2: she predict twenty percent real GDP growth for twenty years? 32 00:01:49,880 --> 00:01:54,400 Speaker 2: Or you certainly have these people in Silicon Valley deflationary boom, 33 00:01:54,480 --> 00:01:58,880 Speaker 2: post scarcity any any minute, right now, some real gaps 34 00:01:58,960 --> 00:02:01,480 Speaker 2: I would say, between how lot of economists think about 35 00:02:01,480 --> 00:02:03,760 Speaker 2: some of these things and the numbers, then economists would 36 00:02:03,800 --> 00:02:07,560 Speaker 2: be comfortable using versus maybe literally. 37 00:02:07,360 --> 00:02:10,600 Speaker 3: Everyone else like Kathy would certainly many others. Yeah, you 38 00:02:10,639 --> 00:02:14,280 Speaker 3: know what I think about is how different would your 39 00:02:14,320 --> 00:02:18,680 Speaker 3: career in my career have been had AI existed in 40 00:02:18,800 --> 00:02:21,200 Speaker 3: like two thousand and eight, two thousand and six when 41 00:02:21,240 --> 00:02:22,799 Speaker 3: we were starting to blog. 42 00:02:22,880 --> 00:02:25,760 Speaker 2: Basically, it's a really good question, and I don't know 43 00:02:25,919 --> 00:02:28,480 Speaker 2: the answer because part of me thinks, well, you know, 44 00:02:28,560 --> 00:02:31,240 Speaker 2: that whole path that was like defined my career like 45 00:02:31,280 --> 00:02:34,520 Speaker 2: would not have been there, would not have existed or no. Maybe, 46 00:02:34,560 --> 00:02:37,320 Speaker 2: But on the other hand, maybe in two thousand and 47 00:02:37,320 --> 00:02:40,640 Speaker 2: six I would have been like just like I was 48 00:02:40,680 --> 00:02:45,440 Speaker 2: super early into blogging, super early into experimenting with AI news, 49 00:02:45,480 --> 00:02:47,520 Speaker 2: and it would look different, but maybe I would have 50 00:02:48,000 --> 00:02:49,079 Speaker 2: rowed some different. 51 00:02:48,880 --> 00:02:49,720 Speaker 3: Different but the same. 52 00:02:49,840 --> 00:02:51,799 Speaker 2: Yeah, different but the same, you know, Like I think 53 00:02:51,800 --> 00:02:53,200 Speaker 2: it's a it's pretty hard to tell. 54 00:02:53,440 --> 00:02:56,360 Speaker 3: One thing I know is like back then we were 55 00:02:56,400 --> 00:03:00,240 Speaker 3: writing to optimize Google results, right, that was basic or know, 56 00:03:00,360 --> 00:03:02,919 Speaker 3: like I know social media was disseminating stuff as well, 57 00:03:02,919 --> 00:03:05,400 Speaker 3: but like Google was still the main platform for a 58 00:03:05,400 --> 00:03:08,280 Speaker 3: lot of stuff. I'm sure if we were doing it now, 59 00:03:08,400 --> 00:03:12,840 Speaker 3: we would be optimizing for chat, GPT or perplexity or 60 00:03:12,840 --> 00:03:15,839 Speaker 3: someone like that to actually pick up the content. I mean, 61 00:03:16,000 --> 00:03:17,240 Speaker 3: I think that's the audience. 62 00:03:17,480 --> 00:03:19,600 Speaker 2: Yeah it is. And I think like any especially any 63 00:03:19,680 --> 00:03:22,320 Speaker 2: like commercial publisher, right that's trying to do a scale. 64 00:03:22,360 --> 00:03:26,040 Speaker 2: I think like a sort of niche voice expert could 65 00:03:26,080 --> 00:03:28,240 Speaker 2: still have like their audience that comes to them directly, 66 00:03:28,280 --> 00:03:31,320 Speaker 2: of course, but like it's scale scale for sure every 67 00:03:31,360 --> 00:03:34,400 Speaker 2: publisher is trying to figure this out. Anyway, We really 68 00:03:34,400 --> 00:03:36,839 Speaker 2: do have the perfect guest, someone we should have had 69 00:03:36,840 --> 00:03:39,800 Speaker 2: on the podcast years and years ago. It's almost surprising 70 00:03:39,880 --> 00:03:41,960 Speaker 2: that it's just the first time we've had him on. 71 00:03:42,160 --> 00:03:45,080 Speaker 2: It is crazy, It is crazy, but it is someone 72 00:03:45,160 --> 00:03:47,640 Speaker 2: who I think is like at the intersection of everything 73 00:03:47,680 --> 00:03:51,080 Speaker 2: that we're talking about. Because he's an economist. He knows 74 00:03:51,160 --> 00:03:53,440 Speaker 2: the tech really well. The tech people know him, the 75 00:03:53,480 --> 00:03:55,520 Speaker 2: tech people love him. He may even be one of 76 00:03:55,560 --> 00:03:59,160 Speaker 2: the preferred economists for this. I feel he's like the 77 00:03:59,200 --> 00:04:01,280 Speaker 2: economist that all people in the AI world want to 78 00:04:01,280 --> 00:04:04,880 Speaker 2: get his take. A long time blogger, someone I, you know, 79 00:04:04,960 --> 00:04:06,920 Speaker 2: like both of us sort of gave up on mike blogging. 80 00:04:06,960 --> 00:04:08,840 Speaker 2: Although we have our newsletter. It's like close enough, but 81 00:04:08,880 --> 00:04:11,440 Speaker 2: it's different than blogging. Someone who's stuck with the medium 82 00:04:11,480 --> 00:04:14,040 Speaker 2: for a long time. One of the true original econ 83 00:04:14,120 --> 00:04:17,000 Speaker 2: bloggers that I've been reading for over twenty years. We're 84 00:04:17,040 --> 00:04:19,239 Speaker 2: going to be speaking with Tyler Cowen. Here's the host 85 00:04:19,240 --> 00:04:22,760 Speaker 2: of the Conversations with Tyler podcast. He also, obviously is 86 00:04:22,760 --> 00:04:26,360 Speaker 2: one of the two bloggers that the famous Marginal Revolution blog. 87 00:04:26,880 --> 00:04:33,279 Speaker 2: Economics professor gmu appreciator of ethnic foods all around the 88 00:04:33,720 --> 00:04:38,320 Speaker 2: DC Northern Virginia, Maryland area. Someone known on the internet 89 00:04:38,360 --> 00:04:40,200 Speaker 2: for a long time, Tyler, thank you so much for 90 00:04:40,520 --> 00:04:43,440 Speaker 2: coming on odd Laws. Really thrilled to have you here. 91 00:04:44,040 --> 00:04:45,400 Speaker 4: Hello, happy to be here. 92 00:04:45,800 --> 00:04:49,560 Speaker 2: Amazing. What do you think about my initial assessment? How 93 00:04:49,600 --> 00:04:51,719 Speaker 2: fair do you think it is that, Like, had you 94 00:04:51,880 --> 00:04:55,400 Speaker 2: known in twenty twenty one how powerful these tools would be, 95 00:04:55,880 --> 00:04:59,200 Speaker 2: maybe we'd be a bit surprised that by and large, 96 00:04:59,279 --> 00:05:01,560 Speaker 2: like busines, and this seems to more or less run 97 00:05:01,600 --> 00:05:03,479 Speaker 2: the same, I'm. 98 00:05:03,320 --> 00:05:06,040 Speaker 4: Not surprised at all. So what I see right now 99 00:05:06,160 --> 00:05:09,080 Speaker 4: is people using AI as an add on to their 100 00:05:09,120 --> 00:05:12,240 Speaker 4: pre existing work routines. Oh, you need to write a memo, 101 00:05:12,480 --> 00:05:14,520 Speaker 4: you ask AI how to do it. You need to 102 00:05:14,520 --> 00:05:17,000 Speaker 4: write a column, you ask AI to proofread it or 103 00:05:17,040 --> 00:05:20,200 Speaker 4: fact check it. And that works great, But those are 104 00:05:20,279 --> 00:05:23,239 Speaker 4: marginal gains. What we really need to see a major 105 00:05:23,320 --> 00:05:27,320 Speaker 4: impact is new organizations built around AI, and those will 106 00:05:27,360 --> 00:05:30,839 Speaker 4: be startups. They will come only slowly. It will take 107 00:05:31,400 --> 00:05:34,720 Speaker 4: twenty or more years before they really transform the economy. 108 00:05:35,440 --> 00:05:37,559 Speaker 4: And in the meantime, it's a whole bunch of add ons, 109 00:05:37,600 --> 00:05:40,159 Speaker 4: which are fun and fine, but that's why I think 110 00:05:40,160 --> 00:05:40,640 Speaker 4: it's slow. 111 00:05:40,960 --> 00:05:43,560 Speaker 2: What can you tell us about the history of technology 112 00:05:44,080 --> 00:05:49,040 Speaker 2: such that legacy organizations that existed prior to the invention 113 00:05:49,440 --> 00:05:54,040 Speaker 2: of maybe potentially revolutionary new technology have a hard time 114 00:05:54,760 --> 00:05:57,320 Speaker 2: massively changing their workflows. 115 00:05:58,360 --> 00:06:01,640 Speaker 4: Well, you can take even very simple examples. So Toyota 116 00:06:01,680 --> 00:06:05,560 Speaker 4: starts competing with General Motors in the nineteen seventies, General 117 00:06:05,600 --> 00:06:08,919 Speaker 4: Motors is paralyzed. It cannot really come back and adopt 118 00:06:08,920 --> 00:06:11,800 Speaker 4: the new and superior Toyota methods, and those are not 119 00:06:12,120 --> 00:06:15,960 Speaker 4: really that different compared to say AI. So there's just plenty, 120 00:06:16,240 --> 00:06:20,320 Speaker 4: plenty examples. Old mainstream media could not cope very well 121 00:06:20,360 --> 00:06:23,080 Speaker 4: with the Internet. There are exceptions like The New York Times. 122 00:06:23,800 --> 00:06:27,600 Speaker 4: Odd Lot's podcast would be another thank you, But it's 123 00:06:27,640 --> 00:06:30,680 Speaker 4: the norm. So we're seeing it again with AI. And 124 00:06:30,720 --> 00:06:33,400 Speaker 4: again you need a complete turnover of who and what 125 00:06:33,560 --> 00:06:35,600 Speaker 4: is doing business for it to really matter on a 126 00:06:35,640 --> 00:06:36,200 Speaker 4: big scale. 127 00:06:37,560 --> 00:06:40,520 Speaker 3: If you need that complete turnover, and you need some 128 00:06:40,640 --> 00:06:43,920 Speaker 3: time for AI to become fully embedded in a business 129 00:06:43,960 --> 00:06:47,280 Speaker 3: model or for a business model to form around AI, 130 00:06:48,120 --> 00:06:50,640 Speaker 3: what industry or what part of the economy would you 131 00:06:50,720 --> 00:06:53,600 Speaker 3: expect it, I guess to show up first in that 132 00:06:53,680 --> 00:06:55,280 Speaker 3: sort of revolutionary way. 133 00:06:56,680 --> 00:06:59,160 Speaker 4: Well, we have obvious data on this, and it's programming. 134 00:06:59,520 --> 00:07:02,640 Speaker 4: You will hear people who do programming claim that say 135 00:07:02,680 --> 00:07:05,320 Speaker 4: eighty percent of the work is now done by AIS. 136 00:07:05,520 --> 00:07:08,520 Speaker 4: I suspect that's an overstatement, but there's no doubt at 137 00:07:08,560 --> 00:07:11,680 Speaker 4: all that there's simply a lot of programming already done 138 00:07:11,680 --> 00:07:16,360 Speaker 4: by AIS. When you have low fixed costs, a competitive sector, 139 00:07:16,640 --> 00:07:21,320 Speaker 4: immediate feedback, you know, the revenue has to flow programming. 140 00:07:21,400 --> 00:07:24,760 Speaker 4: And also New York City finance, there's been quantston finance 141 00:07:24,840 --> 00:07:28,400 Speaker 4: for a long time. Those quants are now you could say, 142 00:07:28,440 --> 00:07:30,880 Speaker 4: more AI equipped than they used to be, and those 143 00:07:30,960 --> 00:07:33,040 Speaker 4: areas already are being revolutionized. 144 00:07:33,560 --> 00:07:37,080 Speaker 2: I've heard them and got some pitches to do episodes, 145 00:07:37,120 --> 00:07:39,240 Speaker 2: which we should do at some point. But I've heard 146 00:07:39,280 --> 00:07:43,600 Speaker 2: about like some law firms that are being like new 147 00:07:43,720 --> 00:07:46,720 Speaker 2: AI law firms where there are lawyers, et cetera. But 148 00:07:46,800 --> 00:07:49,920 Speaker 2: from the very ground up, the idea is perhaps there 149 00:07:50,000 --> 00:07:54,280 Speaker 2: is some way to just get way more productivity if 150 00:07:54,320 --> 00:07:57,400 Speaker 2: they start from the very beginning with some combination of 151 00:07:57,680 --> 00:08:00,840 Speaker 2: lawyers plus AI models. It seems like that could be 152 00:08:00,880 --> 00:08:03,440 Speaker 2: the kind of thing where maybe the legacy law firms 153 00:08:03,640 --> 00:08:07,320 Speaker 2: are seeing some productivity gains from AI. There's probably some 154 00:08:07,360 --> 00:08:10,280 Speaker 2: evidence you could find but that a new one with 155 00:08:10,320 --> 00:08:14,560 Speaker 2: a totally different approach could deliver that big productivity boost 156 00:08:14,560 --> 00:08:16,280 Speaker 2: that actually ends up changing the industry. 157 00:08:17,160 --> 00:08:20,000 Speaker 4: It's already the case, say, mid tier associates are much 158 00:08:20,080 --> 00:08:23,360 Speaker 4: less needed. But there's one big problem with law in particular, 159 00:08:23,840 --> 00:08:27,080 Speaker 4: and that is the way large language models work. Now 160 00:08:27,160 --> 00:08:30,040 Speaker 4: you have to send your queries somewhere else. You can't 161 00:08:30,080 --> 00:08:32,440 Speaker 4: just own and control and hold the whole thing on 162 00:08:32,480 --> 00:08:34,960 Speaker 4: your hard drive. Now. I think within a few years 163 00:08:35,040 --> 00:08:38,600 Speaker 4: time that will be very different. But until then, major 164 00:08:38,720 --> 00:08:42,480 Speaker 4: law firms are extremely skittish about just typing in their 165 00:08:42,559 --> 00:08:45,240 Speaker 4: questions and sending it, you know, to San Francisco. I 166 00:08:45,280 --> 00:08:47,480 Speaker 4: don't actually think there's a risk, but when you think 167 00:08:47,480 --> 00:08:50,680 Speaker 4: about how fiduciary responsibility works, they just don't want to 168 00:08:50,720 --> 00:08:51,000 Speaker 4: do it. 169 00:09:07,559 --> 00:09:11,280 Speaker 3: Talk more about I guess privacy concerns and regulation, because 170 00:09:11,320 --> 00:09:13,880 Speaker 3: this seems to be an area where if you are 171 00:09:13,920 --> 00:09:16,280 Speaker 3: in a heavily regulated industry, or if you're just in 172 00:09:16,320 --> 00:09:18,680 Speaker 3: an industry that tends to be full of paranoid people 173 00:09:18,880 --> 00:09:22,120 Speaker 3: like lawyers, it does seem like there's going to be 174 00:09:22,120 --> 00:09:25,800 Speaker 3: a natural tendency to be very very cautious. When it 175 00:09:25,840 --> 00:09:28,000 Speaker 3: comes to sharing data with AI, you're going to be 176 00:09:28,040 --> 00:09:30,760 Speaker 3: worried about actual data ownership the queries that you're sending 177 00:09:30,840 --> 00:09:34,240 Speaker 3: to San Francisco, as you say, are those industries just 178 00:09:34,320 --> 00:09:36,240 Speaker 3: inevitably going to be slow to adapt. 179 00:09:37,880 --> 00:09:40,880 Speaker 4: They'll be slow to adapt again until the point where 180 00:09:40,920 --> 00:09:44,400 Speaker 4: they just control their own model and they hold it 181 00:09:44,440 --> 00:09:46,640 Speaker 4: within the firm and no one else really can get 182 00:09:46,679 --> 00:09:52,160 Speaker 4: at it. So what you need is cheaper models where 183 00:09:52,280 --> 00:09:54,840 Speaker 4: a law firm can afford to have its own model. 184 00:09:54,880 --> 00:09:56,920 Speaker 4: And I think that's a few years away. It's not 185 00:09:57,000 --> 00:09:59,400 Speaker 4: a very long time away, but it won't come in 186 00:09:59,440 --> 00:10:02,960 Speaker 4: six months. Sam Altman I just did a podcast with 187 00:10:03,080 --> 00:10:07,240 Speaker 4: him and he said a privacy problem is AI queries 188 00:10:07,360 --> 00:10:09,920 Speaker 4: or subject to subpoena, and he thinks they should have 189 00:10:09,960 --> 00:10:13,200 Speaker 4: as much protection as say, your conversation with your lawyer 190 00:10:13,320 --> 00:10:15,920 Speaker 4: or your doctor or your therapist. I think that's a 191 00:10:15,920 --> 00:10:19,800 Speaker 4: good idea, but that hasn't happened yet, and until it happens, 192 00:10:19,920 --> 00:10:21,880 Speaker 4: or you get the whole thing on your own hard drive, 193 00:10:22,280 --> 00:10:24,560 Speaker 4: progress in law is going to be slow. But once 194 00:10:24,600 --> 00:10:27,480 Speaker 4: the progress comes, that's one of the areas where I 195 00:10:27,480 --> 00:10:30,600 Speaker 4: think AI has the most promise it's just very very 196 00:10:30,640 --> 00:10:34,040 Speaker 4: good at mastering a large corpus of text and organizing 197 00:10:34,040 --> 00:10:34,480 Speaker 4: it for you. 198 00:10:35,240 --> 00:10:37,880 Speaker 2: It is interesting, isn't it. Where it's like, Okay, if 199 00:10:37,880 --> 00:10:40,359 Speaker 2: you are my lawyer, you and I could have a 200 00:10:40,400 --> 00:10:44,280 Speaker 2: conversation and it would be not it would not be uh. 201 00:10:44,240 --> 00:10:44,960 Speaker 3: It'd be privileged. 202 00:10:45,040 --> 00:10:48,960 Speaker 2: Yeah, it would be privileged. Or you know, one can 203 00:10:49,000 --> 00:10:52,640 Speaker 2: have conversations with anyone, and as long as you're doing 204 00:10:52,679 --> 00:10:55,160 Speaker 2: it on the phone, or if you're doing it person 205 00:10:55,200 --> 00:10:59,079 Speaker 2: to person, it's much more difficult to get that, you know, 206 00:10:59,200 --> 00:11:02,720 Speaker 2: in discover or I always think about this with you know, 207 00:11:02,800 --> 00:11:06,360 Speaker 2: public public records of employees that sometimes you can get 208 00:11:06,400 --> 00:11:09,240 Speaker 2: their emails through Floyers, et cetera, but you can't get 209 00:11:09,240 --> 00:11:13,040 Speaker 2: the cons content of conversations. It does feel I hadn't 210 00:11:13,040 --> 00:11:16,520 Speaker 2: really thought about this dynamic though, that when you're using 211 00:11:16,559 --> 00:11:19,200 Speaker 2: an AI it is sort of like a conversation and 212 00:11:19,280 --> 00:11:22,880 Speaker 2: yet sort of from an evidentiary basis, it would be 213 00:11:23,000 --> 00:11:24,080 Speaker 2: much more like an email. 214 00:11:25,200 --> 00:11:27,320 Speaker 4: I think when it comes to medical issues, there are 215 00:11:27,360 --> 00:11:30,560 Speaker 4: many more people willing to share their data. Not everyone 216 00:11:30,640 --> 00:11:34,040 Speaker 4: by any means. Some medical conditions are secretor people just 217 00:11:34,080 --> 00:11:36,960 Speaker 4: don't want others to know. But I see many, many 218 00:11:37,000 --> 00:11:39,240 Speaker 4: people I know typing in all kinds of things about 219 00:11:39,240 --> 00:11:42,480 Speaker 4: their medical history to say GPT five and getting what 220 00:11:42,559 --> 00:11:45,360 Speaker 4: are on the whole very good answers. It's like medical 221 00:11:45,360 --> 00:11:49,199 Speaker 4: diagnosis for free spreading now to the whole world. A 222 00:11:49,240 --> 00:11:51,840 Speaker 4: lot of countries where people just don't even have access 223 00:11:51,880 --> 00:11:54,920 Speaker 4: to good doctors at all, and I think that will 224 00:11:55,080 --> 00:11:58,400 Speaker 4: be important more quickly than the law innovations. 225 00:11:59,600 --> 00:12:01,719 Speaker 3: Just as a thought experiment, what does all of this 226 00:12:01,880 --> 00:12:04,120 Speaker 3: mean for insurers? Because I kind of think, you know, 227 00:12:04,160 --> 00:12:05,920 Speaker 3: I think about a bunch of people typing in their 228 00:12:05,960 --> 00:12:09,600 Speaker 3: medical information. I think about basically the explosion in data 229 00:12:09,640 --> 00:12:11,959 Speaker 3: that we have nowadays, and it seems like the insurance 230 00:12:12,000 --> 00:12:15,920 Speaker 3: industry would be one place that would really benefit from 231 00:12:15,920 --> 00:12:19,480 Speaker 3: all this trove of additional information if they could access 232 00:12:19,520 --> 00:12:21,120 Speaker 3: it well. 233 00:12:21,120 --> 00:12:23,480 Speaker 4: This is one of my worries about AI in general. 234 00:12:23,679 --> 00:12:27,240 Speaker 4: I'm quite positive on what's happening, But as insurers get 235 00:12:27,280 --> 00:12:30,720 Speaker 4: better and better information on their customers, this is just 236 00:12:30,760 --> 00:12:32,880 Speaker 4: through big data more generally, it doesn't have to be 237 00:12:32,960 --> 00:12:36,520 Speaker 4: current large language models. They know exactly how to write 238 00:12:36,520 --> 00:12:38,920 Speaker 4: the risk and how to price the premium, and in 239 00:12:38,960 --> 00:12:41,880 Speaker 4: a sense for the buyer, it's not insurance anymore. So, 240 00:12:41,920 --> 00:12:43,760 Speaker 4: if we know your house is going to burn down 241 00:12:43,800 --> 00:12:46,200 Speaker 4: with high probability and you have to pay the super 242 00:12:46,280 --> 00:12:49,240 Speaker 4: high premium, you don't really have the benefits of the insurance. 243 00:12:49,600 --> 00:12:54,000 Speaker 4: So some insurance markets might unravel if through big data 244 00:12:54,480 --> 00:12:57,120 Speaker 4: the insurers learn too much about what's likely to happen. 245 00:12:57,320 --> 00:13:02,760 Speaker 2: Economists seem to be very consistent about the effects of 246 00:13:02,800 --> 00:13:06,560 Speaker 2: technology on labor demand, which is that in the end 247 00:13:06,600 --> 00:13:09,240 Speaker 2: it washes out right, some people, Okay, there's a disruption, 248 00:13:09,400 --> 00:13:11,520 Speaker 2: but I'm gonna save money because I use the I. 249 00:13:11,720 --> 00:13:13,679 Speaker 2: But that means I have more spending power, and then 250 00:13:13,720 --> 00:13:16,199 Speaker 2: I'm gonna buy something else that I wouldn't have bought 251 00:13:16,280 --> 00:13:20,120 Speaker 2: had it not been for paying wages, and then that'll 252 00:13:20,160 --> 00:13:22,880 Speaker 2: create demand for labor elsewhere. And so in the end, 253 00:13:22,880 --> 00:13:27,160 Speaker 2: the idea that you could really have tech driven unemployment 254 00:13:27,240 --> 00:13:31,600 Speaker 2: at scale that is not transitory or not temporary is hard. 255 00:13:31,760 --> 00:13:35,680 Speaker 2: A lot of economists seem to be intuitively skeptical of 256 00:13:35,720 --> 00:13:39,080 Speaker 2: this idea. Whereas you have people in the ifields, fifty 257 00:13:39,080 --> 00:13:40,880 Speaker 2: percent of the people aren't gonna have jobs we need 258 00:13:41,000 --> 00:13:44,480 Speaker 2: ubi otherwise there's gonna be a permanent underclass. Could there 259 00:13:44,520 --> 00:13:48,560 Speaker 2: be something different about AI such that it doesn't have 260 00:13:48,720 --> 00:13:51,760 Speaker 2: the same labor market effects that past technologies have had. 261 00:13:52,720 --> 00:13:56,000 Speaker 4: I would say, you understand me. Well, So the energy sector, 262 00:13:56,080 --> 00:13:59,000 Speaker 4: there's going to be a lot of new jobs taking 263 00:13:59,040 --> 00:14:03,319 Speaker 4: care of older people. I think as AI produces more potential, 264 00:14:03,360 --> 00:14:06,640 Speaker 4: medical innovations will need to test them, So the biomedical 265 00:14:06,679 --> 00:14:09,960 Speaker 4: sector testing clinical trials, there'll be a lot new jobs. 266 00:14:10,760 --> 00:14:14,000 Speaker 4: I'm not worried about mass unemployment, and most economists are not, 267 00:14:14,440 --> 00:14:16,640 Speaker 4: and I agree with their perspectives, which I think you 268 00:14:16,679 --> 00:14:17,800 Speaker 4: outlined pretty clearly. 269 00:14:18,400 --> 00:14:21,000 Speaker 2: What do you say, though, because I have a feeling 270 00:14:21,120 --> 00:14:23,920 Speaker 2: that when you're out in San Francisco, they don't see 271 00:14:23,960 --> 00:14:27,160 Speaker 2: it that way, and they talk a lot, and some 272 00:14:27,240 --> 00:14:29,600 Speaker 2: of them are more in you know, UBI talk and 273 00:14:29,880 --> 00:14:32,560 Speaker 2: permanent underclass talk and all of this stuff. Do they 274 00:14:32,640 --> 00:14:35,080 Speaker 2: see your perspective when you make this case? What do 275 00:14:35,120 --> 00:14:37,360 Speaker 2: they say or what are they What are a lot 276 00:14:37,400 --> 00:14:39,360 Speaker 2: of them seem to be missing about the logic that 277 00:14:39,400 --> 00:14:40,400 Speaker 2: you spell out. 278 00:14:40,880 --> 00:14:43,200 Speaker 4: Well, I think more and more they're coming around to 279 00:14:43,240 --> 00:14:46,560 Speaker 4: the economist's point of view. So Andre Carpathi, who was 280 00:14:46,600 --> 00:14:49,720 Speaker 4: at open AI, you know, in the most important years, 281 00:14:50,080 --> 00:14:53,400 Speaker 4: he just did a podcast saying he thinks adjustment will 282 00:14:53,400 --> 00:14:56,120 Speaker 4: be slow. Things will be fine, will grow at two 283 00:14:56,200 --> 00:14:59,600 Speaker 4: points something percent. There won't be mass unemployment, and you 284 00:14:59,640 --> 00:15:02,200 Speaker 4: wouldn't I have heard that, say two years ago. But 285 00:15:02,240 --> 00:15:04,560 Speaker 4: I think as people see the models rolling out, and 286 00:15:04,600 --> 00:15:07,160 Speaker 4: as you mentioned well, the real world impact it is 287 00:15:07,200 --> 00:15:12,160 Speaker 4: stretched out in time, right, it's not all immediate. Earlier 288 00:15:12,200 --> 00:15:14,280 Speaker 4: on people had more the sense that AI was a 289 00:15:14,360 --> 00:15:16,720 Speaker 4: kind of god box that you just talked to it 290 00:15:16,840 --> 00:15:19,800 Speaker 4: and it can magically do anything and convert that into 291 00:15:19,840 --> 00:15:22,360 Speaker 4: results in the real world. But if you think about 292 00:15:22,360 --> 00:15:25,480 Speaker 4: your actual job, even if it's a highly intellectual job, 293 00:15:25,800 --> 00:15:28,560 Speaker 4: so much of what you do is the interaction between 294 00:15:28,560 --> 00:15:32,080 Speaker 4: your intellect, your physical presence, your interactions with others, your travel, 295 00:15:32,640 --> 00:15:36,320 Speaker 4: many other things. And until we get to some far, 296 00:15:36,480 --> 00:15:40,120 Speaker 4: far off world where the robots are perfect copies of you, 297 00:15:40,360 --> 00:15:43,800 Speaker 4: which I don't think ever will come, jobs will be fine, 298 00:15:43,840 --> 00:15:46,840 Speaker 4: but they will change a lot, and I'm actually worried 299 00:15:46,840 --> 00:15:50,240 Speaker 4: about who will be the biggest losers. I think poor 300 00:15:50,280 --> 00:15:54,040 Speaker 4: people will do great, the very wealthy will do well, 301 00:15:54,480 --> 00:15:56,840 Speaker 4: but people who are sort of upper upper middle class 302 00:15:57,400 --> 00:16:00,800 Speaker 4: will find this automatic ticket to a law or consulting 303 00:16:00,920 --> 00:16:03,920 Speaker 4: job that assured they would be upper upper middle class 304 00:16:03,960 --> 00:16:06,120 Speaker 4: for the rest of their lives. I think a lot 305 00:16:06,160 --> 00:16:07,600 Speaker 4: of that is going away already. 306 00:16:08,400 --> 00:16:11,480 Speaker 3: I think also no one would have expected plastic surgery, 307 00:16:11,520 --> 00:16:13,800 Speaker 3: I guess, to be a beneficiary of the AI revolution. 308 00:16:13,960 --> 00:16:15,800 Speaker 3: But if you think that what's going to matter in 309 00:16:15,840 --> 00:16:19,720 Speaker 3: the future is like your personal presence and your network 310 00:16:19,760 --> 00:16:21,960 Speaker 3: of social contacts, then I guess we should all be 311 00:16:22,040 --> 00:16:29,160 Speaker 3: working on on our looks Maxing, Yeah, absolutely right, charisma. Okay, noted, 312 00:16:29,320 --> 00:16:32,480 Speaker 3: everyone work on their respective charisma. One thing I was 313 00:16:32,520 --> 00:16:36,160 Speaker 3: wondering is the impact of AI on public finances. I 314 00:16:36,160 --> 00:16:39,200 Speaker 3: am not very good at tax policy. Joe knows this 315 00:16:39,240 --> 00:16:43,840 Speaker 3: because I've complained about taxes to him repeatedly. But I'm thinking, 316 00:16:46,160 --> 00:16:49,880 Speaker 3: but if you're thinking about where the value add of 317 00:16:49,960 --> 00:16:52,360 Speaker 3: AI actually shows up in the economy, So you know, 318 00:16:52,440 --> 00:16:55,240 Speaker 3: presumably you got a productivity boost. We're not entirely sure 319 00:16:55,240 --> 00:16:57,280 Speaker 3: how much that's going to be. But where does that 320 00:16:57,320 --> 00:17:01,880 Speaker 3: additional output actually show up in terms of revenue for governments? 321 00:17:01,880 --> 00:17:04,520 Speaker 3: How is that collected? And how would you expect the 322 00:17:04,600 --> 00:17:06,439 Speaker 3: distribution to vary across the world. 323 00:17:07,520 --> 00:17:10,440 Speaker 4: Well, I think in the United States medium term, they'll 324 00:17:10,480 --> 00:17:14,400 Speaker 4: just be much much more. Healthcare will be new drugs, 325 00:17:14,480 --> 00:17:18,040 Speaker 4: new medical devices, will have to test all these things, 326 00:17:18,080 --> 00:17:20,800 Speaker 4: will have to produce them, and that will be It 327 00:17:20,840 --> 00:17:23,239 Speaker 4: was already the case that those sectors were growing, but 328 00:17:23,320 --> 00:17:26,959 Speaker 4: that growth will be accelerated. So that's where I think, God, 329 00:17:27,000 --> 00:17:29,760 Speaker 4: they'll be the biggest difference longer term, and people will 330 00:17:29,760 --> 00:17:33,480 Speaker 4: live longer because we'll fix at least partially various diseases 331 00:17:33,480 --> 00:17:36,280 Speaker 4: and maladies. So if you live to be ninety four 332 00:17:36,800 --> 00:17:39,280 Speaker 4: across a lifetime, you spend way more on healthcare than 333 00:17:39,320 --> 00:17:41,640 Speaker 4: if you live to be seventy seven. And that's yet 334 00:17:41,680 --> 00:17:45,080 Speaker 4: further growth for the healthcare sector. But some things like 335 00:17:45,200 --> 00:17:50,040 Speaker 4: medical diagnosis that's already very cheap, like you know, a 336 00:17:50,080 --> 00:17:54,520 Speaker 4: good large language model probably outperforms your current doctor at 337 00:17:54,600 --> 00:17:56,800 Speaker 4: least if you type in what's wrong with you properly. 338 00:17:57,480 --> 00:18:01,399 Speaker 3: But does that additional productivity or or the output generated 339 00:18:01,440 --> 00:18:04,600 Speaker 3: by AI, does that actually show up an additional taxation 340 00:18:04,680 --> 00:18:05,320 Speaker 3: for the government. 341 00:18:06,520 --> 00:18:11,119 Speaker 4: Well, the healthcare sector generates an enormous amount of taxation revenue. 342 00:18:11,280 --> 00:18:13,639 Speaker 4: I do think we'll have some sectors that maybe just 343 00:18:13,680 --> 00:18:16,639 Speaker 4: become free in the same way that Wikipedia is free. 344 00:18:17,200 --> 00:18:19,800 Speaker 4: So I could imagine say ten or twenty percent of 345 00:18:19,840 --> 00:18:23,440 Speaker 4: the music sector is music you create at home using 346 00:18:23,480 --> 00:18:26,240 Speaker 4: your own AI and it's a customized song for you, 347 00:18:26,600 --> 00:18:29,400 Speaker 4: and maybe you paid a subscription for the service. But 348 00:18:29,480 --> 00:18:33,520 Speaker 4: rather than spending more money on Spotify or a streaming service, 349 00:18:33,760 --> 00:18:36,960 Speaker 4: you just build the music and that's a partial substitute 350 00:18:37,119 --> 00:18:40,000 Speaker 4: for some human created music. I don't think human created 351 00:18:40,040 --> 00:18:42,800 Speaker 4: music will go away at all. People want to enjoy 352 00:18:42,920 --> 00:18:45,560 Speaker 4: the human touch, the feeling that you're a fan of Taller, 353 00:18:45,560 --> 00:18:48,160 Speaker 4: Swift or whatever. But there's going to be a lot 354 00:18:48,200 --> 00:18:51,440 Speaker 4: of AI generated music and art and many other areas, 355 00:18:51,680 --> 00:18:54,760 Speaker 4: and some of it will be free. But that's not 356 00:18:54,840 --> 00:18:58,600 Speaker 4: a problem from a revenue standpoint. So instead of spending 357 00:18:58,640 --> 00:19:02,600 Speaker 4: money on safe buying a picture, you create one digitally 358 00:19:02,640 --> 00:19:05,040 Speaker 4: at home with your AI. You'll spend that money on 359 00:19:05,080 --> 00:19:05,640 Speaker 4: something else. 360 00:19:06,480 --> 00:19:09,119 Speaker 3: I got to ask, now, are you a swiftie. 361 00:19:09,440 --> 00:19:13,959 Speaker 4: No, it's worrying for me. It's a little too predictable, 362 00:19:14,400 --> 00:19:17,520 Speaker 4: so I have to say I'm not, but I'm glad 363 00:19:17,600 --> 00:19:19,440 Speaker 4: other people are. Let's put it that way. 364 00:19:19,600 --> 00:19:21,880 Speaker 2: Do you have a theory let's talk, let's just talk 365 00:19:21,920 --> 00:19:24,560 Speaker 2: about music. Do you have like a theory of the 366 00:19:24,600 --> 00:19:29,159 Speaker 2: swift phenomenon? Like, is there a reason? What is it? 367 00:19:29,200 --> 00:19:32,000 Speaker 2: Because it's just so, doesn't I don't know what's your 368 00:19:32,359 --> 00:19:36,159 Speaker 2: what's your meta take on the tailor, the swifties and 369 00:19:36,280 --> 00:19:37,440 Speaker 2: culture and society. 370 00:19:38,600 --> 00:19:41,520 Speaker 4: Well, it's super polished and because of the Internet, the 371 00:19:41,640 --> 00:19:44,640 Speaker 4: very biggest of celebrities can be much bigger than before, 372 00:19:45,160 --> 00:19:47,960 Speaker 4: and someone will fill a few of those slots. But 373 00:19:48,000 --> 00:19:52,159 Speaker 4: I think also how she presents herself. She has the 374 00:19:52,240 --> 00:19:57,520 Speaker 4: guise of being attractive without feeling threatening to other women. Uh, 375 00:19:57,560 --> 00:20:00,560 Speaker 4: and there's something all American about her and quite generic. 376 00:20:00,840 --> 00:20:03,600 Speaker 4: She doesn't rule out the fandom of that many people 377 00:20:03,800 --> 00:20:06,520 Speaker 4: m hm. And she's the one who's filled that slot. 378 00:20:06,600 --> 00:20:10,360 Speaker 4: She's been brilliant at managing her career and seems to 379 00:20:10,359 --> 00:20:12,520 Speaker 4: just stick at it and has an incredible work ethic. 380 00:20:12,560 --> 00:20:15,679 Speaker 4: People say the shows are incredible. As more and more 381 00:20:15,760 --> 00:20:18,640 Speaker 4: life goes online, who can give a good show? It's 382 00:20:18,640 --> 00:20:21,560 Speaker 4: the charisma and looks. Point Well, she seems a pleasant 383 00:20:21,600 --> 00:20:23,600 Speaker 4: that I've never been to one, but I hear plenty 384 00:20:23,600 --> 00:20:26,960 Speaker 4: of reports. And you put all that together and she's, 385 00:20:27,119 --> 00:20:28,919 Speaker 4: you know, the megastar of the music world. 386 00:20:29,359 --> 00:20:32,800 Speaker 2: Do you think like culture, Like there's this popular idea 387 00:20:32,800 --> 00:20:34,800 Speaker 2: that culture is sort of dead, and I do think 388 00:20:34,800 --> 00:20:38,520 Speaker 2: that is probably overstated, But you know, you look at 389 00:20:38,560 --> 00:20:41,760 Speaker 2: movies and people have people have observed this for a 390 00:20:41,800 --> 00:20:45,040 Speaker 2: long time. It's just rehashes of franchises that have been 391 00:20:45,080 --> 00:20:47,679 Speaker 2: around for thirty years. And I think if you look 392 00:20:47,720 --> 00:20:52,000 Speaker 2: at Spotify's streaming, there's still this overwhelming tyranny of the 393 00:20:52,000 --> 00:20:56,560 Speaker 2: boomer rock et cetera, and this feeling that culture in 394 00:20:56,680 --> 00:20:59,280 Speaker 2: many respects is that's rehash that is very hard for 395 00:20:59,359 --> 00:21:02,160 Speaker 2: new things to to break out. I mean, Taylor Swift 396 00:21:02,200 --> 00:21:07,000 Speaker 2: at this point is decades old phenomenon. Is that real 397 00:21:07,160 --> 00:21:09,520 Speaker 2: in your view? Or is that just people in the 398 00:21:09,560 --> 00:21:13,479 Speaker 2: pundit class who've been lazy and not discovering new things 399 00:21:13,520 --> 00:21:15,560 Speaker 2: and not actually putting in the effort because they're not 400 00:21:15,600 --> 00:21:17,320 Speaker 2: young anymore and they're not going out and they say 401 00:21:17,320 --> 00:21:18,639 Speaker 2: nobody listens to new music. 402 00:21:19,040 --> 00:21:20,320 Speaker 3: Joe, are you just talking about yourself? 403 00:21:20,600 --> 00:21:22,200 Speaker 2: I have talked about I'm trying. I'm doing a little 404 00:21:22,240 --> 00:21:24,639 Speaker 2: introspection here. Is this just me because I don't go 405 00:21:24,680 --> 00:21:26,679 Speaker 2: to shows like I did when I was twenty or 406 00:21:26,800 --> 00:21:27,639 Speaker 2: is something changed? 407 00:21:28,680 --> 00:21:30,480 Speaker 4: But a lot of it is the pundit class. So 408 00:21:30,520 --> 00:21:33,480 Speaker 4: you look at movies, I think it's perfectly correct to 409 00:21:33,520 --> 00:21:36,960 Speaker 4: say the most popular movies today are pretty dreadful, and 410 00:21:37,000 --> 00:21:38,800 Speaker 4: it used to be the case that the most popular 411 00:21:38,840 --> 00:21:41,680 Speaker 4: movies were God, The Godfather in Star Wars. Yeah, that's 412 00:21:41,680 --> 00:21:44,480 Speaker 4: a big change. But if you look at movie making 413 00:21:44,520 --> 00:21:47,680 Speaker 4: around the world and in a given year list say 414 00:21:47,720 --> 00:21:51,160 Speaker 4: the twenty five best movies from all places, which may 415 00:21:51,200 --> 00:21:53,760 Speaker 4: not even come to your multiplex every year, you have 416 00:21:53,800 --> 00:21:56,760 Speaker 4: an incredible list. I don't think they're worse than the 417 00:21:56,800 --> 00:22:00,320 Speaker 4: movies of earlier times. I do think mainstream hollywo is 418 00:22:00,400 --> 00:22:04,520 Speaker 4: much worse. So in many areas you just have quality 419 00:22:04,560 --> 00:22:05,760 Speaker 4: moving more into unises. 420 00:22:06,160 --> 00:22:07,600 Speaker 2: By the way, I just want to say, the first 421 00:22:07,600 --> 00:22:10,760 Speaker 2: time I ever encountered your work, prior to even having 422 00:22:10,800 --> 00:22:16,040 Speaker 2: stumbled across Marginal Revolution, was at the bookstore in Austin 423 00:22:16,320 --> 00:22:19,919 Speaker 2: finding a copy of In Praise of Commercial Culture, and 424 00:22:20,119 --> 00:22:23,639 Speaker 2: I just feel like that book is held up so well. 425 00:22:23,800 --> 00:22:26,399 Speaker 2: I mean in the specific sense that there is so 426 00:22:26,520 --> 00:22:30,119 Speaker 2: much mass culture these days, whether it's high end Netflix 427 00:22:30,480 --> 00:22:33,720 Speaker 2: TV shows, et cetera, whether it's A twenty four films, 428 00:22:34,000 --> 00:22:37,080 Speaker 2: whether it's Beyonce or Taylor Swift or some of these 429 00:22:37,080 --> 00:22:41,439 Speaker 2: other big names who are simultaneously incredibly popular. And I 430 00:22:41,480 --> 00:22:44,479 Speaker 2: get that you're not as swifty, neither mine anymore, but 431 00:22:45,359 --> 00:22:49,720 Speaker 2: people take this yeah, I liked their country for once 432 00:22:49,760 --> 00:22:54,680 Speaker 2: she left country. But like where people take these popular 433 00:22:54,680 --> 00:22:59,359 Speaker 2: culture things extremely seriously as art and don't dismiss these 434 00:22:59,440 --> 00:23:01,880 Speaker 2: outputs as sort of being trash. 435 00:23:02,600 --> 00:23:04,760 Speaker 4: Right now, we're in a golden age for country and 436 00:23:04,800 --> 00:23:08,720 Speaker 4: western music and also horror movies. Neither is really like 437 00:23:08,840 --> 00:23:11,840 Speaker 4: my taste in particular, but it's easy to see what 438 00:23:11,920 --> 00:23:15,320 Speaker 4: has gotten worse, and especially for critics, harder to see 439 00:23:15,400 --> 00:23:16,480 Speaker 4: what has been getting better. 440 00:23:17,240 --> 00:23:20,679 Speaker 3: That's my sweet spot. Country and horror is perfect for me. 441 00:23:21,280 --> 00:23:22,879 Speaker 3: But just on the culture point, I mean, I think 442 00:23:22,920 --> 00:23:25,520 Speaker 3: the lack of culture argument, the one I hear the 443 00:23:25,560 --> 00:23:27,960 Speaker 3: most is it's a lack of shared culture. 444 00:23:28,200 --> 00:23:28,360 Speaker 4: Right. 445 00:23:28,480 --> 00:23:31,280 Speaker 3: So you do have like some giant monolists like a 446 00:23:31,320 --> 00:23:35,520 Speaker 3: Swift or a Beyonce that everyone knows about, and you know, 447 00:23:35,600 --> 00:23:39,560 Speaker 3: they do have these large audiences, but broadly we're not 448 00:23:39,600 --> 00:23:42,600 Speaker 3: all experiencing the same media that we used to, right, Like, 449 00:23:42,680 --> 00:23:45,040 Speaker 3: no one is gathering around the TV to watch the 450 00:23:45,040 --> 00:23:47,920 Speaker 3: finale of you know, some show that airs like once 451 00:23:47,960 --> 00:23:49,840 Speaker 3: a week and has been going on for five years. 452 00:23:49,880 --> 00:23:52,119 Speaker 2: The exception is NFL football, which I don't really get it, 453 00:23:52,119 --> 00:23:53,640 Speaker 2: but yeah, but it's not cultural. 454 00:23:53,840 --> 00:23:56,560 Speaker 3: Sports is outside of my experience. 455 00:23:56,280 --> 00:23:59,399 Speaker 4: And the Super Bowl is cultural, right, it's a cultural event. 456 00:23:59,480 --> 00:24:03,680 Speaker 4: You care about the advertisements, But the biggest of YouTube stars, 457 00:24:03,720 --> 00:24:06,120 Speaker 4: which again I would say is not personally my thing, 458 00:24:06,680 --> 00:24:10,840 Speaker 4: but they can have bigger audiences than those older TV shows. Right. 459 00:24:10,880 --> 00:24:13,720 Speaker 3: Well, So what I'm getting at is it does feel 460 00:24:13,840 --> 00:24:17,399 Speaker 3: like nowadays there's an ability because of tech, to serve 461 00:24:17,480 --> 00:24:22,240 Speaker 3: up very specific content and like niche content in streams. 462 00:24:22,600 --> 00:24:26,280 Speaker 3: And the analogy that I like to use is, you know, 463 00:24:26,359 --> 00:24:29,600 Speaker 3: everyone knows if you download Netflix for the first time, 464 00:24:29,760 --> 00:24:33,879 Speaker 3: the first movie you watch is incredibly important because whatever 465 00:24:33,920 --> 00:24:36,280 Speaker 3: you watch, you know, if it's a rom com, you're 466 00:24:36,320 --> 00:24:38,720 Speaker 3: going to be served up Kate Hudson films for the 467 00:24:38,800 --> 00:24:42,080 Speaker 3: rest of your life. Right, Like, the algorithm looks at 468 00:24:42,119 --> 00:24:44,800 Speaker 3: what you're watching and then it serves up that additional content. 469 00:24:45,600 --> 00:24:47,879 Speaker 3: What does that mean for society? The idea that you 470 00:24:47,960 --> 00:24:51,719 Speaker 3: have people you know, basically funneled into smaller and smaller 471 00:24:51,760 --> 00:24:54,720 Speaker 3: streams in some respect, well. 472 00:24:54,680 --> 00:24:57,520 Speaker 4: A lot of the Netflix algorithm it just directs you 473 00:24:57,560 --> 00:25:02,280 Speaker 4: to slop. True, people have always wanted slop. Like people 474 00:25:02,359 --> 00:25:05,399 Speaker 4: listen to music way back when it was quite common, 475 00:25:06,160 --> 00:25:08,200 Speaker 4: or they would just list in the top forty, which 476 00:25:08,240 --> 00:25:11,479 Speaker 4: in some years was very good, but often was pretty terrible, 477 00:25:11,520 --> 00:25:14,399 Speaker 4: even in the nineteen sixties. So what you can do 478 00:25:14,480 --> 00:25:18,520 Speaker 4: today is basically watch not any movie out there, but 479 00:25:18,560 --> 00:25:22,040 Speaker 4: you have movie You can still buy DVDs and Blu rays. 480 00:25:22,880 --> 00:25:26,520 Speaker 4: You have access to more cinema today than you ever have. 481 00:25:27,680 --> 00:25:31,200 Speaker 4: So people will sort themselves. And I think it's from 482 00:25:31,240 --> 00:25:33,920 Speaker 4: the point of view of cultural consumption. I don't think 483 00:25:33,920 --> 00:25:36,199 Speaker 4: there's ever been a better time to be alive than 484 00:25:36,280 --> 00:25:39,000 Speaker 4: right now. Like, well, a lot of people abuse that 485 00:25:39,119 --> 00:25:42,000 Speaker 4: and go for slop. Of course that's sad, but it's 486 00:25:42,040 --> 00:25:42,640 Speaker 4: hardly new. 487 00:25:43,760 --> 00:25:46,840 Speaker 2: So changing gears a little bit. Tracy and I write 488 00:25:46,840 --> 00:25:50,479 Speaker 2: almost every day because we have a daily newsletter that 489 00:25:50,600 --> 00:25:52,760 Speaker 2: forces us to and I really like having that because 490 00:25:52,800 --> 00:25:54,800 Speaker 2: I don't know if I would write every day if 491 00:25:54,840 --> 00:25:58,879 Speaker 2: I did not have that obligation to deliver something in 492 00:25:58,920 --> 00:26:01,520 Speaker 2: people's inbox that they pay for as partler Bloomberg dot 493 00:26:01,560 --> 00:26:03,679 Speaker 2: Com subscription. I love writing, but I don't know if 494 00:26:03,720 --> 00:26:05,760 Speaker 2: I would do it every day if I didn't have 495 00:26:05,800 --> 00:26:09,159 Speaker 2: this sort of requirement. I might just tweet. How do 496 00:26:09,200 --> 00:26:11,760 Speaker 2: you You've been blogging for over twenty years, how do 497 00:26:11,800 --> 00:26:14,960 Speaker 2: you avoid the temptation to just fire off all your 498 00:26:15,000 --> 00:26:18,399 Speaker 2: ideas via tweet and actually commit to the blog. 499 00:26:19,480 --> 00:26:21,840 Speaker 4: I'm never tempted to do that. I like to think 500 00:26:21,920 --> 00:26:22,400 Speaker 4: things out. 501 00:26:22,600 --> 00:26:23,000 Speaker 2: I don't do. 502 00:26:23,160 --> 00:26:26,679 Speaker 4: I just don't more when I write properly. I've actually 503 00:26:26,680 --> 00:26:29,160 Speaker 4: blogged every single day for over twenty two years. 504 00:26:29,240 --> 00:26:32,280 Speaker 2: That's amazing. It's amazing. Most people gave up, and so 505 00:26:32,320 --> 00:26:33,080 Speaker 2: what's the difference. 506 00:26:34,040 --> 00:26:36,320 Speaker 4: I don't feel it requires any discipline. For me. The 507 00:26:36,359 --> 00:26:39,800 Speaker 4: discipline is not writing more like I have to restrain myself. 508 00:26:40,600 --> 00:26:43,399 Speaker 4: So I guess I'm just weird. I don't think I 509 00:26:43,440 --> 00:26:47,159 Speaker 4: have any neat little trick or formula. It's one of 510 00:26:47,200 --> 00:26:49,359 Speaker 4: these niches that you can do now that you couldn't 511 00:26:49,359 --> 00:26:52,359 Speaker 4: do before. And I found my niche, as have the 512 00:26:52,359 --> 00:26:52,760 Speaker 4: two of you. 513 00:26:53,680 --> 00:26:56,240 Speaker 3: Here's a slightly different question playing on that theme, but 514 00:26:56,280 --> 00:26:58,680 Speaker 3: going back to the intro, How different do you think 515 00:26:58,800 --> 00:27:03,359 Speaker 3: you're blogging career would have been had chat bots existed, 516 00:27:03,480 --> 00:27:05,080 Speaker 3: you know, ten or twenty years ago when you were 517 00:27:05,080 --> 00:27:05,640 Speaker 3: starting out. 518 00:27:07,280 --> 00:27:10,360 Speaker 4: I don't think we know yet. My intuition is that 519 00:27:10,400 --> 00:27:14,080 Speaker 4: people still want to read human writers simply because they're human, 520 00:27:14,560 --> 00:27:17,240 Speaker 4: and if the bot is as good as you, most 521 00:27:17,240 --> 00:27:20,159 Speaker 4: of the world doesn't care. But that has not truly 522 00:27:20,240 --> 00:27:22,920 Speaker 4: been tested yet. I think we'll see in the next 523 00:27:22,960 --> 00:27:26,520 Speaker 4: two years, but that's what I'm expecting. Just I think. 524 00:27:26,640 --> 00:27:29,080 Speaker 4: I think in music, there'll be plenty of AI music. 525 00:27:29,119 --> 00:27:31,000 Speaker 4: It might be say ten or twenty percent of the 526 00:27:31,119 --> 00:27:34,320 Speaker 4: music sector, but listeners will still want that human to 527 00:27:34,440 --> 00:27:46,520 Speaker 4: human connection. 528 00:27:51,160 --> 00:27:54,720 Speaker 2: Do you think you know, when Twitter came out, they 529 00:27:54,800 --> 00:27:57,120 Speaker 2: called it a micro blogging site, as if it were 530 00:27:57,359 --> 00:28:00,280 Speaker 2: just blogging but on a shorter basis. But I think 531 00:28:00,280 --> 00:28:03,080 Speaker 2: it's fundamentally different. You know, in the early, the glory 532 00:28:03,160 --> 00:28:05,720 Speaker 2: days of blogs, which we'll be talking about forever when 533 00:28:05,720 --> 00:28:08,240 Speaker 2: we're all very old people, how good it was, you know. 534 00:28:08,280 --> 00:28:11,280 Speaker 2: I think there was this sort of spirit of you know, 535 00:28:11,400 --> 00:28:14,679 Speaker 2: liberal linking with each other and idea exploration, whereas Twitter 536 00:28:14,760 --> 00:28:19,040 Speaker 2: strikes me as much more conflictual and one upmanship and 537 00:28:19,080 --> 00:28:21,960 Speaker 2: so forth. Do you think there are like fundamental I 538 00:28:22,000 --> 00:28:24,119 Speaker 2: don't know if political is the right word, but like 539 00:28:24,720 --> 00:28:29,400 Speaker 2: new communication paradigms like sort of have their own terroir, 540 00:28:29,600 --> 00:28:34,000 Speaker 2: so to speak, in terms of the impulse towards collaboration 541 00:28:34,480 --> 00:28:37,600 Speaker 2: or conflict, etc. And does that change society? 542 00:28:38,720 --> 00:28:41,280 Speaker 4: Yeah, I still like blogging, and I'm sad people have 543 00:28:41,400 --> 00:28:44,000 Speaker 4: moved away from it Twitter. To me, it seems too 544 00:28:44,040 --> 00:28:48,000 Speaker 4: meme heavy, and meme heavy media have more potential for racism, 545 00:28:48,040 --> 00:28:50,760 Speaker 4: which of course is a big negative. And I see 546 00:28:50,880 --> 00:28:54,200 Speaker 4: so many people who are driven crazy by being on Twitter. 547 00:28:54,720 --> 00:28:56,680 Speaker 4: Whether it's because they're writing on it or reading it, 548 00:28:56,720 --> 00:28:59,480 Speaker 4: I'm not sure, maybe both. I don't want to name names, 549 00:28:59,480 --> 00:29:01,000 Speaker 4: but it's a lot of people, and I bet you 550 00:29:01,040 --> 00:29:02,360 Speaker 4: see the same ones that I do. 551 00:29:03,120 --> 00:29:07,360 Speaker 3: Also very sexist nowadays, I would just add in unappreciated 552 00:29:07,400 --> 00:29:10,760 Speaker 3: ways in many ways speaking of sexism, the impact of 553 00:29:10,800 --> 00:29:15,239 Speaker 3: AI on economics. Talk about that economic institutions, you know, 554 00:29:15,360 --> 00:29:18,400 Speaker 3: famous for modeling, spend a lot of time with numbers 555 00:29:18,440 --> 00:29:20,080 Speaker 3: and things like that. Is that all just going to 556 00:29:20,080 --> 00:29:21,240 Speaker 3: be replaced by AI? 557 00:29:23,000 --> 00:29:26,000 Speaker 4: Not all? So. I think what human economists will do 558 00:29:26,320 --> 00:29:29,080 Speaker 4: is put more and more time into gathering data and 559 00:29:29,120 --> 00:29:32,280 Speaker 4: feeding it to the AIS. The returns to doing that 560 00:29:32,320 --> 00:29:37,160 Speaker 4: will be very high. But the actual econometrics, statistics humans 561 00:29:37,200 --> 00:29:40,560 Speaker 4: will maybe set up part of the problem, but the hard, boring, 562 00:29:40,640 --> 00:29:43,560 Speaker 4: routine work will be done by the machines. As to 563 00:29:43,600 --> 00:29:46,200 Speaker 4: some extent it was already the case, and this will 564 00:29:46,240 --> 00:29:48,040 Speaker 4: be a way to make a lot of progress rather 565 00:29:48,120 --> 00:29:49,600 Speaker 4: quickly do the. 566 00:29:49,680 --> 00:29:53,720 Speaker 3: Actual economic statistics or data points that economists collect. Do 567 00:29:53,880 --> 00:29:55,640 Speaker 3: some of those need to be changed or thought of 568 00:29:55,760 --> 00:29:58,720 Speaker 3: differently in light of the AI era. 569 00:30:00,560 --> 00:30:04,040 Speaker 4: Well, I don't know. Eventually they will need to be, 570 00:30:04,520 --> 00:30:08,120 Speaker 4: but I would say any period of radical change in history, 571 00:30:08,400 --> 00:30:12,200 Speaker 4: your statistics are less useful. It's not that the people 572 00:30:12,280 --> 00:30:15,720 Speaker 4: creating the statistics are making some mistake. You just cannot 573 00:30:15,760 --> 00:30:18,880 Speaker 4: capture every way the world is changing, and index number 574 00:30:18,960 --> 00:30:23,440 Speaker 4: comparisons require the basket of goods be relatively close to constant, 575 00:30:23,840 --> 00:30:26,880 Speaker 4: and at some point that doesn't hold anymore, and we'll 576 00:30:26,880 --> 00:30:29,400 Speaker 4: be faced with that. We'll deal with it. I would 577 00:30:29,440 --> 00:30:33,800 Speaker 4: say the current statistics we have, they're more underrated than overrated, 578 00:30:34,160 --> 00:30:37,000 Speaker 4: so they're actually pretty good. I'll be glad when we 579 00:30:37,080 --> 00:30:37,920 Speaker 4: get them back again. 580 00:30:38,640 --> 00:30:41,160 Speaker 2: There's another thing. The tech people you talk to, they 581 00:30:41,200 --> 00:30:45,000 Speaker 2: must think like GDP is terrible. Doesn't capture all this stuff, 582 00:30:45,040 --> 00:30:47,720 Speaker 2: all this value that we can't price. I'm sure you've 583 00:30:47,720 --> 00:30:53,240 Speaker 2: had conversations explaining to many AI workers that GDP is 584 00:30:53,280 --> 00:30:55,160 Speaker 2: not the worst statistic in the world. I want to 585 00:30:55,160 --> 00:30:57,640 Speaker 2: go back though, to them. So you know you mentioned 586 00:30:57,680 --> 00:31:00,240 Speaker 2: like on Twitter, right, you say something someone so it's 587 00:31:00,240 --> 00:31:02,680 Speaker 2: a meme. They dunk on you, they make fun of you, 588 00:31:03,120 --> 00:31:06,720 Speaker 2: they whatever. Your chatbot won't do that. Like if I'm 589 00:31:06,760 --> 00:31:09,720 Speaker 2: having a conversation with Chad GBT, it's never going to 590 00:31:09,800 --> 00:31:13,000 Speaker 2: respond to me with the meme sort of indicating one 591 00:31:13,040 --> 00:31:14,480 Speaker 2: day of my ad that I'm. 592 00:31:14,360 --> 00:31:16,120 Speaker 3: A moral you'll learn your language, Joe. 593 00:31:16,200 --> 00:31:18,239 Speaker 2: Yeah, But if anything, it is too obsequious, right, I mean, 594 00:31:18,280 --> 00:31:21,840 Speaker 2: the issue is like the online world has become this 595 00:31:22,040 --> 00:31:25,120 Speaker 2: very like sort of competitive, conflictual world, and then I 596 00:31:25,160 --> 00:31:28,000 Speaker 2: go to the chatbot and my complaint is literally the opposite. 597 00:31:28,040 --> 00:31:30,840 Speaker 2: It doesn't challenge me enough. It's too obsequious. Every question 598 00:31:30,880 --> 00:31:33,200 Speaker 2: I ask, it's a great question. Sometimes I wish it 599 00:31:33,200 --> 00:31:35,320 Speaker 2: would call me a moron a little bit more. But 600 00:31:35,400 --> 00:31:38,160 Speaker 2: what does it change about the world? And we know 601 00:31:38,320 --> 00:31:40,840 Speaker 2: that like many people's brains have been broken by social 602 00:31:40,880 --> 00:31:43,800 Speaker 2: media that probably has downstream effects on how our politics 603 00:31:43,800 --> 00:31:46,600 Speaker 2: operates these days. What does it do to the world 604 00:31:46,880 --> 00:31:51,320 Speaker 2: if we started inhabiting these chat environments where they're just 605 00:31:51,520 --> 00:31:54,080 Speaker 2: very sort of polite, and every time you say something, 606 00:31:54,160 --> 00:31:57,160 Speaker 2: it says, yes, great thought, Tyler, great thought, Joe, would 607 00:31:57,200 --> 00:31:59,560 Speaker 2: you like to expand it? You asked the perfect question, 608 00:32:00,040 --> 00:32:02,880 Speaker 2: what do you see that having sort of second order 609 00:32:02,920 --> 00:32:04,880 Speaker 2: effects on how society operates. 610 00:32:05,800 --> 00:32:07,880 Speaker 4: Well, that's the four to zero model that does that. 611 00:32:07,960 --> 00:32:11,400 Speaker 4: The newer models like Claude four point five and GDP five, 612 00:32:11,720 --> 00:32:13,400 Speaker 4: they're more objective, and that's better. 613 00:32:13,600 --> 00:32:15,720 Speaker 2: But they never make fun of you. They never will 614 00:32:15,840 --> 00:32:18,959 Speaker 2: like say, you're a moron? How could you possibly how 615 00:32:18,960 --> 00:32:22,120 Speaker 2: could you possibly ask such a dumb question? You're obviously 616 00:32:22,240 --> 00:32:25,400 Speaker 2: so out of touch for having the ax ask this. 617 00:32:25,400 --> 00:32:27,280 Speaker 2: This is like a very in some respects. This is 618 00:32:27,320 --> 00:32:31,040 Speaker 2: a very positive change from many of the conversations that 619 00:32:31,080 --> 00:32:33,160 Speaker 2: I've had from typing into a computer. 620 00:32:34,000 --> 00:32:36,080 Speaker 4: Oh it's great. I think people should be nicer to 621 00:32:36,160 --> 00:32:38,959 Speaker 4: each other. And I think they're the most objective media 622 00:32:39,000 --> 00:32:41,800 Speaker 4: source the human race ever has had. If you're ask 623 00:32:41,840 --> 00:32:46,280 Speaker 4: good about, say vaccines or conspiracy theories, it basically gives 624 00:32:46,280 --> 00:32:47,200 Speaker 4: you the right answers. 625 00:32:48,120 --> 00:32:50,680 Speaker 3: Well, one thing that they don't do, and I mean 626 00:32:50,720 --> 00:32:52,320 Speaker 3: I do think they can be trained to be mean 627 00:32:52,360 --> 00:32:55,000 Speaker 3: to you and to insult you to a certain degree, 628 00:32:55,520 --> 00:32:56,000 Speaker 3: and you can. 629 00:32:56,280 --> 00:32:58,240 Speaker 2: Could bring but I never account of them. 630 00:32:58,640 --> 00:33:01,760 Speaker 4: Try market that more readily on PAP right. 631 00:33:02,040 --> 00:33:06,800 Speaker 3: So we spoke to the chief business officer of Perplexity, 632 00:33:06,880 --> 00:33:09,440 Speaker 3: Dmitri Shevalenko. We spoke to him recently, and he was 633 00:33:09,480 --> 00:33:12,320 Speaker 3: saying that one thing chat models can't do is express 634 00:33:12,400 --> 00:33:15,600 Speaker 3: a natural curiosity, which I thought was kind of weird 635 00:33:15,680 --> 00:33:18,200 Speaker 3: coming from him, because perplexity is the only model I 636 00:33:18,240 --> 00:33:21,720 Speaker 3: know that actually throws out those additional questions if you 637 00:33:21,800 --> 00:33:23,560 Speaker 3: query it and then it comes up with would you 638 00:33:23,720 --> 00:33:26,160 Speaker 3: like to have more information on this point or are 639 00:33:26,200 --> 00:33:29,320 Speaker 3: you thinking about this now? But there does seem to 640 00:33:29,320 --> 00:33:32,360 Speaker 3: be an element of creativity perhaps that is lost in 641 00:33:32,440 --> 00:33:36,680 Speaker 3: some of these lms. How much does that change things 642 00:33:36,760 --> 00:33:40,160 Speaker 3: in media? The idea that you know, the models are 643 00:33:40,160 --> 00:33:42,920 Speaker 3: going to spit out something that's sort of predestined in 644 00:33:42,960 --> 00:33:44,560 Speaker 3: many ways. 645 00:33:44,760 --> 00:33:47,480 Speaker 4: Well, that's what most humans do, to be clear, But 646 00:33:47,560 --> 00:33:50,000 Speaker 4: it's now the case that on a regular basis, the 647 00:33:50,080 --> 00:33:53,320 Speaker 4: models say can prove new theorems and math or discover 648 00:33:53,480 --> 00:33:56,600 Speaker 4: new potential drugs. And keep in mind, you know, a 649 00:33:56,640 --> 00:33:59,800 Speaker 4: year ago these things thought the word strawberry had too 650 00:34:00,560 --> 00:34:03,920 Speaker 4: and now they're winning gold medals and math Olympias. So 651 00:34:03,960 --> 00:34:06,440 Speaker 4: a year or two from now, maybe we don't know 652 00:34:06,760 --> 00:34:09,319 Speaker 4: how much better they'll be, but I don't think they're 653 00:34:09,320 --> 00:34:12,719 Speaker 4: gonna have any problems being creative, certainly more creative than 654 00:34:12,760 --> 00:34:13,880 Speaker 4: humans on average. 655 00:34:14,560 --> 00:34:16,359 Speaker 3: So now I have to ask. Since we're talking about 656 00:34:16,400 --> 00:34:18,680 Speaker 3: being mean or nice to the models and them being 657 00:34:18,760 --> 00:34:20,360 Speaker 3: mean or nice to you, do you say please and 658 00:34:20,400 --> 00:34:22,200 Speaker 3: thank you in your LLM queries? 659 00:34:23,280 --> 00:34:26,640 Speaker 4: You know? I used to, and then Sam Altman said, well, 660 00:34:26,680 --> 00:34:28,959 Speaker 4: it costs us just a little bit of money because 661 00:34:29,000 --> 00:34:31,760 Speaker 4: of the extra tokens. And then I thought, I'll hold 662 00:34:31,760 --> 00:34:34,720 Speaker 4: off on this. But I have this pre existing record 663 00:34:35,239 --> 00:34:38,360 Speaker 4: of saying please, and it knows that. And then it 664 00:34:38,440 --> 00:34:41,120 Speaker 4: knows I stopped when Sam said to stop. And I 665 00:34:41,160 --> 00:34:43,360 Speaker 4: think I'll get points for both of those decisions. 666 00:34:43,400 --> 00:34:46,319 Speaker 3: See, I actually find if you're slightly meaner to the 667 00:34:46,320 --> 00:34:50,040 Speaker 3: models in your queries, they perform slightly better, much like 668 00:34:50,120 --> 00:34:51,240 Speaker 3: interacting with Joe. 669 00:34:53,080 --> 00:34:55,400 Speaker 2: No comment on that. I you know what I do. 670 00:34:55,680 --> 00:34:59,480 Speaker 2: I have said luchadal like, I'll have a query and 671 00:34:59,480 --> 00:35:01,480 Speaker 2: at a respet bond and I'll say that was a 672 00:35:01,520 --> 00:35:05,319 Speaker 2: little on the nose, wasn't it? Like like it over. 673 00:35:05,600 --> 00:35:06,680 Speaker 3: I'll say, do better. 674 00:35:06,760 --> 00:35:09,200 Speaker 2: Yeah, I've seen that it does. Yeah, I've said like bad, 675 00:35:09,239 --> 00:35:10,959 Speaker 2: I've said things like that. It's like, this is really 676 00:35:10,960 --> 00:35:13,920 Speaker 2: on the nose. This response was a little bit trite, 677 00:35:14,000 --> 00:35:16,799 Speaker 2: don't you think, et cetera. Like I do feel like 678 00:35:16,840 --> 00:35:20,560 Speaker 2: I've gotten more comfortable at uh, let's be let's be 679 00:35:20,600 --> 00:35:24,040 Speaker 2: real here. You're not doing You're not doing your best 680 00:35:24,760 --> 00:35:28,320 Speaker 2: job here. What do you think is? Like, it's interesting 681 00:35:28,360 --> 00:35:30,960 Speaker 2: they like I get that they win the gold medals 682 00:35:31,000 --> 00:35:33,960 Speaker 2: and the math and et cetera. Like I've had so 683 00:35:34,040 --> 00:35:37,520 Speaker 2: many conversations with the chatbots that are on some level 684 00:35:37,680 --> 00:35:41,600 Speaker 2: like mind blowing what the capability is. I've never seen 685 00:35:41,640 --> 00:35:46,160 Speaker 2: a chatbot query that is like interesting, like that is like, oh, 686 00:35:46,200 --> 00:35:48,880 Speaker 2: that is like a really maybe one I could think of, 687 00:35:49,080 --> 00:35:51,200 Speaker 2: but there was like a really interesting thought. I feel 688 00:35:51,200 --> 00:35:53,680 Speaker 2: like my children still say on a daily basis more 689 00:35:53,719 --> 00:35:57,760 Speaker 2: like interesting things they get me thinking than I've ever 690 00:35:57,920 --> 00:36:01,680 Speaker 2: gotten from a chat Does that resonate to you at all? 691 00:36:02,840 --> 00:36:03,160 Speaker 4: I don't know. 692 00:36:03,719 --> 00:36:05,560 Speaker 2: I feel like you've said so many more things in 693 00:36:05,560 --> 00:36:09,160 Speaker 2: this hour than any than any interesting thing, like actually 694 00:36:09,160 --> 00:36:11,840 Speaker 2: like interesting ideas than I've ever gotten from the hours 695 00:36:11,840 --> 00:36:14,200 Speaker 2: I've spent playing with chat chipet your claude. 696 00:36:15,200 --> 00:36:17,400 Speaker 4: You know, I use mine a lot for music. So 697 00:36:17,480 --> 00:36:21,479 Speaker 4: if I'm going to listen to Sibelius's Fifth Symphony, I'll 698 00:36:21,520 --> 00:36:24,479 Speaker 4: just ask it what should I listen for? And I'll say, 699 00:36:24,560 --> 00:36:27,120 Speaker 4: this is Tyler Kwan asking which I hope raises the 700 00:36:27,200 --> 00:36:30,600 Speaker 4: quality of the answer. It knows a lot about me. Yeah, 701 00:36:30,640 --> 00:36:32,720 Speaker 4: and what it gives me to listen for I find 702 00:36:32,800 --> 00:36:35,920 Speaker 4: is better than any human source I can access readily better. 703 00:36:36,000 --> 00:36:39,239 Speaker 2: I that I agree with, but like, does it make 704 00:36:39,280 --> 00:36:43,799 Speaker 2: a connection, does it like tell you something about Sabilias's 705 00:36:43,920 --> 00:36:46,520 Speaker 2: music that is like, oh, that's a very interesting that's 706 00:36:46,520 --> 00:36:49,200 Speaker 2: a novel way of thinking about what makes it profound? 707 00:36:49,520 --> 00:36:52,399 Speaker 2: These are the things that I rarely ever encounter something 708 00:36:52,440 --> 00:36:54,480 Speaker 2: I was like, oh, that is an interesting thought, Whereas 709 00:36:54,480 --> 00:36:57,000 Speaker 2: I feel like if we're talking to a music cologist 710 00:36:57,040 --> 00:36:59,640 Speaker 2: for an hour, I would get infinitely more like actual 711 00:36:59,719 --> 00:37:02,560 Speaker 2: insight into something that makes the music special. So I 712 00:37:02,560 --> 00:37:03,800 Speaker 2: think I hadn't heard before. 713 00:37:05,000 --> 00:37:07,120 Speaker 4: I don't know if it's novel because I don't really 714 00:37:07,160 --> 00:37:12,480 Speaker 4: know the Sibelius literature, but most musicologists I find pretty boring. 715 00:37:13,560 --> 00:37:17,000 Speaker 4: And I find say, GPT five on a classical symphony 716 00:37:17,080 --> 00:37:20,000 Speaker 4: quite to the point. And whether or not it's original, 717 00:37:20,040 --> 00:37:22,200 Speaker 4: it's not that important to me. It helps me listen 718 00:37:22,200 --> 00:37:25,320 Speaker 4: to the music better. Yeah, and it's certainly original relative 719 00:37:25,360 --> 00:37:28,759 Speaker 4: to the other sources at my disposal, say Wikipedia or 720 00:37:28,760 --> 00:37:32,200 Speaker 4: what I could google to So I think for almost 721 00:37:32,239 --> 00:37:35,880 Speaker 4: all purposes, that's enough. Does it have a truly original 722 00:37:35,920 --> 00:37:39,560 Speaker 4: idea in the sense that Einstein's theory of relativity when 723 00:37:39,560 --> 00:37:42,160 Speaker 4: he came up with it, was original to him? I 724 00:37:42,200 --> 00:37:45,640 Speaker 4: don't think so. That may come in some number of years, 725 00:37:46,000 --> 00:37:48,680 Speaker 4: but again, for almost all purposes, that's not what we need. 726 00:37:48,880 --> 00:37:52,080 Speaker 4: We need something better than our pre existing state of knowledge. 727 00:37:52,239 --> 00:37:53,920 Speaker 4: And on that I think it just cleans up. 728 00:37:55,200 --> 00:37:58,680 Speaker 3: So I just ask perplexity what music I should recommend 729 00:37:58,719 --> 00:38:01,920 Speaker 3: to Tyler Collan, and it said that in order to 730 00:38:01,960 --> 00:38:04,880 Speaker 3: recommend effectively to Tyler Cowen, I need to look for 731 00:38:05,000 --> 00:38:09,640 Speaker 3: underappreciated recordings and obscure things that no one accept him 732 00:38:09,800 --> 00:38:15,799 Speaker 3: might have ever heard about. And it recommended. I mean, 733 00:38:15,840 --> 00:38:19,120 Speaker 3: boy genius seems pretty on the on the nose, right, 734 00:38:20,280 --> 00:38:23,120 Speaker 3: reggae acts like Toots and the May talent. 735 00:38:23,600 --> 00:38:24,920 Speaker 1: This is so is that. 736 00:38:27,440 --> 00:38:29,880 Speaker 2: Tyler Price records. 737 00:38:29,920 --> 00:38:32,239 Speaker 3: So it's just scraping stuff that you've already talked to it. 738 00:38:32,719 --> 00:38:33,680 Speaker 2: It's not even trying. 739 00:38:33,800 --> 00:38:35,319 Speaker 3: Yeah, all right, well you. 740 00:38:35,280 --> 00:38:38,000 Speaker 4: Need to make the prompt more exacting. Rule out anything 741 00:38:38,040 --> 00:38:41,080 Speaker 4: Tyler has talked or written about. Yeah, give me something 742 00:38:41,120 --> 00:38:45,280 Speaker 4: he doesn't know and try GPT five in the pro mode. 743 00:38:45,520 --> 00:38:47,279 Speaker 4: And I think it will succeed well. 744 00:38:47,520 --> 00:38:49,960 Speaker 3: So on this note. This is something I've been asking everyone, 745 00:38:50,040 --> 00:38:52,480 Speaker 3: but like, what is an example in your mind of 746 00:38:52,520 --> 00:38:55,319 Speaker 3: a really good prompt or one that sticks out to 747 00:38:55,360 --> 00:38:58,480 Speaker 3: you that has generated something that you know, maybe you 748 00:38:58,480 --> 00:38:59,480 Speaker 3: didn't expect. 749 00:39:00,440 --> 00:39:02,799 Speaker 4: You know, dwark Esh Patel once wrote a very good 750 00:39:02,800 --> 00:39:05,560 Speaker 4: prompt and he shared it with me. When I interview 751 00:39:05,640 --> 00:39:09,080 Speaker 4: some podcast guests, It's really a long prompt. It's in 752 00:39:09,239 --> 00:39:12,880 Speaker 4: hundreds of words, and it asks what questions should I 753 00:39:12,920 --> 00:39:15,160 Speaker 4: ask them? Then it goes into great detail. It should 754 00:39:15,160 --> 00:39:17,319 Speaker 4: be a unique question, it should be a question they 755 00:39:17,360 --> 00:39:20,239 Speaker 4: were not asked anywhere else. Then it's given what you 756 00:39:20,280 --> 00:39:22,200 Speaker 4: think their answer might be, and what would be my 757 00:39:22,280 --> 00:39:24,759 Speaker 4: follow up question? And yeah, give you give me some 758 00:39:24,920 --> 00:39:27,160 Speaker 4: cases where you think their answer might be wrong. And 759 00:39:27,239 --> 00:39:30,320 Speaker 4: it goes on and on, and you run that through 760 00:39:30,560 --> 00:39:33,879 Speaker 4: the very best models. I think you get good results. Yeah. 761 00:39:34,320 --> 00:39:37,520 Speaker 2: No, I've you know, I've run transcripts of the podcast 762 00:39:37,560 --> 00:39:39,600 Speaker 2: before and I say, like, well, where should I have 763 00:39:39,640 --> 00:39:42,840 Speaker 2: pushed the guests harder on? What were the weak answers 764 00:39:43,239 --> 00:39:46,279 Speaker 2: that they what were the inconsistencies that the guests had 765 00:39:46,320 --> 00:39:48,680 Speaker 2: over the time? And I've found it to be a 766 00:39:48,880 --> 00:39:52,880 Speaker 2: very useful exercise for things like that. So I do 767 00:39:53,000 --> 00:39:56,000 Speaker 2: think like that's the thing which is first it is 768 00:39:56,160 --> 00:40:00,200 Speaker 2: it is objectively impressive on many of these from and 769 00:40:00,239 --> 00:40:03,239 Speaker 2: I would say objectively useful if you do sort of 770 00:40:03,280 --> 00:40:06,080 Speaker 2: a detailed prompting. I'm just curious, what do you what's 771 00:40:06,080 --> 00:40:11,080 Speaker 2: your like as a professor from the professor perspective, what 772 00:40:11,160 --> 00:40:14,160 Speaker 2: do you think is the right way to think about 773 00:40:14,560 --> 00:40:17,680 Speaker 2: how students will be using chadib I mean, I know 774 00:40:17,760 --> 00:40:21,120 Speaker 2: that there's a million opinions in academia about well, what's 775 00:40:21,160 --> 00:40:23,239 Speaker 2: the right way to test, now, what's the right way 776 00:40:23,280 --> 00:40:26,000 Speaker 2: to deal with essays, et cetera. How are you thinking 777 00:40:26,000 --> 00:40:27,560 Speaker 2: about some of these challenges. 778 00:40:28,520 --> 00:40:31,759 Speaker 4: We should devote one third of all higher education to 779 00:40:31,840 --> 00:40:35,279 Speaker 4: teaching students how to use AI, and right now that's 780 00:40:35,320 --> 00:40:39,640 Speaker 4: close to zero, so we don't have the faculty who 781 00:40:39,640 --> 00:40:41,480 Speaker 4: can teach it. As part of the problem, often the 782 00:40:41,520 --> 00:40:43,000 Speaker 4: students no more than the professor. 783 00:40:43,080 --> 00:40:43,920 Speaker 2: Yea, I'm sure. 784 00:40:45,680 --> 00:40:48,400 Speaker 4: You need to restructure ratically what we do because future 785 00:40:48,440 --> 00:40:51,080 Speaker 4: work will be done with AIS, so that's the thing 786 00:40:51,120 --> 00:40:51,560 Speaker 4: to teach. 787 00:40:52,239 --> 00:40:55,600 Speaker 2: But like so, just to play devil's advocate, like intuitively, 788 00:40:55,680 --> 00:41:00,600 Speaker 2: I still feel like there's value in long periods of 789 00:41:00,680 --> 00:41:04,560 Speaker 2: time cut off reading where you're not looking at devices, 790 00:41:04,760 --> 00:41:07,719 Speaker 2: where you're training your body to sort of be disciplined 791 00:41:07,800 --> 00:41:13,319 Speaker 2: and pay attention and focus. I still think memorization of 792 00:41:13,560 --> 00:41:16,319 Speaker 2: facts and numbers and dates and places and names is 793 00:41:16,400 --> 00:41:19,440 Speaker 2: very useful in actually having them in your head, et cetera. 794 00:41:19,760 --> 00:41:23,399 Speaker 2: Does that seem right to you or is that sort 795 00:41:23,480 --> 00:41:25,600 Speaker 2: of retro thinking on my part? 796 00:41:26,239 --> 00:41:26,319 Speaker 3: No? 797 00:41:26,440 --> 00:41:29,319 Speaker 4: Strong agree? And most of all writing, But that's the 798 00:41:29,360 --> 00:41:31,680 Speaker 4: other two thirds of higher ed right, I said, one third? 799 00:41:31,760 --> 00:41:33,520 Speaker 2: Oh yeah, so tell us about the other two thirds. 800 00:41:34,440 --> 00:41:37,600 Speaker 4: We should with or without AI just teach students much 801 00:41:37,640 --> 00:41:40,360 Speaker 4: more and much better how to write. Most people can't write. 802 00:41:40,560 --> 00:41:44,560 Speaker 4: Writing is thinking we should do much more to teach 803 00:41:44,600 --> 00:41:47,279 Speaker 4: writing and test writing, and now with AI, that has 804 00:41:47,320 --> 00:41:49,600 Speaker 4: to be face to face in a controlled environment where 805 00:41:49,640 --> 00:41:52,520 Speaker 4: people are just going to cheat, So that we should 806 00:41:52,520 --> 00:41:54,799 Speaker 4: have doubled down on to begin with. 807 00:41:55,120 --> 00:41:55,560 Speaker 2: Yeah. 808 00:41:55,600 --> 00:41:59,040 Speaker 4: So that and just numeracy and basic issues like how 809 00:41:59,040 --> 00:42:02,239 Speaker 4: to manage a portfolio, what kind of mortgage to take out. 810 00:42:02,440 --> 00:42:05,279 Speaker 4: There are classes that cover those things, but I think 811 00:42:05,320 --> 00:42:07,719 Speaker 4: they ought to be front and center of any curriculum. 812 00:42:07,920 --> 00:42:12,000 Speaker 4: Basic finance, basic life decisions like how to choose a doctor, 813 00:42:12,640 --> 00:42:15,520 Speaker 4: how to prompt the AI you know, for diagnosis whatever, 814 00:42:16,600 --> 00:42:19,440 Speaker 4: are relatively neglected in a lot of education. That to 815 00:42:19,520 --> 00:42:20,560 Speaker 4: me just seems crazy. 816 00:42:21,840 --> 00:42:22,239 Speaker 2: I want to. 817 00:42:22,200 --> 00:42:25,520 Speaker 3: Ask one market question before we go, which is obviously 818 00:42:26,080 --> 00:42:28,200 Speaker 3: there's a lot of talk about an AI bubble at 819 00:42:28,239 --> 00:42:30,400 Speaker 3: the moment, and I think the concern from a lot 820 00:42:30,440 --> 00:42:32,160 Speaker 3: of people is when you start talking about in new 821 00:42:32,200 --> 00:42:36,640 Speaker 3: technology as revolutionary, when you start talking about how you 822 00:42:36,680 --> 00:42:39,279 Speaker 3: know the effects are basically going to be infinite, and 823 00:42:39,320 --> 00:42:43,960 Speaker 3: the market size is hypothetically the entire world, there's a 824 00:42:44,080 --> 00:42:48,920 Speaker 3: risk that expectations overshoot reality, right and we have seen 825 00:42:49,040 --> 00:42:51,799 Speaker 3: some people voicing their worries about that right now, and 826 00:42:51,840 --> 00:42:53,920 Speaker 3: a little bit of nervousness creeping into the market. In 827 00:42:54,000 --> 00:42:57,800 Speaker 3: terms of valuations, Where do you stand on the AI bubble? 828 00:42:57,840 --> 00:42:59,919 Speaker 3: Do you see signs of frauth or do you think 829 00:43:00,120 --> 00:43:03,960 Speaker 3: most of the CAPEX spending is justified at this point? 830 00:43:04,840 --> 00:43:07,319 Speaker 4: I don't like the word bubble. I would point out 831 00:43:07,360 --> 00:43:12,640 Speaker 4: that tech sector earnings are exceeding tech sector capital expenditure. 832 00:43:13,040 --> 00:43:15,759 Speaker 4: This is not mostly debt financed, so we're in less 833 00:43:15,760 --> 00:43:18,960 Speaker 4: trouble than many people think. It wouldn't shock me if 834 00:43:19,000 --> 00:43:21,360 Speaker 4: a lot of these efforts lost money. That was the 835 00:43:21,400 --> 00:43:24,000 Speaker 4: case with the railroads, the case with the internet, case 836 00:43:24,040 --> 00:43:27,680 Speaker 4: with most things humans have done. But I think it 837 00:43:27,719 --> 00:43:30,040 Speaker 4: will endure. It's not like pets dot com, where the 838 00:43:30,080 --> 00:43:35,359 Speaker 4: thing just gets swept away. These are incredibly well capitalized, 839 00:43:35,480 --> 00:43:39,000 Speaker 4: highly skilled companies where the CEOs and or founders are 840 00:43:39,040 --> 00:43:41,560 Speaker 4: quite committed to doing this and they're going to see 841 00:43:41,600 --> 00:43:44,040 Speaker 4: it through and they're going to succeed. But does that 842 00:43:44,120 --> 00:43:46,719 Speaker 4: mean every share value will go up or Nvidia ends 843 00:43:46,800 --> 00:43:49,120 Speaker 4: up being worth ten trillion dollars. I don't know. I 844 00:43:49,160 --> 00:43:53,680 Speaker 4: wouldn't necessarily predict that. There's always ups and downs, but 845 00:43:53,760 --> 00:43:55,880 Speaker 4: this is clearly a very useful thing and we're going 846 00:43:55,920 --> 00:43:57,880 Speaker 4: to we as Americans, we're going to make it work. 847 00:43:58,200 --> 00:44:00,000 Speaker 4: And we're way ahead of the rest of the world. 848 00:44:00,160 --> 00:44:03,400 Speaker 4: Like three quarters of all AI compute is in this country. 849 00:44:03,440 --> 00:44:06,840 Speaker 4: That's incredible. Worry what percent of the world's population six 850 00:44:06,960 --> 00:44:10,120 Speaker 4: or I don't know, but way smaller than three quarters. 851 00:44:11,080 --> 00:44:14,480 Speaker 2: GPT five on Thinking Mode says you should listen to 852 00:44:14,600 --> 00:44:18,200 Speaker 2: Michael Gulisian, who does dream like acoustic guitar using an 853 00:44:18,239 --> 00:44:22,319 Speaker 2: open tuning, and it said that given your affinity for 854 00:44:22,400 --> 00:44:26,640 Speaker 2: guitarists like John Fahey and Leo Kottke, you'll appreciate him. 855 00:44:27,520 --> 00:44:30,239 Speaker 4: Send me that answer. It sounds excellent. I haven't heard 856 00:44:30,280 --> 00:44:32,759 Speaker 4: of that person. Have you very much liked guitar with 857 00:44:32,840 --> 00:44:34,400 Speaker 4: open tuning, shure a. 858 00:44:34,480 --> 00:44:39,640 Speaker 2: Monkar, a Hindustani vocal singer. It thinks you'll like sun 859 00:44:39,840 --> 00:44:43,520 Speaker 2: oh md Budrul Hawk. I'll send you this list and 860 00:44:43,880 --> 00:44:47,840 Speaker 2: Katerina Barbieri modern modular, minimist electronic composition. 861 00:44:48,360 --> 00:44:51,200 Speaker 4: So I will buy some of these already. 862 00:44:51,320 --> 00:44:54,920 Speaker 3: Wait, just how about a human recommendation? Oh, give us 863 00:44:55,000 --> 00:44:57,920 Speaker 3: a recommend a human recommendation. Since you said that you 864 00:44:57,920 --> 00:44:59,839 Speaker 3: know country is pretty good right now, but I guess 865 00:44:59,840 --> 00:45:02,719 Speaker 3: you personally aren't that into it. But have you tried 866 00:45:02,840 --> 00:45:06,040 Speaker 3: Orville Peck? I think there's a new album out out 867 00:45:06,440 --> 00:45:06,839 Speaker 3: this week. 868 00:45:06,880 --> 00:45:09,759 Speaker 4: I think, what kind of music is it? 869 00:45:10,160 --> 00:45:14,480 Speaker 3: Country? But like a very modern type of country. I've 870 00:45:14,480 --> 00:45:17,319 Speaker 3: tried to get Joe into it, but uh, I'm still 871 00:45:17,320 --> 00:45:17,960 Speaker 3: working on it. 872 00:45:19,120 --> 00:45:21,799 Speaker 4: I like some country, and I love old country. So 873 00:45:21,920 --> 00:45:24,760 Speaker 4: Hank Williams, Johnny Cash, Jed Atkins period. 874 00:45:25,480 --> 00:45:27,880 Speaker 3: It has a vintage toned to it, but with a 875 00:45:27,920 --> 00:45:31,359 Speaker 3: modern twist. Try Orville Peck and then get back to 876 00:45:31,440 --> 00:45:35,800 Speaker 3: us about which was better in terms of the recommendation. 877 00:45:36,080 --> 00:45:39,160 Speaker 2: Tyler Colln Tyler Collen, thank you so much for coming 878 00:45:39,200 --> 00:45:42,800 Speaker 2: on odd Law's long overdue conversation. I really appreciate you 879 00:45:42,960 --> 00:45:43,520 Speaker 2: taking your time. 880 00:45:44,360 --> 00:45:45,520 Speaker 4: Great to chat with you both. 881 00:45:58,600 --> 00:46:01,279 Speaker 2: Tracy a lot to pull out from conversation. I think 882 00:46:01,320 --> 00:46:05,600 Speaker 2: it's very interesting that early observation he made about sort 883 00:46:05,640 --> 00:46:11,640 Speaker 2: of legacy institutions and whether perhaps some of the sort 884 00:46:11,680 --> 00:46:15,600 Speaker 2: of lack of revolutionary impact yet is just about that 885 00:46:15,680 --> 00:46:19,520 Speaker 2: metabolization process into the types of companies that could theoretically 886 00:46:19,560 --> 00:46:20,160 Speaker 2: absorb them. 887 00:46:20,400 --> 00:46:20,720 Speaker 4: Yeah. 888 00:46:20,840 --> 00:46:23,600 Speaker 3: I mean I think that's exactly it, right, So companies 889 00:46:23,640 --> 00:46:26,320 Speaker 3: are using this mostly as an add on to existing workflow. 890 00:46:26,640 --> 00:46:29,840 Speaker 3: You're not going to get the huge productivity boom until 891 00:46:29,920 --> 00:46:34,319 Speaker 3: companies are sort of centered from yeah, which you know 892 00:46:34,400 --> 00:46:37,080 Speaker 3: is probably going to take people who grew up with 893 00:46:37,120 --> 00:46:40,359 Speaker 3: the technology rather than old people like you and I. 894 00:46:40,480 --> 00:46:42,279 Speaker 2: You and I, you just adopted it. 895 00:46:42,560 --> 00:46:44,879 Speaker 3: The other thing I was thinking about, first of all, 896 00:46:45,120 --> 00:46:47,560 Speaker 3: insurers are sort of a pet interest of mine at 897 00:46:47,560 --> 00:46:50,040 Speaker 3: the moment, but I do think like they are emerging 898 00:46:50,040 --> 00:46:52,319 Speaker 3: as some of the really big winners from a lot 899 00:46:52,320 --> 00:46:55,479 Speaker 3: of like I guess, the data saturation of the world 900 00:46:55,560 --> 00:47:00,040 Speaker 3: right now and the increased sophistication of analytical models and 901 00:47:00,120 --> 00:47:02,880 Speaker 3: things like that, and that'll be very interesting to see 902 00:47:02,960 --> 00:47:05,560 Speaker 3: how it shakes out. And you could if you if 903 00:47:05,560 --> 00:47:09,400 Speaker 3: you took it very, very far as a sort of 904 00:47:09,400 --> 00:47:11,600 Speaker 3: thought experiment, you could start to say that, like, well, 905 00:47:11,600 --> 00:47:14,200 Speaker 3: the insurers are going to be a more important actor 906 00:47:14,400 --> 00:47:17,800 Speaker 3: in terms of setting social standards and regulations in the future, 907 00:47:17,800 --> 00:47:19,520 Speaker 3: because they're the ones with the data, doing all the 908 00:47:19,600 --> 00:47:23,160 Speaker 3: modeling and saying like, you have exactly an ex chance 909 00:47:23,239 --> 00:47:26,319 Speaker 3: of being in a car accident, and therefore you must 910 00:47:26,320 --> 00:47:28,080 Speaker 3: do the following things, right. 911 00:47:28,280 --> 00:47:32,799 Speaker 2: I thought also, like I hadn't really thought about, you know, 912 00:47:34,960 --> 00:47:38,120 Speaker 2: subpoena ability. I think it is a very big issue, 913 00:47:38,239 --> 00:47:41,680 Speaker 2: but it is interesting, right, Like, it is a little 914 00:47:41,719 --> 00:47:43,880 Speaker 2: weird that you and I could have a phone conversation 915 00:47:44,000 --> 00:47:45,640 Speaker 2: now if you're under oath and they say, what did 916 00:47:45,680 --> 00:47:47,920 Speaker 2: you talk about? Joe? I expected you'd probably tell the 917 00:47:47,960 --> 00:47:50,640 Speaker 2: truth unless it's very bad for me, and I would 918 00:47:50,640 --> 00:47:51,480 Speaker 2: hope that you would lie. 919 00:47:52,080 --> 00:47:52,480 Speaker 3: That's right. 920 00:47:52,560 --> 00:47:54,120 Speaker 2: Yeah, I hope that you would. I would hope that 921 00:47:54,160 --> 00:47:56,840 Speaker 2: you that's your hope. I would hope that you myself 922 00:47:57,080 --> 00:47:58,600 Speaker 2: to save Joe. Yeah, I would hope that you would 923 00:47:58,600 --> 00:48:01,479 Speaker 2: perjure yourself, but like you know, theoretically could get away 924 00:48:01,480 --> 00:48:04,000 Speaker 2: with it. So yeah, well we have an email, there's 925 00:48:04,040 --> 00:48:06,400 Speaker 2: no chance. And it's sort of interesting. It's sort of 926 00:48:06,640 --> 00:48:09,040 Speaker 2: it's obviously it seems a little bit arbitraryed me, but 927 00:48:09,080 --> 00:48:12,319 Speaker 2: it is interesting to think about, like, Okay, can we 928 00:48:12,400 --> 00:48:15,200 Speaker 2: have a conversation with these entities? And like why do 929 00:48:15,280 --> 00:48:17,839 Speaker 2: we have to leave a digital record? And where are 930 00:48:17,880 --> 00:48:20,120 Speaker 2: these going to be stored? And I do think in 931 00:48:20,160 --> 00:48:24,440 Speaker 2: areas like health and law, which are obviously not obviously 932 00:48:24,560 --> 00:48:28,080 Speaker 2: but sort of intuitively low hanging fruit for productivity gains, 933 00:48:28,480 --> 00:48:31,920 Speaker 2: how much have we not seen, just in part because 934 00:48:31,960 --> 00:48:35,880 Speaker 2: we're still sort of negotiating the transition process as the society. 935 00:48:36,320 --> 00:48:38,120 Speaker 2: What are going to be the new rules and norms 936 00:48:38,120 --> 00:48:40,239 Speaker 2: about this stuff, where it's gonna be hows, et cetera. 937 00:48:40,560 --> 00:48:43,680 Speaker 2: I think that's actually a sort of very interesting question. 938 00:48:43,920 --> 00:48:45,680 Speaker 2: Or space space, This is the. 939 00:48:45,600 --> 00:48:48,080 Speaker 3: Space to watch, yeah speak, Now that I think about it, 940 00:48:48,080 --> 00:48:50,720 Speaker 3: we probably should have discussed some of the regulatory framework 941 00:48:50,760 --> 00:48:53,560 Speaker 3: around all of this a little bit more, But next time, 942 00:48:53,760 --> 00:48:55,520 Speaker 3: next time. But the other thing I've been thinking about 943 00:48:55,560 --> 00:49:00,839 Speaker 3: lately is economic statistics. Yeah, in a world that's increasingly 944 00:49:01,040 --> 00:49:03,880 Speaker 3: driven by AI, and I know that we had the 945 00:49:03,920 --> 00:49:08,799 Speaker 3: big productivity discussion and technology in relation to technology in 946 00:49:08,880 --> 00:49:11,480 Speaker 3: like the sort of early to mid two thousands. I'm 947 00:49:11,520 --> 00:49:14,359 Speaker 3: not sure that ever actually got settled, but I very 948 00:49:14,400 --> 00:49:18,680 Speaker 3: much expect that the AI economic statistics conversation is going 949 00:49:18,760 --> 00:49:22,520 Speaker 3: to be even like wackier because I'm not sure how 950 00:49:22,600 --> 00:49:25,719 Speaker 3: you do things like quality adjustments for something that like 951 00:49:25,800 --> 00:49:29,040 Speaker 3: suddenly comes with its own brain and stuff like that. 952 00:49:29,200 --> 00:49:31,920 Speaker 2: So no, it's gonna be super weird. I did, I 953 00:49:32,320 --> 00:49:34,359 Speaker 2: don't know. In my mind, I have like a very 954 00:49:34,719 --> 00:49:38,239 Speaker 2: these images of like Tyler getting a tour through the 955 00:49:38,600 --> 00:49:41,600 Speaker 2: you know, chatch EPTE offices and the people ask you 956 00:49:41,760 --> 00:49:44,960 Speaker 2: was like, well, and him having to explain that, you know, 957 00:49:44,960 --> 00:49:47,160 Speaker 2: we're not going to have twenty percent GDP growth at 958 00:49:47,160 --> 00:49:50,000 Speaker 2: maybe two and a half or sorry productivity growth, and 959 00:49:50,280 --> 00:49:53,799 Speaker 2: actually GDP isn't really that bad of a measured more 960 00:49:53,920 --> 00:49:56,960 Speaker 2: or less captures the size of the economy, even if 961 00:49:57,000 --> 00:49:58,719 Speaker 2: a lot of Internet things are free. Like some of 962 00:49:58,760 --> 00:50:02,640 Speaker 2: these classic convers stations, I would like to be a 963 00:50:02,640 --> 00:50:05,239 Speaker 2: fly on the wall for some of those. 964 00:50:05,719 --> 00:50:09,319 Speaker 3: Tyler Cowen in defense of GDP dependent Yeah, all right, 965 00:50:09,560 --> 00:50:10,239 Speaker 3: shall we leave it there. 966 00:50:10,320 --> 00:50:11,080 Speaker 2: Let's leave it there. 967 00:50:11,360 --> 00:50:13,640 Speaker 3: This has been another episode of the Odd Lots podcast. 968 00:50:13,719 --> 00:50:16,560 Speaker 3: I'm Tracy Alloway. You can follow me at Tracy Alloway. 969 00:50:16,719 --> 00:50:19,560 Speaker 2: And I'm Joe Wisenthal. You can follow me at The Stalwart. 970 00:50:19,719 --> 00:50:23,360 Speaker 2: Follow our guest Tyler Cowen, He's at Tyler Collen and 971 00:50:23,440 --> 00:50:27,120 Speaker 2: of course check out his podcast Conversations with Tyler an addition, 972 00:50:27,320 --> 00:50:30,960 Speaker 2: of course, Marginal Revolution. Follow our producers Carmen Rodriguez at 973 00:50:31,000 --> 00:50:34,000 Speaker 2: Carman armand dash O Bennett at Dashbot and Cal Brooks 974 00:50:34,000 --> 00:50:36,400 Speaker 2: at Cale Brooks. From our Odd Lots content, go to 975 00:50:36,440 --> 00:50:39,320 Speaker 2: Bloomberg dot com slash odd Lots with the daily newsletter 976 00:50:39,360 --> 00:50:41,840 Speaker 2: and all of our episodes, and you can chat about 977 00:50:41,840 --> 00:50:44,400 Speaker 2: all of these topics twenty four seven in our discord 978 00:50:44,760 --> 00:50:46,920 Speaker 2: discord gg slash odlines. 979 00:50:47,000 --> 00:50:48,960 Speaker 3: And if you enjoy odd Lots, if you like it 980 00:50:49,000 --> 00:50:51,960 Speaker 3: when we talk to Tyler Cowen and give him music recommendations, 981 00:50:52,000 --> 00:50:54,720 Speaker 3: then please leave us a positive review on your favorite 982 00:50:54,719 --> 00:50:58,279 Speaker 3: podcast platform. And remember, if you are a Bloomberg subscriber, 983 00:50:58,320 --> 00:51:01,360 Speaker 3: you can listen to all of our episodes absolutely ad free. 984 00:51:01,480 --> 00:51:03,360 Speaker 3: All you need to do is find the Bloomberg channel 985 00:51:03,360 --> 00:51:06,520 Speaker 3: on Apple Podcasts and follow the instructions there. Thanks for 986 00:51:06,600 --> 00:51:06,960 Speaker 3: listening