1 00:00:15,356 --> 00:00:15,796 Speaker 1: Pushkin. 2 00:00:20,436 --> 00:00:23,556 Speaker 2: Hey, it's Jacob. I recently was a guest co host 3 00:00:23,636 --> 00:00:27,636 Speaker 2: on another Pushkin podcast shows called Cautionary Tales. The host 4 00:00:27,716 --> 00:00:30,356 Speaker 2: is Tim Harford, and it was a lot of fun 5 00:00:30,436 --> 00:00:32,556 Speaker 2: and I thought you might enjoy hearing it while What's 6 00:00:32,596 --> 00:00:35,156 Speaker 2: Your Problem is on a break, so we'll be back 7 00:00:35,196 --> 00:00:37,476 Speaker 2: with What's Your Problem in a few weeks. In the meantime, 8 00:00:37,596 --> 00:00:39,996 Speaker 2: I hope you like this episode of Cautionary Tales. 9 00:00:43,036 --> 00:00:47,796 Speaker 1: Let's play around Warders Studios. It sounds very elvin. 10 00:00:48,716 --> 00:00:50,916 Speaker 3: I always think of David Bowie when I think of Wardog. 11 00:00:50,956 --> 00:00:53,396 Speaker 3: In one of his early songs talks about bright light, 12 00:00:53,476 --> 00:00:54,796 Speaker 3: soho wardour Street. 13 00:00:55,636 --> 00:00:59,196 Speaker 1: Bowie himself has an elfin aspect. 14 00:00:59,556 --> 00:01:03,196 Speaker 3: He said he does is breakthrough gig at Aylesbury, my hometown, 15 00:01:03,676 --> 00:01:06,716 Speaker 3: which basically is about the only thing remarkable about Aylesbury 16 00:01:06,756 --> 00:01:08,116 Speaker 3: is well and also. 17 00:01:08,076 --> 00:01:10,436 Speaker 1: Home of Tim Harford. As the plaque. 18 00:01:09,996 --> 00:01:13,316 Speaker 3: Says, yeah, they've got a statue for Bowie, not yet 19 00:01:13,316 --> 00:01:17,716 Speaker 3: for me. I don't why shore we go Okay, let's go, okay, 20 00:01:17,756 --> 00:01:43,716 Speaker 3: I'm ready. Hello and welcome back to another episode of 21 00:01:43,796 --> 00:01:47,116 Speaker 3: Cautionary Questions, our first of twenty twenty four. I am, 22 00:01:47,196 --> 00:01:50,716 Speaker 3: of course Tim Harford. You are our loyal listeners. You've 23 00:01:50,716 --> 00:01:55,396 Speaker 3: been sending in your burning questions on money, technology, economics, 24 00:01:55,396 --> 00:01:58,036 Speaker 3: and problem solving, and thank you so much to everyone 25 00:01:58,316 --> 00:02:01,956 Speaker 3: who's done so. And today is the day I do 26 00:02:02,036 --> 00:02:04,876 Speaker 3: my best to answer them. And thankfully I won't be 27 00:02:04,916 --> 00:02:07,716 Speaker 3: alone in this endeavor. Here to help me out both 28 00:02:07,756 --> 00:02:09,516 Speaker 3: of the questions and I think with some of the 29 00:02:09,556 --> 00:02:14,956 Speaker 3: answers is the brilliant, brilliant Jacob Goldstein, the host of 30 00:02:15,076 --> 00:02:18,956 Speaker 3: Pushkin podcast What's Your Problem, author of the book Money, 31 00:02:19,156 --> 00:02:22,076 Speaker 3: The True Story of a Made Up Thing. Jacob was 32 00:02:22,116 --> 00:02:25,236 Speaker 3: my inaugural Cautionary Questions co host. Jacob's wonderful to. 33 00:02:25,196 --> 00:02:27,756 Speaker 1: Have you back, Tim. It's an honor. Hi. 34 00:02:28,476 --> 00:02:30,196 Speaker 3: So we should just get home with the questions because 35 00:02:30,196 --> 00:02:32,316 Speaker 3: we always have so many and so much to say. So, 36 00:02:32,716 --> 00:02:34,836 Speaker 3: what have you got in real big bag of listener 37 00:02:34,916 --> 00:02:35,636 Speaker 3: questions for me? 38 00:02:36,236 --> 00:02:40,636 Speaker 1: Let's start with a question from Alex in Melbourne, Australia. 39 00:02:40,716 --> 00:02:45,156 Speaker 1: Alex writes, I work with artificial intelligence image generation software 40 00:02:45,236 --> 00:02:49,156 Speaker 1: almost daily now, and I'm quickly seeing how so much 41 00:02:49,236 --> 00:02:52,196 Speaker 1: of my workforce and processes can either be sped up 42 00:02:52,276 --> 00:02:55,956 Speaker 1: or entirely replaced by AI, and this makes me nervous. 43 00:02:56,716 --> 00:03:00,036 Speaker 1: And then he asks about universal basic income, right, this 44 00:03:00,116 --> 00:03:03,556 Speaker 1: idea of a government giving all its citizens' money, and 45 00:03:03,596 --> 00:03:05,636 Speaker 1: he says it seems for the first time that computers 46 00:03:05,636 --> 00:03:09,516 Speaker 1: and software will actually replace jobs in a deeply concerned way, 47 00:03:09,836 --> 00:03:12,996 Speaker 1: which is both exciting and terrifying. What are your thoughts 48 00:03:13,036 --> 00:03:16,236 Speaker 1: on UBI universal basic income as a solution to an 49 00:03:16,276 --> 00:03:21,316 Speaker 1: AI crisis and the widespread philosophical and economic implications of this. 50 00:03:22,596 --> 00:03:26,836 Speaker 3: I love this question. I mean, it's so big, and 51 00:03:26,876 --> 00:03:28,436 Speaker 3: I think the first thing to say is we don't 52 00:03:28,476 --> 00:03:33,276 Speaker 3: really have any idea. If what Alex is thinking about 53 00:03:33,556 --> 00:03:37,196 Speaker 3: comes true, and if most people just have no economic value, 54 00:03:37,196 --> 00:03:39,316 Speaker 3: they have value as human beings, have value as members 55 00:03:39,316 --> 00:03:41,476 Speaker 3: of society, but there's nothing that they could actually sell 56 00:03:41,516 --> 00:03:46,196 Speaker 3: their labor to do, then that's completely uncharted territory. We've 57 00:03:46,236 --> 00:03:50,116 Speaker 3: never been anywhere like that before, so everything we do 58 00:03:50,196 --> 00:03:51,196 Speaker 3: is kind of speculative. 59 00:03:51,676 --> 00:03:54,796 Speaker 1: We've feared it for a long time, right, We've have 60 00:03:54,996 --> 00:03:59,156 Speaker 1: two hundred years of being afraid of technological unemployment. And 61 00:03:59,156 --> 00:04:04,156 Speaker 1: my prior on this is to be somewhat skeptical. Right, Like, 62 00:04:04,356 --> 00:04:08,356 Speaker 1: clearly there can be some large number of people who 63 00:04:08,396 --> 00:04:10,356 Speaker 1: lose their jobs. We should be concerned about that, and 64 00:04:10,356 --> 00:04:12,596 Speaker 1: we should think about how to mitigate that. But the 65 00:04:12,756 --> 00:04:16,556 Speaker 1: idea of more or less everybody losing their jobs I'm 66 00:04:16,596 --> 00:04:19,036 Speaker 1: skeptical of for the simple reason that it hasn't happened 67 00:04:19,036 --> 00:04:22,436 Speaker 1: in two hundred years of incredible technological progress. And right now, 68 00:04:22,876 --> 00:04:27,236 Speaker 1: after decades of extreme technological progress, in the US, unemployment 69 00:04:27,316 --> 00:04:30,356 Speaker 1: is below four percent, the share of working age people 70 00:04:30,396 --> 00:04:33,356 Speaker 1: who are working is near all time highs, And so 71 00:04:33,596 --> 00:04:37,756 Speaker 1: somehow we keep coming up with new things to do 72 00:04:37,836 --> 00:04:40,636 Speaker 1: for money, no matter how many things computers can do. 73 00:04:40,796 --> 00:04:44,196 Speaker 1: And so my first thought is, I don't think we're 74 00:04:44,196 --> 00:04:46,756 Speaker 1: going to have everybody losing their jobs to AI. I 75 00:04:46,796 --> 00:04:48,436 Speaker 1: definitely could be wrong, but that's what I think. 76 00:04:48,516 --> 00:04:52,076 Speaker 3: No, I think that's a good working assumption. If you 77 00:04:52,116 --> 00:04:55,636 Speaker 3: think back a few centuries, basically, almost all the labor 78 00:04:55,676 --> 00:04:58,676 Speaker 3: that people did, they might wash their clothes occasionally, Well, 79 00:04:58,676 --> 00:05:00,996 Speaker 3: that's been outsourced to the washing machines that spend a 80 00:05:01,036 --> 00:05:04,516 Speaker 3: lot of time moving water around, just drinking water, cooking water, 81 00:05:04,956 --> 00:05:09,516 Speaker 3: throwing out human excrement. That's all now handled by automated systems. 82 00:05:09,836 --> 00:05:13,236 Speaker 1: Digging people did a lot of digging, right, pulling a plow. 83 00:05:13,516 --> 00:05:16,916 Speaker 3: Almost everything we used to do is now done by machines, 84 00:05:16,956 --> 00:05:19,396 Speaker 3: but somehow we still all have jobs. Let's at least 85 00:05:19,396 --> 00:05:21,476 Speaker 3: accept the premise that maybe this time it might be 86 00:05:21,516 --> 00:05:26,156 Speaker 3: different because the robots are doing everything. There's still material prosperity. 87 00:05:26,356 --> 00:05:29,356 Speaker 3: There's food out there, in all the services we could 88 00:05:29,356 --> 00:05:32,556 Speaker 3: possibly want. We just have to find some system whereby 89 00:05:32,956 --> 00:05:36,076 Speaker 3: the humans who have no economic value get to enjoy 90 00:05:36,116 --> 00:05:38,756 Speaker 3: all this cool stuff that's being produced by the machines. 91 00:05:39,276 --> 00:05:44,876 Speaker 1: Yeah, and you know, technological prosperity has given us in 92 00:05:44,956 --> 00:05:49,076 Speaker 1: the developed world ubi for old people, right in the US, 93 00:05:49,156 --> 00:05:51,876 Speaker 1: as in every developed country as far as I know, 94 00:05:52,396 --> 00:05:54,236 Speaker 1: once you get to some age, if you are a 95 00:05:54,316 --> 00:05:57,516 Speaker 1: citizen and you have worked, yeah, the government gives you 96 00:05:57,556 --> 00:06:00,636 Speaker 1: money yea every month until you die, right, and so 97 00:06:00,796 --> 00:06:05,916 Speaker 1: you could imagine a kind of creeping extension of that. Certainly, 98 00:06:05,996 --> 00:06:08,396 Speaker 1: right now they're not talking about lowering the retirement age 99 00:06:08,396 --> 00:06:11,236 Speaker 1: in the US. They're about raising it as in many countries. 100 00:06:11,356 --> 00:06:13,196 Speaker 3: Yeah, raising it in the UK as well, But it's 101 00:06:13,236 --> 00:06:16,156 Speaker 3: still a long way away from life expectancies. My recollection 102 00:06:16,236 --> 00:06:18,636 Speaker 3: is that when Bismarck introduced the first pension, which was 103 00:06:18,676 --> 00:06:22,996 Speaker 3: in Germany in the late nineteenth century, I think the 104 00:06:23,116 --> 00:06:26,796 Speaker 3: pension started to be paid at the age sixty seven 105 00:06:26,996 --> 00:06:30,596 Speaker 3: and the life expectancy was sixty three, you. 106 00:06:30,596 --> 00:06:33,076 Speaker 1: Get negative four years in expectation. 107 00:06:33,396 --> 00:06:35,676 Speaker 3: People who survive long enough to claim any pension at 108 00:06:35,676 --> 00:06:38,116 Speaker 3: all are already exceptional. So that would be like having 109 00:06:38,156 --> 00:06:41,276 Speaker 3: a pension today that started at age ninety. Like some 110 00:06:41,276 --> 00:06:43,916 Speaker 3: people will get it by most bad road. But actually 111 00:06:43,956 --> 00:06:46,636 Speaker 3: a lot of people could easily collect thirty years of pension, 112 00:06:46,796 --> 00:06:51,036 Speaker 3: certainly twenty years. So they're living a large proportion of 113 00:06:51,076 --> 00:06:53,596 Speaker 3: their adult life receiving money from the state, and they're 114 00:06:53,596 --> 00:06:57,076 Speaker 3: also receiving money from their own savings, their own investments, 115 00:06:57,116 --> 00:07:00,316 Speaker 3: which might be a replacement for UBI. Maybe we all 116 00:07:00,356 --> 00:07:02,876 Speaker 3: just have shares in the robots instead, we have shares 117 00:07:02,876 --> 00:07:04,716 Speaker 3: in Google or whatever, and that's how we get paid. 118 00:07:05,076 --> 00:07:07,596 Speaker 1: Yes, whether that is mediated by the government or not 119 00:07:07,796 --> 00:07:11,236 Speaker 1: looks somewhat different. Right, Either the government is taxing the 120 00:07:11,316 --> 00:07:15,276 Speaker 1: owners of in Nvidia and Google stock and distributing the money, 121 00:07:15,436 --> 00:07:19,276 Speaker 1: or everybody owns Google and Nvidia stock. There is the 122 00:07:19,356 --> 00:07:23,716 Speaker 1: piece of this which is about the non financial parts 123 00:07:23,836 --> 00:07:26,316 Speaker 1: of work, right, I mean, there's a political economy piece, 124 00:07:26,396 --> 00:07:28,116 Speaker 1: the sort of bridge. How do we get from here 125 00:07:28,156 --> 00:07:31,476 Speaker 1: to there? If that happens and that's complicated and maybe ugly. 126 00:07:31,716 --> 00:07:34,356 Speaker 3: If the robots can do all of the stuff the 127 00:07:34,396 --> 00:07:38,196 Speaker 3: computers can do it, there's no economic reason why humans 128 00:07:38,196 --> 00:07:42,196 Speaker 3: couldn't just receive whatever an allowance they're given ten robots, 129 00:07:42,436 --> 00:07:45,156 Speaker 3: or they're given ten thousand dollars a month to spend 130 00:07:45,276 --> 00:07:47,156 Speaker 3: or whatever they like. There's no economic reason why that 131 00:07:47,196 --> 00:07:49,476 Speaker 3: couldn't happen. But I think what you're getting at is 132 00:07:49,516 --> 00:07:52,036 Speaker 3: what does it do to us if we're in that situation? 133 00:07:52,276 --> 00:07:55,316 Speaker 1: Yeah? Yeah, Like I don't want to not have a job. 134 00:07:55,636 --> 00:07:58,196 Speaker 1: I recognize that I am fortunate to have a job 135 00:07:58,276 --> 00:08:01,676 Speaker 1: that I enjoy, that I derive a part of my 136 00:08:01,756 --> 00:08:04,956 Speaker 1: identity from. I recognize that for many, in fact, probably 137 00:08:04,956 --> 00:08:07,756 Speaker 1: most people, a job is not that it's some unpleasant 138 00:08:07,756 --> 00:08:10,516 Speaker 1: thing they do because they need the money, and if 139 00:08:10,516 --> 00:08:13,036 Speaker 1: they got more money, they would quit their job. Well, 140 00:08:13,076 --> 00:08:15,356 Speaker 1: you probably know the kind of empirical evidence on this 141 00:08:15,436 --> 00:08:18,156 Speaker 1: better than I do. Like, people have looked at lottery winners, right, 142 00:08:18,236 --> 00:08:20,876 Speaker 1: and my sense is it's not great for you to 143 00:08:20,916 --> 00:08:23,076 Speaker 1: win the lottery and quit your job. It doesn't actually 144 00:08:23,156 --> 00:08:23,956 Speaker 1: make you happier. 145 00:08:24,116 --> 00:08:26,676 Speaker 3: Actually, the evidence on lottery winners is a bit mixed, 146 00:08:26,716 --> 00:08:29,316 Speaker 3: and I think slightly we hear all the disaster stories 147 00:08:29,316 --> 00:08:30,076 Speaker 3: of lottery winners. 148 00:08:30,196 --> 00:08:32,556 Speaker 1: Actually, it's great to win the lottery. That's amazing. Is 149 00:08:32,556 --> 00:08:32,916 Speaker 1: that true. 150 00:08:33,076 --> 00:08:35,556 Speaker 3: I think it's fine. Yes, it's just fine to win 151 00:08:35,556 --> 00:08:38,516 Speaker 3: the lottery. Is not really a problem to win the lottery. 152 00:08:38,596 --> 00:08:40,476 Speaker 3: In my vague recollection I wrote about this a couple 153 00:08:40,516 --> 00:08:44,116 Speaker 3: of years ago. But we may not all want a job, 154 00:08:44,596 --> 00:08:47,836 Speaker 3: but we all want something to do. We all want 155 00:08:47,956 --> 00:08:51,596 Speaker 3: to feel useful, we all want a sense of some 156 00:08:51,676 --> 00:08:54,556 Speaker 3: kind of purpose. We all want, I think, that experience 157 00:08:54,556 --> 00:08:56,796 Speaker 3: of mastery, that experience of knowing that you can do 158 00:08:56,876 --> 00:08:59,996 Speaker 3: something that not everybody can do. Those things don't have 159 00:09:00,036 --> 00:09:01,276 Speaker 3: to come from a job, but for a lot of 160 00:09:01,316 --> 00:09:02,556 Speaker 3: people they do come from a job. 161 00:09:02,996 --> 00:09:06,956 Speaker 1: Yeah. For me, in my narrow provincial experience of life, 162 00:09:07,396 --> 00:09:10,756 Speaker 1: it's frankly hard to imagine and getting those things without 163 00:09:10,796 --> 00:09:11,996 Speaker 1: a job. Yeah. 164 00:09:12,036 --> 00:09:15,156 Speaker 3: It comes back to my initial reaction to Alex's terrific question, 165 00:09:15,196 --> 00:09:18,116 Speaker 3: which is this is such unknown territory that we can 166 00:09:18,156 --> 00:09:20,956 Speaker 3: only really speculate. But I think you and I jcub 167 00:09:20,996 --> 00:09:24,156 Speaker 3: are in agreement that the fundamental issue here is not economic. 168 00:09:24,636 --> 00:09:27,036 Speaker 3: It's really to do with our souls. How would we 169 00:09:27,116 --> 00:09:30,316 Speaker 3: react if our desire for master. We our desire for meaning, 170 00:09:30,316 --> 00:09:33,356 Speaker 3: our desire to feel useful if that had to be 171 00:09:33,476 --> 00:09:36,996 Speaker 3: satisfied without having a job, and what would we do 172 00:09:37,036 --> 00:09:39,636 Speaker 3: and could we cope? And I don't know, I. 173 00:09:39,556 --> 00:09:42,236 Speaker 1: Mean, Harford, if the robots come and take our jobs, 174 00:09:42,316 --> 00:09:44,356 Speaker 1: let's just you and me make a podcast for free. 175 00:09:44,476 --> 00:09:48,556 Speaker 1: I'm in. I'll commit right now in the robot Utopia 176 00:09:48,556 --> 00:09:52,156 Speaker 1: apocalypse to making a weekly podcast with you for free. 177 00:09:52,236 --> 00:09:52,516 Speaker 3: Deal. 178 00:09:52,716 --> 00:09:55,636 Speaker 1: Sorry, we have no more time. You might give in 179 00:09:55,876 --> 00:10:02,516 Speaker 1: your robot overlord. Give us another question, Hello, Race Ubinov, 180 00:10:03,076 --> 00:10:05,036 Speaker 1: I would like to ask, with the onset of AI, 181 00:10:05,636 --> 00:10:09,836 Speaker 1: what is the next cautionary tale you anticipate talking about 182 00:10:09,996 --> 00:10:10,916 Speaker 1: in the years to come? 183 00:10:11,236 --> 00:10:14,516 Speaker 3: Hello Ivanough, I have just finished the first draft of 184 00:10:14,596 --> 00:10:19,556 Speaker 3: a script about the coming of AI, and without giving 185 00:10:19,596 --> 00:10:23,396 Speaker 3: too much away, the cautionary note is about what happens 186 00:10:23,436 --> 00:10:28,276 Speaker 3: when automation gets so good that we lose our own skills. 187 00:10:28,476 --> 00:10:31,916 Speaker 3: We hand over controlled the machine, and then how do 188 00:10:31,996 --> 00:10:34,916 Speaker 3: we respond when the machine says, actually, this one's too hard? 189 00:10:35,876 --> 00:10:38,516 Speaker 3: Could you take over again? Suddenly we're back at the 190 00:10:38,556 --> 00:10:40,716 Speaker 3: wheel and we're out of practice. 191 00:10:40,876 --> 00:10:43,436 Speaker 1: Harford, it's not another plane crash, is it. Don't you 192 00:10:43,436 --> 00:10:45,636 Speaker 1: have a moratory. I'm undering plane crashes at this point. 193 00:10:45,636 --> 00:10:46,836 Speaker 1: Because it sounds like that. 194 00:10:46,756 --> 00:10:49,156 Speaker 3: I'm going to say nothing more fair enough. Let's go 195 00:10:49,196 --> 00:10:49,716 Speaker 3: for a break. 196 00:10:50,876 --> 00:11:00,036 Speaker 1: All right, Yeah, we're back. 197 00:11:00,236 --> 00:11:04,196 Speaker 3: I'm talking to Jacob Goldstein, the host of What's Your Problem, 198 00:11:04,476 --> 00:11:08,756 Speaker 3: and we are doing a Cautionary Questions Q and A episode. 199 00:11:09,236 --> 00:11:11,916 Speaker 3: Jacob has the questions. Jacob's also helping with the answers. 200 00:11:12,316 --> 00:11:15,396 Speaker 3: I'm really sepless to requirements here, but I'm doing my best, Jacob. 201 00:11:15,436 --> 00:11:16,236 Speaker 3: What have you got for me? 202 00:11:16,756 --> 00:11:20,796 Speaker 1: Tim? Our next question comes from Adam, who writes, Hi, 203 00:11:21,276 --> 00:11:24,476 Speaker 1: I have a question about investing in cryptocurrency. Oh dear, 204 00:11:24,836 --> 00:11:28,156 Speaker 1: oh dear, come on give it to them. Okay. It's 205 00:11:28,156 --> 00:11:32,036 Speaker 1: an interesting one. Traditionally, when buying shares in a company, 206 00:11:32,236 --> 00:11:35,236 Speaker 1: you would be supporting that company to grow, create a 207 00:11:35,236 --> 00:11:38,316 Speaker 1: new product, or enter a new retail space. This would 208 00:11:38,316 --> 00:11:42,156 Speaker 1: hopefully create jobs and new products. When people invest money 209 00:11:42,196 --> 00:11:46,036 Speaker 1: into cryptocurrency or any currency for that matter, isn't that 210 00:11:46,116 --> 00:11:49,516 Speaker 1: money just sitting around not doing anything until you withdraw it. 211 00:11:49,876 --> 00:11:53,196 Speaker 1: Would investing in crypto be bad for the economy compared 212 00:11:53,236 --> 00:11:54,916 Speaker 1: with investing in businesses? 213 00:11:55,676 --> 00:11:59,556 Speaker 3: This is a deep question. I really like it. It is 214 00:12:00,236 --> 00:12:03,276 Speaker 3: as the author of the wonderful book Money, The True 215 00:12:03,316 --> 00:12:05,516 Speaker 3: Story of a Maid Art Thing. I am sure you 216 00:12:05,556 --> 00:12:07,756 Speaker 3: have thoughts, Jacob. They may have a first crack and 217 00:12:07,916 --> 00:12:09,076 Speaker 3: you can tell me everything I've made. 218 00:12:09,236 --> 00:12:09,676 Speaker 1: Yeah. Yeah. 219 00:12:09,716 --> 00:12:12,596 Speaker 3: So let's say you buy bitcoin or whatever, and then 220 00:12:12,596 --> 00:12:16,196 Speaker 3: the money's just sitting there because bitcoin is not building anything. 221 00:12:16,236 --> 00:12:19,476 Speaker 3: You're not buying investment in, say a road building company, 222 00:12:19,556 --> 00:12:21,236 Speaker 3: or you're not buying an investment in Google, which is 223 00:12:21,236 --> 00:12:24,916 Speaker 3: developing new technology. There's one of two possible things that happens. 224 00:12:25,236 --> 00:12:29,396 Speaker 3: So most likely is whoever you bought the bitcoin from 225 00:12:29,756 --> 00:12:33,156 Speaker 3: now has your money. Well, now it's their money, and 226 00:12:33,196 --> 00:12:34,756 Speaker 3: then they're going to do something with that money. So 227 00:12:34,796 --> 00:12:37,676 Speaker 3: maybe they then buy shares in Google, or they then 228 00:12:37,716 --> 00:12:40,356 Speaker 3: set up a business, or they will then go on 229 00:12:40,436 --> 00:12:43,196 Speaker 3: to do something with that money, or they'll lend it 230 00:12:43,196 --> 00:12:45,516 Speaker 3: to somebody else and that person will do something with it. 231 00:12:45,556 --> 00:12:49,156 Speaker 3: Eventually the money will find its way into some productive investment. 232 00:12:49,476 --> 00:12:52,196 Speaker 3: The fact that you bought bitcoin off somebody does not 233 00:12:52,436 --> 00:12:55,236 Speaker 3: mean that they won't then do something useful with the money. 234 00:12:55,316 --> 00:12:56,956 Speaker 3: But then let's say for some reason they just go, 235 00:12:57,036 --> 00:12:59,156 Speaker 3: you know what, I'm just going to sit on this money. 236 00:12:59,196 --> 00:13:01,276 Speaker 3: It's just dollars. I'm going to stick them under the mattress. 237 00:13:01,516 --> 00:13:04,916 Speaker 3: I guess a bitcoin guy might do that. That would 238 00:13:05,036 --> 00:13:08,716 Speaker 3: reduce inflationary pressures on the economy, which is something we 239 00:13:08,756 --> 00:13:12,996 Speaker 3: want anyway. And if inflationary pressures are reduced too much, 240 00:13:13,476 --> 00:13:16,116 Speaker 3: the Federal Reserve could then go, you know what, guys, 241 00:13:16,796 --> 00:13:20,796 Speaker 3: we should probably print some money or otherwise stimulate the economy. Actually, 242 00:13:20,796 --> 00:13:23,236 Speaker 3: the worst thing that could possibly happen is that the 243 00:13:23,236 --> 00:13:27,156 Speaker 3: money gets invested in buying more computers that could be 244 00:13:27,156 --> 00:13:28,996 Speaker 3: put to better use, but in fact just end up 245 00:13:29,036 --> 00:13:31,156 Speaker 3: doing bitcoin mining. I guess what I'm saying is, as 246 00:13:31,156 --> 00:13:33,636 Speaker 3: long as they don't spend it on bitcoin mining computers, 247 00:13:33,676 --> 00:13:35,836 Speaker 3: it's totally fine. Well what am I missing? 248 00:13:36,116 --> 00:13:38,236 Speaker 1: I went around the same track as you went around. 249 00:13:38,356 --> 00:13:43,596 Speaker 1: And you know, different cryptocurrencies have different methods of regulation, right, 250 00:13:43,636 --> 00:13:48,516 Speaker 1: So Bitcoin by design is this very energy intensive computing 251 00:13:48,596 --> 00:13:51,636 Speaker 1: process in order to mine bitcoin, and that is, in 252 00:13:51,676 --> 00:13:55,436 Speaker 1: my opinion, a bad outcome. Socially, I agree, because you 253 00:13:55,436 --> 00:13:58,116 Speaker 1: could imagine a world where maintaining the bitcoin network was 254 00:13:58,156 --> 00:14:00,636 Speaker 1: socially desirable. In the world as it has evolved, I 255 00:14:00,636 --> 00:14:03,676 Speaker 1: don't find it particularly socially desirable, but at this point, 256 00:14:03,716 --> 00:14:06,396 Speaker 1: I don't think most of the money going into bitcoin 257 00:14:06,516 --> 00:14:08,876 Speaker 1: is going to miners, right, there's a large stock of 258 00:14:09,236 --> 00:14:11,996 Speaker 1: coin that exists in the world. I mean, an interesting 259 00:14:12,036 --> 00:14:16,436 Speaker 1: thing about money is it almost never sleeps. It keeps going, 260 00:14:16,556 --> 00:14:20,316 Speaker 1: and so you can always ask what happens next, you know, 261 00:14:20,356 --> 00:14:23,036 Speaker 1: with the stock market, sometimes people talk about money on 262 00:14:23,116 --> 00:14:25,356 Speaker 1: the sidelines when they're like, well, the stock market can 263 00:14:25,356 --> 00:14:27,036 Speaker 1: go up somewhere because there's a lot of cash on 264 00:14:27,076 --> 00:14:31,716 Speaker 1: the sidelines. And there's this famous billionaire investor, Cliff Asnes. 265 00:14:31,996 --> 00:14:33,796 Speaker 1: He's an interesting guy. You got a PhD from the 266 00:14:33,876 --> 00:14:37,876 Speaker 1: University of Chicago, studied with Gene Fama, famous economist, and 267 00:14:37,996 --> 00:14:41,116 Speaker 1: Cliff Asnes gets driven up the wall when people say 268 00:14:41,116 --> 00:14:43,796 Speaker 1: cash on the sidelines for the same reason you just said, Right, 269 00:14:43,836 --> 00:14:47,916 Speaker 1: Like every time somebody buys a share of stock, someone 270 00:14:47,916 --> 00:14:50,356 Speaker 1: else is selling it, right, So like the money is 271 00:14:50,956 --> 00:14:54,236 Speaker 1: changing hands, the stock is changing hands. But like there 272 00:14:54,276 --> 00:14:56,036 Speaker 1: are no sidelines with money. 273 00:14:56,116 --> 00:14:59,796 Speaker 3: Yeah, and even if someone puts just cash in a 274 00:14:59,876 --> 00:15:03,076 Speaker 3: checking account, well then now the bank has money, and 275 00:15:03,396 --> 00:15:04,956 Speaker 3: you can bet the bank is going to want to 276 00:15:04,956 --> 00:15:07,876 Speaker 3: do something with that money, something productive. And even if 277 00:15:07,876 --> 00:15:09,836 Speaker 3: they doesn't put it in the bank, as I say, 278 00:15:10,116 --> 00:15:13,756 Speaker 3: it's a dramatris. Well, even then, the Federals could always 279 00:15:13,796 --> 00:15:14,316 Speaker 3: put money. 280 00:15:14,476 --> 00:15:17,116 Speaker 1: Yeah, the stock of money is not fixed, right. I 281 00:15:17,156 --> 00:15:20,956 Speaker 1: think there's an assumption in the question I think goes 282 00:15:21,036 --> 00:15:24,716 Speaker 1: back to the pre modern era, the era before the 283 00:15:24,796 --> 00:15:27,636 Speaker 1: nineteen thirties when money was based on gold or silver 284 00:15:27,756 --> 00:15:30,676 Speaker 1: and it was finite. Right, there was a fixed amount 285 00:15:30,716 --> 00:15:33,196 Speaker 1: of gold and silver in the world in those days. Indeed, 286 00:15:33,316 --> 00:15:36,956 Speaker 1: if you sat on money, you were effectively reducing the 287 00:15:36,996 --> 00:15:39,396 Speaker 1: supply of money in the world. But to your point, 288 00:15:39,396 --> 00:15:41,956 Speaker 1: the supply of money in the world is as big 289 00:15:41,996 --> 00:15:44,196 Speaker 1: as the Central Bank wants it to be, and so 290 00:15:44,556 --> 00:15:47,796 Speaker 1: it's not a meaningful constraint on economic growth. So the 291 00:15:47,876 --> 00:15:50,196 Speaker 1: ultimate answer to the question is if you are just 292 00:15:50,516 --> 00:15:53,156 Speaker 1: sitting on money, it's not really going to have an 293 00:15:53,156 --> 00:15:54,556 Speaker 1: effect on the broader economy. 294 00:15:54,676 --> 00:15:56,476 Speaker 3: Well, I think we've come to agreement. Hopefully that makes 295 00:15:56,516 --> 00:15:59,036 Speaker 3: sense to Adam. Next question. 296 00:16:00,036 --> 00:16:04,396 Speaker 1: Our next question comes from doctor Yvonne Couch, who is 297 00:16:04,996 --> 00:16:11,036 Speaker 1: Associate Professor of Neuroimmunology Alzheimer's Research UK Fellow, Associate Research 298 00:16:11,076 --> 00:16:15,916 Speaker 1: Fellow at Saint Hilda's College, Stipendary Lecturer at Somerville College, 299 00:16:16,636 --> 00:16:17,676 Speaker 1: University of Oxford. 300 00:16:18,076 --> 00:16:20,396 Speaker 3: Yeah, University of Oxford. I know, well, one of my 301 00:16:20,436 --> 00:16:22,156 Speaker 3: sisters went to St. Hilda's College and one of my 302 00:16:22,196 --> 00:16:25,116 Speaker 3: sisters went to Somerville College, so this feels very much 303 00:16:25,116 --> 00:16:27,596 Speaker 3: in the family. So what does Dr Couch got to say? 304 00:16:28,716 --> 00:16:33,116 Speaker 1: She writes, my question was vaguely economics based, although probably 305 00:16:33,156 --> 00:16:38,436 Speaker 1: not very is the way we currently fund science feasible 306 00:16:38,516 --> 00:16:41,996 Speaker 1: going forward? Do we just have too many scientists and 307 00:16:42,076 --> 00:16:47,196 Speaker 1: not enough resources? All the best, Yvonne, oh love it. 308 00:16:47,556 --> 00:16:49,796 Speaker 3: Let me throw out a few thoughts. Then I think 309 00:16:49,836 --> 00:16:53,636 Speaker 3: that probably there is a problem with the way we 310 00:16:53,676 --> 00:16:56,356 Speaker 3: fund science, But there's no one way that we fund science. 311 00:16:56,436 --> 00:16:59,276 Speaker 3: You have university research, you have various grands, various sources 312 00:16:59,276 --> 00:17:03,036 Speaker 3: of funding, philanthropy and so on. You also have private 313 00:17:03,076 --> 00:17:08,196 Speaker 3: sector research, which is often incentivized by the patent system. 314 00:17:08,356 --> 00:17:10,756 Speaker 3: And then you have big block grants that are handed 315 00:17:10,796 --> 00:17:15,196 Speaker 3: out by agencies such as the National Institutes for Health. 316 00:17:15,596 --> 00:17:18,796 Speaker 3: So there are lots of different ways that science gets funded. 317 00:17:19,716 --> 00:17:23,996 Speaker 3: A couple of things that worry me is that, first 318 00:17:24,036 --> 00:17:27,276 Speaker 3: of all, there's a big incentive to make really incremental 319 00:17:27,356 --> 00:17:33,876 Speaker 3: improvements rather than to take risks. There's a great economics 320 00:17:33,876 --> 00:17:37,756 Speaker 3: paper studying scientists who are funded by the National Institutes 321 00:17:37,796 --> 00:17:41,276 Speaker 3: for Health, which is a wonderful institution, very important, and 322 00:17:41,596 --> 00:17:45,076 Speaker 3: scientists who on paper seem to be the same that're 323 00:17:45,076 --> 00:17:48,116 Speaker 3: on the same career track, got very similar publication records, 324 00:17:48,396 --> 00:17:51,756 Speaker 3: and they are instead funded by a private foundation called 325 00:17:51,756 --> 00:17:55,516 Speaker 3: the Howard Hughes Medical Institute. Listeners who want to hear 326 00:17:55,556 --> 00:17:58,556 Speaker 3: more about Howard Hughes and the Howard Hughes biography can 327 00:17:58,676 --> 00:18:00,676 Speaker 3: go to the back catalog of caution details. So the 328 00:18:00,676 --> 00:18:04,916 Speaker 3: Howard Hughes Medical Institute basically takes risks. It wants people 329 00:18:05,076 --> 00:18:08,116 Speaker 3: to do something new. It's happy with a high risk 330 00:18:08,156 --> 00:18:10,316 Speaker 3: of failure as long as there's some chance of a 331 00:18:10,476 --> 00:18:16,996 Speaker 3: real breakthrough success. And this particular paper studying the results 332 00:18:16,996 --> 00:18:20,076 Speaker 3: that come from these two funding systems basically finds that 333 00:18:20,236 --> 00:18:23,196 Speaker 3: the grant funders get what they pay for. So the 334 00:18:23,276 --> 00:18:26,476 Speaker 3: National Institutes for Health get a high success rate, but 335 00:18:26,556 --> 00:18:29,516 Speaker 3: it's often quite incremental progress. And the Howard Hughes Medical 336 00:18:29,556 --> 00:18:32,716 Speaker 3: Institute has lots and lots of failures, but when it succeeds, 337 00:18:32,716 --> 00:18:35,876 Speaker 3: it really succeeds. And they these are hugely important papers, 338 00:18:36,196 --> 00:18:40,756 Speaker 3: and I just feel that we're not deliberate enough about saying, well, 339 00:18:40,796 --> 00:18:46,236 Speaker 3: how much of our funding ecosystem should be aiming at 340 00:18:46,316 --> 00:18:49,436 Speaker 3: kind of venture capital style for really big wins, and 341 00:18:49,476 --> 00:18:51,676 Speaker 3: how much should be incremental. I think those are not 342 00:18:51,796 --> 00:18:54,396 Speaker 3: the sorts of questions that get asked, So that's one 343 00:18:54,396 --> 00:18:56,956 Speaker 3: of the things that worries me. What's your take, Jacob. 344 00:18:57,476 --> 00:18:59,756 Speaker 1: One of the things that I have read is that 345 00:18:59,836 --> 00:19:03,676 Speaker 1: over time the average age of grant recipients from the 346 00:19:03,796 --> 00:19:06,436 Speaker 1: NIH has gone up and up and up. So kind 347 00:19:06,436 --> 00:19:08,796 Speaker 1: of you just think of this whole universe as getting 348 00:19:08,836 --> 00:19:12,476 Speaker 1: all and more risk averse and more kind of bureaucratic. 349 00:19:13,076 --> 00:19:16,516 Speaker 1: There is an interesting set of counter pressures I think 350 00:19:16,636 --> 00:19:19,876 Speaker 1: rising up partly out of Silicon Valley. I don't know 351 00:19:19,876 --> 00:19:23,796 Speaker 1: if you've come across the work of Patrick Hollison and 352 00:19:23,876 --> 00:19:26,836 Speaker 1: his brother. They're from Ireland. They found it Striped. They're 353 00:19:26,996 --> 00:19:30,356 Speaker 1: very rich. Stripe is a big company that does basically 354 00:19:30,396 --> 00:19:33,916 Speaker 1: payment stuff online. They have a really interesting set of 355 00:19:34,436 --> 00:19:39,276 Speaker 1: kind of philanthropic endeavors around the idea of progress. You know, 356 00:19:39,276 --> 00:19:42,356 Speaker 1: they're trying to create a kind of field of progress 357 00:19:42,436 --> 00:19:46,556 Speaker 1: studies that is, it's very meta. What are the conditions 358 00:19:46,676 --> 00:19:51,076 Speaker 1: that best foster technological and scientific progress? 359 00:19:51,516 --> 00:19:54,156 Speaker 3: But this question from doctor Couch, he is very massive. Yes, 360 00:19:54,156 --> 00:19:55,796 Speaker 3: it's very much progress studies, isn't it. 361 00:19:56,236 --> 00:19:56,476 Speaker 2: Yes. 362 00:19:56,716 --> 00:20:00,356 Speaker 1: There is this institute in the Bay Area called the 363 00:20:00,516 --> 00:20:04,676 Speaker 1: ARC Institute that hires leading scientists. The basic idea is 364 00:20:04,996 --> 00:20:10,276 Speaker 1: give talented people freedom and money, right and encourage them 365 00:20:10,316 --> 00:20:13,356 Speaker 1: to take big swings. And they are interested, you know, 366 00:20:13,436 --> 00:20:16,116 Speaker 1: not just in outcomes, but in new tools. Right. Again, 367 00:20:16,396 --> 00:20:19,236 Speaker 1: continuing on the nerdy thing, like if you take a 368 00:20:19,236 --> 00:20:22,156 Speaker 1: tool like Crisper. Crisper is an intermediate tool that allows 369 00:20:22,196 --> 00:20:24,476 Speaker 1: people to cut up a genome. Basically, we just had 370 00:20:24,516 --> 00:20:27,916 Speaker 1: the first transplant from an animal into a human a 371 00:20:27,916 --> 00:20:29,996 Speaker 1: few weeks ago because of Crisper. We have the first 372 00:20:30,396 --> 00:20:33,796 Speaker 1: treatments for sickle cell disease because of Crisper. So I 373 00:20:33,876 --> 00:20:38,596 Speaker 1: do think there is a wave of people trying to 374 00:20:38,716 --> 00:20:43,236 Speaker 1: rethink scientific funding. There is a bottom line aspect to 375 00:20:43,356 --> 00:20:46,836 Speaker 1: this question from doctor Couch that I don't feel like 376 00:20:46,916 --> 00:20:48,636 Speaker 1: I know enough to answer, and I'm curious if you do. 377 00:20:48,676 --> 00:20:51,396 Speaker 1: I mean, the question, is the way we currently fund 378 00:20:51,436 --> 00:20:55,556 Speaker 1: science feasible going forward? There is a yes no version 379 00:20:55,596 --> 00:20:57,796 Speaker 1: of the answer. Do you think you know? Is it? 380 00:20:58,476 --> 00:21:00,836 Speaker 3: I don't know the answer, but I guess I don't 381 00:21:00,876 --> 00:21:04,316 Speaker 3: know either. Another thing that concerns me on this, which 382 00:21:04,516 --> 00:21:08,316 Speaker 3: is potentially really existential is the question of how we 383 00:21:08,396 --> 00:21:13,876 Speaker 3: fund new antibiotics. I think this points to a real 384 00:21:13,916 --> 00:21:17,156 Speaker 3: weakness in the ecosystem of research funding. So if you 385 00:21:17,196 --> 00:21:20,396 Speaker 3: think about the basic way we develop drugs, the fundamental 386 00:21:20,636 --> 00:21:23,916 Speaker 3: incentive is the patent. So a drug company spends a 387 00:21:23,916 --> 00:21:26,876 Speaker 3: lot of money, tries to develop drugs. Some of them work, 388 00:21:26,916 --> 00:21:28,516 Speaker 3: some of them fail, a lot of them fail, but 389 00:21:28,636 --> 00:21:31,316 Speaker 3: in the end you have a drug and the deal 390 00:21:31,476 --> 00:21:34,276 Speaker 3: is we give you this artificial monopoly called a patent, 391 00:21:34,836 --> 00:21:37,556 Speaker 3: and it will only last for a certain amount of time, 392 00:21:37,636 --> 00:21:39,276 Speaker 3: which is kind of a problem because it takes so 393 00:21:39,276 --> 00:21:42,196 Speaker 3: long to develop the drugs. Maybe the patent's nearly expired. 394 00:21:42,596 --> 00:21:45,196 Speaker 3: You can charge an incredible amount for these drugs for 395 00:21:45,236 --> 00:21:47,796 Speaker 3: a while, and then your patents will run out, and 396 00:21:47,836 --> 00:21:49,916 Speaker 3: then loads of as people will make the same drug 397 00:21:49,916 --> 00:21:52,596 Speaker 3: and it'll come down in value. For example, Viagra, you 398 00:21:52,636 --> 00:21:54,516 Speaker 3: know you could sell this for a huge amount of money, 399 00:21:54,516 --> 00:21:57,076 Speaker 3: and now Viagra is off patent and anyone can make 400 00:21:57,076 --> 00:22:00,396 Speaker 3: a generic Viagra. So that's the incentive that we're given 401 00:22:00,436 --> 00:22:03,396 Speaker 3: to private companies, that you will have this temporary monopoly. 402 00:22:04,556 --> 00:22:08,156 Speaker 3: So now think about antibiotics. The problem with antibiotics. We 403 00:22:08,196 --> 00:22:11,636 Speaker 3: have lots of antibiotics work really well, except the bacteria 404 00:22:11,676 --> 00:22:14,236 Speaker 3: have figured them out. And why have they figured them out? 405 00:22:14,356 --> 00:22:18,396 Speaker 3: Because we've used them. The bacteria developed resistance. So what 406 00:22:18,436 --> 00:22:22,636 Speaker 3: we really need is new antibiotics that we don't use. 407 00:22:23,836 --> 00:22:26,356 Speaker 3: And now think about how the pattern system deals with that. 408 00:22:26,436 --> 00:22:30,356 Speaker 3: So you basically say, if you develop a new antibiotic, 409 00:22:30,796 --> 00:22:32,996 Speaker 3: we'll give you a tempery monopoly, but we really need 410 00:22:33,036 --> 00:22:36,516 Speaker 3: you to just not sell any don't sell any of. 411 00:22:36,476 --> 00:22:39,716 Speaker 1: This thing except in cases of dire emergency. It's like 412 00:22:39,756 --> 00:22:42,636 Speaker 1: a break glass in case of emergency. I mean, I 413 00:22:42,636 --> 00:22:46,476 Speaker 1: feel like a bounty. Like every econ nerd storyteller loves 414 00:22:46,476 --> 00:22:50,316 Speaker 1: a good bounty. Right, wasn't this done with a malaria vaccine, 415 00:22:50,316 --> 00:22:52,636 Speaker 1: which is actually coming along quite well. You have some 416 00:22:52,876 --> 00:22:56,196 Speaker 1: universe of people say we will pay a billion dollars 417 00:22:56,276 --> 00:22:59,036 Speaker 1: to anyone who comes up with a new antibiotic that 418 00:22:59,076 --> 00:23:02,396 Speaker 1: meets this set of criteria, that treats this set of 419 00:23:02,516 --> 00:23:06,356 Speaker 1: bugs that are resistant to these existing antibiotics, and we'll 420 00:23:06,396 --> 00:23:07,996 Speaker 1: give you the money, and you give it to us, 421 00:23:07,996 --> 00:23:09,876 Speaker 1: and we'll put it on the break glass in case 422 00:23:09,916 --> 00:23:10,956 Speaker 1: of emergency. Shelf. 423 00:23:11,076 --> 00:23:13,316 Speaker 3: You've used the word bounty, but what the term that 424 00:23:13,436 --> 00:23:16,556 Speaker 3: is normally used is an advanced market commitment. These were 425 00:23:16,716 --> 00:23:19,436 Speaker 3: proposed most famously by Michael Kramer, who's a Nobel Prize 426 00:23:19,436 --> 00:23:23,596 Speaker 3: winner in economics. And basically, the way this price, this 427 00:23:23,636 --> 00:23:26,116 Speaker 3: bounty tends to get paid is as a kind of 428 00:23:26,476 --> 00:23:28,716 Speaker 3: extra payment on top of each dose you sell. Ah, 429 00:23:28,996 --> 00:23:31,996 Speaker 3: so we'll give you extra for every kid that gets vaccinated. 430 00:23:32,756 --> 00:23:35,036 Speaker 1: So or back to the same problem in that universe. 431 00:23:35,196 --> 00:23:36,796 Speaker 3: Back to the same problem. But the reason you go, well, 432 00:23:36,796 --> 00:23:38,196 Speaker 3: why does it have to be like that? It doesn't 433 00:23:38,236 --> 00:23:39,996 Speaker 3: have to be like that. But the reason that they 434 00:23:40,036 --> 00:23:42,076 Speaker 3: tend to be structured like that is because you need 435 00:23:42,116 --> 00:23:45,676 Speaker 3: to demonstrate some kind of market demand. Somebody needs to 436 00:23:45,676 --> 00:23:48,836 Speaker 3: be willing to buy your product. If they are, we'll 437 00:23:48,876 --> 00:23:51,436 Speaker 3: give you an extra payment for every product you sell. 438 00:23:51,876 --> 00:23:54,396 Speaker 3: But that wouldn't work for antibiotics. There's so many different 439 00:23:54,396 --> 00:23:56,716 Speaker 3: ways in which science funding could be said to be broken. 440 00:23:57,156 --> 00:24:00,836 Speaker 3: But I think doctor Couch is right to raise the issue. 441 00:24:01,076 --> 00:24:02,916 Speaker 3: We need to do a lot more of this kind 442 00:24:02,916 --> 00:24:05,356 Speaker 3: of meta thinking about progress studies. 443 00:24:05,756 --> 00:24:08,196 Speaker 1: Tim that was a lot of answer. Let's take a break. 444 00:24:08,436 --> 00:24:19,116 Speaker 1: Cautionary to e will be back in a moment Edward Back. 445 00:24:19,356 --> 00:24:22,316 Speaker 1: I'm Jacob Goldstein here with Tim Harford on cautionary tales. 446 00:24:22,356 --> 00:24:25,236 Speaker 3: Hello, Jacob, more questions? What have you got for me? 447 00:24:25,636 --> 00:24:29,036 Speaker 1: All right, Tim, I got another one for you. Comes 448 00:24:29,036 --> 00:24:35,636 Speaker 1: from Graham in Florida. Graham Rights. I'm fascinated by errors 449 00:24:35,836 --> 00:24:41,116 Speaker 1: made by falsely identifying correlation as causation. What examples of 450 00:24:41,156 --> 00:24:42,556 Speaker 1: this error stand out to you? 451 00:24:43,596 --> 00:24:45,876 Speaker 3: I love the question, but I'm going to slightly sidestep 452 00:24:45,956 --> 00:24:48,516 Speaker 3: it because I'm worried by this question. But I think 453 00:24:48,556 --> 00:24:55,716 Speaker 3: that it's generally more complicated than simply some fool sora 454 00:24:55,836 --> 00:24:57,636 Speaker 3: correlation and thought it was causation. 455 00:24:58,076 --> 00:25:01,556 Speaker 1: You don't want to talk about sun spots and crap yields. 456 00:25:01,236 --> 00:25:03,836 Speaker 3: Well, let's talk about stocks and babies briefly, Okay. 457 00:25:04,236 --> 00:25:06,036 Speaker 1: A classic, A classic of the genre. 458 00:25:06,196 --> 00:25:10,916 Speaker 3: Yeah, the most successful book have a public statistics. How 459 00:25:10,956 --> 00:25:14,156 Speaker 3: to Lie with Statistics by Daryl Huff includes this example 460 00:25:14,156 --> 00:25:17,276 Speaker 3: of stalks and babies and shows that there's a correlation 461 00:25:17,436 --> 00:25:20,076 Speaker 3: between the number of storks and the number of babies. 462 00:25:20,156 --> 00:25:22,716 Speaker 3: And there are various ways to demonstrate this correlation. One 463 00:25:22,756 --> 00:25:25,356 Speaker 3: way to do it is you just look at national populations. 464 00:25:25,596 --> 00:25:27,836 Speaker 3: You go, hey, countries with lots of storks also have 465 00:25:27,876 --> 00:25:30,196 Speaker 3: lots of babies, and there's a very very strong correlation. 466 00:25:30,676 --> 00:25:33,196 Speaker 3: But of course the reason is big places like the 467 00:25:33,276 --> 00:25:35,716 Speaker 3: United States have a lot of room for storks and 468 00:25:35,756 --> 00:25:38,396 Speaker 3: a lot of room for babies, and small places like 469 00:25:38,396 --> 00:25:40,796 Speaker 3: in a Luxembourg or the Vatican City don't have a 470 00:25:40,796 --> 00:25:42,036 Speaker 3: lot of room for babies and don't have a lot 471 00:25:42,036 --> 00:25:43,756 Speaker 3: of room for storks. So then you go, oh, that's 472 00:25:43,796 --> 00:25:47,196 Speaker 3: a great example of this mistake. The sting in the 473 00:25:47,196 --> 00:25:50,196 Speaker 3: tale of that story, as I describe in my book 474 00:25:50,236 --> 00:25:53,076 Speaker 3: The Data Detective, is that Daryl Huff, the guy who 475 00:25:53,356 --> 00:25:56,996 Speaker 3: created that story, then went on to tell the same 476 00:25:57,036 --> 00:26:03,196 Speaker 3: story in congressional testimony, saying that there was no compelling 477 00:26:03,196 --> 00:26:06,756 Speaker 3: evidence that smoking cigarettes would give you lung cancer, and 478 00:26:06,796 --> 00:26:08,556 Speaker 3: it was just like stalks and babies, and he was 479 00:26:08,556 --> 00:26:11,676 Speaker 3: actually high by the tobacco lobby. It was just correlation. 480 00:26:11,756 --> 00:26:15,716 Speaker 3: It was just correlational. They seemed about the same. Sure, yeah, 481 00:26:15,716 --> 00:26:17,716 Speaker 3: there are people who smoke, and that people who get cancer, 482 00:26:17,756 --> 00:26:21,116 Speaker 3: but there's no cause of evidence. One theory was cigarettes 483 00:26:21,156 --> 00:26:24,076 Speaker 3: are so soothing, and if you have some early symptoms 484 00:26:24,076 --> 00:26:26,716 Speaker 3: of lung cancer, maybe you soothe that with the soothing 485 00:26:26,956 --> 00:26:29,676 Speaker 3: vapors of cigarettes. I mean, it's completely ridiculous, but this 486 00:26:29,716 --> 00:26:32,316 Speaker 3: sort of rhetoric was deployed. And so one of the 487 00:26:32,396 --> 00:26:35,196 Speaker 3: things I'm very concerned about in my work on statistics 488 00:26:35,396 --> 00:26:38,916 Speaker 3: is that it's great to be skeptical about statistics and 489 00:26:38,956 --> 00:26:41,836 Speaker 3: to point out lots of examples of statistics being misused. 490 00:26:41,876 --> 00:26:44,516 Speaker 3: But if that's all you do, you just get to 491 00:26:44,596 --> 00:26:49,756 Speaker 3: a canihilistic situation where you're just constantly rejecting statistical evidence 492 00:26:49,796 --> 00:26:52,436 Speaker 3: because it's just another of those damn lines and statistics. 493 00:26:52,556 --> 00:26:55,276 Speaker 3: When you look at the real world, I think this 494 00:26:55,356 --> 00:26:59,116 Speaker 3: gets to be incredibly interesting. So a real hot topic 495 00:26:59,156 --> 00:27:04,116 Speaker 3: at the moment is our smartphones destroying a generation. Basically, 496 00:27:04,316 --> 00:27:06,396 Speaker 3: are our kids having their mental health wrecked. 497 00:27:06,636 --> 00:27:08,716 Speaker 1: There's a new book that basically makes that up. 498 00:27:08,796 --> 00:27:12,476 Speaker 3: Yeah Yeah, by a prominent ecademic yea by Jonathan Hyde, 499 00:27:12,556 --> 00:27:14,356 Speaker 3: and there have been others by Gene Twegian. Lots of 500 00:27:14,356 --> 00:27:18,236 Speaker 3: people have said this, and the evidence for it is 501 00:27:18,876 --> 00:27:22,956 Speaker 3: mostly correlational, not completely. There are some experiments, but they're 502 00:27:22,956 --> 00:27:25,796 Speaker 3: not that convincing. None of them are perfect, but there 503 00:27:25,796 --> 00:27:28,396 Speaker 3: are lots of different ways of measuring it. And the 504 00:27:28,516 --> 00:27:32,676 Speaker 3: really interesting evidence basically says look, there appears to be 505 00:27:32,676 --> 00:27:35,876 Speaker 3: a mental health crisis. The kids seem to get really distressed, 506 00:27:35,876 --> 00:27:40,956 Speaker 3: particularly the girls, when children have access to social media 507 00:27:41,236 --> 00:27:44,436 Speaker 3: on their phones, and that happens around sometime between twenty 508 00:27:44,556 --> 00:27:47,516 Speaker 3: ten and twenty fourteen. And at the same time, suddenly 509 00:27:47,516 --> 00:27:51,636 Speaker 3: you've got this outbreak of suicidal ideation and self harm, 510 00:27:52,156 --> 00:27:55,156 Speaker 3: poor mental health and so on. And that's correlational evidence. 511 00:27:55,476 --> 00:27:58,476 Speaker 3: I don't completely believe it, but I don't completely not 512 00:27:58,516 --> 00:28:00,596 Speaker 3: believe it either. I think that's what makes it interesting 513 00:28:00,636 --> 00:28:03,356 Speaker 3: and makes it important to engage with. You have to 514 00:28:03,356 --> 00:28:05,916 Speaker 3: start going, well, what's the alternative explanation? Is there something 515 00:28:05,916 --> 00:28:09,076 Speaker 3: else that has happened sometime about sort of ten, twelve, 516 00:28:09,276 --> 00:28:12,796 Speaker 3: fourteen years ago that might explain this mental health distress? 517 00:28:12,836 --> 00:28:14,916 Speaker 3: So the timing of the Great Financial Crisis probably not 518 00:28:14,996 --> 00:28:18,476 Speaker 3: quite right. Donald Trump, maybe Donald Trump is upsetting the kids. 519 00:28:18,916 --> 00:28:21,836 Speaker 3: Timing doesn't work, so partly it does. The pattern of 520 00:28:21,876 --> 00:28:26,676 Speaker 3: the correlation make enough sense to explain this causal story, which, 521 00:28:26,716 --> 00:28:30,236 Speaker 3: funny enough, is basically exactly what the scientists finding a 522 00:28:30,236 --> 00:28:33,036 Speaker 3: connection between cigarettes and lung cancer were doing. They only 523 00:28:33,076 --> 00:28:35,476 Speaker 3: had correlational evidence for a long time. They weren't running 524 00:28:35,516 --> 00:28:38,316 Speaker 3: randomized trials saying you know, half of you smoke and 525 00:28:38,356 --> 00:28:40,276 Speaker 3: half of you don't smoke. I mean, they couldn't do that. 526 00:28:40,556 --> 00:28:43,436 Speaker 3: They had to look at correlational evidence, and sometimes that's 527 00:28:43,516 --> 00:28:44,156 Speaker 3: what we've got. 528 00:28:44,596 --> 00:28:48,196 Speaker 1: This question got me thinking about the rise in social 529 00:28:48,276 --> 00:28:52,436 Speaker 1: science of what they call natural experiments, right, trying to 530 00:28:52,556 --> 00:28:56,876 Speaker 1: find instances in the real world where you have something 531 00:28:57,196 --> 00:28:59,476 Speaker 1: that obviously is not as good as a randomized trial, 532 00:28:59,476 --> 00:29:01,716 Speaker 1: because you just can't get that with large numbers of 533 00:29:01,716 --> 00:29:04,796 Speaker 1: people in the world, but that gives you some element 534 00:29:05,036 --> 00:29:09,756 Speaker 1: of randomization or pseudorandomization, something that allows you to try 535 00:29:09,796 --> 00:29:13,436 Speaker 1: and make the leap from correlation to causation. And you know, 536 00:29:13,556 --> 00:29:15,436 Speaker 1: this as far as I know, goes back to the 537 00:29:15,596 --> 00:29:18,316 Speaker 1: Vietnam War, right, where in the US there was a 538 00:29:18,396 --> 00:29:22,876 Speaker 1: draft lottery, and there were social scientists after the war 539 00:29:23,036 --> 00:29:25,516 Speaker 1: who looked and said, oh, look, here are people who 540 00:29:25,516 --> 00:29:28,996 Speaker 1: are at an aggregate level very similar on many dimensions 541 00:29:28,996 --> 00:29:33,516 Speaker 1: socioeconomic dimensions. We can look at people who randomly got 542 00:29:33,596 --> 00:29:37,196 Speaker 1: drafted versus those who randomly didn't, and who appear quite 543 00:29:37,236 --> 00:29:40,196 Speaker 1: similar in the aggregate, and see how their outcomes differ. 544 00:29:40,356 --> 00:29:43,396 Speaker 3: And that's a really good example of a natural experiment. 545 00:29:43,436 --> 00:29:45,516 Speaker 3: Because it's actually very close to a really. 546 00:29:45,396 --> 00:29:48,116 Speaker 1: Yeah, when you get a lottery in the real world. Now, 547 00:29:48,276 --> 00:29:49,876 Speaker 1: social scientists flocked to it. 548 00:29:49,996 --> 00:29:50,156 Speaker 3: Right. 549 00:29:50,236 --> 00:29:55,196 Speaker 1: Similarly, there was one in Oregon some years ago with Medicaid, 550 00:29:55,196 --> 00:29:58,756 Speaker 1: which is the healthcare program for low income people in 551 00:29:58,756 --> 00:30:02,116 Speaker 1: the US, and Oregon got some new Medicaid funds and 552 00:30:02,156 --> 00:30:06,316 Speaker 1: they randomly allocated them to a group of people over time, right, 553 00:30:06,436 --> 00:30:08,796 Speaker 1: so that social scientists could say, oh, look, here are 554 00:30:08,836 --> 00:30:11,316 Speaker 1: people all who are basically identical. Some of them got 555 00:30:11,356 --> 00:30:13,276 Speaker 1: this health insurance and some of them didn't. And in 556 00:30:13,316 --> 00:30:16,556 Speaker 1: that case the findings were quite interesting. Medicaid didn't appear 557 00:30:16,596 --> 00:30:19,636 Speaker 1: to be as helpful as I would have thought, as 558 00:30:19,676 --> 00:30:22,516 Speaker 1: the researchers themselves would have thought. According to the one 559 00:30:22,556 --> 00:30:26,716 Speaker 1: I interviewed, lowered mental health problems. People worried less about money, 560 00:30:26,716 --> 00:30:30,436 Speaker 1: but like their basic health outcomes didn't improve, which nobody 561 00:30:30,476 --> 00:30:33,596 Speaker 1: would have guessed. Right. And the evidence is quite robust. 562 00:30:34,116 --> 00:30:36,796 Speaker 1: When you don't have those lotteries. As you say, the 563 00:30:36,836 --> 00:30:39,556 Speaker 1: world is just hard to understand. I mean, even in 564 00:30:39,596 --> 00:30:43,596 Speaker 1: some instances where you do have randomized trials. People are complicated. 565 00:30:43,636 --> 00:30:47,556 Speaker 1: The body is complicated, society is complicated, and so you know, 566 00:30:47,716 --> 00:30:51,276 Speaker 1: correlation in a certain way I think is underrated, Like, yes, 567 00:30:51,356 --> 00:30:54,276 Speaker 1: obviously it doesn't equal causation, but it's a place to 568 00:30:54,316 --> 00:30:57,116 Speaker 1: start looking, right, it's a place to start asking questions. 569 00:30:57,156 --> 00:31:00,436 Speaker 3: I think that's absolutely right, and we need more randomized experiments. 570 00:31:00,956 --> 00:31:03,756 Speaker 3: There are more opportunities to run them than people seem 571 00:31:03,796 --> 00:31:05,396 Speaker 3: to think. For example, one of the things that John 572 00:31:05,476 --> 00:31:07,596 Speaker 3: Hyde in his book is arguing for is, you know, 573 00:31:07,716 --> 00:31:11,116 Speaker 3: shouldn't have smart funds in school Schools should be phone 574 00:31:11,156 --> 00:31:14,956 Speaker 3: free zones. I could completely imagine a state saying we're 575 00:31:14,956 --> 00:31:17,796 Speaker 3: going to introduce a rule whereby in all of the 576 00:31:17,876 --> 00:31:21,916 Speaker 3: schools in the state, no smartphones strictly forbidden. You have 577 00:31:21,956 --> 00:31:23,436 Speaker 3: to put them in a locker when you show up 578 00:31:23,476 --> 00:31:25,076 Speaker 3: and then unlock them at the end of the day. 579 00:31:25,396 --> 00:31:27,116 Speaker 3: You could introduce that rule and just go, well, we 580 00:31:27,396 --> 00:31:31,756 Speaker 3: introduce it for a semester at random in fifty percent 581 00:31:31,796 --> 00:31:34,076 Speaker 3: of the schools, and then we'll introduce it in the 582 00:31:34,116 --> 00:31:36,876 Speaker 3: other fifty percent of the schools in the next semester, 583 00:31:36,916 --> 00:31:38,916 Speaker 3: and we'll just randomize that because we want to know 584 00:31:38,956 --> 00:31:41,836 Speaker 3: whether there's any point in this experiment or not. That's 585 00:31:41,956 --> 00:31:44,916 Speaker 3: not very difficult to do. It would create so much 586 00:31:44,956 --> 00:31:49,636 Speaker 3: information about children's performance in the classroom, their mental health, 587 00:31:49,956 --> 00:31:51,876 Speaker 3: that could inform policy across the world. 588 00:31:52,196 --> 00:31:54,996 Speaker 1: That would be the dream. You know, there is now 589 00:31:55,036 --> 00:31:59,676 Speaker 1: a robust set of methods essentially where social scientists could 590 00:31:59,716 --> 00:32:02,516 Speaker 1: look at that state and compare it to neighboring states 591 00:32:02,636 --> 00:32:05,476 Speaker 1: and see the difference in differences, as they say, see 592 00:32:05,516 --> 00:32:08,716 Speaker 1: the change over time. And that's an instance where it 593 00:32:08,716 --> 00:32:12,156 Speaker 1: wouldn't as elegant as randomizing within a state. But I 594 00:32:12,156 --> 00:32:14,396 Speaker 1: feel like you could start to get pretty good data 595 00:32:14,476 --> 00:32:17,796 Speaker 1: even if you did one state compared to other states. 596 00:32:18,116 --> 00:32:20,876 Speaker 3: Yeah, if you've got no experiments, we should run them. 597 00:32:21,036 --> 00:32:23,196 Speaker 3: If we don't run them, there are still ways of 598 00:32:23,236 --> 00:32:27,196 Speaker 3: making correlation talk. Should we have another question? 599 00:32:28,236 --> 00:32:30,636 Speaker 1: Okay, Tim, we got one more. We had to have 600 00:32:30,836 --> 00:32:35,316 Speaker 1: one game related question for right, Yes, you're welcome. This 601 00:32:35,356 --> 00:32:40,076 Speaker 1: one comes from Yost, who writes, Hi, Tim, diving into 602 00:32:40,076 --> 00:32:43,916 Speaker 1: the board game shaped flank you've now exposed for discussion. 603 00:32:44,116 --> 00:32:47,916 Speaker 1: That is a very gamersh way in what board games 604 00:32:48,036 --> 00:32:51,556 Speaker 1: or board game mechanisms that you enjoy are particularly insightful 605 00:32:51,636 --> 00:32:55,596 Speaker 1: on some aspect of real world economics offer a unique 606 00:32:55,636 --> 00:32:59,836 Speaker 1: angle to look at a maybe niche problem and what 607 00:33:00,116 --> 00:33:03,516 Speaker 1: non D and D games are you particularly enjoying right now? 608 00:33:04,236 --> 00:33:08,236 Speaker 1: Love the show? However, many episodes you produce that is 609 00:33:08,276 --> 00:33:11,996 Speaker 1: a listener? We all yeah, yeah, so kind your. 610 00:33:12,036 --> 00:33:13,996 Speaker 3: There are a couple of mechanisms that I do see 611 00:33:14,476 --> 00:33:17,676 Speaker 3: used quite often that shared important light on economics. So 612 00:33:17,756 --> 00:33:21,916 Speaker 3: one is trading. A lot of games involved trading. Now Monopoly, 613 00:33:22,036 --> 00:33:25,476 Speaker 3: that classic of sadly not a very good game. Oh 614 00:33:25,476 --> 00:33:27,716 Speaker 3: there's a cautionary tell about the history of Monopoly if 615 00:33:27,716 --> 00:33:31,796 Speaker 3: people want to listen. Monopoly in theory involves trading, but 616 00:33:31,876 --> 00:33:34,636 Speaker 3: in practice, not a lot of trading happens. It turns 617 00:33:34,676 --> 00:33:39,116 Speaker 3: out that you need to give people a reason to trade. 618 00:33:39,316 --> 00:33:42,276 Speaker 3: The great modern game Settlers now about thirty years old. 619 00:33:42,396 --> 00:33:44,836 Speaker 1: By the way, Settlers, I assume that Settlers of Catan 620 00:33:45,076 --> 00:33:47,876 Speaker 1: as an outsider, not on a first name basis, with the. 621 00:33:47,836 --> 00:33:50,916 Speaker 3: Game, Settlers of Katan is the game. It's super It's 622 00:33:50,916 --> 00:33:53,676 Speaker 3: the game that Monopoly wishes it was. It's like if 623 00:33:53,716 --> 00:33:57,276 Speaker 3: Monopoly had been designed with a modern eye to be 624 00:33:57,396 --> 00:34:02,236 Speaker 3: super exciting. The people need different resources, and the supply 625 00:34:02,316 --> 00:34:04,996 Speaker 3: of resources fluctuates, so sometimes you've got loads of wood, 626 00:34:05,076 --> 00:34:07,596 Speaker 3: sometimes there's no wood. So there's an active incentive to 627 00:34:07,636 --> 00:34:09,876 Speaker 3: trade all the time. By the way, if you don't trade, 628 00:34:09,876 --> 00:34:11,836 Speaker 3: every now and then the robber Barron comes and take stuff. 629 00:34:11,956 --> 00:34:14,876 Speaker 3: So it's an interesting insight that trading doesn't just happen 630 00:34:15,356 --> 00:34:18,076 Speaker 3: because it's allowed. There needs to be some difference in 631 00:34:18,196 --> 00:34:20,276 Speaker 3: value and perhaps some incentive to get on with it. 632 00:34:21,156 --> 00:34:25,076 Speaker 3: So my personal favorite game is a gricler, a wonderful 633 00:34:25,116 --> 00:34:27,996 Speaker 3: game about developing a farm, which doesn't sound promising, but 634 00:34:28,036 --> 00:34:30,556 Speaker 3: it's really really good. But one of the clever things 635 00:34:30,596 --> 00:34:33,596 Speaker 3: about a grickler is it uses an auction in an 636 00:34:33,596 --> 00:34:37,356 Speaker 3: interesting way. But it's sometimes called a descending clock auction 637 00:34:37,556 --> 00:34:41,556 Speaker 3: or a Dutch auction. The prize gets more and more tempting, 638 00:34:42,236 --> 00:34:44,476 Speaker 3: so in a grickller there's just more and more good 639 00:34:44,476 --> 00:34:47,876 Speaker 3: stuff on the board. In a traditional descending clock auction, 640 00:34:47,956 --> 00:34:49,796 Speaker 3: basically the price is going down and down and down 641 00:34:49,796 --> 00:34:51,916 Speaker 3: and down, and everyone is just sitting there and then 642 00:34:51,956 --> 00:34:54,396 Speaker 3: it's a question of who grabs it first. The longer 643 00:34:54,436 --> 00:34:57,596 Speaker 3: you leave it, the better it gets, but then only 644 00:34:57,636 --> 00:34:59,796 Speaker 3: one person can have it. It's very very elegant. They 645 00:34:59,796 --> 00:35:03,676 Speaker 3: sell flowers at Alsmere in the Netherlands and it's just 646 00:35:03,716 --> 00:35:07,356 Speaker 3: incredibly quick, much much quicker than selling say art. You 647 00:35:07,356 --> 00:35:09,956 Speaker 3: know you're selling a van god price rises and rises 648 00:35:09,956 --> 00:35:10,316 Speaker 3: and rises. 649 00:35:10,436 --> 00:35:12,876 Speaker 1: So auction design is a whole thing, right, Yeah, When 650 00:35:12,916 --> 00:35:16,636 Speaker 1: would you choose the traditional prices going up eBay art 651 00:35:16,676 --> 00:35:19,396 Speaker 1: style auction in the real world, and when would you 652 00:35:19,476 --> 00:35:22,156 Speaker 1: choose a Dutch prices going down auction? What do you 653 00:35:22,236 --> 00:35:24,396 Speaker 1: get in a relative sense out of each one of those. 654 00:35:24,396 --> 00:35:28,156 Speaker 3: There's a couple of differences. One is that in certain 655 00:35:28,236 --> 00:35:32,316 Speaker 3: types of ascending auction you get to see people dropping 656 00:35:32,316 --> 00:35:34,316 Speaker 3: out as the price rises. It depends on the way 657 00:35:34,316 --> 00:35:37,036 Speaker 3: the auction is designed. But you could imagine an ascending 658 00:35:37,076 --> 00:35:39,316 Speaker 3: auction where you just see people going no, I'm out, 659 00:35:39,356 --> 00:35:40,396 Speaker 3: I'm out, I'm out. 660 00:35:40,716 --> 00:35:43,676 Speaker 1: Sort of the classic kind of art auction in a movie. 661 00:35:43,836 --> 00:35:46,436 Speaker 3: Yeah, and so you're learning information as the price rises. 662 00:35:46,476 --> 00:35:48,156 Speaker 3: How many people are still interested? Are there's still ten 663 00:35:48,196 --> 00:35:51,676 Speaker 3: people interested? Is it just two people? So that generates information. 664 00:35:51,796 --> 00:35:56,276 Speaker 3: By generating information, it might raise the final price. Overall. 665 00:35:56,836 --> 00:36:00,196 Speaker 3: The advantage of the descending auction is it's just so quick. 666 00:36:00,436 --> 00:36:03,996 Speaker 3: You can run an auction in ten seconds. It's one 667 00:36:04,036 --> 00:36:06,436 Speaker 3: hundred thousand tulips. Here are the tulips. We're going to 668 00:36:06,516 --> 00:36:11,836 Speaker 3: sell them for whatever, ten thousand years, nine hundred and 669 00:36:11,836 --> 00:36:14,516 Speaker 3: thin thousand, and it's literally a clock. The clock just 670 00:36:14,996 --> 00:36:17,756 Speaker 3: runs around showing what the price is and then someone 671 00:36:17,796 --> 00:36:21,516 Speaker 3: presses the button. Sold, okay, now bringing the roses. So 672 00:36:21,556 --> 00:36:22,356 Speaker 3: it's very fast. 673 00:36:22,556 --> 00:36:24,436 Speaker 1: You would want to have a good idea of the 674 00:36:24,516 --> 00:36:26,396 Speaker 1: market clearing price if you were going to run that 675 00:36:26,556 --> 00:36:28,396 Speaker 1: kind of auction, so that you didn't sell it for 676 00:36:28,476 --> 00:36:31,236 Speaker 1: too cheap. Right, it's fast, but you run the risk 677 00:36:31,276 --> 00:36:32,796 Speaker 1: of selling it for too cheap. But if it's just 678 00:36:32,876 --> 00:36:36,156 Speaker 1: commodity toolips, then you basically know the market clearing price. Anyways. 679 00:36:36,236 --> 00:36:38,556 Speaker 3: Yeah, it's almost like a stock market for tulips. That 680 00:36:38,716 --> 00:36:40,596 Speaker 3: I think is why it works so well. Whereas if 681 00:36:40,676 --> 00:36:42,876 Speaker 3: it's a unique work of art, you need to give 682 00:36:42,956 --> 00:36:46,276 Speaker 3: maximum information, maximum comfort to people, maybe a little bit 683 00:36:46,316 --> 00:36:48,596 Speaker 3: of theater as well. We're not in a hurry to 684 00:36:48,636 --> 00:36:50,956 Speaker 3: run this auction. The cost of the auction itself is 685 00:36:51,076 --> 00:36:54,396 Speaker 3: trivial compared to the value of what's being sold. 686 00:36:54,996 --> 00:36:57,316 Speaker 1: Now we go over to Agricola and have a descending action. 687 00:36:57,516 --> 00:37:00,796 Speaker 1: It feels exciting, and it also feels like you're trying 688 00:37:00,876 --> 00:37:03,316 Speaker 1: to guess what is in your opponent's head, right, because 689 00:37:03,356 --> 00:37:06,756 Speaker 1: they're not revealing it as they would in ascending price auction, 690 00:37:06,916 --> 00:37:08,876 Speaker 1: you have to think, oh, what are they willing. 691 00:37:08,676 --> 00:37:10,956 Speaker 3: To pay for? Absolutely, and it works slightly differently, but 692 00:37:11,036 --> 00:37:14,516 Speaker 3: the functionally what's going on is that every turn you 693 00:37:14,716 --> 00:37:17,476 Speaker 3: take you get to grab some resource on the board, 694 00:37:17,676 --> 00:37:19,676 Speaker 3: and then once you've grabbed the resource on the board, 695 00:37:19,996 --> 00:37:22,276 Speaker 3: no one else is allowed to grab it until the 696 00:37:22,356 --> 00:37:24,476 Speaker 3: next round. And there are several moves you can make 697 00:37:24,516 --> 00:37:26,156 Speaker 3: in each turn, so you might be able to get 698 00:37:26,196 --> 00:37:28,956 Speaker 3: two or three things each turn. And so you go 699 00:37:29,036 --> 00:37:30,996 Speaker 3: backwards and forwards. It's your move and it's my move. 700 00:37:31,076 --> 00:37:33,436 Speaker 3: And each time we want to grab something, and you 701 00:37:33,516 --> 00:37:34,796 Speaker 3: might go, well do I want to grab it now? 702 00:37:34,876 --> 00:37:37,116 Speaker 3: What do I want to wait until next turn where 703 00:37:37,116 --> 00:37:38,836 Speaker 3: it might be better? And you're trying to figure out, like, 704 00:37:38,916 --> 00:37:41,796 Speaker 3: what is the thing that the other person is desperate 705 00:37:41,876 --> 00:37:45,716 Speaker 3: to have? Can I just let this pile get bigger 706 00:37:46,196 --> 00:37:47,996 Speaker 3: because they can't afford to take it because they've got 707 00:37:48,036 --> 00:37:50,276 Speaker 3: to use their move on something else. It's a very 708 00:37:50,436 --> 00:37:53,436 Speaker 3: very clever game. I do enjoy it. You just also 709 00:37:53,516 --> 00:37:56,956 Speaker 3: ask what non D and D games I am enjoying. 710 00:37:57,436 --> 00:38:00,916 Speaker 3: I am enjoying a modified version of Blades in the Dark, 711 00:38:01,156 --> 00:38:04,676 Speaker 3: Whi's a classic role playing game, very fast moving, you 712 00:38:04,756 --> 00:38:07,596 Speaker 3: get to do Ocean's eleven or other kind of heists, 713 00:38:07,996 --> 00:38:12,556 Speaker 3: and it's very modifiable. We're speaking on Monday. Just yesterday 714 00:38:12,596 --> 00:38:14,996 Speaker 3: on Sunday, had a whole bunch of old friends around 715 00:38:15,156 --> 00:38:18,036 Speaker 3: to my house and we just played this game all 716 00:38:18,116 --> 00:38:20,756 Speaker 3: day and we had an absolutely terrific time. 717 00:38:21,076 --> 00:38:23,636 Speaker 1: I feel like you're living the dream, Harfred makes me happy, 718 00:38:23,796 --> 00:38:24,676 Speaker 1: living my best life. 719 00:38:25,676 --> 00:38:27,676 Speaker 3: Jacob, this has been such fun. Thank you so much 720 00:38:27,716 --> 00:38:28,156 Speaker 3: for doing this. 721 00:38:28,556 --> 00:38:31,116 Speaker 1: Ah, it was a delight. I'll come back anytime. 722 00:38:31,676 --> 00:38:35,036 Speaker 3: We would love to have you back. Thank you everybody 723 00:38:35,036 --> 00:38:36,756 Speaker 3: who sent in a question. Sorry, we weren't able to 724 00:38:36,796 --> 00:38:38,436 Speaker 3: answer all the questions, but we are going to be 725 00:38:38,516 --> 00:38:42,276 Speaker 3: back with another Cautionary Questions episode later this year. Please 726 00:38:42,356 --> 00:38:45,796 Speaker 3: do keep your queries coming, send them into tales at 727 00:38:45,876 --> 00:38:49,036 Speaker 3: pushkin dot fm. That's t a l e s at 728 00:38:49,156 --> 00:38:53,236 Speaker 3: Pushkin dot fm and I'll be back with another Cautionary 729 00:38:53,316 --> 00:38:56,916 Speaker 3: Tale in two weeks time. Jacob Goldstein has a wonderful 730 00:38:56,956 --> 00:39:00,316 Speaker 3: podcast called What's Your Problem? Thank You, Jacob, Thanks Jim 731 00:39:08,436 --> 00:39:11,956 Speaker 3: Caul is written by me, Tim Harford with Andrew Wright. 732 00:39:12,436 --> 00:39:15,716 Speaker 3: It's produced by Alice Fines with support from Marilyn Rust. 733 00:39:16,236 --> 00:39:18,756 Speaker 3: The sound design and original music is the work of 734 00:39:18,836 --> 00:39:23,716 Speaker 3: Pascal Wise. Sarah Nix edited the scripts. It features the 735 00:39:23,796 --> 00:39:27,996 Speaker 3: voice talents of Ben Crowe, Melanie Guttridge, Stella Harford, Jemma 736 00:39:28,036 --> 00:39:31,876 Speaker 3: Saunders and rufus Wright. The show also wouldn't have been 737 00:39:31,956 --> 00:39:36,316 Speaker 3: possible without the work of Jacob Weisberg, Ryan Dilly, Gretta Cohne, 738 00:39:36,836 --> 00:39:42,116 Speaker 3: Vital Millard, John Schnaz, Eric's handler, Carrie Brody, and Christina Sullivan. 739 00:39:42,956 --> 00:39:47,556 Speaker 3: Cautionary Tales is a production of Pushkin Industries. It's recorded 740 00:39:47,596 --> 00:39:51,316 Speaker 3: at Wardoor Studios in London by Tom Berry. If you 741 00:39:51,556 --> 00:39:55,716 Speaker 3: like the show, please remember to share, rate and review, 742 00:39:56,276 --> 00:39:58,396 Speaker 3: tell your friends and if you want to hear the 743 00:39:58,476 --> 00:40:01,916 Speaker 3: show ad free, sign up for Pushkin Plus on the 744 00:40:01,996 --> 00:40:06,036 Speaker 3: show page in Apple Podcasts or at pushkin dot fm, 745 00:40:06,436 --> 00:40:07,436 Speaker 3: slash plus