1 00:00:15,410 --> 00:00:24,970 Speaker 1: Pushkin let's play around Warders Studios. It sounds very elvin. 2 00:00:25,890 --> 00:00:28,090 Speaker 2: I always think of David Bowie when I think of Wardog. 3 00:00:28,130 --> 00:00:30,570 Speaker 2: In one of his early songs talks about right light, 4 00:00:30,690 --> 00:00:31,970 Speaker 2: soho wardour Street. 5 00:00:32,810 --> 00:00:36,410 Speaker 1: Bowie himself has an elfin aspect. 6 00:00:36,770 --> 00:00:39,770 Speaker 2: He said he does is break through gig at Aylesbury, 7 00:00:39,810 --> 00:00:43,170 Speaker 2: my hometown, which basically is about the only thing remarking 8 00:00:43,210 --> 00:00:44,050 Speaker 2: about Aylesbury is. 9 00:00:44,130 --> 00:00:46,330 Speaker 1: Well and also home of Tim Harford. 10 00:00:46,410 --> 00:00:47,770 Speaker 3: As the plaque says. 11 00:00:48,810 --> 00:00:50,730 Speaker 2: They've got a statue for Bowie, not yet for me. 12 00:00:50,930 --> 00:00:51,770 Speaker 2: I don't know why. 13 00:00:52,450 --> 00:01:25,130 Speaker 3: Here we go, Okay, let's go, okay, I'm ready. 14 00:01:18,330 --> 00:01:22,210 Speaker 2: Hello, and welcome back to another episode of Cautionary Questions, 15 00:01:22,250 --> 00:01:24,850 Speaker 2: our first of twenty twenty four. I am, of course 16 00:01:24,970 --> 00:01:28,490 Speaker 2: Tim Harford. You are our loyal listeners. You've been sending 17 00:01:28,490 --> 00:01:33,490 Speaker 2: in your burning questions on money, technology, economics and problem solving, 18 00:01:33,730 --> 00:01:36,490 Speaker 2: and thank you so much to everyone who's done so, 19 00:01:37,050 --> 00:01:39,690 Speaker 2: and today is the day I do my best to 20 00:01:39,770 --> 00:01:43,250 Speaker 2: answer them. And thankfully I won't be alone in this endeavor. 21 00:01:43,610 --> 00:01:45,770 Speaker 2: Here to help me out both of the questions, and 22 00:01:45,850 --> 00:01:48,690 Speaker 2: I think with some of the answers is the brilliant, 23 00:01:49,090 --> 00:01:54,330 Speaker 2: brilliant Jacob Goldstein, the host of Pushkin podcast What's Your Problem, 24 00:01:54,650 --> 00:01:57,490 Speaker 2: author of the book Money, The True Story of a 25 00:01:57,530 --> 00:02:01,130 Speaker 2: Made Up Thing. Jacob was my inaugural Cautionary Questions co 26 00:02:01,210 --> 00:02:02,410 Speaker 2: host Jacob's wonderful to. 27 00:02:02,410 --> 00:02:04,370 Speaker 1: Have you back, tim Asonana. 28 00:02:04,730 --> 00:02:07,130 Speaker 2: Hi, So we should just get on with the questions 29 00:02:07,170 --> 00:02:09,330 Speaker 2: because we always have so many and so much to say. 30 00:02:09,410 --> 00:02:11,610 Speaker 2: So what have you got in real big bag of 31 00:02:11,650 --> 00:02:12,810 Speaker 2: listener questions for me? 32 00:02:13,410 --> 00:02:17,810 Speaker 1: Let's start with a question from Alex in Melbourne, Australia. 33 00:02:17,890 --> 00:02:22,330 Speaker 1: Alex writes, I work with artificial intelligence image generation software 34 00:02:22,410 --> 00:02:26,330 Speaker 1: almost daily now, and I'm quickly seeing how so much 35 00:02:26,410 --> 00:02:29,370 Speaker 1: of my workforce and processes can either be sped up 36 00:02:29,490 --> 00:02:33,170 Speaker 1: or entirely replaced by AI, and this makes me nervous. 37 00:02:33,930 --> 00:02:37,210 Speaker 1: And then he asks about universal basic income, right, this 38 00:02:37,290 --> 00:02:40,770 Speaker 1: idea of a government giving all its citizens' money, And 39 00:02:40,810 --> 00:02:42,810 Speaker 1: he says it seems for the first time that computers 40 00:02:42,810 --> 00:02:46,770 Speaker 1: and software will actually replace jobs in a deeply concerning way, 41 00:02:47,050 --> 00:02:50,210 Speaker 1: which is both exciting and terrifying. What are your thoughts 42 00:02:50,250 --> 00:02:53,450 Speaker 1: on UBI, universal basic income as a solution to an 43 00:02:53,490 --> 00:02:58,530 Speaker 1: AI crisis and the widespread philosophical and economic implications of this. 44 00:02:59,770 --> 00:03:04,010 Speaker 2: I love this question. I mean it's so big, and 45 00:03:04,050 --> 00:03:05,650 Speaker 2: I think the first thing to say is we don't 46 00:03:05,690 --> 00:03:10,450 Speaker 2: really have any idea. If what Alex is thinking about 47 00:03:10,770 --> 00:03:14,370 Speaker 2: comes true, and if most people just have no economic value. 48 00:03:14,410 --> 00:03:16,490 Speaker 2: They have value as human beings, have value as members 49 00:03:16,530 --> 00:03:18,690 Speaker 2: of society, but there's nothing that they could actually sell 50 00:03:18,730 --> 00:03:23,410 Speaker 2: their labor to do, then that's completely uncharted territory. We've 51 00:03:23,410 --> 00:03:27,290 Speaker 2: never been anywhere like that before, so everything we do 52 00:03:27,410 --> 00:03:28,410 Speaker 2: is kind of speculative. 53 00:03:28,850 --> 00:03:31,970 Speaker 1: We've feared it for a long time, right We've have 54 00:03:32,210 --> 00:03:36,290 Speaker 1: two hundred years of being afraid of technological unemployment. And 55 00:03:36,370 --> 00:03:41,370 Speaker 1: my prior on this is to be somewhat skeptical. Right, Like, 56 00:03:41,530 --> 00:03:45,570 Speaker 1: clearly there can be some large number of people who 57 00:03:45,610 --> 00:03:47,450 Speaker 1: lose their jobs, and we should be concerned about that, 58 00:03:47,490 --> 00:03:49,610 Speaker 1: and we should think about how to mitigate that, But 59 00:03:49,730 --> 00:03:53,530 Speaker 1: the idea of more or less everybody losing their jobs 60 00:03:53,610 --> 00:03:55,890 Speaker 1: I'm skeptical of for the simple reason that it hasn't 61 00:03:55,930 --> 00:03:58,810 Speaker 1: happened in two hundred years of incredible technological progress. And 62 00:03:58,930 --> 00:04:03,570 Speaker 1: right now, after decades of extreme technological progress, in the US, 63 00:04:03,770 --> 00:04:07,210 Speaker 1: unemployment is below four percent, the share of working age 64 00:04:07,250 --> 00:04:10,410 Speaker 1: people who are working is near all time highs, and 65 00:04:10,450 --> 00:04:14,730 Speaker 1: so somehow we keep coming up with new things to 66 00:04:14,770 --> 00:04:17,850 Speaker 1: do for money, no matter how many things computers can do. 67 00:04:17,970 --> 00:04:21,370 Speaker 1: And so my first thought is, I don't think we're 68 00:04:21,370 --> 00:04:23,930 Speaker 1: going to have everybody losing their jobs to AI. I 69 00:04:23,970 --> 00:04:25,610 Speaker 1: definitely could be wrong, but that's what I think. 70 00:04:25,730 --> 00:04:29,250 Speaker 2: No, I think that's a good working assumption. If you 71 00:04:29,330 --> 00:04:32,850 Speaker 2: think back a few centuries, basically, almost all the labor 72 00:04:32,850 --> 00:04:35,850 Speaker 2: that people did, they might wash their clothes occasionally, Well, 73 00:04:35,890 --> 00:04:38,210 Speaker 2: that's been outsourced to the washing machines that spend a 74 00:04:38,210 --> 00:04:41,770 Speaker 2: lot of time moving water around, just drinking water, cooking water, 75 00:04:42,130 --> 00:04:46,690 Speaker 2: throwing out human excrement. That's all now handled by automated systems. 76 00:04:47,050 --> 00:04:50,410 Speaker 1: Digging people did a lot of digging, right, pulling a plow. 77 00:04:50,730 --> 00:04:54,130 Speaker 2: Almost everything we used to do is now done by machines. 78 00:04:54,170 --> 00:04:56,570 Speaker 2: But somehow we still all have jobs. Let's at least 79 00:04:56,610 --> 00:04:59,090 Speaker 2: accept the premise that maybe this time it might be different. 80 00:05:00,010 --> 00:05:03,330 Speaker 2: Because the robots are doing everything. There's still material prosperity, 81 00:05:03,530 --> 00:05:06,570 Speaker 2: there's food out there, in all the services we could 82 00:05:06,570 --> 00:05:09,290 Speaker 2: possibly want. We just have to find some system and 83 00:05:09,290 --> 00:05:12,890 Speaker 2: whereby the humans who have no economic value get to 84 00:05:13,010 --> 00:05:15,930 Speaker 2: enjoy all this cool stuff that's being produced by the machines. 85 00:05:16,450 --> 00:05:22,050 Speaker 1: Yeah, and you know, technological prosperity has given us in 86 00:05:22,130 --> 00:05:26,290 Speaker 1: the developed world ubi for old people, right in the US, 87 00:05:26,330 --> 00:05:29,090 Speaker 1: as in every developed country as far as I know, 88 00:05:29,610 --> 00:05:31,490 Speaker 1: once you get to some age, if you are a 89 00:05:31,490 --> 00:05:35,370 Speaker 1: citizen and you have worked, yeah, the government gives you money, yeah, 90 00:05:35,410 --> 00:05:38,330 Speaker 1: every month until you die, right, And so you could 91 00:05:38,410 --> 00:05:43,330 Speaker 1: imagine a kind of creeping extension of that. Certainly, right 92 00:05:43,370 --> 00:05:45,650 Speaker 1: now they're not talking about lowering the retirement age in 93 00:05:45,650 --> 00:05:48,450 Speaker 1: the US. They're talking about raising it as in many countries. 94 00:05:48,570 --> 00:05:50,410 Speaker 2: Yeah, raising it in the UK as well, But it's 95 00:05:50,410 --> 00:05:53,370 Speaker 2: still a long way away from life expectancies. My recollection 96 00:05:53,410 --> 00:05:55,810 Speaker 2: is that when Bismarck introduced the first pension, which was 97 00:05:55,850 --> 00:06:00,210 Speaker 2: in Germany in the late nineteenth century, I think the 98 00:06:00,290 --> 00:06:04,010 Speaker 2: pension started to be paid at the age sixty seven 99 00:06:04,210 --> 00:06:06,810 Speaker 2: and the life expectancy was sixty three. 100 00:06:07,610 --> 00:06:10,290 Speaker 1: You get negative four years in expectation. 101 00:06:10,570 --> 00:06:12,850 Speaker 2: People who survive long enough to claim any pension at 102 00:06:12,850 --> 00:06:15,290 Speaker 2: all are already exceptional. So that would be like having 103 00:06:15,330 --> 00:06:18,090 Speaker 2: a pension today that started at age ninety. 104 00:06:18,210 --> 00:06:19,890 Speaker 4: Like some people will get it, but most bad way. 105 00:06:20,650 --> 00:06:23,010 Speaker 2: But actually a lot of people could easily collect thirty 106 00:06:23,090 --> 00:06:27,010 Speaker 2: years of pension. Certainly twenty years so they're living a 107 00:06:27,130 --> 00:06:30,450 Speaker 2: large proportion of the adult life receiving money from the state, 108 00:06:30,490 --> 00:06:33,530 Speaker 2: and they're also receiving money from their own savings, their 109 00:06:33,530 --> 00:06:37,210 Speaker 2: own investments, which might be a replacement for UBI. Maybe 110 00:06:37,210 --> 00:06:39,650 Speaker 2: we all just have shares in the robots instead with 111 00:06:39,770 --> 00:06:41,930 Speaker 2: shares in Google or whatever, and that's how we get paid. 112 00:06:42,250 --> 00:06:44,770 Speaker 1: Yes, whether that is mediated by the government or not 113 00:06:44,970 --> 00:06:48,410 Speaker 1: looks somewhat different. Right, Either the government is taxing the 114 00:06:48,530 --> 00:06:52,490 Speaker 1: owners of in Nvidia and Google stock and distributing the money, 115 00:06:52,610 --> 00:06:56,490 Speaker 1: or everybody owns Google and Nvidia stock. There is the 116 00:06:56,530 --> 00:07:00,930 Speaker 1: piece of this which is about the non financial parts 117 00:07:01,010 --> 00:07:03,530 Speaker 1: of work, right, I mean there's a political economy piece, 118 00:07:03,570 --> 00:07:05,330 Speaker 1: the sort of bridge how do we get from here 119 00:07:05,330 --> 00:07:08,650 Speaker 1: to there? If that happens, And that's complicated and maybe ugly. 120 00:07:08,890 --> 00:07:11,570 Speaker 2: If the robots can do all of the stuff the 121 00:07:11,570 --> 00:07:15,370 Speaker 2: computers can do it, there's no economic reason why humans 122 00:07:15,410 --> 00:07:19,410 Speaker 2: couldn't just receive whatever an allowance they're given ten robots, 123 00:07:19,650 --> 00:07:22,370 Speaker 2: or they're given ten thousand dollars a month to spend 124 00:07:22,450 --> 00:07:24,330 Speaker 2: or whatever they like. There's no economic reason why that 125 00:07:24,370 --> 00:07:26,650 Speaker 2: couldn't happen. But I think what you're getting at is 126 00:07:26,730 --> 00:07:29,290 Speaker 2: what does it do to us if we're in that situation. 127 00:07:29,450 --> 00:07:32,530 Speaker 1: Yeah, yeah, Like I don't want to not have a job. 128 00:07:32,850 --> 00:07:35,370 Speaker 1: I recognize that I am fortunate to have a job 129 00:07:35,450 --> 00:07:38,890 Speaker 1: that I enjoy, that I derive a part of my 130 00:07:38,930 --> 00:07:42,130 Speaker 1: identity from. I recognize that for many, in fact probably 131 00:07:42,170 --> 00:07:44,930 Speaker 1: most people, a job is not that it's some unpleasant 132 00:07:44,930 --> 00:07:47,690 Speaker 1: thing they do because they need the money, and if 133 00:07:47,730 --> 00:07:50,210 Speaker 1: they got more money, they would quit their job. Well, 134 00:07:50,250 --> 00:07:52,570 Speaker 1: you probably know the kind of empirical evidence on this 135 00:07:52,610 --> 00:07:55,370 Speaker 1: better than I do. Like, people have looked at lottery winners, right, 136 00:07:55,410 --> 00:07:58,050 Speaker 1: and my sense is it's not great for you to 137 00:07:58,090 --> 00:08:00,330 Speaker 1: win the lottery and quit your job. It doesn't actually 138 00:08:00,330 --> 00:08:01,130 Speaker 1: make you happier. 139 00:08:01,330 --> 00:08:03,850 Speaker 2: Actually, the evidence on lottery winners is a bit mixed, 140 00:08:03,930 --> 00:08:06,530 Speaker 2: and I think slightly. We hear all the disaster stories 141 00:08:06,530 --> 00:08:07,250 Speaker 2: of lottery winners. 142 00:08:07,370 --> 00:08:09,730 Speaker 1: Actually, it's great to win the lottery. That's amazing. Is 143 00:08:09,770 --> 00:08:10,090 Speaker 1: that true? 144 00:08:10,290 --> 00:08:12,690 Speaker 4: I think it's fine. Yes, it's just fine to win 145 00:08:12,770 --> 00:08:13,130 Speaker 4: the lottery. 146 00:08:13,210 --> 00:08:15,890 Speaker 2: Is not really a problem to win the lottery. In 147 00:08:15,890 --> 00:08:17,770 Speaker 2: my vague recollection, I wrote about this a couple of 148 00:08:17,810 --> 00:08:21,330 Speaker 2: years ago. But we may not all want a job, 149 00:08:21,770 --> 00:08:25,010 Speaker 2: but we all want something to do. We all want 150 00:08:25,130 --> 00:08:28,730 Speaker 2: to feel useful. We all want a sense of some 151 00:08:28,850 --> 00:08:31,730 Speaker 2: kind of purpose. We all want, I think, that experience 152 00:08:31,770 --> 00:08:34,010 Speaker 2: of mastery, that experience of knowing that you can do 153 00:08:34,050 --> 00:08:37,210 Speaker 2: something that not everybody can do. Those things don't have 154 00:08:37,250 --> 00:08:38,490 Speaker 2: to come from a job, but for a lot of 155 00:08:38,490 --> 00:08:39,770 Speaker 2: people they do come from a job. 156 00:08:40,170 --> 00:08:44,170 Speaker 1: Yeah. For me, in my narrow provincial experience of life, 157 00:08:44,610 --> 00:08:49,170 Speaker 1: it's frankly hard to imagine getting those things without a job. Yeah. 158 00:08:49,250 --> 00:08:52,370 Speaker 2: It comes back to my initial reaction to Alex's terrific question, 159 00:08:52,410 --> 00:08:55,330 Speaker 2: which is this is such unknown territory that we can 160 00:08:55,330 --> 00:08:57,930 Speaker 2: only really speculate. But I think you and I J. 161 00:08:58,010 --> 00:09:00,610 Speaker 2: Cub are in agreement that the fundamental issue here is 162 00:09:00,650 --> 00:09:03,770 Speaker 2: not economic. It's really to do with our souls. How 163 00:09:03,810 --> 00:09:07,050 Speaker 2: would we react if our desire for mastery, our desire 164 00:09:07,090 --> 00:09:10,250 Speaker 2: for meaning, our desire to feel useful, if that had 165 00:09:10,370 --> 00:09:13,970 Speaker 2: to be satisfied without having a job, And what would 166 00:09:13,970 --> 00:09:16,810 Speaker 2: we do? And could we cope? And I don't know, I. 167 00:09:16,770 --> 00:09:19,450 Speaker 1: Mean, Harford, if the robots come and take our jobs, 168 00:09:19,530 --> 00:09:21,530 Speaker 1: let's just you and me make a podcast for free. 169 00:09:21,650 --> 00:09:25,730 Speaker 1: I'm in. I'll commit right now in the robot utopia 170 00:09:25,770 --> 00:09:29,370 Speaker 1: apocalypse to making a weekly podcast with you for free. 171 00:09:29,410 --> 00:09:29,690 Speaker 4: Deal. 172 00:09:29,930 --> 00:09:32,810 Speaker 3: Sorry, we have no more time you might give in 173 00:09:33,050 --> 00:09:34,330 Speaker 3: your robot overload. 174 00:09:35,170 --> 00:09:40,610 Speaker 1: Give us another question, Hello, Race, I'm enough. I would 175 00:09:40,650 --> 00:09:43,250 Speaker 1: like to ask, with the onset of AI, what is 176 00:09:43,290 --> 00:09:47,410 Speaker 1: the next cautionary tale you anticipate talking about in the 177 00:09:47,450 --> 00:09:48,130 Speaker 1: years to come? 178 00:09:48,410 --> 00:09:51,730 Speaker 2: Hello Abanov, I have just finished the first draft of 179 00:09:51,770 --> 00:09:56,730 Speaker 2: a script about the coming of AI, and without giving 180 00:09:56,770 --> 00:10:00,570 Speaker 2: too much away, the cautionary note is about what happens 181 00:10:00,610 --> 00:10:05,490 Speaker 2: when automation gets so good that we lose our own skills, 182 00:10:05,650 --> 00:10:08,970 Speaker 2: we hand over controlled to the machine, and then how 183 00:10:09,250 --> 00:10:12,570 Speaker 2: we respond when the machine says, actually, this one's too hard? 184 00:10:13,050 --> 00:10:15,690 Speaker 2: Could you take over again? Suddenly we're back at the 185 00:10:15,730 --> 00:10:17,890 Speaker 2: wheel and we're out of practice. 186 00:10:18,050 --> 00:10:20,610 Speaker 1: Harford, It's not another plane crash, is it? Don't you 187 00:10:20,650 --> 00:10:22,810 Speaker 1: have a moratry I'm undering plane crashes at this point 188 00:10:22,850 --> 00:10:24,250 Speaker 1: because it sounds like that, I'm. 189 00:10:24,130 --> 00:10:26,490 Speaker 2: Going to say nothing more fair enough, Let's go for 190 00:10:26,530 --> 00:10:27,090 Speaker 2: a break. 191 00:10:28,250 --> 00:10:37,250 Speaker 4: Right, Yeah, we're back. 192 00:10:37,410 --> 00:10:41,370 Speaker 2: I'm talking to Jacob Goldstein, the host of What's Your Problem, 193 00:10:41,650 --> 00:10:45,930 Speaker 2: and we are doing a Cautionary Questions Q and A episode. 194 00:10:46,370 --> 00:10:49,090 Speaker 2: Jacob has the questions. Jacob's also helping me with the answers, 195 00:10:49,210 --> 00:10:51,690 Speaker 2: and I'm really sepless to requirements here, but I'm doing 196 00:10:51,730 --> 00:10:53,410 Speaker 2: my best, Jacob, What have you got for me? 197 00:10:53,970 --> 00:10:57,970 Speaker 1: Tim? Our next question comes from Adam, who writes, Hi, 198 00:10:58,490 --> 00:11:01,130 Speaker 1: I have a question about investing in cryptocurrency. 199 00:11:01,250 --> 00:11:04,810 Speaker 4: Oh dear, oh dear, come on give it to them. Okay. 200 00:11:05,210 --> 00:11:09,290 Speaker 1: It's an interesting one. Traditionally, when buying shares in a company, 201 00:11:09,450 --> 00:11:12,410 Speaker 1: you would be supporting that company to grow, create a 202 00:11:12,410 --> 00:11:15,450 Speaker 1: new product, or enter a new retail space. This would 203 00:11:15,530 --> 00:11:19,410 Speaker 1: hopefully create jobs and new products. When people invest money 204 00:11:19,410 --> 00:11:23,210 Speaker 1: into cryptocurrency or any currency for that matter, isn't that 205 00:11:23,290 --> 00:11:26,730 Speaker 1: money just sitting around not doing anything until you withdraw it. 206 00:11:27,090 --> 00:11:30,370 Speaker 1: Would investing in crypto be bad for the economy compared 207 00:11:30,450 --> 00:11:32,090 Speaker 1: with investing in businesses? 208 00:11:32,890 --> 00:11:36,490 Speaker 4: This is a deep question. I really like it. 209 00:11:37,410 --> 00:11:40,490 Speaker 2: As the author of the wonderful book Money, The True 210 00:11:40,490 --> 00:11:42,810 Speaker 2: Story of a Maid art Thing, I am sure you. 211 00:11:42,730 --> 00:11:44,930 Speaker 4: Have thoughts, Jacob. Let me have a first crack and 212 00:11:45,130 --> 00:11:46,450 Speaker 4: you can tell me everything I've missed. 213 00:11:46,650 --> 00:11:49,610 Speaker 2: Yeah, so let's see you buy bitcoin or whatever, and 214 00:11:49,650 --> 00:11:52,010 Speaker 2: then the money's just sitting there because bitcoin is not 215 00:11:52,610 --> 00:11:55,930 Speaker 2: building anything. You're not buying investment in, say a road 216 00:11:55,970 --> 00:11:58,090 Speaker 2: building company, or you're not buying an investment in Google, 217 00:11:58,130 --> 00:12:01,010 Speaker 2: which is developing new technology. There's one of two possible 218 00:12:01,010 --> 00:12:05,490 Speaker 2: things that happens. So most likely is whoever you bought 219 00:12:05,530 --> 00:12:10,130 Speaker 2: the bitcoin from now has your money. Now it's their money, 220 00:12:10,250 --> 00:12:11,850 Speaker 2: and then they're going to do something with that money. 221 00:12:11,890 --> 00:12:14,730 Speaker 2: So maybe they then buy shares in Google, or they 222 00:12:14,730 --> 00:12:17,370 Speaker 2: then set up a business, or they will then go 223 00:12:17,450 --> 00:12:20,290 Speaker 2: on to do something with that money, or they'll lend 224 00:12:20,330 --> 00:12:22,490 Speaker 2: it to somebody else and that person will do something 225 00:12:22,490 --> 00:12:24,730 Speaker 2: with it. Eventually the money will find its way into 226 00:12:24,770 --> 00:12:28,330 Speaker 2: some productive investment. The fact that you bought bitcoin off 227 00:12:28,330 --> 00:12:31,370 Speaker 2: somebody does not mean that they won't then do something 228 00:12:31,530 --> 00:12:33,450 Speaker 2: useful with the money. But then let's say for some 229 00:12:33,490 --> 00:12:35,370 Speaker 2: reason they just go, you know what, I'm just going 230 00:12:35,410 --> 00:12:37,370 Speaker 2: to sit on this money. It's just dollars. I'm going 231 00:12:37,370 --> 00:12:39,610 Speaker 2: to stick them under the mattress. I guess a bitcoin 232 00:12:39,650 --> 00:12:43,770 Speaker 2: guy might do that. That would reduce inflationary pressures on 233 00:12:43,810 --> 00:12:47,290 Speaker 2: the economy, which is something we want anyway. And if 234 00:12:47,290 --> 00:12:52,010 Speaker 2: inflationary pressures are reduced too much, the Federal Reserve could 235 00:12:52,050 --> 00:12:54,930 Speaker 2: then go, you know what, guys, we should probably print 236 00:12:54,970 --> 00:12:58,370 Speaker 2: some money or otherwise stimulate the economy. Actually, the worst 237 00:12:58,410 --> 00:13:00,930 Speaker 2: thing that could possibly happen is that the money gets 238 00:13:00,970 --> 00:13:04,650 Speaker 2: invested in buying more computers that could be put to 239 00:13:04,690 --> 00:13:07,170 Speaker 2: better use, but in fact just end up doing bitcoin mining. 240 00:13:07,370 --> 00:13:08,650 Speaker 2: I guess what I'm saying is, as long as they 241 00:13:08,650 --> 00:13:11,770 Speaker 2: don't spend it on bitcoin mining computers, it's totally fine. 242 00:13:12,130 --> 00:13:13,050 Speaker 2: Well what am I missing? 243 00:13:13,330 --> 00:13:15,410 Speaker 1: I went around the same track as you went around, 244 00:13:15,530 --> 00:13:20,770 Speaker 1: and you know, different cryptocurrencies have different methods of regulation, right, 245 00:13:20,850 --> 00:13:25,690 Speaker 1: So bitcoin by design is this very energy intensive computing 246 00:13:25,770 --> 00:13:28,850 Speaker 1: process in order to mine bitcoin, and that is, in 247 00:13:28,850 --> 00:13:32,170 Speaker 1: my opinion, a bad outcome socially. Is I agree, because 248 00:13:32,570 --> 00:13:35,130 Speaker 1: you could imagine a world where maintaining the bitcoin network 249 00:13:35,210 --> 00:13:37,730 Speaker 1: was socially desirable. In the world as it has evolved, 250 00:13:37,770 --> 00:13:40,850 Speaker 1: I don't find it particularly socially desirable. But at this point, 251 00:13:40,890 --> 00:13:43,570 Speaker 1: I don't think most of the money going into bitcoin 252 00:13:43,690 --> 00:13:46,130 Speaker 1: is going to miners, right, there's a large stock of 253 00:13:46,170 --> 00:13:49,170 Speaker 1: bitcoin that exists in the world. I Mean. An interesting 254 00:13:49,210 --> 00:13:53,650 Speaker 1: thing about money is it almost never sleeps. It keeps going, 255 00:13:53,770 --> 00:13:57,530 Speaker 1: and so you can always ask what happens next. You know, 256 00:13:57,570 --> 00:14:00,250 Speaker 1: with the stock market, sometimes people talk about money on 257 00:14:00,290 --> 00:14:02,530 Speaker 1: the sidelines when they're like, well, the stock market can 258 00:14:02,530 --> 00:14:04,210 Speaker 1: go up somewhere because there's a lot of cash on 259 00:14:04,250 --> 00:14:08,850 Speaker 1: the sidelines. And there's this famous billionaire investor, cliff Asnas. 260 00:14:09,210 --> 00:14:10,970 Speaker 1: He's an interesting guy. He got a PhD from the 261 00:14:11,050 --> 00:14:15,050 Speaker 1: University of Chicago, studied with Gene Fama, famous economist, and 262 00:14:15,170 --> 00:14:18,290 Speaker 1: cliff Asnas gets driven up the wall when people say 263 00:14:18,330 --> 00:14:20,970 Speaker 1: cash on the sidelines for the same reason you just said, right, 264 00:14:21,050 --> 00:14:25,090 Speaker 1: Like every time somebody buys a share of stock, someone 265 00:14:25,090 --> 00:14:27,570 Speaker 1: else is selling it, right, So like the money is 266 00:14:28,130 --> 00:14:31,410 Speaker 1: changing hands, the stock is changing hands, but like there 267 00:14:31,490 --> 00:14:33,210 Speaker 1: are no sidelines with money. 268 00:14:33,330 --> 00:14:37,010 Speaker 2: Yeah, and even if someone puts just cash in a 269 00:14:37,090 --> 00:14:40,250 Speaker 2: checking account, well, then now the bank has money, and 270 00:14:40,570 --> 00:14:42,130 Speaker 2: you can bet the bank is going to want to 271 00:14:42,170 --> 00:14:45,050 Speaker 2: do something with that money, something productive. And even if 272 00:14:45,050 --> 00:14:46,610 Speaker 2: they doesn't put it in the bank, even if, as 273 00:14:46,650 --> 00:14:49,730 Speaker 2: I say, it's under a mattress, well, even then the 274 00:14:49,770 --> 00:14:51,490 Speaker 2: Federal Reserve could always put money. 275 00:14:51,690 --> 00:14:54,330 Speaker 1: Yeah, the stock of money is not fixed, right. I 276 00:14:54,330 --> 00:14:57,610 Speaker 1: think there's a there's an assumption in the question I 277 00:14:57,650 --> 00:15:01,530 Speaker 1: think goes back to the pre modern era, the era 278 00:15:01,610 --> 00:15:04,370 Speaker 1: before the nineteen thirties when money was based on gold 279 00:15:04,450 --> 00:15:07,250 Speaker 1: or silver and it was finite. Right, there was a 280 00:15:07,290 --> 00:15:09,290 Speaker 1: fixed amount of gold and silver in the world in 281 00:15:09,290 --> 00:15:11,810 Speaker 1: those days. Indeed, if you sat on money, you were 282 00:15:11,810 --> 00:15:15,570 Speaker 1: effectively reducing the supply of money in the world. But 283 00:15:16,010 --> 00:15:18,010 Speaker 1: to your point, the supply of money in the world 284 00:15:18,210 --> 00:15:20,570 Speaker 1: is as big as the Central Bank wants it to be, 285 00:15:21,090 --> 00:15:24,730 Speaker 1: and so it's not a meaningful constraint on economic growth. 286 00:15:24,770 --> 00:15:26,810 Speaker 1: So the ultimate answer to the question is, if you 287 00:15:27,010 --> 00:15:29,690 Speaker 1: are just sitting on money, it's not really going to 288 00:15:29,810 --> 00:15:31,730 Speaker 1: have an effect on the broader economy. 289 00:15:31,890 --> 00:15:33,650 Speaker 2: Well, I think we've come to agreement. Hopefully that makes 290 00:15:33,690 --> 00:15:36,210 Speaker 2: sense to Adam. Next question. 291 00:15:37,210 --> 00:15:41,570 Speaker 1: Our next question comes from doctor Yvonne Couch, who is 292 00:15:42,170 --> 00:15:48,210 Speaker 1: Associate Professor of Neuroimmunology, Alzheimer's Research UK Fellow, Associate Research 293 00:15:48,250 --> 00:15:53,210 Speaker 1: Fellow at Saint Hilda's College, stipendary lecturer at Somerville College, 294 00:15:53,810 --> 00:15:54,810 Speaker 1: University of Oxford. 295 00:15:55,290 --> 00:15:57,610 Speaker 2: Yeah, University of Oxford. I know, well, one of my 296 00:15:57,650 --> 00:15:59,650 Speaker 2: sisters went to Hilldess College and one of my sisters 297 00:15:59,690 --> 00:16:02,370 Speaker 2: went to Somerville College, so this feels very much in 298 00:16:02,410 --> 00:16:04,770 Speaker 2: the family. So what does doctor Couch got just say? 299 00:16:05,890 --> 00:16:10,290 Speaker 1: She writes, my question was vaguely economics based, although probably 300 00:16:10,330 --> 00:16:15,610 Speaker 1: not very is the way we currently fund science feasible 301 00:16:15,690 --> 00:16:19,170 Speaker 1: going forward? Do we just have too many scientists and 302 00:16:19,290 --> 00:16:20,450 Speaker 1: not enough resources? 303 00:16:20,890 --> 00:16:25,130 Speaker 2: All the best, Yvonne, Oh love it. Let me throw 304 00:16:25,170 --> 00:16:28,610 Speaker 2: out a few thoughts. Then I think that probably there 305 00:16:29,050 --> 00:16:31,570 Speaker 2: is a problem with the way we fund science. But 306 00:16:31,850 --> 00:16:33,810 Speaker 2: there's no one way that we fund science. You have 307 00:16:33,890 --> 00:16:36,930 Speaker 2: university research, you have various grants, various sources of funding, 308 00:16:36,930 --> 00:16:41,010 Speaker 2: philanthropy and so on. You also have private sector research, 309 00:16:41,490 --> 00:16:45,810 Speaker 2: which is often incentivized by the patent system. And then 310 00:16:45,810 --> 00:16:48,810 Speaker 2: you have big block grants that are handed out by 311 00:16:48,890 --> 00:16:53,010 Speaker 2: agencies such as the National Institutes for Health. So there 312 00:16:53,050 --> 00:16:57,010 Speaker 2: are lots of different ways that science gets funded. A 313 00:16:57,050 --> 00:17:01,530 Speaker 2: couple of things that worry me is that, first of all, 314 00:17:01,610 --> 00:17:06,690 Speaker 2: there's a big incentive to make really incremental improvements rather than. 315 00:17:07,090 --> 00:17:08,450 Speaker 4: To take risks. 316 00:17:09,130 --> 00:17:14,050 Speaker 2: There's a great economics paper studying scientists who are funded 317 00:17:14,050 --> 00:17:16,810 Speaker 2: by the National Institutes for Health, which is a wonderful institution, 318 00:17:17,170 --> 00:17:21,130 Speaker 2: very important, and scientists who on paper seem to be 319 00:17:21,610 --> 00:17:23,450 Speaker 2: the same that they're on the same career tracks, got 320 00:17:23,530 --> 00:17:27,450 Speaker 2: very similar publication records, and they are instead funded by 321 00:17:27,530 --> 00:17:32,130 Speaker 2: a private foundation called the Howard Hughes Medical Institute. Listeners 322 00:17:32,130 --> 00:17:34,010 Speaker 2: who want to hear more about Howard Hughes and the 323 00:17:34,050 --> 00:17:36,850 Speaker 2: Howard Hughes biography can go to the back catalog of 324 00:17:36,850 --> 00:17:40,610 Speaker 2: caution details. So the Howard Hughes Medical Institute basically takes risks. 325 00:17:41,170 --> 00:17:44,050 Speaker 2: It wants people to do something new. It's happy with 326 00:17:44,170 --> 00:17:46,730 Speaker 2: a high risk of failure as long as there's some 327 00:17:46,850 --> 00:17:52,090 Speaker 2: chance of a real breakthrough success. And this particular paper 328 00:17:52,810 --> 00:17:55,690 Speaker 2: studying the results that come from these two funding systems 329 00:17:56,010 --> 00:17:59,650 Speaker 2: basically finds that the grant funders get what they pay for. 330 00:18:00,090 --> 00:18:03,210 Speaker 2: So the National Institutes for Health get a high success rate, 331 00:18:03,570 --> 00:18:06,410 Speaker 2: but it's often quite incremental progress. And the Howard Hughes 332 00:18:06,410 --> 00:18:09,290 Speaker 2: Medical Institute has lots and lots of failures, but when 333 00:18:09,330 --> 00:18:12,130 Speaker 2: it succeeds, it really succeeds. And they these are hugely 334 00:18:12,170 --> 00:18:16,130 Speaker 2: important papers, and I just feel that we're not deliberate 335 00:18:16,290 --> 00:18:21,290 Speaker 2: enough about saying, well, how much of our funding ecosystem 336 00:18:21,690 --> 00:18:25,250 Speaker 2: should be aiming at kind of venture capital style for 337 00:18:25,450 --> 00:18:28,250 Speaker 2: really big wins, and how much should be incremental. I 338 00:18:28,290 --> 00:18:31,050 Speaker 2: think those are not the sorts of questions that get asked, 339 00:18:31,170 --> 00:18:32,450 Speaker 2: So that's one of the things that worries me. 340 00:18:32,530 --> 00:18:34,130 Speaker 4: But what's your take, Jacob. 341 00:18:34,650 --> 00:18:36,930 Speaker 1: One of the things that I have read is that 342 00:18:37,010 --> 00:18:40,890 Speaker 1: over time, the average age of grant recipients from the 343 00:18:40,970 --> 00:18:43,610 Speaker 1: NIH has gone up and up and up, so kind 344 00:18:43,610 --> 00:18:46,010 Speaker 1: of you just think of this whole universe as getting 345 00:18:46,050 --> 00:18:49,690 Speaker 1: older and more risk averse and more kind of bureaucratic. 346 00:18:50,250 --> 00:18:53,690 Speaker 1: There is an interesting set of counter pressures I think 347 00:18:53,810 --> 00:18:57,050 Speaker 1: rising up partly out of Silicon Valley. I don't know 348 00:18:57,090 --> 00:19:01,010 Speaker 1: if you've come across the work of Patrick Hollison and 349 00:19:01,050 --> 00:19:04,010 Speaker 1: his brother. They're from Ireland. They found it Stripe. They're 350 00:19:04,210 --> 00:19:07,530 Speaker 1: very rich. Stripe is a big company that does basically 351 00:19:07,570 --> 00:19:11,050 Speaker 1: payment stuff online. They have a really interesting set of 352 00:19:11,610 --> 00:19:16,410 Speaker 1: kind of philanthropic endeavors around the idea of progress. You know, 353 00:19:16,490 --> 00:19:20,130 Speaker 1: they're trying to create a kind of field of progress studies. 354 00:19:20,650 --> 00:19:24,050 Speaker 1: That is, it's very meta. What are the conditions that 355 00:19:24,490 --> 00:19:28,250 Speaker 1: best foster technological and scientific progress? 356 00:19:28,690 --> 00:19:31,330 Speaker 2: But this question from doctor Couch is very massive. Yes, 357 00:19:31,330 --> 00:19:32,970 Speaker 2: it's very much progress stud is, isn't it. 358 00:19:33,410 --> 00:19:37,530 Speaker 1: Yes. There's this institute in the Bay Area called the 359 00:19:37,730 --> 00:19:41,850 Speaker 1: Arc Institute that hires leading scientists. The basic idea is 360 00:19:42,210 --> 00:19:47,450 Speaker 1: give talented people freedom and money, right and encourage them 361 00:19:47,490 --> 00:19:50,570 Speaker 1: to take big swings. And they are interested, you know, 362 00:19:50,610 --> 00:19:53,330 Speaker 1: not just in outcomes but in new tools. Right again, 363 00:19:53,570 --> 00:19:56,410 Speaker 1: continuing on the nerdy thing, like if you take a 364 00:19:56,450 --> 00:19:59,370 Speaker 1: tool like Crisper. Crisper is an intermediate tool that allows 365 00:19:59,370 --> 00:20:01,690 Speaker 1: people to cut up a genome. Basically, we just had 366 00:20:01,730 --> 00:20:05,090 Speaker 1: the first transplant from an animal into a human a 367 00:20:05,090 --> 00:20:07,170 Speaker 1: few weeks ago because of Crisper. We have the first 368 00:20:07,610 --> 00:20:11,010 Speaker 1: treatments for hicle cell disease because of Crisper's. So I 369 00:20:11,090 --> 00:20:15,770 Speaker 1: do think there is a wave of people trying to 370 00:20:15,890 --> 00:20:20,450 Speaker 1: rethink scientific funding. There is a bottom line aspect to 371 00:20:20,530 --> 00:20:24,050 Speaker 1: this question from doctor Couch that I don't feel like 372 00:20:24,090 --> 00:20:25,810 Speaker 1: I know enough to answer, and I'm curious if you do. 373 00:20:25,890 --> 00:20:28,570 Speaker 1: I mean, the question, is the way we currently fund 374 00:20:28,610 --> 00:20:32,770 Speaker 1: science feasible going forward? There is a yes no version 375 00:20:32,770 --> 00:20:35,010 Speaker 1: of the answer. Do you think you know? Is it? 376 00:20:35,690 --> 00:20:38,610 Speaker 2: I don't know the answer, but I don't know either. 377 00:20:38,890 --> 00:20:42,530 Speaker 2: Another thing that concerns me on this, which is potentially 378 00:20:42,810 --> 00:20:47,410 Speaker 2: really existential, is the question of how we fund new antibiotics. 379 00:20:47,890 --> 00:20:48,170 Speaker 1: Huh. 380 00:20:48,810 --> 00:20:51,970 Speaker 2: I think this points to a real weakness in the 381 00:20:52,010 --> 00:20:54,810 Speaker 2: ecosystem of research funding. So if you think about the 382 00:20:54,850 --> 00:20:59,690 Speaker 2: basic way we develop drugs, the fundamental incentive is the patent. 383 00:21:00,010 --> 00:21:02,250 Speaker 2: So a drug company spends a lot of money, tries 384 00:21:02,290 --> 00:21:04,730 Speaker 2: to develop drugs. Some of them work, some of them fail, 385 00:21:04,730 --> 00:21:07,050 Speaker 2: a lot of them fail. But in the end you 386 00:21:07,130 --> 00:21:09,410 Speaker 2: have a drug, and the deal is give you this 387 00:21:09,490 --> 00:21:13,490 Speaker 2: artificial monopoly called a patent, and it will only last 388 00:21:13,690 --> 00:21:15,490 Speaker 2: for a certain amount of time, which is kind of 389 00:21:15,490 --> 00:21:17,410 Speaker 2: a problem because it takes so long to develop the drugs. 390 00:21:17,410 --> 00:21:21,090 Speaker 2: Maybe the patent's nearly expired. You can charge an incredible 391 00:21:21,090 --> 00:21:23,890 Speaker 2: amount for these drugs for a while, and then your 392 00:21:23,890 --> 00:21:26,370 Speaker 2: patterents will run out, and then loads of its people 393 00:21:26,410 --> 00:21:28,290 Speaker 2: will make the same drug and it'll come down in value. 394 00:21:28,410 --> 00:21:30,850 Speaker 2: For example, Viagra, you know you could sell this for 395 00:21:30,850 --> 00:21:32,570 Speaker 2: a huge amount of money, and now Viagra is off 396 00:21:32,610 --> 00:21:36,290 Speaker 2: patent and anyone can make a generic Viagra. So that's 397 00:21:36,330 --> 00:21:39,210 Speaker 2: the incentive that we're given to private companies that you 398 00:21:39,250 --> 00:21:43,970 Speaker 2: will have this temporary monopoly. So now think about antibiotics. 399 00:21:44,090 --> 00:21:46,450 Speaker 2: The problem with antibiotics. We have lots of antibiotics that 400 00:21:46,490 --> 00:21:49,810 Speaker 2: work really well, except the bacteria have figured them out. 401 00:21:50,170 --> 00:21:53,130 Speaker 2: And why have they figured them out Because we've used them. 402 00:21:53,370 --> 00:21:56,690 Speaker 2: The bacteria developed resistance. So what we really need is 403 00:21:56,810 --> 00:22:01,970 Speaker 2: new antibiotics that we don't use. And now think about 404 00:22:02,010 --> 00:22:05,490 Speaker 2: how the patent system deals with that. So you basically say, 405 00:22:05,770 --> 00:22:08,490 Speaker 2: if you develop a new antibiotic, we'll give you a 406 00:22:08,530 --> 00:22:10,930 Speaker 2: temporary minio, but we really need you to just not 407 00:22:11,250 --> 00:22:14,690 Speaker 2: sell any don't sell any of this things except. 408 00:22:14,450 --> 00:22:17,850 Speaker 1: In cases of dire emergency. It's like a break glass 409 00:22:17,890 --> 00:22:20,930 Speaker 1: in case of emergency. I mean, I feel like a bounty, 410 00:22:21,170 --> 00:22:24,490 Speaker 1: Like every EuCon nerd storyteller loves a good bounty. 411 00:22:24,570 --> 00:22:24,730 Speaker 3: Right. 412 00:22:24,970 --> 00:22:28,290 Speaker 1: Wasn't this done with malaria vaccine, which is actually coming 413 00:22:28,290 --> 00:22:31,250 Speaker 1: along quite well? You have some universe of people say 414 00:22:31,570 --> 00:22:34,490 Speaker 1: we will pay a billion dollars to anyone who comes 415 00:22:34,570 --> 00:22:37,130 Speaker 1: up with a new antibiotic that meets this set of 416 00:22:37,170 --> 00:22:40,970 Speaker 1: criteria that treats this set of bugs that are resistant 417 00:22:41,010 --> 00:22:44,330 Speaker 1: to these existing antibiotics, and we'll give you the money, 418 00:22:44,370 --> 00:22:45,770 Speaker 1: and you give it to us, and we'll put it 419 00:22:45,810 --> 00:22:48,130 Speaker 1: on the break glass in case of emergency shelf. 420 00:22:48,290 --> 00:22:50,530 Speaker 2: You've used the word bounty, but what the term that 421 00:22:50,650 --> 00:22:53,770 Speaker 2: is normally used is an advanced market commitment. These were 422 00:22:53,890 --> 00:22:56,610 Speaker 2: proposed most famously by Michael Kramer, who's a Nobel Prize 423 00:22:56,610 --> 00:23:00,770 Speaker 2: winner in economics. And basically the way this price, this 424 00:23:00,850 --> 00:23:03,330 Speaker 2: bounty tends to get paid is as a kind of 425 00:23:03,650 --> 00:23:06,170 Speaker 2: extra payment on top of each dose you sell. Huh, 426 00:23:06,170 --> 00:23:08,530 Speaker 2: So we'll give you extra for every kid that gets 427 00:23:08,570 --> 00:23:10,090 Speaker 2: vaccinated so. 428 00:23:10,050 --> 00:23:12,610 Speaker 1: We're back to the same problem in that universe. 429 00:23:12,410 --> 00:23:13,970 Speaker 2: Back to the same problem. But the reason we go, well, 430 00:23:14,010 --> 00:23:15,410 Speaker 2: why does it have to be like that? It doesn't 431 00:23:15,450 --> 00:23:17,170 Speaker 2: have to be like that. But the reason that they 432 00:23:17,210 --> 00:23:19,250 Speaker 2: tend to be structured like that is because you need 433 00:23:19,290 --> 00:23:22,850 Speaker 2: to demonstrate some kind of market demand. Somebody needs to 434 00:23:22,890 --> 00:23:26,050 Speaker 2: be willing to buy your product. If they are, we'll 435 00:23:26,050 --> 00:23:28,610 Speaker 2: give you an extra payment for every product you sell. 436 00:23:29,050 --> 00:23:31,610 Speaker 2: But that wouldn't work for antibiotics. There's so many different 437 00:23:31,610 --> 00:23:33,890 Speaker 2: ways in which science funding could be said to be broken. 438 00:23:34,330 --> 00:23:38,130 Speaker 2: But I think doctor Couch is right to raise the issue. 439 00:23:38,250 --> 00:23:40,090 Speaker 2: We need to do a lot more of this kind 440 00:23:40,130 --> 00:23:42,610 Speaker 2: of meta thinking about progress studies. 441 00:23:42,890 --> 00:23:45,370 Speaker 1: Tim, that was a lot of answer. Let's take a break. 442 00:23:45,650 --> 00:23:56,290 Speaker 1: Cautionary Tales will be back in a moment. Edward back. 443 00:23:56,530 --> 00:23:59,490 Speaker 1: I'm Jacob Goldstein here with Tim Harford on Cautionary Tales. 444 00:23:59,570 --> 00:24:02,410 Speaker 2: Hello, Jacob, more questions. What have you got for me? 445 00:24:02,850 --> 00:24:06,210 Speaker 1: All right, Tim, I got another one for you. Comes 446 00:24:06,210 --> 00:24:12,850 Speaker 1: from Graham in Florida. Gram Rights. I'm fascinated by errors 447 00:24:13,010 --> 00:24:18,330 Speaker 1: made by falsely identifying correlation as causation. What examples of 448 00:24:18,370 --> 00:24:19,730 Speaker 1: this error stand out to you? 449 00:24:20,770 --> 00:24:23,090 Speaker 2: I love the question, but I'm going to slightly sidestep 450 00:24:23,130 --> 00:24:25,690 Speaker 2: it because I'm worried by this question, but I think 451 00:24:25,730 --> 00:24:32,250 Speaker 2: that it's generally more complicated than simply, oh, some fool 452 00:24:32,690 --> 00:24:34,810 Speaker 2: sourer correlation and thought it was causation. 453 00:24:35,250 --> 00:24:38,610 Speaker 1: You don't want to talk about sun spots and crap yields, Well, 454 00:24:38,690 --> 00:24:42,050 Speaker 1: let's talk about storks and babies briefly, Okay. A classic, 455 00:24:42,090 --> 00:24:43,210 Speaker 1: A classic of the genre. 456 00:24:43,370 --> 00:24:48,090 Speaker 2: Yeah, the most successful book ever published about statistics, How 457 00:24:48,130 --> 00:24:51,330 Speaker 2: to Lie with Statistics by Darryl Huff, includes this example 458 00:24:51,330 --> 00:24:54,450 Speaker 2: of stalks and babies and shows that there's a correlation 459 00:24:54,650 --> 00:24:57,290 Speaker 2: between the number of storks and the number of babies. 460 00:24:57,330 --> 00:24:59,930 Speaker 2: And there are various ways to demonstrate this correlation. One 461 00:24:59,930 --> 00:25:02,530 Speaker 2: way to do it is you just look at national populations. 462 00:25:02,770 --> 00:25:05,050 Speaker 2: You go, hey, countries with lots of storks also have 463 00:25:05,090 --> 00:25:07,370 Speaker 2: lots of babies, and there's a very very strong correlation. 464 00:25:07,850 --> 00:25:10,370 Speaker 2: But of course the reason is big places like the 465 00:25:10,490 --> 00:25:12,890 Speaker 2: United States have a lot of room for storks and 466 00:25:12,930 --> 00:25:15,570 Speaker 2: a lot of room for babies, and small places like 467 00:25:15,610 --> 00:25:17,970 Speaker 2: in a Luxembourg or the Vatican City don't have a 468 00:25:18,010 --> 00:25:19,210 Speaker 2: lot of room for babies and don't have a lot 469 00:25:19,210 --> 00:25:20,970 Speaker 2: of room for storks. So then you go, oh, that's 470 00:25:20,970 --> 00:25:24,370 Speaker 2: a great example of this mistake. The sting in the 471 00:25:24,410 --> 00:25:27,370 Speaker 2: tale of that story, as I describe in my book 472 00:25:27,450 --> 00:25:30,290 Speaker 2: The Data Detective, is that Darryl Huff, the guy who 473 00:25:30,530 --> 00:25:34,170 Speaker 2: created that story, then went on to tell the same 474 00:25:34,210 --> 00:25:40,370 Speaker 2: story in congressional testimony, saying that there was no compelling 475 00:25:40,410 --> 00:25:43,930 Speaker 2: evidence that smoking cigarettes would give you lung cancer, and 476 00:25:43,970 --> 00:25:45,770 Speaker 2: it was just like stalks and babies. And he was 477 00:25:45,770 --> 00:25:47,570 Speaker 2: actually hired by the tobacco lobby. 478 00:25:47,730 --> 00:25:48,890 Speaker 1: It was just correlation. 479 00:25:48,930 --> 00:25:52,930 Speaker 2: It was just correlational. They seemed about the same. Sure, yeah, 480 00:25:52,930 --> 00:25:54,930 Speaker 2: there are people who smoke, and that people who get cancer, 481 00:25:54,970 --> 00:25:58,330 Speaker 2: but there's no cause of evidence. One theory was cigarettes 482 00:25:58,330 --> 00:26:01,290 Speaker 2: are so soothing, and if you have some early symptoms 483 00:26:01,290 --> 00:26:03,890 Speaker 2: of lung cancer, maybe you soothed that with the soothing 484 00:26:04,170 --> 00:26:06,850 Speaker 2: vapors of cigarettes. I mean, it's completely ridiculous, but this 485 00:26:06,930 --> 00:26:09,530 Speaker 2: sort of rhetoric was deployed. And so one of the 486 00:26:09,570 --> 00:26:12,370 Speaker 2: things I'm very concerned about in my work on statistics 487 00:26:12,570 --> 00:26:16,090 Speaker 2: is that it's great to be skeptical about statistics and 488 00:26:16,130 --> 00:26:19,050 Speaker 2: to point out lots of examples of statistics being misused. 489 00:26:19,050 --> 00:26:21,690 Speaker 2: But if that's all you do, you just get to 490 00:26:22,290 --> 00:26:26,970 Speaker 2: a canihilistic situation where you're just constantly rejecting statistical evidence 491 00:26:26,970 --> 00:26:29,650 Speaker 2: because it's just another of those dawn lines and statistics 492 00:26:29,730 --> 00:26:32,450 Speaker 2: when you look at the real world, I think this 493 00:26:32,570 --> 00:26:36,330 Speaker 2: gets to be incredibly interesting. So a real hot topic 494 00:26:36,370 --> 00:26:41,330 Speaker 2: at the moment is our smartphones destroying a generation. Basically, 495 00:26:41,490 --> 00:26:43,610 Speaker 2: are our kids having their mental health wrecked. 496 00:26:43,810 --> 00:26:46,530 Speaker 1: There's a new book that basically makes that up, Yeah, 497 00:26:46,530 --> 00:26:50,090 Speaker 1: by prominent ecademic Yea by Jonathan Hyde, and there have 498 00:26:50,090 --> 00:26:51,890 Speaker 1: been others by Gene Twegi, and lots of people have 499 00:26:51,930 --> 00:26:57,610 Speaker 1: said this, and the evidence for it is mostly correlational, 500 00:26:57,850 --> 00:26:58,450 Speaker 1: not completely. 501 00:26:58,450 --> 00:27:02,010 Speaker 2: There are some experiments, but they're not that convincing. None 502 00:27:02,050 --> 00:27:03,770 Speaker 2: of them are perfect, but there are lots of different 503 00:27:03,810 --> 00:27:08,490 Speaker 2: ways of measuring it. And the really interesting evidence basically says, look, 504 00:27:09,170 --> 00:27:11,690 Speaker 2: there appears to be a mental health crisis. The kids 505 00:27:11,730 --> 00:27:16,050 Speaker 2: seem to get really distressed, particularly the girls, when children 506 00:27:16,210 --> 00:27:19,850 Speaker 2: have access to social media on their phones, and that 507 00:27:19,890 --> 00:27:23,330 Speaker 2: happens around sometime between twenty ten and twenty fourteen, and 508 00:27:23,530 --> 00:27:26,050 Speaker 2: at the same time, suddenly you've got this outbreak of 509 00:27:26,450 --> 00:27:30,450 Speaker 2: suicidal ideation, self harm, poor mental health, and so on. 510 00:27:30,810 --> 00:27:34,850 Speaker 2: And that's correlational evidence. I don't completely believe it, But 511 00:27:34,890 --> 00:27:36,890 Speaker 2: I don't completely not believe it either. I think that's 512 00:27:36,930 --> 00:27:39,890 Speaker 2: what makes it interesting and makes it important to engage with. 513 00:27:40,210 --> 00:27:42,450 Speaker 2: You have to start going, well, what's the alternative explanation? 514 00:27:42,570 --> 00:27:45,410 Speaker 2: Is there something else that has happened sometime about sort 515 00:27:45,450 --> 00:27:49,010 Speaker 2: of ten, twelve, fourteen years ago that might explain this 516 00:27:49,130 --> 00:27:51,250 Speaker 2: mental health distress? So the timing of the Great Financial 517 00:27:51,290 --> 00:27:54,690 Speaker 2: Crisis probably not quite right Donald Trump. Maybe Donald Trump 518 00:27:54,730 --> 00:27:58,570 Speaker 2: is upsetting the kids. Timing doesn't work, so partly does 519 00:27:58,610 --> 00:28:02,570 Speaker 2: the pan of the correlation make enough sense to explain 520 00:28:02,610 --> 00:28:05,490 Speaker 2: this causal story, which, funny enough, is basically exactly what 521 00:28:05,530 --> 00:28:09,610 Speaker 2: the scientists finding a connection between cigarettes and lunghuans were doing. 522 00:28:09,770 --> 00:28:12,130 Speaker 2: They only had correlational evidence for a long time. They 523 00:28:12,170 --> 00:28:15,050 Speaker 2: weren't running randomized trials saying you know, half of you 524 00:28:15,130 --> 00:28:16,850 Speaker 2: smoke and half of you don't smoke. I mean, they 525 00:28:16,850 --> 00:28:19,450 Speaker 2: couldn't do that. They had to look at correlational evidence, 526 00:28:19,490 --> 00:28:21,290 Speaker 2: and sometimes that's what we've got. 527 00:28:21,770 --> 00:28:25,370 Speaker 1: This question got me thinking about the rise in social 528 00:28:25,450 --> 00:28:29,650 Speaker 1: science of what they call natural experiments, right, trying to 529 00:28:29,730 --> 00:28:34,050 Speaker 1: find instances in the real world where you have something 530 00:28:34,370 --> 00:28:36,650 Speaker 1: that obviously is not as good as a randomized trial, 531 00:28:36,690 --> 00:28:38,890 Speaker 1: because you just can't get that with large numbers of 532 00:28:38,890 --> 00:28:41,970 Speaker 1: people in the world. But that gives you some element 533 00:28:42,250 --> 00:28:46,530 Speaker 1: of randomization or pseudo randomization, something that allows you to 534 00:28:46,690 --> 00:28:50,330 Speaker 1: try and make the leap from correlation to causation. And 535 00:28:50,410 --> 00:28:52,290 Speaker 1: you know, this, as far as I know, goes back 536 00:28:52,290 --> 00:28:55,250 Speaker 1: to the Vietnam War, right, where in the US there 537 00:28:55,370 --> 00:28:58,810 Speaker 1: was a draft lottery, and there were social scientists after 538 00:28:59,250 --> 00:29:01,930 Speaker 1: the war who looked and said, oh, look, here are 539 00:29:01,930 --> 00:29:05,250 Speaker 1: people who are at an aggregate level very similar on 540 00:29:05,330 --> 00:29:09,890 Speaker 1: many dimensions, socioeconomic dimensions. We can look at people who 541 00:29:09,970 --> 00:29:13,610 Speaker 1: randomly got drafted versus those who randomly didn't and who 542 00:29:13,690 --> 00:29:16,530 Speaker 1: appear quite similar in the aggregate, and see how their 543 00:29:16,530 --> 00:29:17,370 Speaker 1: outcomes differ. 544 00:29:17,530 --> 00:29:20,610 Speaker 2: And that's a really good example of a natural experiment 545 00:29:20,610 --> 00:29:22,690 Speaker 2: because it's actually very close to a really. 546 00:29:22,610 --> 00:29:25,330 Speaker 1: Yeah, when you get a lottery in the real world, now, 547 00:29:25,450 --> 00:29:27,090 Speaker 1: social scientists flock to it. 548 00:29:27,210 --> 00:29:27,370 Speaker 3: Right. 549 00:29:27,410 --> 00:29:32,370 Speaker 1: Similarly, there was one in Oregon some years ago with Medicaid, 550 00:29:32,370 --> 00:29:35,930 Speaker 1: which is the healthcare program for low income people in 551 00:29:35,970 --> 00:29:39,330 Speaker 1: the US, and Oregon got some new Medicaid funds and 552 00:29:39,370 --> 00:29:43,530 Speaker 1: they randomly allocated them to a group of people over time, Right, 553 00:29:43,610 --> 00:29:46,010 Speaker 1: So that social scientists could say, oh, look, here are 554 00:29:46,050 --> 00:29:48,610 Speaker 1: people who are basically identical. Some of them got this 555 00:29:48,810 --> 00:29:50,570 Speaker 1: health insurance and some of them didn't, and in that 556 00:29:50,690 --> 00:29:53,890 Speaker 1: case the findings were quite interesting. Medicaid didn't appear to 557 00:29:53,930 --> 00:29:56,930 Speaker 1: be as helpful as I would have thought, as the 558 00:29:56,970 --> 00:30:00,490 Speaker 1: researchers themselves would have thought. According to the one I interviewed, 559 00:30:00,730 --> 00:30:04,010 Speaker 1: lowered mental health problems, people worried less about money, but 560 00:30:04,130 --> 00:30:07,890 Speaker 1: like their basic health outcomes didn't improve, which nobody would 561 00:30:07,890 --> 00:30:10,810 Speaker 1: have guessed right. And the the evidence is quite robust 562 00:30:11,290 --> 00:30:14,010 Speaker 1: when you don't have those lotteries. As you say, the 563 00:30:14,050 --> 00:30:16,730 Speaker 1: world is just hard to understand. I mean, even in 564 00:30:16,770 --> 00:30:20,810 Speaker 1: some instances where you do have randomized trials, people are complicated. 565 00:30:20,810 --> 00:30:24,730 Speaker 1: The body is complicated, society is complicated, and so you know, 566 00:30:24,890 --> 00:30:28,490 Speaker 1: correlation in a certain way I think is underrated. Like, yes, 567 00:30:28,570 --> 00:30:31,450 Speaker 1: obviously it doesn't equal causation, but it's a place to 568 00:30:31,530 --> 00:30:34,330 Speaker 1: start looking, right, It's a place to start asking questions. 569 00:30:34,370 --> 00:30:37,130 Speaker 2: I think that's absolutely right. And we need more randomized 570 00:30:37,130 --> 00:30:40,370 Speaker 2: experiments that there are more opportunities to run them than 571 00:30:40,410 --> 00:30:42,250 Speaker 2: people seem to think. For example, one of the things 572 00:30:42,290 --> 00:30:44,530 Speaker 2: that John Hyde in his book is arguing for is 573 00:30:44,610 --> 00:30:47,850 Speaker 2: you know, shouldn't have smartphones in schools. Schools should be 574 00:30:48,050 --> 00:30:51,690 Speaker 2: phone free zones. I could completely imagine a state saying 575 00:30:51,930 --> 00:30:54,650 Speaker 2: we're going to introduce a rule whereby in all of 576 00:30:54,690 --> 00:30:58,970 Speaker 2: the schools in the state, no smartphones strictly forbidden. You 577 00:30:59,010 --> 00:31:00,450 Speaker 2: have to put them in a locker when you show 578 00:31:00,530 --> 00:31:02,250 Speaker 2: up and then unlock them at the end of the day. 579 00:31:02,610 --> 00:31:04,490 Speaker 2: You could introduce that rule and just go, well, we know, 580 00:31:04,570 --> 00:31:08,890 Speaker 2: introduce it for a semester at random in fifty percent 581 00:31:08,930 --> 00:31:11,290 Speaker 2: of the schools, and then we'll introduce it in the 582 00:31:11,290 --> 00:31:14,050 Speaker 2: other fifty percent of the schools in the next semester, 583 00:31:14,130 --> 00:31:16,130 Speaker 2: and we'll just randomize that because we want to know 584 00:31:16,130 --> 00:31:19,090 Speaker 2: whether there's any point in this experiment or not. That's 585 00:31:19,170 --> 00:31:22,090 Speaker 2: not very difficult to do. It would create so much 586 00:31:22,170 --> 00:31:26,810 Speaker 2: information about children's performance in the classroom, their mental health 587 00:31:27,130 --> 00:31:29,050 Speaker 2: that could inform policy across the world. 588 00:31:29,370 --> 00:31:32,210 Speaker 1: That would be the dream. You know, there is now 589 00:31:32,250 --> 00:31:36,890 Speaker 1: a robust set of methods essentially where social scientists could 590 00:31:36,890 --> 00:31:39,730 Speaker 1: look at that state and compare it to neighboring states 591 00:31:39,810 --> 00:31:42,690 Speaker 1: and see the difference in differences, as they say, see 592 00:31:42,690 --> 00:31:45,970 Speaker 1: the change over time. And that's an instance where it 593 00:31:45,970 --> 00:31:49,250 Speaker 1: wouldn't be as elegant as randomizing within a state. But 594 00:31:49,290 --> 00:31:51,010 Speaker 1: I feel like you could start to get pretty good 595 00:31:51,090 --> 00:31:54,970 Speaker 1: data even if you did one state compared to other states. 596 00:31:55,290 --> 00:31:58,130 Speaker 2: Yeah, if you've got no experiments, we should run them. 597 00:31:58,210 --> 00:32:00,370 Speaker 2: If we don't run them, there are still ways of 598 00:32:00,450 --> 00:32:01,450 Speaker 2: making correlation. 599 00:32:01,730 --> 00:32:01,970 Speaker 4: Talk. 600 00:32:03,410 --> 00:32:04,370 Speaker 2: Do we have another question? 601 00:32:05,450 --> 00:32:07,850 Speaker 1: Okay, Tim, we are one more. We had to have 602 00:32:08,010 --> 00:32:12,530 Speaker 1: one game related question for great Yes, you're welcome. This 603 00:32:12,530 --> 00:32:17,250 Speaker 1: one comes from Yost, who writes, Hi, Tim, diving into 604 00:32:17,290 --> 00:32:21,050 Speaker 1: the board game shaped flank you've now exposed for discussion. 605 00:32:21,330 --> 00:32:25,090 Speaker 1: That is a very gamersh way in what board games 606 00:32:25,210 --> 00:32:28,730 Speaker 1: or board game mechanisms that you enjoy are particularly insightful 607 00:32:28,850 --> 00:32:32,770 Speaker 1: on some aspect of real world economics offer a unique 608 00:32:32,810 --> 00:32:37,010 Speaker 1: angle to look at a maybe niche problem. And what 609 00:32:37,290 --> 00:32:40,730 Speaker 1: non D and D games are you particularly enjoying right now? 610 00:32:41,450 --> 00:32:45,450 Speaker 1: Love the show, however many episodes you produce that is 611 00:32:45,490 --> 00:32:47,690 Speaker 1: a listener We all want thank you, yos. 612 00:32:47,770 --> 00:32:50,410 Speaker 2: Yeah, so kind your There are a couple of mechanisms 613 00:32:50,450 --> 00:32:53,570 Speaker 2: that I do see used quite often that shed important 614 00:32:53,610 --> 00:32:56,450 Speaker 2: light to economics. So one is trading a lot of 615 00:32:56,490 --> 00:33:00,930 Speaker 2: games involved trading. Now Monopoly, that classic, of sadly not 616 00:33:01,050 --> 00:33:03,570 Speaker 2: a very good game. Oh, there's a caution we tell 617 00:33:03,610 --> 00:33:05,570 Speaker 2: about the history of Monopoly if people want to listen. 618 00:33:06,410 --> 00:33:09,970 Speaker 2: Monopoly in theory involves trading, but in practice, not a 619 00:33:09,970 --> 00:33:13,930 Speaker 2: lot of trading happens. It turns out that you need 620 00:33:13,970 --> 00:33:17,330 Speaker 2: to give people a reason to trade. The great modern 621 00:33:17,330 --> 00:33:19,530 Speaker 2: game Settlers, now about thirty years old. 622 00:33:19,610 --> 00:33:22,050 Speaker 1: By the way, Settlers, I assume that Settlers of Catan 623 00:33:22,250 --> 00:33:25,170 Speaker 1: as an outsider, not on a first name basis. 624 00:33:24,810 --> 00:33:25,250 Speaker 3: With the game. 625 00:33:25,410 --> 00:33:28,170 Speaker 2: Settlers of Katan is the game. It's super It's the 626 00:33:28,170 --> 00:33:31,250 Speaker 2: game that Monopoly wishes it was. It's like if Monopoly 627 00:33:31,330 --> 00:33:35,410 Speaker 2: had been designed with a modern eye. To be super exciting. 628 00:33:35,890 --> 00:33:40,690 Speaker 2: The people need different resources, and the supply of resources fluctuates, 629 00:33:40,770 --> 00:33:43,290 Speaker 2: so sometimes you've got loads of wood, sometimes there's no wood, 630 00:33:43,490 --> 00:33:45,690 Speaker 2: so there's an active incentive to trade all the time. 631 00:33:45,890 --> 00:33:47,210 Speaker 2: Oh and by the way, if you don't trade every 632 00:33:47,250 --> 00:33:49,050 Speaker 2: now and then the robber Barron comes and take stuff. 633 00:33:49,170 --> 00:33:52,050 Speaker 2: So it's an interesting insight that trading doesn't just happen 634 00:33:52,530 --> 00:33:55,290 Speaker 2: because it's allowed. There needs to be some difference in 635 00:33:55,370 --> 00:33:57,210 Speaker 2: value and perhaps some incentive. 636 00:33:56,730 --> 00:33:57,450 Speaker 4: To get on with it. 637 00:33:58,330 --> 00:34:02,450 Speaker 2: So my personal favorite game is Agricola, a wonderful game 638 00:34:02,490 --> 00:34:05,410 Speaker 2: about developing a farm, which doesn't sound promising, but it's 639 00:34:05,450 --> 00:34:08,010 Speaker 2: really really good. But one of the clever things about 640 00:34:08,010 --> 00:34:11,610 Speaker 2: agricola is it uses an auction in an interesting way, 641 00:34:12,090 --> 00:34:14,170 Speaker 2: but it's sometimes called a descending. 642 00:34:13,770 --> 00:34:15,610 Speaker 4: Clock auction or a Dutch auction. 643 00:34:16,210 --> 00:34:20,130 Speaker 2: The prize gets more and more tempting, so in a 644 00:34:20,170 --> 00:34:22,610 Speaker 2: gricola there's just more and more good stuff on the board. 645 00:34:22,930 --> 00:34:25,890 Speaker 2: In a traditional descending clock auction, basically the price is 646 00:34:25,930 --> 00:34:28,090 Speaker 2: going down and down and down and down, and everyone 647 00:34:28,130 --> 00:34:29,850 Speaker 2: is just sitting there and then it's a question of 648 00:34:29,930 --> 00:34:32,490 Speaker 2: who grabs it first. The longer you leave it, the 649 00:34:32,530 --> 00:34:35,610 Speaker 2: better it gets. But then only one person can have 650 00:34:35,690 --> 00:34:38,850 Speaker 2: it if very very elegant. They sell flowers at Alsmere 651 00:34:39,090 --> 00:34:42,810 Speaker 2: in the Netherlands and it's just incredibly quick, much much 652 00:34:42,850 --> 00:34:45,130 Speaker 2: quicker than selling say art. You know you're selling a 653 00:34:45,210 --> 00:34:47,530 Speaker 2: van goal hah No. The price rises and rises and rises. 654 00:34:47,610 --> 00:34:50,090 Speaker 1: So auction design is a whole thing, right, Yeah. When 655 00:34:50,090 --> 00:34:53,810 Speaker 1: would you choose a traditional prices going up eBay art 656 00:34:53,850 --> 00:34:56,570 Speaker 1: style auction in the real world, and when would you 657 00:34:56,650 --> 00:34:59,370 Speaker 1: choose a Dutch prices going down auction? What do you 658 00:34:59,450 --> 00:35:01,570 Speaker 1: get in a relative sense out of each one of those? 659 00:35:01,610 --> 00:35:05,330 Speaker 2: There's a couple of differences. One is that in certain 660 00:35:05,450 --> 00:35:09,490 Speaker 2: types of ascending auction you get to see people dropping 661 00:35:09,530 --> 00:35:11,490 Speaker 2: out as the price rises. It depends on the way 662 00:35:11,530 --> 00:35:14,210 Speaker 2: the auction is designed, but you could imagine an ascending 663 00:35:14,250 --> 00:35:16,490 Speaker 2: auction where you just see people going no, I'm out, 664 00:35:16,570 --> 00:35:17,610 Speaker 2: I'm out, I'm out. 665 00:35:17,930 --> 00:35:20,850 Speaker 1: Sort of the classic kind of art auction in a movie. 666 00:35:21,050 --> 00:35:23,650 Speaker 2: Yeah, and so you're learning information as the price rises. 667 00:35:23,650 --> 00:35:25,370 Speaker 2: How many people are still interested? Are there still ten 668 00:35:25,370 --> 00:35:28,850 Speaker 2: people interested? Is it just two people? So that generates information. 669 00:35:28,970 --> 00:35:32,770 Speaker 2: By generating information, it might raise the final price. 670 00:35:32,890 --> 00:35:33,450 Speaker 4: Overall. 671 00:35:34,010 --> 00:35:37,370 Speaker 2: The advantage of the descending auction is it's just so quick. 672 00:35:37,610 --> 00:35:41,170 Speaker 2: You can run an auction in ten seconds. It's one 673 00:35:41,250 --> 00:35:43,650 Speaker 2: hundred thousand tulips. Here are the tulips. We're going to 674 00:35:43,690 --> 00:35:49,370 Speaker 2: sell them for whatever, ten thousand euros nine eight hundred thousand, 675 00:35:49,370 --> 00:35:52,730 Speaker 2: and it's literally a clock. The clock just runs around 676 00:35:52,970 --> 00:35:56,570 Speaker 2: showing what the price is, and then someone presses the button. Sold, okay, 677 00:35:57,490 --> 00:35:59,570 Speaker 2: now bringing the roses. So it's very fast. 678 00:35:59,730 --> 00:36:01,610 Speaker 1: You would want to have a good idea of the 679 00:36:01,690 --> 00:36:03,610 Speaker 1: market clearing price if you were going to run that 680 00:36:03,770 --> 00:36:05,570 Speaker 1: kind of auction, so that you didn't sell it for 681 00:36:05,650 --> 00:36:08,410 Speaker 1: too cheap. Right, it's fast, but you run the risk 682 00:36:08,450 --> 00:36:11,050 Speaker 1: of selling it for cheap. But if it's just commodity toolips, 683 00:36:11,090 --> 00:36:13,330 Speaker 1: then you basically know the market clearing price anyways. 684 00:36:13,450 --> 00:36:15,770 Speaker 2: Yeah, it's almost like a stock market for tulips. That 685 00:36:15,890 --> 00:36:17,810 Speaker 2: I think is why it works so well. Whereas if 686 00:36:17,850 --> 00:36:20,090 Speaker 2: it's a unique work of art, you need to give 687 00:36:20,170 --> 00:36:23,490 Speaker 2: maximum information, maximum comfort to people, maybe a little bit 688 00:36:23,490 --> 00:36:25,770 Speaker 2: of theater as well. We're not in a hurry to 689 00:36:25,850 --> 00:36:28,170 Speaker 2: run this auction. The cost of the auction itself is 690 00:36:28,250 --> 00:36:31,610 Speaker 2: trivial compared to the value of what's being sold. 691 00:36:32,210 --> 00:36:34,530 Speaker 1: Now we go over to agricola and have a descending action. 692 00:36:34,730 --> 00:36:37,970 Speaker 1: It feels exciting and it also feels like you're trying 693 00:36:38,050 --> 00:36:40,530 Speaker 1: to guess what is in your opponent's head, right, because 694 00:36:40,530 --> 00:36:43,930 Speaker 1: they're not revealing it as they would in ascending price auction. 695 00:36:44,090 --> 00:36:45,930 Speaker 1: You have to think, oh, what are they willing to 696 00:36:46,010 --> 00:36:46,450 Speaker 1: pay for it? 697 00:36:46,570 --> 00:36:49,450 Speaker 2: Absolutely, and it works slightly differently, but the functionally what's 698 00:36:49,530 --> 00:36:52,970 Speaker 2: going on is that every turn you take you get 699 00:36:53,010 --> 00:36:55,490 Speaker 2: to grab some resource on the board, and then once 700 00:36:55,570 --> 00:36:57,610 Speaker 2: you've grabbed the resource on the board, no one else 701 00:36:57,730 --> 00:37:00,130 Speaker 2: is allowed to grab it until the next round. And 702 00:37:00,330 --> 00:37:02,330 Speaker 2: there are several moves you can make in each turn, 703 00:37:02,490 --> 00:37:03,930 Speaker 2: so you might be able to get two or three 704 00:37:04,010 --> 00:37:06,970 Speaker 2: things each turn, And so you go backwards and forwards. 705 00:37:07,010 --> 00:37:08,730 Speaker 2: It's your move and it's my move. And each time 706 00:37:08,770 --> 00:37:11,050 Speaker 2: we want to grab something, and you might go, well 707 00:37:11,050 --> 00:37:12,330 Speaker 2: do I want to grab it now? What do I 708 00:37:12,370 --> 00:37:14,930 Speaker 2: want to wait until next turn where it might be better? 709 00:37:14,970 --> 00:37:16,490 Speaker 2: And you're trying to figure out, like, what is the 710 00:37:16,650 --> 00:37:20,490 Speaker 2: thing that the other person is desperate to have? Can 711 00:37:20,570 --> 00:37:24,050 Speaker 2: I just let this pile get bigger because they can't 712 00:37:24,090 --> 00:37:25,530 Speaker 2: afford to take it because they've got to use their 713 00:37:25,570 --> 00:37:28,330 Speaker 2: move on something else. It's a very very clever game. 714 00:37:28,530 --> 00:37:29,250 Speaker 4: I do enjoy it. 715 00:37:29,970 --> 00:37:32,850 Speaker 2: Yours also ask what non D and D games I 716 00:37:33,090 --> 00:37:37,570 Speaker 2: am enjoying. I am enjoying a modified version of Blades 717 00:37:37,610 --> 00:37:41,050 Speaker 2: in the Dark, which's a classic role playing game, very 718 00:37:41,130 --> 00:37:43,930 Speaker 2: fast moving. You get to do Ocean's eleven or other 719 00:37:44,090 --> 00:37:48,170 Speaker 2: kind of heists, and it's very modifiable. We're speaking on 720 00:37:48,250 --> 00:37:51,050 Speaker 2: a Monday. Just yesterday on Sunday, had a whole bunch 721 00:37:51,130 --> 00:37:53,690 Speaker 2: of old friends around to my house and we just 722 00:37:53,730 --> 00:37:57,370 Speaker 2: played this game all day and we had an absolutely 723 00:37:57,410 --> 00:37:57,930 Speaker 2: terrific time. 724 00:37:58,290 --> 00:38:01,090 Speaker 1: I feel like you're living the dream. Harford makes me happy. 725 00:38:00,970 --> 00:38:04,090 Speaker 2: Living my best life. Jacob, this has been such fun. 726 00:38:04,250 --> 00:38:05,330 Speaker 2: Thank you so much for doing this. 727 00:38:05,730 --> 00:38:08,290 Speaker 1: Ah, it was a delight. I'll come back anytime. 728 00:38:09,290 --> 00:38:10,370 Speaker 4: Love to have you back. 729 00:38:11,130 --> 00:38:13,330 Speaker 2: Thank you everybody who sent in a question. Sorry we 730 00:38:13,410 --> 00:38:15,330 Speaker 2: weren't able to answer all the questions, but we are 731 00:38:15,370 --> 00:38:18,050 Speaker 2: going to be back with another Cautionary Questions episode later 732 00:38:18,130 --> 00:38:21,530 Speaker 2: this year. Please do keep your queries coming, send them 733 00:38:21,570 --> 00:38:25,010 Speaker 2: into tales at pushkin dot fm. That's t a l 734 00:38:25,130 --> 00:38:29,170 Speaker 2: e s at Pushkin dot fm and I'll be back 735 00:38:29,410 --> 00:38:33,090 Speaker 2: with another Cautionary Tale in two weeks time. Jacob Goldstein 736 00:38:33,330 --> 00:38:36,010 Speaker 2: has a wonderful podcast called What's Your Problem? 737 00:38:36,170 --> 00:38:37,450 Speaker 4: Thank you Jacob, Thanks Je. 738 00:38:45,650 --> 00:38:49,130 Speaker 2: Cautionary Tales is written by me, Tim Harford with Andrew Wright. 739 00:38:49,610 --> 00:38:52,890 Speaker 2: It's produced by Alice Fines with support from Marilyn Rust. 740 00:38:53,410 --> 00:38:55,930 Speaker 2: The sound design and original music is the work of 741 00:38:56,050 --> 00:39:00,890 Speaker 2: Pascal Wise. Sarah Nix edited the scripts. It features the 742 00:39:00,970 --> 00:39:05,730 Speaker 2: voice talents of Ben Crow, Melanie Guttridge, Stella Harford, Jamma Saunders, 743 00:39:05,850 --> 00:39:09,570 Speaker 2: and Rufus Wright. The show also wouldn't have been possible 744 00:39:09,610 --> 00:39:13,530 Speaker 2: without the work of Jacob Weisberg, Ryan Dilly, Gretta Cohne, 745 00:39:14,010 --> 00:39:19,330 Speaker 2: Vital Mollard, John Schnaz. Eric's Handler, Carrie Brody and Christina Sullivan. 746 00:39:20,130 --> 00:39:24,730 Speaker 2: Cautionary Tales is a production of Pushkin Industries. It's recorded 747 00:39:24,770 --> 00:39:28,490 Speaker 2: at Wardoor Studios in London by Tom Berry. If you 748 00:39:28,730 --> 00:39:32,890 Speaker 2: like the show, please remember to share, rate and review, 749 00:39:33,490 --> 00:39:35,610 Speaker 2: tell your friends and if you want to hear the 750 00:39:35,650 --> 00:39:39,090 Speaker 2: show ad free, sign up for Pushkin Plus on the 751 00:39:39,210 --> 00:39:43,210 Speaker 2: show page in Apple Podcasts or at pushkin dot Fm, 752 00:39:43,610 --> 00:39:44,610 Speaker 2: slash plus