1 00:00:03,040 --> 00:00:05,320 Speaker 1: Welcome to Stuff to Blow Your Mind, the production of 2 00:00:05,320 --> 00:00:14,840 Speaker 1: My Heart Radio. Hey, welcome to Stuff to Blow Your Mind. 3 00:00:15,000 --> 00:00:18,040 Speaker 1: My name is Robert Lamb, and I'm Joe McCormick. And 4 00:00:18,160 --> 00:00:21,079 Speaker 1: in today's episode, we are going to be focusing on 5 00:00:21,120 --> 00:00:24,400 Speaker 1: a topic that is already something that's very well known 6 00:00:24,600 --> 00:00:28,680 Speaker 1: to people who are familiar with quantitative research and statistics, 7 00:00:28,760 --> 00:00:32,159 Speaker 1: but less known to the general public. And uh and 8 00:00:32,200 --> 00:00:34,479 Speaker 1: I think that's a tragedy because it's an idea that 9 00:00:34,479 --> 00:00:39,000 Speaker 1: should really be part of everybody's basic critical thinking toolkit, 10 00:00:39,080 --> 00:00:42,240 Speaker 1: no matter what your job is. And so in order 11 00:00:42,280 --> 00:00:44,840 Speaker 1: to introduce this concept, I thought it would be best 12 00:00:44,880 --> 00:00:47,760 Speaker 1: to start with a with a direct illustration from the 13 00:00:47,800 --> 00:00:52,959 Speaker 1: real world of people reaching incorrect conclusions by not understanding 14 00:00:53,000 --> 00:00:55,960 Speaker 1: the subject of today's episode. And so the illustration I 15 00:00:56,000 --> 00:00:59,000 Speaker 1: want to start with is an interesting story told by 16 00:00:59,200 --> 00:01:03,240 Speaker 1: the psycholo, just Daniel Kaneman, that's about the illusory power 17 00:01:03,360 --> 00:01:07,560 Speaker 1: of screaming at pilots. Uh. So, the context of the 18 00:01:07,600 --> 00:01:10,480 Speaker 1: story is that Knemon says he was giving a lecture 19 00:01:10,680 --> 00:01:15,360 Speaker 1: about positive reinforcement to a group of flight instructures. I 20 00:01:15,360 --> 00:01:18,840 Speaker 1: think this was in the nineteen sixties, and Kaneman was 21 00:01:18,959 --> 00:01:22,600 Speaker 1: trying to inform them about what he believed at the 22 00:01:22,600 --> 00:01:25,839 Speaker 1: time was the best consensus of scientific research on learning 23 00:01:25,880 --> 00:01:29,640 Speaker 1: and reinforcement, which was at the time that if these 24 00:01:29,640 --> 00:01:33,280 Speaker 1: flight instructors wanted their students to have the best possible outcomes, 25 00:01:33,600 --> 00:01:36,880 Speaker 1: they should focus more on praising the students when they 26 00:01:36,880 --> 00:01:39,560 Speaker 1: did well than on chewing them out when they did 27 00:01:39,600 --> 00:01:43,880 Speaker 1: something wrong. And Kneman says that when he finished his talk, 28 00:01:44,319 --> 00:01:46,440 Speaker 1: one of the flight instructors that he had been giving 29 00:01:46,440 --> 00:01:49,360 Speaker 1: this lecture two got up and tried to dispute him. 30 00:01:49,360 --> 00:01:52,240 Speaker 1: He said, no, you're wrong, And so the direct quote 31 00:01:52,240 --> 00:01:55,880 Speaker 1: Economy gives from the instructor here is on many occasions, 32 00:01:55,920 --> 00:01:59,360 Speaker 1: I have praised flight cadets for clean execution of some 33 00:01:59,480 --> 00:02:02,560 Speaker 1: aero attic maneuver, and in general when they try it 34 00:02:02,600 --> 00:02:06,040 Speaker 1: again they do worse. On the other hand, I've often 35 00:02:06,160 --> 00:02:10,359 Speaker 1: screamed at cadets for bad execution, and in general they 36 00:02:10,440 --> 00:02:13,280 Speaker 1: do better the next time. So please don't tell us 37 00:02:13,320 --> 00:02:17,320 Speaker 1: that reinforcement works and punishment does not, because the opposite 38 00:02:17,440 --> 00:02:20,160 Speaker 1: is the case. So you might think he has a 39 00:02:20,160 --> 00:02:23,080 Speaker 1: good point here if you accept that this flight instructor 40 00:02:23,160 --> 00:02:26,359 Speaker 1: has had a lot of direct experience working with students, 41 00:02:26,800 --> 00:02:30,320 Speaker 1: and you trust him to remember the relative frequency of 42 00:02:30,360 --> 00:02:33,720 Speaker 1: these events pretty well, you might assume that he has 43 00:02:33,760 --> 00:02:37,320 Speaker 1: a meaningful rebuke to ekonom In. Here again, he says 44 00:02:37,400 --> 00:02:39,800 Speaker 1: that most of the time, after a cadet does something 45 00:02:39,919 --> 00:02:42,600 Speaker 1: bad and he screams at them, they do better the 46 00:02:42,639 --> 00:02:45,560 Speaker 1: next time. And after a cadet does something good and 47 00:02:45,639 --> 00:02:48,600 Speaker 1: he praises them, they actually do worse the next time. 48 00:02:49,040 --> 00:02:52,359 Speaker 1: So if he's remembering these experiences correctly, and he's had 49 00:02:52,400 --> 00:02:55,160 Speaker 1: a lot of them, it would really seem like evidence 50 00:02:55,240 --> 00:02:59,160 Speaker 1: that praise has a negative effect on learning, maybe by 51 00:02:59,280 --> 00:03:03,040 Speaker 1: making the student pilots soft and overconfident or something, and 52 00:03:03,080 --> 00:03:06,639 Speaker 1: getting chewed out is good for skill development. I think 53 00:03:06,680 --> 00:03:08,880 Speaker 1: it's quite easy to see the allure of this, this 54 00:03:09,000 --> 00:03:12,680 Speaker 1: false conclusion, right right, And it's and you can also 55 00:03:12,680 --> 00:03:15,120 Speaker 1: easily imagine how you kind of build upon this with 56 00:03:15,280 --> 00:03:20,240 Speaker 1: certain loosely backed up you know, folk ideas about how 57 00:03:20,280 --> 00:03:22,839 Speaker 1: you encourage people and how people learn, and you got 58 00:03:22,840 --> 00:03:24,360 Speaker 1: to stay on them if they if you tell them 59 00:03:24,360 --> 00:03:27,200 Speaker 1: they're doing a good job, they'll get lazy, right, folk wisdom, 60 00:03:27,280 --> 00:03:30,680 Speaker 1: tough guy mentality. Yeah, But Knemon saw something different in 61 00:03:30,680 --> 00:03:33,920 Speaker 1: this response, and he says that he immediately set up 62 00:03:33,919 --> 00:03:38,040 Speaker 1: an experiment on the spot to demonstrate the flaw in 63 00:03:38,120 --> 00:03:41,040 Speaker 1: the flight instructors thinking here, so I want to read 64 00:03:41,080 --> 00:03:45,560 Speaker 1: from Knomen's description, he says, I immediately arranged a demonstration 65 00:03:45,600 --> 00:03:49,240 Speaker 1: in which each participant tossed two coins at a target 66 00:03:49,320 --> 00:03:53,880 Speaker 1: behind his back without any feedback. We measured the distances 67 00:03:53,920 --> 00:03:56,280 Speaker 1: from the target and could see that those who had 68 00:03:56,320 --> 00:04:01,080 Speaker 1: done best the first time had mostly deteriorated their second try, 69 00:04:01,120 --> 00:04:04,680 Speaker 1: and vice versa. But I knew that this demonstration would 70 00:04:04,680 --> 00:04:10,120 Speaker 1: not undo the effects of lifelong exposure to a perverse contingency. 71 00:04:10,720 --> 00:04:13,440 Speaker 1: So to explain this, this experiment a little bit better, right, 72 00:04:13,440 --> 00:04:16,120 Speaker 1: he has people stand with their backs to a target 73 00:04:16,240 --> 00:04:18,560 Speaker 1: so they couldn't see it, and they would take two 74 00:04:18,560 --> 00:04:21,719 Speaker 1: attempts to throw a coin and hit the target without 75 00:04:21,800 --> 00:04:24,240 Speaker 1: any feedback of any kind. So they're not getting praised, 76 00:04:24,279 --> 00:04:27,839 Speaker 1: they're not getting chewed out, nothing. Uh. And after staging 77 00:04:27,880 --> 00:04:30,560 Speaker 1: a number of these, he found again what he suspected, 78 00:04:30,839 --> 00:04:33,360 Speaker 1: that the people who were the closest on the first 79 00:04:33,400 --> 00:04:36,640 Speaker 1: throw did worse on their second throw, and the people 80 00:04:36,640 --> 00:04:39,839 Speaker 1: who were farthest away on their first throw tended to 81 00:04:39,880 --> 00:04:43,560 Speaker 1: do better on the second throw. So what kandiment is 82 00:04:43,560 --> 00:04:47,200 Speaker 1: actually demonstrating here is something that doesn't really have anything 83 00:04:47,240 --> 00:04:51,320 Speaker 1: to do with learning or reinforcement, or really skills or 84 00:04:51,360 --> 00:04:56,280 Speaker 1: even human psychology. Instead, this demonstration is showing the effects 85 00:04:56,320 --> 00:05:00,640 Speaker 1: of chance, luck, and statistics. What he was showing is 86 00:05:00,720 --> 00:05:05,240 Speaker 1: the subject we're talking about today, regression to the mean. Uh. 87 00:05:05,520 --> 00:05:07,719 Speaker 1: You'll you'll see that phrase a lot in in scientific 88 00:05:07,839 --> 00:05:10,200 Speaker 1: literature and in statistics. But if it helps to put 89 00:05:10,240 --> 00:05:13,880 Speaker 1: it in more everyday terms, anytime you see regression to 90 00:05:13,960 --> 00:05:16,440 Speaker 1: the mean, you can translate it in your head as 91 00:05:16,680 --> 00:05:21,479 Speaker 1: trending toward the average, trending toward the average. So, to 92 00:05:21,560 --> 00:05:26,440 Speaker 1: make the coin tossing illustration even clearer, imagine you throw 93 00:05:26,480 --> 00:05:28,719 Speaker 1: the coin not twice, but that you throw the coin 94 00:05:28,880 --> 00:05:31,920 Speaker 1: a hundred times. So you stand there throwing the coin 95 00:05:32,000 --> 00:05:35,160 Speaker 1: a hundred times. And then let's say afterwards you average 96 00:05:35,200 --> 00:05:39,920 Speaker 1: together the distance from the target across all a hundred throws, 97 00:05:40,320 --> 00:05:42,880 Speaker 1: and you'll come up with some kind of average distance 98 00:05:42,920 --> 00:05:45,520 Speaker 1: from target. Uh, just to make up a number for 99 00:05:45,520 --> 00:05:47,800 Speaker 1: the sake of argument. Common doesn't give this. But let's 100 00:05:47,800 --> 00:05:50,560 Speaker 1: say the average distance from the target across all your 101 00:05:50,600 --> 00:05:54,240 Speaker 1: throws is nine centimeters. And remember that you're getting no 102 00:05:54,440 --> 00:05:57,000 Speaker 1: feedback at all. Here, so it's unlikely that you will 103 00:05:57,040 --> 00:06:00,280 Speaker 1: be getting much better as you go on. So even 104 00:06:00,320 --> 00:06:03,440 Speaker 1: that the average distance from the target is nine cimeters, 105 00:06:03,680 --> 00:06:05,719 Speaker 1: if you throw a coin once and it happens to 106 00:06:05,760 --> 00:06:09,800 Speaker 1: be two centimeters from the targets really close, is your 107 00:06:09,839 --> 00:06:12,760 Speaker 1: next throw likely to be about the same as that one, 108 00:06:13,080 --> 00:06:17,720 Speaker 1: better or worse. Obviously, it is overwhelmingly likely that your 109 00:06:17,720 --> 00:06:21,320 Speaker 1: next throw will be worse, just due to chance, probably 110 00:06:21,440 --> 00:06:25,479 Speaker 1: closer to the average of nine cimeters away. And the 111 00:06:25,520 --> 00:06:27,840 Speaker 1: same goes for throws that are really far off. If 112 00:06:27,839 --> 00:06:31,920 Speaker 1: you throw something three hundred centimeters off, your next random toss, 113 00:06:32,080 --> 00:06:35,840 Speaker 1: just by chance, is likely to be much better, much closer. So, 114 00:06:35,880 --> 00:06:39,799 Speaker 1: simply put, most of the time, if you're sampling something 115 00:06:39,960 --> 00:06:44,120 Speaker 1: in a series over time, if one sample produces an 116 00:06:44,120 --> 00:06:48,520 Speaker 1: extreme value, the next one in the series is more 117 00:06:48,560 --> 00:06:52,039 Speaker 1: likely to be closer to the average instead of extreme 118 00:06:52,120 --> 00:06:55,480 Speaker 1: in the same way. In my experience, Uh, this is 119 00:06:55,600 --> 00:06:58,320 Speaker 1: This is why it can sometimes be liberating to start 120 00:06:58,360 --> 00:07:02,240 Speaker 1: off a game of bowling with just a disastrous gutter ball, 121 00:07:02,720 --> 00:07:06,159 Speaker 1: because because I know that I'm good enough that that's 122 00:07:06,160 --> 00:07:08,320 Speaker 1: probably not gonna happen twice in a row, but it's 123 00:07:08,360 --> 00:07:10,560 Speaker 1: definitely going to happen at some point in the game 124 00:07:10,600 --> 00:07:14,520 Speaker 1: because I'm not that good, you know, I like playing 125 00:07:14,640 --> 00:07:16,920 Speaker 1: you know, once a year or even with less frequency 126 00:07:17,040 --> 00:07:19,960 Speaker 1: these days. Oh yeah. And it's also like why I 127 00:07:20,000 --> 00:07:22,000 Speaker 1: think a lot of us have intuitions that when you 128 00:07:22,040 --> 00:07:24,480 Speaker 1: try something for the first time and you do really 129 00:07:24,520 --> 00:07:27,080 Speaker 1: good on the first attempt, that makes you kind of 130 00:07:27,120 --> 00:07:29,760 Speaker 1: nervous because you just know you're probably not gonna live 131 00:07:29,840 --> 00:07:32,200 Speaker 1: up to that repeatedly. Yeah, like if you get you 132 00:07:32,200 --> 00:07:35,160 Speaker 1: get a strike that first time, then that that first 133 00:07:35,400 --> 00:07:37,400 Speaker 1: um what is it round? I can't even remember. This 134 00:07:37,400 --> 00:07:41,000 Speaker 1: is how frequently I bowl. Um, the first role. So 135 00:07:41,120 --> 00:07:44,880 Speaker 1: the first role first, the first column, you know. So 136 00:07:44,960 --> 00:07:48,480 Speaker 1: the tendency of regression to the mean or or trending 137 00:07:48,520 --> 00:07:51,480 Speaker 1: towards the average is pretty obvious when you're dealing with 138 00:07:51,520 --> 00:07:55,400 Speaker 1: something like lots of random coin tosses with no feedback, 139 00:07:56,080 --> 00:07:59,200 Speaker 1: But it becomes much more obscure when you're dealing with, 140 00:07:59,240 --> 00:08:02,840 Speaker 1: say a more a more limited numbers of outcomes. In 141 00:08:02,880 --> 00:08:08,280 Speaker 1: the series you're looking at and introducing possibly influential variables 142 00:08:08,320 --> 00:08:12,360 Speaker 1: like pilot skill and instructor feedback. After all, we would 143 00:08:12,360 --> 00:08:16,320 Speaker 1: expect that some variables having to do with instructor feedback 144 00:08:16,640 --> 00:08:19,520 Speaker 1: should have an effect on pilot skill, right, That's the 145 00:08:19,560 --> 00:08:22,000 Speaker 1: point of teaching is to have an effect over time. 146 00:08:22,400 --> 00:08:25,400 Speaker 1: And after all, in this one scenario, the Konomen describes 147 00:08:25,480 --> 00:08:29,080 Speaker 1: the the instructor believed that his verbal abuse of the 148 00:08:29,120 --> 00:08:33,120 Speaker 1: students was so motivating that it made them instantly better 149 00:08:33,160 --> 00:08:36,480 Speaker 1: on the stick. And you can't necessarily rule that out, 150 00:08:36,920 --> 00:08:40,720 Speaker 1: but it's unlikely. I think. I'm convinced that regression to 151 00:08:40,760 --> 00:08:45,000 Speaker 1: the mean could more easily explain this flight instructor's belief 152 00:08:45,040 --> 00:08:48,280 Speaker 1: that screaming at pilots for screw ups made them better 153 00:08:48,320 --> 00:08:51,360 Speaker 1: at planes, because, again, on average, even in the absence 154 00:08:51,400 --> 00:08:54,760 Speaker 1: of any feedback at all, if a pilot in training 155 00:08:55,040 --> 00:09:00,000 Speaker 1: executes a maneuver perfectly, the random fluctuation from one execut 156 00:09:00,040 --> 00:09:02,400 Speaker 1: usian to the next will tend to mean that their 157 00:09:02,480 --> 00:09:05,439 Speaker 1: next attempt probably won't be as good as that really 158 00:09:05,440 --> 00:09:08,079 Speaker 1: good when the last time. And likewise, if they make 159 00:09:08,120 --> 00:09:11,640 Speaker 1: a major error totally botcha maneuver, they're more likely to 160 00:09:11,679 --> 00:09:14,600 Speaker 1: do better the next time just by chance. Both of 161 00:09:14,640 --> 00:09:18,160 Speaker 1: these tendencies are regression towards the mean. But then Conomon 162 00:09:18,200 --> 00:09:21,920 Speaker 1: actually draws a really interesting observation about about about our 163 00:09:21,960 --> 00:09:25,880 Speaker 1: psychology and about culture from this fact, so, to quote 164 00:09:25,920 --> 00:09:29,240 Speaker 1: him directly, this was a joyous moment in which I 165 00:09:29,320 --> 00:09:33,160 Speaker 1: understood an important truth about the world. Because we tend 166 00:09:33,200 --> 00:09:36,640 Speaker 1: to reward others when they do well and punish them 167 00:09:36,640 --> 00:09:39,920 Speaker 1: when they do badly, and because there is regression to 168 00:09:40,000 --> 00:09:43,360 Speaker 1: the mean, it is part of the human condition that 169 00:09:43,400 --> 00:09:47,760 Speaker 1: we are statistically punished for rewarding others and rewarded for 170 00:09:47,840 --> 00:09:50,520 Speaker 1: punishing them. And that was one of those things that 171 00:09:50,559 --> 00:09:52,720 Speaker 1: when I read it, I was just like, oh my god, 172 00:09:53,000 --> 00:09:57,800 Speaker 1: that's so true. Um, yeah, yeah, And in this specific instance, 173 00:09:58,280 --> 00:10:00,920 Speaker 1: it makes me think about the special fact of reversion 174 00:10:00,960 --> 00:10:04,880 Speaker 1: to the mean, fallacies on motivating belief in the effectiveness 175 00:10:04,960 --> 00:10:07,520 Speaker 1: of of not just screaming at pilots in this one case, 176 00:10:07,559 --> 00:10:12,319 Speaker 1: but all kinds of punishment behaviors, for example, corporal punishment. 177 00:10:12,760 --> 00:10:15,120 Speaker 1: Thankfully you hear this less often these days, but I 178 00:10:15,120 --> 00:10:18,000 Speaker 1: remember when I was younger, I used to hear people 179 00:10:18,000 --> 00:10:22,480 Speaker 1: who would defend the parental practice of spanking children by saying, 180 00:10:22,480 --> 00:10:24,120 Speaker 1: you know, I don't I don't care what the site 181 00:10:24,160 --> 00:10:26,440 Speaker 1: scientists say. I don't care what the research says. I 182 00:10:26,520 --> 00:10:30,280 Speaker 1: know from experience that it works to the extent that 183 00:10:30,320 --> 00:10:33,400 Speaker 1: comments like this were based on any real experience and 184 00:10:33,440 --> 00:10:36,040 Speaker 1: observation and not just sort of a free form, self 185 00:10:36,120 --> 00:10:39,280 Speaker 1: justifying statement that had nothing to do with experience. I 186 00:10:39,320 --> 00:10:43,080 Speaker 1: bet a lot of it was fallacious inference of causation 187 00:10:43,240 --> 00:10:46,080 Speaker 1: actually based on regression to the mean, just like in 188 00:10:46,120 --> 00:10:49,040 Speaker 1: this commument example. But anyway, I thought it would be 189 00:10:49,080 --> 00:10:51,960 Speaker 1: interesting to talk a bit more about regression to the 190 00:10:52,000 --> 00:10:54,400 Speaker 1: mean today because it's one of those things that, again, 191 00:10:54,480 --> 00:10:57,600 Speaker 1: once you see it, it's it's pretty simple, it's actually 192 00:10:57,679 --> 00:11:01,080 Speaker 1: actually pretty clear. But understanding it can help you have 193 00:11:01,120 --> 00:11:04,000 Speaker 1: a better sense of how good science works and help 194 00:11:04,160 --> 00:11:08,960 Speaker 1: keep you from drawing hasty inferences in everyday life. Yeah, 195 00:11:09,040 --> 00:11:12,240 Speaker 1: because it is. It is interesting how kind of an 196 00:11:12,320 --> 00:11:15,480 Speaker 1: insidious the results can be, the idea that that again, 197 00:11:16,240 --> 00:11:19,719 Speaker 1: praise is ultimately punished because there's going to be a 198 00:11:19,800 --> 00:11:22,240 Speaker 1: regression to the mean, to to to to the mean, 199 00:11:22,600 --> 00:11:26,560 Speaker 1: and then likewise there can be this illusion, uh that 200 00:11:27,640 --> 00:11:30,360 Speaker 1: uh that's screaming at pilots and so forth is going 201 00:11:30,400 --> 00:11:33,840 Speaker 1: to be the successful way to go about things. Um So, yeah, 202 00:11:33,880 --> 00:11:35,599 Speaker 1: this is I think this is an important episode to 203 00:11:35,640 --> 00:11:37,679 Speaker 1: cover because it's the kind of thing that it's the 204 00:11:37,760 --> 00:11:39,720 Speaker 1: kind of tool you kind of need tucked in your 205 00:11:39,720 --> 00:11:42,559 Speaker 1: back pocket, even if you're just doing something like like 206 00:11:42,640 --> 00:11:45,800 Speaker 1: scanning science headlines on a you know, a news server 207 00:11:46,120 --> 00:11:50,640 Speaker 1: or social media message board. Yeah, because of course, understanding 208 00:11:50,640 --> 00:11:54,480 Speaker 1: regression to the mean is extremely important in what scientists 209 00:11:54,480 --> 00:11:58,640 Speaker 1: do when they design good experiments. If you don't take 210 00:11:58,679 --> 00:12:01,960 Speaker 1: into account regression to the mean, you can incorrectly believe 211 00:12:02,000 --> 00:12:05,079 Speaker 1: you have discovered some kind of tiger repellent or something. Uh. 212 00:12:05,120 --> 00:12:08,000 Speaker 1: This concern plays a huge role in the history of medicine. 213 00:12:08,400 --> 00:12:10,760 Speaker 1: It's part of the design of good medical research, or 214 00:12:10,840 --> 00:12:15,040 Speaker 1: really any field that seeks to find remedies for problems. 215 00:12:15,679 --> 00:12:19,719 Speaker 1: So consider a very basic hypothetical, uh patent medicine, say 216 00:12:19,800 --> 00:12:21,880 Speaker 1: from a hundred years ago. So you know, you have 217 00:12:22,000 --> 00:12:25,600 Speaker 1: you have a foot pain that you've never really had before. Uh. 218 00:12:26,120 --> 00:12:27,600 Speaker 1: You know, you want it to go away. So you 219 00:12:27,679 --> 00:12:29,960 Speaker 1: go to the store and you buy a bottle of 220 00:12:30,080 --> 00:12:34,000 Speaker 1: Doctor Field Grades No Fail Pantasy for tumors, ulcers, cramps, 221 00:12:34,000 --> 00:12:37,319 Speaker 1: and rooms, and you you pull the cork out, you 222 00:12:37,400 --> 00:12:40,120 Speaker 1: chug it, and then the next day your foot feels better. 223 00:12:40,720 --> 00:12:44,040 Speaker 1: Now you can conclude from this that the Doctor Field 224 00:12:44,040 --> 00:12:47,320 Speaker 1: Grades cured you. But how do you know actually that 225 00:12:47,400 --> 00:12:50,440 Speaker 1: the feelings in your foot didn't just regress to the mean, 226 00:12:50,640 --> 00:12:53,800 Speaker 1: because the average is a low amount or no amount 227 00:12:53,800 --> 00:12:56,480 Speaker 1: of foot pain. And if you don't have a medication 228 00:12:56,520 --> 00:13:00,240 Speaker 1: that's tested with control groups and and randomized allocation into 229 00:13:00,240 --> 00:13:03,320 Speaker 1: the groups, then how do you know that that the 230 00:13:03,360 --> 00:13:06,800 Speaker 1: medicine actually did anything at all? Yeah? Yeah, So many 231 00:13:06,840 --> 00:13:09,199 Speaker 1: of the examples you see for this and the applications, 232 00:13:09,240 --> 00:13:12,240 Speaker 1: you're dealing with some sort of situation in the world 233 00:13:12,320 --> 00:13:17,120 Speaker 1: where there is fluctuation and or change happening, often separately 234 00:13:17,200 --> 00:13:19,200 Speaker 1: from whatever is being tested. So in this case, yeah, 235 00:13:19,200 --> 00:13:22,400 Speaker 1: the Doctor Field Greats could have just been like just water, 236 00:13:22,640 --> 00:13:25,080 Speaker 1: It just just you know, but there is the illusion 237 00:13:25,120 --> 00:13:28,040 Speaker 1: that it worked because things got better. But if you 238 00:13:28,040 --> 00:13:30,199 Speaker 1: don't have a control group and to you know, to 239 00:13:30,280 --> 00:13:31,800 Speaker 1: drive home what that is, that would be like if 240 00:13:31,840 --> 00:13:34,360 Speaker 1: you had a had like three different groups and a 241 00:13:34,400 --> 00:13:38,120 Speaker 1: study of Doctor Field Greats elixir. Here, one group was 242 00:13:38,200 --> 00:13:42,360 Speaker 1: taking Doctor Feel Greats elixer, another group was taking I 243 00:13:42,440 --> 00:13:44,480 Speaker 1: don't know, let's say a half dose of Feel Grade 244 00:13:44,640 --> 00:13:48,280 Speaker 1: or maybe a competitor's tonic. And then one group, the 245 00:13:48,360 --> 00:13:52,760 Speaker 1: control group was taking nothing was or was taking you know, 246 00:13:52,880 --> 00:13:56,600 Speaker 1: just water or something to that effect something completely innate. Uh. 247 00:13:56,720 --> 00:14:00,680 Speaker 1: And that would be that would be a a group 248 00:14:00,720 --> 00:14:04,000 Speaker 1: that you would judge the results of the other categories by, right, 249 00:14:04,080 --> 00:14:06,520 Speaker 1: And you would need to randomly sort the people into 250 00:14:06,559 --> 00:14:08,959 Speaker 1: those groups. So it wasn't just that, you know, the 251 00:14:09,360 --> 00:14:12,640 Speaker 1: only the people with real severe foot pain. We're taking 252 00:14:12,720 --> 00:14:15,760 Speaker 1: the doctor field grades, because the more extreme their pain 253 00:14:15,840 --> 00:14:18,880 Speaker 1: to begin with, probably the more likely they are to 254 00:14:19,160 --> 00:14:22,200 Speaker 1: have that pain be lessened or go away over time, 255 00:14:22,360 --> 00:14:25,160 Speaker 1: just naturally. Right. And uh. And I'm going to have 256 00:14:25,240 --> 00:14:27,560 Speaker 1: a more specific example of this a little later in 257 00:14:27,600 --> 00:14:29,400 Speaker 1: the podcast. So if you if you still don't get it, 258 00:14:29,520 --> 00:14:31,880 Speaker 1: just hang on. We'll we'll have another example in a bit. 259 00:14:38,280 --> 00:14:40,520 Speaker 1: I was looking at an article in the British Medical 260 00:14:40,600 --> 00:14:44,200 Speaker 1: Journal from nine that was just a collection of different 261 00:14:44,240 --> 00:14:48,240 Speaker 1: examples of regression to the mean in real life medical research. 262 00:14:48,360 --> 00:14:51,840 Speaker 1: This was by j Martin Bland and Douglas J. Altman 263 00:14:52,480 --> 00:14:55,880 Speaker 1: called statistics notes some examples of regression towards the mean, 264 00:14:56,560 --> 00:14:59,040 Speaker 1: and they point out a very common type of example, 265 00:14:59,160 --> 00:15:01,120 Speaker 1: so the this will be similar to what we just 266 00:15:01,160 --> 00:15:04,600 Speaker 1: talked about. The author's right. In clinical practice, there are 267 00:15:04,680 --> 00:15:09,680 Speaker 1: many measurements such as weight, serum cholesterol concentration, or blood pressure, 268 00:15:10,120 --> 00:15:13,840 Speaker 1: for which particularly high or low values are signs of 269 00:15:14,000 --> 00:15:18,480 Speaker 1: underlying disease or risk factors for disease. People with extreme 270 00:15:18,680 --> 00:15:21,800 Speaker 1: values of the measurements, such as high blood pressure may 271 00:15:21,880 --> 00:15:24,440 Speaker 1: be treated to bring their values closer to the mean. 272 00:15:25,000 --> 00:15:27,400 Speaker 1: If they are measured again, we will observe that the 273 00:15:27,520 --> 00:15:30,120 Speaker 1: mean of the extreme group is now closer to the 274 00:15:30,200 --> 00:15:33,280 Speaker 1: mean of the whole population. That is, it is reduced. 275 00:15:33,800 --> 00:15:36,520 Speaker 1: This should not be interpreted as showing the effect of 276 00:15:36,560 --> 00:15:39,760 Speaker 1: the treatment. Even if subjects are not treated, the mean 277 00:15:39,840 --> 00:15:43,560 Speaker 1: blood pressure will go down owing to regression towards the means. 278 00:15:43,600 --> 00:15:47,040 Speaker 1: So again, something starts with an extreme value in certain 279 00:15:47,080 --> 00:15:49,600 Speaker 1: types of cases, you would just expect it to have 280 00:15:49,720 --> 00:15:54,280 Speaker 1: a less extreme value the next time due to random fluctuation. 281 00:15:55,280 --> 00:15:57,480 Speaker 1: Uh So again, you know this could fill you with 282 00:15:57,600 --> 00:16:00,320 Speaker 1: despair because you might wonder, well, then how could you 283 00:16:00,400 --> 00:16:03,320 Speaker 1: ever know if a treatment was effective or not. But again, 284 00:16:03,400 --> 00:16:06,160 Speaker 1: this is where the standard practices of science based medicine 285 00:16:06,200 --> 00:16:09,040 Speaker 1: come to play. Instead of just taking people with some 286 00:16:09,200 --> 00:16:13,400 Speaker 1: extreme measurement and giving them a treatment, you randomize them 287 00:16:13,520 --> 00:16:15,800 Speaker 1: into test groups and control groups like we were just 288 00:16:15,880 --> 00:16:18,000 Speaker 1: talking about. So if you have a large enough sample, 289 00:16:18,360 --> 00:16:21,520 Speaker 1: you properly randomize the groups. People with the extreme starting 290 00:16:21,560 --> 00:16:24,840 Speaker 1: conditions will somewhat regress toward the mean, but they will 291 00:16:24,880 --> 00:16:27,800 Speaker 1: all regress toward the mean on average the same rate 292 00:16:27,920 --> 00:16:31,120 Speaker 1: whether they're receiving a real potential treatment or they're in 293 00:16:31,160 --> 00:16:34,480 Speaker 1: the placebo group. But if the treatment actually does something helpful, 294 00:16:34,680 --> 00:16:38,080 Speaker 1: this effect will manifest as the difference between the two groups. 295 00:16:38,760 --> 00:16:42,320 Speaker 1: So good scientific research good medical research has methods for 296 00:16:42,400 --> 00:16:45,360 Speaker 1: excluding the effects of reversion to the mean on their findings. 297 00:16:45,480 --> 00:16:49,160 Speaker 1: We have the tools, but we can still fall into 298 00:16:49,240 --> 00:16:53,080 Speaker 1: the trap of regression to the mean fallacies, especially in 299 00:16:53,120 --> 00:16:56,040 Speaker 1: our day to day lives drawing inferences the way that 300 00:16:56,240 --> 00:16:59,520 Speaker 1: that the pilot and in common story did, or or 301 00:16:59,600 --> 00:17:03,280 Speaker 1: even science if we're not careful and deliberate about designing experiments. 302 00:17:03,800 --> 00:17:07,120 Speaker 1: And in addition to just a methodology design that has 303 00:17:07,520 --> 00:17:10,840 Speaker 1: you know, randomized groups and control groups, there are also 304 00:17:10,920 --> 00:17:13,600 Speaker 1: ways of trying to counteract regression to the mean, just 305 00:17:13,760 --> 00:17:17,119 Speaker 1: through statistical methods that are maybe less reliable, But there 306 00:17:17,119 --> 00:17:20,800 Speaker 1: are statistical methods people can use to try to apply 307 00:17:21,080 --> 00:17:24,960 Speaker 1: sort of modifiers to data in order to estimate regression 308 00:17:25,040 --> 00:17:28,639 Speaker 1: to the mean and UH and counteract its effects. So 309 00:17:28,680 --> 00:17:31,920 Speaker 1: again we have tools within scientific research to to figure 310 00:17:31,960 --> 00:17:34,320 Speaker 1: this out, and it's a lot of what science does 311 00:17:34,960 --> 00:17:37,639 Speaker 1: is trying to sort out the difference between regression to 312 00:17:37,720 --> 00:17:41,159 Speaker 1: the mean and actual effects of interventions. But in our 313 00:17:41,240 --> 00:17:43,560 Speaker 1: day to day lives, we still fall for regression to 314 00:17:43,640 --> 00:17:46,640 Speaker 1: the mean fallacies all the time. Yeah, and it's important 315 00:17:46,680 --> 00:17:49,240 Speaker 1: to realize too that it's not just a situation where 316 00:17:49,320 --> 00:17:53,200 Speaker 1: regression towards the mean could create an illusion of something 317 00:17:53,320 --> 00:17:55,600 Speaker 1: working when it doesn't. Uh, you know, sometimes it can 318 00:17:55,680 --> 00:18:02,040 Speaker 1: just potentially overstate um the effects something. For an example 319 00:18:02,119 --> 00:18:04,800 Speaker 1: of that that I was looking at was that regression 320 00:18:04,840 --> 00:18:06,879 Speaker 1: towards the mean or the failure to account for it 321 00:18:07,160 --> 00:18:11,040 Speaker 1: can also overstate the effectiveness of something like traffic light cameras. 322 00:18:11,720 --> 00:18:15,360 Speaker 1: Is it making a difference and cutting down on accidents perhaps, 323 00:18:15,840 --> 00:18:20,520 Speaker 1: but any actual effectiveness could potentially be overstated by failure 324 00:18:20,600 --> 00:18:23,520 Speaker 1: to account for just regression towards the mean. Oh yeah, 325 00:18:23,680 --> 00:18:26,480 Speaker 1: So where do you tend to install things like that? 326 00:18:27,440 --> 00:18:30,520 Speaker 1: High acts like problem areas? Right, So, if there's like 327 00:18:30,560 --> 00:18:33,800 Speaker 1: a stretch of road that has a lot of problems 328 00:18:33,920 --> 00:18:36,640 Speaker 1: on it, people really speeding a lot there or crashing 329 00:18:36,680 --> 00:18:39,760 Speaker 1: a lot there, that might be where you stage the intervention. 330 00:18:40,160 --> 00:18:43,840 Speaker 1: It's possible some things like that fluctuate naturally over time 331 00:18:43,920 --> 00:18:47,160 Speaker 1: in different locations. Yeah, and you put the cameras in place, 332 00:18:47,280 --> 00:18:49,520 Speaker 1: and it could have an effect, but maybe not as 333 00:18:49,600 --> 00:18:52,600 Speaker 1: much of an effect as it looks like it is 334 00:18:52,680 --> 00:18:56,720 Speaker 1: taking place. Again, if you don't factor regression towards the 335 00:18:56,760 --> 00:19:01,040 Speaker 1: mean into the study. Right now, While our TM is 336 00:19:01,080 --> 00:19:04,040 Speaker 1: a very important phenomenon to understand and take into account, 337 00:19:04,119 --> 00:19:07,960 Speaker 1: it certainly doesn't apply to every sequence of values you 338 00:19:08,040 --> 00:19:11,359 Speaker 1: could repeatedly sample, So you also have to be careful 339 00:19:11,480 --> 00:19:15,000 Speaker 1: not to apply it in situations where it isn't warranted. 340 00:19:15,760 --> 00:19:17,920 Speaker 1: I was you know, there are a million examples you 341 00:19:17,960 --> 00:19:20,000 Speaker 1: could cite. One that came to my mind is the 342 00:19:20,160 --> 00:19:22,600 Speaker 1: orbital decay of a satellite. Let's say you've got a 343 00:19:22,600 --> 00:19:26,280 Speaker 1: communication satellite in lower orbit and you get a reading 344 00:19:26,320 --> 00:19:29,040 Speaker 1: on its altitude and the reading is lower than the 345 00:19:29,119 --> 00:19:33,560 Speaker 1: satellites average altitude. Uh. Now, you might say, hey, I 346 00:19:33,840 --> 00:19:36,040 Speaker 1: think this means we need to program a reboost to 347 00:19:36,240 --> 00:19:38,920 Speaker 1: insert it back into the orbit where it's supposed to be. 348 00:19:39,640 --> 00:19:43,360 Speaker 1: And somebody could erroneously apply regression to the mean here 349 00:19:43,800 --> 00:19:45,639 Speaker 1: and say, no, we don't need to do that. The 350 00:19:45,680 --> 00:19:49,000 Speaker 1: satellite might just return to its average altitude. It doesn't 351 00:19:49,040 --> 00:19:51,920 Speaker 1: apply in this scenario, even though you are taking repeated 352 00:19:52,000 --> 00:19:55,359 Speaker 1: measurements of a value over time, because we know things 353 00:19:55,480 --> 00:19:59,800 Speaker 1: about the physical characteristics determining the orbit of satellites and 354 00:20:00,000 --> 00:20:03,080 Speaker 1: in lower thorbit uh and that due to factors like 355 00:20:03,160 --> 00:20:07,840 Speaker 1: atmospheric drag, their altitude tends to trend steadily downward over 356 00:20:07,960 --> 00:20:11,880 Speaker 1: time in a consistent direction down, down, down, So eventually 357 00:20:12,240 --> 00:20:14,440 Speaker 1: you will need a reboost in order to put it 358 00:20:14,520 --> 00:20:17,879 Speaker 1: back up to the correct distance. So regression to the 359 00:20:17,960 --> 00:20:21,520 Speaker 1: mean applies to certain kinds of data that are repeatedly 360 00:20:21,640 --> 00:20:26,919 Speaker 1: sampled data where there is natural random fluctuation back and forth, 361 00:20:27,440 --> 00:20:30,680 Speaker 1: not a steady trend in the data in one direction 362 00:20:30,800 --> 00:20:33,800 Speaker 1: on the relevant time scale. The other thing that's important 363 00:20:33,840 --> 00:20:36,760 Speaker 1: to understand is that systems where you expect to find 364 00:20:36,880 --> 00:20:40,560 Speaker 1: regression to the mean are systems in which the repeated 365 00:20:40,680 --> 00:20:44,040 Speaker 1: data values you're sampling are to some degree determined by 366 00:20:44,280 --> 00:20:48,080 Speaker 1: luck or chance. If a series of values is influenced 367 00:20:48,119 --> 00:20:52,919 Speaker 1: almost entirely by deterministic influence, like in the satellite example, 368 00:20:53,000 --> 00:20:56,400 Speaker 1: by like the laws of physics, or by some extremely 369 00:20:56,520 --> 00:21:01,120 Speaker 1: reliable skill with little room for variation, values don't really 370 00:21:01,280 --> 00:21:03,640 Speaker 1: regress towards the mean in the same way, because there's 371 00:21:03,680 --> 00:21:06,800 Speaker 1: just less random fluctuation back and forth to begin with. 372 00:21:07,480 --> 00:21:10,680 Speaker 1: The more chance and random variation plays a role in 373 00:21:10,720 --> 00:21:14,040 Speaker 1: the outcome, the more you will tend to observe regression 374 00:21:14,080 --> 00:21:17,159 Speaker 1: towards the mean after an extreme sample in in whatever 375 00:21:17,240 --> 00:21:20,560 Speaker 1: it is you're looking at. Yeah, I've read that the 376 00:21:20,720 --> 00:21:23,520 Speaker 1: progression towards the mean is is not to be confused 377 00:21:23,640 --> 00:21:26,800 Speaker 1: with the law of large numbers. For example, Uh, this 378 00:21:27,000 --> 00:21:29,399 Speaker 1: is the the law that that states as a sample 379 00:21:29,480 --> 00:21:32,639 Speaker 1: size becomes larger, the sample mean gets closer to the 380 00:21:32,720 --> 00:21:36,440 Speaker 1: expected value. So a coin flipping example is key here. 381 00:21:36,520 --> 00:21:39,040 Speaker 1: Flip a coin and the random results are going to 382 00:21:39,160 --> 00:21:43,320 Speaker 1: ultimately average out to a point five proportion. But if 383 00:21:43,359 --> 00:21:45,760 Speaker 1: you only flip the coin ten times, you might not 384 00:21:45,920 --> 00:21:49,440 Speaker 1: see this breakdown. Um. And this also applies to say, 385 00:21:49,480 --> 00:21:52,119 Speaker 1: even odds on the rolling of a of a D 386 00:21:52,320 --> 00:21:55,680 Speaker 1: six of a six sided die. Uh So for example, 387 00:21:56,200 --> 00:21:59,000 Speaker 1: too regular people, that's just to die. That is nerves 388 00:21:59,080 --> 00:22:01,600 Speaker 1: like us, it's a D six. Yeah, D six is 389 00:22:01,600 --> 00:22:02,919 Speaker 1: what I could get my hands on because I was like, well, 390 00:22:02,920 --> 00:22:04,639 Speaker 1: I'm gonna do an example. I'm gonna try it myself. 391 00:22:04,720 --> 00:22:06,680 Speaker 1: So while I was putting together notes for this, I 392 00:22:06,720 --> 00:22:10,520 Speaker 1: went ahead and rolled ten times, and I got even 393 00:22:10,600 --> 00:22:14,240 Speaker 1: even odd even odd even even even even odd. So 394 00:22:14,440 --> 00:22:17,560 Speaker 1: that's that's seven to three in favor of even. So 395 00:22:17,760 --> 00:22:19,919 Speaker 1: it might make you wonder, well, is this die broken? 396 00:22:20,200 --> 00:22:22,639 Speaker 1: Does this D six need to go away? Because it 397 00:22:22,720 --> 00:22:26,840 Speaker 1: can't be trusted to roll? Uh? You know a balanced 398 00:22:27,480 --> 00:22:30,080 Speaker 1: array of odd and even numbers. Well, no, that's not 399 00:22:30,200 --> 00:22:32,639 Speaker 1: the case. Uh. And if I were to roll this, 400 00:22:32,920 --> 00:22:37,080 Speaker 1: say another hundred times, another thousand times, I would see 401 00:22:37,119 --> 00:22:40,840 Speaker 1: things even out even more to where we would see this, uh, 402 00:22:41,080 --> 00:22:44,960 Speaker 1: this point five proportion of odd versus even. Right. So 403 00:22:45,359 --> 00:22:47,840 Speaker 1: these are not exactly the same thing, regression to the 404 00:22:47,920 --> 00:22:50,280 Speaker 1: mean and the law of large numbers, but they are 405 00:22:50,480 --> 00:22:55,200 Speaker 1: closely related. Both observations require you to think about statistical 406 00:22:55,280 --> 00:22:58,600 Speaker 1: tendencies over time, over a time period of repeated sampling, 407 00:22:59,119 --> 00:23:01,480 Speaker 1: and both are prim ust on the knowledge that repeated 408 00:23:01,520 --> 00:23:05,000 Speaker 1: samples will tend towards the average. But regression to the 409 00:23:05,080 --> 00:23:07,840 Speaker 1: mean has to do with the idea that if you 410 00:23:08,000 --> 00:23:11,200 Speaker 1: start with an extreme observation and there is some role 411 00:23:11,359 --> 00:23:15,080 Speaker 1: of chance or luck in determining the value of this observation, 412 00:23:15,160 --> 00:23:17,600 Speaker 1: the next time you sample it, it's more likely to 413 00:23:17,720 --> 00:23:20,600 Speaker 1: be closer to the average. The law of large numbers 414 00:23:20,720 --> 00:23:23,560 Speaker 1: is that if in the real world, the more times 415 00:23:23,680 --> 00:23:26,680 Speaker 1: you run something, the closer your outcomes in the real 416 00:23:26,760 --> 00:23:29,520 Speaker 1: world will will be to the sort of perfect mathematical 417 00:23:29,600 --> 00:23:32,399 Speaker 1: average that you would estimate just given the chances to 418 00:23:32,480 --> 00:23:35,200 Speaker 1: begin with. Now, I want to come back to regression 419 00:23:35,200 --> 00:23:38,399 Speaker 1: towards the mean in um in medical studies because I 420 00:23:38,440 --> 00:23:41,560 Speaker 1: found a really interesting one that came out earlier this year. 421 00:23:42,320 --> 00:23:43,920 Speaker 1: Uh So, a lot of a lot of the examples 422 00:23:43,960 --> 00:23:48,560 Speaker 1: you find involving regression to the mean involved sports or economics, 423 00:23:48,600 --> 00:23:51,480 Speaker 1: and I found this one discussed in a New York 424 00:23:51,520 --> 00:23:55,159 Speaker 1: Times article again from earlier this year titled Intense strength 425 00:23:55,200 --> 00:23:59,040 Speaker 1: training does not ease knee pain, study finds by Gina Colada. 426 00:23:59,480 --> 00:24:02,879 Speaker 1: Uh this referring to a study published in JAMMA that 427 00:24:03,119 --> 00:24:08,639 Speaker 1: entailed an eighteen month clinical trial involving three seventy seven participants. Okay, 428 00:24:08,720 --> 00:24:11,080 Speaker 1: so the basic situation, the setup for this paper is 429 00:24:11,160 --> 00:24:15,639 Speaker 1: that a lot of people have knee osteoarthritis and one 430 00:24:15,720 --> 00:24:19,800 Speaker 1: of the go to treatment recommendations has long been strength training. 431 00:24:20,960 --> 00:24:23,879 Speaker 1: So in this study they decided to look into it 432 00:24:24,000 --> 00:24:28,040 Speaker 1: with three basic groups, one that received intense strength training, 433 00:24:28,400 --> 00:24:32,400 Speaker 1: another that received moderate strength training, and another that received 434 00:24:32,480 --> 00:24:36,359 Speaker 1: counseling on healthy living. So that third group, that's the 435 00:24:36,400 --> 00:24:39,280 Speaker 1: control group, They did not have any amount of strength training, 436 00:24:39,400 --> 00:24:43,840 Speaker 1: just uh, you know, some positive counseling about healthy living. Sure, 437 00:24:44,280 --> 00:24:47,159 Speaker 1: so the researchers here apparently actually expected to see the 438 00:24:47,240 --> 00:24:50,359 Speaker 1: intense strength training take the lead that they were looking 439 00:24:50,960 --> 00:24:54,840 Speaker 1: to identify what has been just sort of accepted wisdom, 440 00:24:55,440 --> 00:24:58,560 Speaker 1: um and and again that this has been the predominant 441 00:24:58,600 --> 00:25:01,840 Speaker 1: treatment idea. But in instead they found that the results 442 00:25:02,160 --> 00:25:06,000 Speaker 1: were the same for all three groups. Quote, everyone reported 443 00:25:06,040 --> 00:25:10,200 Speaker 1: slightly less pain, including those who had received only counseling. 444 00:25:10,600 --> 00:25:12,760 Speaker 1: Now why is that, Well, as Colotta points out, there's 445 00:25:12,840 --> 00:25:16,320 Speaker 1: there's always room for other effects, especially say the placebo effect. 446 00:25:16,840 --> 00:25:20,160 Speaker 1: Uh but regression to the mean is also a heavy 447 00:25:20,200 --> 00:25:23,359 Speaker 1: consideration here and certainly could work in congress with the 448 00:25:23,400 --> 00:25:26,720 Speaker 1: placebo effect. Right, So you don't necessarily have to assume 449 00:25:26,800 --> 00:25:30,040 Speaker 1: that the counseling actually helped to heal people's knees, though 450 00:25:30,040 --> 00:25:31,800 Speaker 1: it may have in in in some it may have 451 00:25:31,880 --> 00:25:34,199 Speaker 1: had some kind of mechanistic effect in in some way 452 00:25:34,440 --> 00:25:37,480 Speaker 1: a mind body kind of thing, But you would also 453 00:25:37,600 --> 00:25:41,199 Speaker 1: just expect over time, people who have an extreme starting position, 454 00:25:41,240 --> 00:25:43,359 Speaker 1: who are starting with a lot of knee pain, to 455 00:25:43,600 --> 00:25:47,560 Speaker 1: get gradually better over time. Yeah. So a Colatta rights 456 00:25:47,640 --> 00:25:50,600 Speaker 1: quote are the right, as symptoms tend to surge and subside, 457 00:25:50,960 --> 00:25:53,680 Speaker 1: and people tend to seek out treatments when the pain 458 00:25:53,840 --> 00:25:56,600 Speaker 1: is at its peak, when it declines, as it would 459 00:25:56,600 --> 00:26:00,639 Speaker 1: have anyway, they ascribed the improvement to the treatment. Uh. 460 00:26:00,800 --> 00:26:02,960 Speaker 1: So you know, this would this would roughly equate to 461 00:26:03,080 --> 00:26:05,440 Speaker 1: yelling at your knee when it's in pain, and it 462 00:26:05,520 --> 00:26:08,520 Speaker 1: really makes it certainly relates to many other health scenarios 463 00:26:08,600 --> 00:26:11,920 Speaker 1: as well, various medications and even things like prayer and 464 00:26:12,280 --> 00:26:17,280 Speaker 1: you know, supernatural um treatments and attempts to to deal 465 00:26:17,359 --> 00:26:19,760 Speaker 1: with pain, etcetera. Yeah, I mean it could apply to 466 00:26:19,960 --> 00:26:23,920 Speaker 1: to any intervention that is aimed at influencing something that 467 00:26:24,240 --> 00:26:27,920 Speaker 1: is naturally variable on its own, right. Yeah, and you 468 00:26:28,000 --> 00:26:30,320 Speaker 1: know something that's again any kind of system in which 469 00:26:30,440 --> 00:26:33,479 Speaker 1: change occurs, when fluctuation occurs. Uh, you know, you can 470 00:26:33,560 --> 00:26:36,640 Speaker 1: you can see this applying to not only physical pain, 471 00:26:36,840 --> 00:26:41,280 Speaker 1: but also uh, emotional distress things of that nature. You know. 472 00:26:41,480 --> 00:26:44,040 Speaker 1: So again, I think this is an important tool to 473 00:26:44,200 --> 00:26:52,679 Speaker 1: have in our our logic tool kit. Thank thank Now. 474 00:26:52,760 --> 00:26:55,680 Speaker 1: There are even cases where I'm tempted to think about 475 00:26:55,720 --> 00:27:00,400 Speaker 1: the application of regression to the mean, but where it's 476 00:27:00,400 --> 00:27:03,920 Speaker 1: probably a lot harder to quantify exactly what the effects are. 477 00:27:04,560 --> 00:27:08,280 Speaker 1: It's cases where it can be difficult to separate out, 478 00:27:08,400 --> 00:27:11,879 Speaker 1: say the effects of some kind of deterministic influence like 479 00:27:12,080 --> 00:27:15,399 Speaker 1: skill versus how how strong the effect of chance or 480 00:27:15,480 --> 00:27:17,720 Speaker 1: luck is. But I think about things even in the 481 00:27:17,800 --> 00:27:20,359 Speaker 1: world of the arts, like I think about, you know, 482 00:27:20,480 --> 00:27:23,760 Speaker 1: the sophomore album by by a band that has like 483 00:27:23,840 --> 00:27:27,680 Speaker 1: a really stellar debut album. Uh, you know, often that 484 00:27:27,880 --> 00:27:32,080 Speaker 1: is perceived is disappointing, and you have to wonder, like, Okay, 485 00:27:32,240 --> 00:27:35,760 Speaker 1: is it is that often true? Because I don't know 486 00:27:35,880 --> 00:27:37,879 Speaker 1: if people get famous and it goes to their heads 487 00:27:38,000 --> 00:27:39,960 Speaker 1: and then they you know, they get full of themselves 488 00:27:40,000 --> 00:27:42,560 Speaker 1: and make something dumb, or is it because when somebody 489 00:27:42,600 --> 00:27:46,359 Speaker 1: has a debut album that's really well received, to some extent, 490 00:27:46,600 --> 00:27:50,320 Speaker 1: it's so good partially because of luck or chance, and 491 00:27:50,560 --> 00:27:54,120 Speaker 1: that's an outlier that you're as you're starting sample, yeah, yeah, 492 00:27:54,160 --> 00:27:56,280 Speaker 1: And certainly this is an area that's there's a lot 493 00:27:56,359 --> 00:27:59,239 Speaker 1: more subjectivity here and and so it's not the kind 494 00:27:59,240 --> 00:28:01,200 Speaker 1: of thing you can that's really have a control group 495 00:28:01,320 --> 00:28:04,280 Speaker 1: for anything. But but I think it is quite interesting, 496 00:28:04,320 --> 00:28:06,639 Speaker 1: and I did find as I was looking around for 497 00:28:06,720 --> 00:28:09,399 Speaker 1: some jazzy or examples or possible examples of aggression to 498 00:28:09,440 --> 00:28:13,680 Speaker 1: the mean. Um, I found one that that actually gets 499 00:28:13,720 --> 00:28:16,440 Speaker 1: into a little bit into the idea of you know, 500 00:28:16,760 --> 00:28:20,040 Speaker 1: first and second album. But also uh, the idea of 501 00:28:20,160 --> 00:28:23,720 Speaker 1: follow up films and Hollywood sequels has pointed out both 502 00:28:23,800 --> 00:28:27,840 Speaker 1: good Yeah has pointed out by Joanna Deong in two 503 00:28:27,880 --> 00:28:32,200 Speaker 1: thousand eighteen on the blogs scientifically sound movie sequels are 504 00:28:32,440 --> 00:28:35,320 Speaker 1: potentially a great example of aggression to the mean. Quote, 505 00:28:35,680 --> 00:28:39,120 Speaker 1: Hollywood sequels are only made if the original film is 506 00:28:39,200 --> 00:28:43,120 Speaker 1: a quote unquote high quality success. But the average quality 507 00:28:43,160 --> 00:28:46,080 Speaker 1: of sequels will be closer to the mean than average 508 00:28:46,160 --> 00:28:50,040 Speaker 1: quality of originals of sequels because of regression to the means, 509 00:28:50,120 --> 00:28:53,880 Speaker 1: So sequels tend to be of lower quality than the original. Now, 510 00:28:54,040 --> 00:28:57,600 Speaker 1: I might somewhat dispute the premise here that Hollywood sequels 511 00:28:57,720 --> 00:29:00,640 Speaker 1: are only made to films that are high quality to 512 00:29:00,760 --> 00:29:04,960 Speaker 1: begin with. Um, right, But but I still think this 513 00:29:05,160 --> 00:29:08,280 Speaker 1: is onto something because there is a movie that gets 514 00:29:08,360 --> 00:29:11,800 Speaker 1: a sequel tends to have something about it something that 515 00:29:11,920 --> 00:29:14,720 Speaker 1: people are responding to, whether it's a movie that I 516 00:29:14,760 --> 00:29:17,560 Speaker 1: would like or not. Right, I mean, sometimes obviously the 517 00:29:17,640 --> 00:29:19,920 Speaker 1: situation is the film just made a lot of money. 518 00:29:19,920 --> 00:29:21,840 Speaker 1: I mean, I guess that's the key thing. It didn't 519 00:29:21,880 --> 00:29:24,560 Speaker 1: make a lot of money. If so, producers are going 520 00:29:24,600 --> 00:29:26,680 Speaker 1: to be more inclined to say, let's do that again, 521 00:29:26,840 --> 00:29:30,080 Speaker 1: Let's have that experience again of all that money coming in. 522 00:29:30,680 --> 00:29:35,640 Speaker 1: And sometimes this this certainly matches up with a quality film. 523 00:29:35,720 --> 00:29:38,720 Speaker 1: You have something that really captures people's imagination and is 524 00:29:39,000 --> 00:29:41,479 Speaker 1: of high quality. And uh and you know, so it's 525 00:29:41,520 --> 00:29:44,959 Speaker 1: really firing on all cylinders. But you know, and yes, 526 00:29:45,080 --> 00:29:48,000 Speaker 1: certainly in some cases it's just the right film at 527 00:29:48,040 --> 00:29:50,400 Speaker 1: the right time. Or or maybe it has nothing to 528 00:29:50,480 --> 00:29:52,440 Speaker 1: do with the film itself. Maybe it's who's in it, 529 00:29:52,680 --> 00:29:54,640 Speaker 1: or I don't know what's going on in the zeitgeist 530 00:29:55,000 --> 00:29:57,680 Speaker 1: during that particular era. Well, the way I would think 531 00:29:57,720 --> 00:30:00,080 Speaker 1: about this is, and I think that again, this is 532 00:30:00,120 --> 00:30:04,120 Speaker 1: onto something. It highlights that when we experience confusion where 533 00:30:04,160 --> 00:30:07,200 Speaker 1: we say, like, wow, you know, the Exorcist is such 534 00:30:07,240 --> 00:30:10,200 Speaker 1: a great horror movie and the Exorcist too is so bad? 535 00:30:10,800 --> 00:30:12,920 Speaker 1: How could that be the case? You know, why is it. 536 00:30:13,040 --> 00:30:15,560 Speaker 1: Why is such a bad sequel to such a great movie. 537 00:30:16,400 --> 00:30:20,320 Speaker 1: It's because of the comparison of the original to the 538 00:30:20,400 --> 00:30:24,560 Speaker 1: sequel that we're experiencing this confusion. Another way you could 539 00:30:24,600 --> 00:30:27,680 Speaker 1: just look at it is most horror movies are direc 540 00:30:28,280 --> 00:30:32,640 Speaker 1: most movies are bad, and it is only by comparing 541 00:30:32,960 --> 00:30:36,600 Speaker 1: the The Exorcist Too to The Exorcist that you notice 542 00:30:36,720 --> 00:30:39,320 Speaker 1: this steep drop off where Another way of looking at 543 00:30:39,360 --> 00:30:42,760 Speaker 1: it is that The Exorcist Too is bad like most 544 00:30:42,840 --> 00:30:45,720 Speaker 1: horror movies are, and the first one was an outlier 545 00:30:45,840 --> 00:30:48,000 Speaker 1: at the beginning. It was a first film in a 546 00:30:48,080 --> 00:30:53,280 Speaker 1: series that happened to be really good to cut above. Yeah, absolutely, like, yeah, 547 00:30:53,320 --> 00:30:54,760 Speaker 1: I think this is the correct way to look at it, 548 00:30:54,800 --> 00:30:57,440 Speaker 1: and also keeping in mind it just how amazing it 549 00:30:57,600 --> 00:31:00,640 Speaker 1: is that any film gets completed, like even bad film, 550 00:31:00,720 --> 00:31:03,760 Speaker 1: Like a lot of people probably work pretty hard to 551 00:31:03,920 --> 00:31:06,640 Speaker 1: make that happen, even if the end results don't really 552 00:31:06,720 --> 00:31:09,000 Speaker 1: please anyone at all. But but yeah, I think this 553 00:31:09,120 --> 00:31:12,080 Speaker 1: is also an interesting inversion of the opening example of 554 00:31:12,160 --> 00:31:14,960 Speaker 1: yelling at pilots as well, because most of the time, 555 00:31:15,240 --> 00:31:18,480 Speaker 1: if a flawed movie comes out, people are not clamoring 556 00:31:18,560 --> 00:31:23,760 Speaker 1: for the sequel. Um sequels are rarely guaranteed, so you're 557 00:31:23,800 --> 00:31:25,960 Speaker 1: not often going to hear things like, oh, well that 558 00:31:26,200 --> 00:31:28,480 Speaker 1: wasn't great. I hope the next one is an improvement. 559 00:31:28,520 --> 00:31:31,040 Speaker 1: I mean some people say that, some people I've said 560 00:31:31,120 --> 00:31:32,640 Speaker 1: things like that before, where it will be like, oh, 561 00:31:33,000 --> 00:31:35,320 Speaker 1: really flawed film, but maybe there's like a cool idea. 562 00:31:35,600 --> 00:31:37,640 Speaker 1: I kind of wish it would they would remake it, 563 00:31:37,960 --> 00:31:41,240 Speaker 1: even though there's no like logical reason that there would 564 00:31:41,240 --> 00:31:44,840 Speaker 1: be like a there would be money behind that idea. Well, 565 00:31:44,880 --> 00:31:46,920 Speaker 1: I guess it's kind of different when you're talking about 566 00:31:46,920 --> 00:31:49,640 Speaker 1: a one off creative project versus something. I mean, we 567 00:31:49,760 --> 00:31:52,200 Speaker 1: live in a kind of different era now because we 568 00:31:52,440 --> 00:31:55,240 Speaker 1: were at the height of this you know, cinematic universe 569 00:31:55,360 --> 00:31:59,600 Speaker 1: thing with a huge number of the big budget movies 570 00:31:59,680 --> 00:32:02,960 Speaker 1: that come out, the big event movies are not one 571 00:32:03,040 --> 00:32:07,080 Speaker 1: off creative products, but they are a product that exists 572 00:32:07,200 --> 00:32:10,520 Speaker 1: within some kind of franchise or universe or something. So 573 00:32:10,600 --> 00:32:13,360 Speaker 1: you just know automatically that there's gonna be another one, 574 00:32:13,440 --> 00:32:15,920 Speaker 1: whether this one is good or not. Yeah, like either 575 00:32:16,000 --> 00:32:18,959 Speaker 1: it's an established film universe where like, you know, they 576 00:32:19,000 --> 00:32:22,240 Speaker 1: put out another Marvel movie and it's just terrible. Well, 577 00:32:22,280 --> 00:32:24,760 Speaker 1: obviously there's enough momentum, they're not going to stop. They're 578 00:32:24,760 --> 00:32:27,280 Speaker 1: not gonna be like, oh, well, lesson learned, Well we'll 579 00:32:27,320 --> 00:32:30,520 Speaker 1: stop them. No, no, there's gonna be another. Another example 580 00:32:30,560 --> 00:32:33,479 Speaker 1: of this might be a successful franchise in another medium, 581 00:32:33,560 --> 00:32:37,120 Speaker 1: say a book series, so like the Harry Potter books 582 00:32:37,120 --> 00:32:39,440 Speaker 1: for example, or I don't know, Lord of the Rings, 583 00:32:39,720 --> 00:32:42,080 Speaker 1: where you know that once they make the Fellowship of 584 00:32:42,120 --> 00:32:44,400 Speaker 1: the Rings, there's going to be a follow up, They're 585 00:32:44,400 --> 00:32:48,280 Speaker 1: gonna do another one. So in these ways, unless it's 586 00:32:48,320 --> 00:32:51,280 Speaker 1: the seventies and it's uh, that Lord of the Rings 587 00:32:51,360 --> 00:32:54,440 Speaker 1: movie that that ends with Helm's Deep, well, but they 588 00:32:54,520 --> 00:32:58,400 Speaker 1: picked that up eventually, but yeah, okay, but but yeah, 589 00:32:59,320 --> 00:33:01,440 Speaker 1: probably the Harry Hotter films are a better example. And 590 00:33:01,480 --> 00:33:04,360 Speaker 1: there may be spe specific you know things about how 591 00:33:04,440 --> 00:33:07,680 Speaker 1: that wasn't guaranteed either, uh, you know, the economic reality 592 00:33:07,720 --> 00:33:10,120 Speaker 1: can always come into play. But for the most part, 593 00:33:10,200 --> 00:33:12,560 Speaker 1: like those were when when that started, you knew they 594 00:33:12,600 --> 00:33:14,360 Speaker 1: were going to keep making these at least they were 595 00:33:14,400 --> 00:33:15,840 Speaker 1: going to make a follow up, so you could have 596 00:33:16,240 --> 00:33:18,800 Speaker 1: comments like, well, there that was this was kind of 597 00:33:18,840 --> 00:33:20,800 Speaker 1: flawed in some of the some of its execution. I 598 00:33:20,920 --> 00:33:22,960 Speaker 1: hope that they fixed that in the next film for 599 00:33:23,040 --> 00:33:25,200 Speaker 1: the most part. Yeah, with one offs, this is not 600 00:33:25,320 --> 00:33:28,640 Speaker 1: the case. It's like, if if this film fizzles, then 601 00:33:29,240 --> 00:33:32,120 Speaker 1: only you know, a few, like rare people are going 602 00:33:32,200 --> 00:33:34,920 Speaker 1: to be clamoring for a sequel or dreaming about what 603 00:33:35,000 --> 00:33:37,200 Speaker 1: the sequel would be. Yeah. I think this observation but 604 00:33:37,320 --> 00:33:40,640 Speaker 1: regression to the mean and movie sequels is actually very 605 00:33:40,720 --> 00:33:43,840 Speaker 1: on point, but more so for the films of yester year, 606 00:33:43,960 --> 00:33:46,200 Speaker 1: where the more the more common thing was you'd have 607 00:33:46,360 --> 00:33:49,360 Speaker 1: a an independent sort of creative product that it's its 608 00:33:49,440 --> 00:33:52,640 Speaker 1: own thing, and then if it resonated with somebody, if 609 00:33:52,680 --> 00:33:54,719 Speaker 1: it did well, there would be sequels. I think it's 610 00:33:54,760 --> 00:33:57,040 Speaker 1: a little it applies a little bit less today when 611 00:33:57,080 --> 00:33:59,920 Speaker 1: there's just you know, we're in the world of France, 612 00:34:00,080 --> 00:34:03,040 Speaker 1: Chises and extended universes and there's just sort of like 613 00:34:03,160 --> 00:34:07,640 Speaker 1: a guaranteed ongoing uh conveyor belt of of new stuff 614 00:34:07,720 --> 00:34:10,960 Speaker 1: within the Marvel world or the Star Wars world or whatever. Yeah, 615 00:34:11,040 --> 00:34:13,520 Speaker 1: but I think it it is a worthwhile way to 616 00:34:13,600 --> 00:34:18,080 Speaker 1: think about creative the creative process, and you know, as 617 00:34:18,080 --> 00:34:20,520 Speaker 1: opposed to some of these alternate sort of folk wisdomy 618 00:34:20,560 --> 00:34:23,640 Speaker 1: ways of thinking about it. For example, on Weird House, cinema. 619 00:34:23,680 --> 00:34:26,319 Speaker 1: We recently talked about Toby Hooper. Toby Hooper is one 620 00:34:26,360 --> 00:34:29,320 Speaker 1: of those directors who's often you'll often you'll see descriptions. 621 00:34:29,400 --> 00:34:31,080 Speaker 1: I think we've even read part of a review where 622 00:34:31,120 --> 00:34:33,120 Speaker 1: they they really they talk about, oh, well, you know 623 00:34:33,160 --> 00:34:36,920 Speaker 1: he put out Texas Chainsaw Mascre directed that film and 624 00:34:37,200 --> 00:34:39,200 Speaker 1: this was great. It was, you know, just a real 625 00:34:39,840 --> 00:34:44,520 Speaker 1: lightning bolt um to the cinematic world into horror itself 626 00:34:44,600 --> 00:34:47,320 Speaker 1: as a genre. And then the idea that well, he 627 00:34:47,440 --> 00:34:50,000 Speaker 1: was never able to capture that magic again, you know, 628 00:34:50,160 --> 00:34:52,720 Speaker 1: that is his career was just like one long slide 629 00:34:52,760 --> 00:34:55,520 Speaker 1: after that, which I don't think it is a fair assessment, 630 00:34:55,840 --> 00:35:00,480 Speaker 1: especially if you employ regression to the mean you the 631 00:35:00,640 --> 00:35:02,840 Speaker 1: idea being that, yeah, he did kind of get lightning 632 00:35:02,880 --> 00:35:05,360 Speaker 1: in a bottle with that with that first big film, 633 00:35:06,040 --> 00:35:09,040 Speaker 1: that that he was able to to really bring something 634 00:35:09,120 --> 00:35:13,359 Speaker 1: together that is an outlier, um, but that that that's 635 00:35:13,400 --> 00:35:16,120 Speaker 1: just going to happen. That's just the way these things work, right, 636 00:35:16,200 --> 00:35:18,800 Speaker 1: So most movies aren't that good. So you know, the 637 00:35:18,960 --> 00:35:22,640 Speaker 1: random chance of like how good his ideas and execution 638 00:35:22,719 --> 00:35:24,400 Speaker 1: are from one year to the next is going to 639 00:35:24,480 --> 00:35:27,600 Speaker 1: set in and you might have a different idea about 640 00:35:27,760 --> 00:35:31,200 Speaker 1: his career. If you were to say, like randomly chronologically 641 00:35:31,320 --> 00:35:33,800 Speaker 1: reorder all his movies, right, you know, like if you 642 00:35:33,840 --> 00:35:36,200 Speaker 1: were to put the worst ones earlier on or something, 643 00:35:36,480 --> 00:35:38,680 Speaker 1: people might feel differently about it. Yeah, well then they 644 00:35:38,680 --> 00:35:42,480 Speaker 1: would talk about, well, okay, TCM was peak Toby Hooper, 645 00:35:42,600 --> 00:35:45,040 Speaker 1: like this was his peak output. Because this is the 646 00:35:45,160 --> 00:35:48,800 Speaker 1: kind of the kind of view of an artist's you know, 647 00:35:48,920 --> 00:35:53,120 Speaker 1: creative trajectory that we tend to want to um to 648 00:35:53,239 --> 00:35:55,719 Speaker 1: follow along, you know, because it's more story shaped, the 649 00:35:55,880 --> 00:35:59,200 Speaker 1: idea of ascent and then eventually decent that there's gonna 650 00:35:59,239 --> 00:36:01,200 Speaker 1: be h is going to be a period of high 651 00:36:01,280 --> 00:36:04,040 Speaker 1: noon in their creative out output, and sometimes that does 652 00:36:04,120 --> 00:36:06,719 Speaker 1: match up with the reality. But I don't know even 653 00:36:06,840 --> 00:36:09,680 Speaker 1: then we I think we tend to overlook the dogs 654 00:36:10,160 --> 00:36:12,719 Speaker 1: in the filmographies of people we love, you know. Oh 655 00:36:12,800 --> 00:36:15,560 Speaker 1: yeah uh. But then again, I mean, this is interesting 656 00:36:15,640 --> 00:36:18,960 Speaker 1: because in talking about regression to the mean applying to 657 00:36:19,239 --> 00:36:24,120 Speaker 1: creative products like movies, we are acknowledging that the creative 658 00:36:24,200 --> 00:36:27,440 Speaker 1: process is not purely a product of talent and skill, 659 00:36:27,600 --> 00:36:30,719 Speaker 1: that there is a significant amount of chance and luck 660 00:36:30,880 --> 00:36:33,640 Speaker 1: involved in something like how good a movie turns out? 661 00:36:33,719 --> 00:36:36,520 Speaker 1: To be um, and it's hard to know exactly how 662 00:36:36,600 --> 00:36:39,719 Speaker 1: to like how to picture that influence of chance and luck. 663 00:36:39,840 --> 00:36:42,960 Speaker 1: You know, like, what what is that in the creative process. 664 00:36:43,400 --> 00:36:46,160 Speaker 1: It's obviously true because there are people who can be 665 00:36:46,320 --> 00:36:49,120 Speaker 1: incredibly skilled in one instance and then I don't know, 666 00:36:49,280 --> 00:36:51,200 Speaker 1: things just don't go right the next time, and to 667 00:36:51,320 --> 00:36:54,520 Speaker 1: make something that nobody really likes. But uh, but that's 668 00:36:54,800 --> 00:36:57,200 Speaker 1: that's just not often how people like to think about 669 00:36:57,239 --> 00:36:59,880 Speaker 1: creative talents and people like to think about the creative 670 00:37:00,000 --> 00:37:04,480 Speaker 1: process like it is much more strictly deterministic. Yeah, yeah, 671 00:37:04,680 --> 00:37:07,720 Speaker 1: Or or you look at things like the Star Wars 672 00:37:07,800 --> 00:37:09,520 Speaker 1: films and you kind of like fall into this idea 673 00:37:09,560 --> 00:37:12,160 Speaker 1: of thinking, this is stuff that is mind out of 674 00:37:12,239 --> 00:37:15,080 Speaker 1: the mythic earth, and then you know, it just makes 675 00:37:15,120 --> 00:37:17,839 Speaker 1: sense that things would accumulate and get better. So um, 676 00:37:18,360 --> 00:37:20,600 Speaker 1: but really looking back on it, especially if you actually 677 00:37:20,760 --> 00:37:23,520 Speaker 1: like watch documentaries, and there's some great ones about the 678 00:37:24,640 --> 00:37:28,160 Speaker 1: production of those films, like it's it's amazing that Star Wars, 679 00:37:28,280 --> 00:37:30,080 Speaker 1: the first one in New Hope was as good as 680 00:37:30,160 --> 00:37:33,040 Speaker 1: it was, and then it's nothing short of I mean, 681 00:37:33,080 --> 00:37:35,839 Speaker 1: it's it's just a pure miracle that the second one 682 00:37:36,320 --> 00:37:39,120 Speaker 1: was so much better and like really nailed it. Like 683 00:37:39,200 --> 00:37:42,920 Speaker 1: if if the second film had had floundered, I mean, 684 00:37:43,000 --> 00:37:47,239 Speaker 1: just imagine how different the cinematical landscape would have been 685 00:37:47,320 --> 00:37:50,799 Speaker 1: for decades to come. Yeah, So it's it's amazing if 686 00:37:51,320 --> 00:37:53,640 Speaker 1: the first film in a series is good, and it's 687 00:37:53,719 --> 00:37:56,440 Speaker 1: super amazing if the second one is good. And and 688 00:37:56,560 --> 00:37:58,759 Speaker 1: this is why I think we often find too that 689 00:37:59,680 --> 00:38:01,839 Speaker 1: if if part one in part two, if something are 690 00:38:02,080 --> 00:38:04,360 Speaker 1: of high quality, then you've got to look out for 691 00:38:04,400 --> 00:38:07,120 Speaker 1: that part three because that part three, that part three 692 00:38:07,160 --> 00:38:09,040 Speaker 1: may be coming to get you. But likewise, if a 693 00:38:09,160 --> 00:38:14,440 Speaker 1: part two is rubbish, um, you know, subjectively, then then 694 00:38:14,520 --> 00:38:16,719 Speaker 1: part three might pick it up and uh and get 695 00:38:16,800 --> 00:38:18,920 Speaker 1: things back on track. So you certainly see that that 696 00:38:19,040 --> 00:38:21,320 Speaker 1: kind of fluctuation as well. I have a question I 697 00:38:21,360 --> 00:38:23,400 Speaker 1: actually don't know the answer to, but this would be 698 00:38:23,480 --> 00:38:28,880 Speaker 1: interesting in terms of I don't know the high performing output, 699 00:38:29,000 --> 00:38:31,920 Speaker 1: whether that is in whether that is a creative endeavor 700 00:38:32,040 --> 00:38:34,879 Speaker 1: like you know, writing books or creating movies, or whether 701 00:38:34,960 --> 00:38:38,640 Speaker 1: that's something even like athletics, like athletic performance, do you 702 00:38:38,719 --> 00:38:42,560 Speaker 1: expect to see more random fluctuation in the performance of 703 00:38:42,880 --> 00:38:48,640 Speaker 1: collaborative output versus individual output? So say, um, do you 704 00:38:48,719 --> 00:38:52,120 Speaker 1: expect more influence of random chance and fluctuation in the 705 00:38:52,200 --> 00:38:56,200 Speaker 1: quality of uh books written by a single author versus 706 00:38:56,280 --> 00:38:58,719 Speaker 1: you know, movies that have the input of hundreds of 707 00:38:59,040 --> 00:39:02,520 Speaker 1: thousands of people? Uh? Or in in the realm of 708 00:39:02,560 --> 00:39:05,440 Speaker 1: say sports, like, do you expect more random variation in 709 00:39:05,520 --> 00:39:08,960 Speaker 1: the output of an individual athletes like you know, an 710 00:39:09,000 --> 00:39:14,000 Speaker 1: individual gymnast or something, or in team sports? Yeah? I 711 00:39:14,040 --> 00:39:16,400 Speaker 1: can see it going both ways, because yeah, if you 712 00:39:16,440 --> 00:39:19,000 Speaker 1: think too hard to about even just like the film analogy, 713 00:39:19,320 --> 00:39:21,440 Speaker 1: you can easily get into discussions of like, Okay, well 714 00:39:21,480 --> 00:39:24,000 Speaker 1: was it the same cast and crew that are producing 715 00:39:24,040 --> 00:39:26,680 Speaker 1: the sequel? Uh? You know, what happens when the budget 716 00:39:26,800 --> 00:39:28,800 Speaker 1: is different, what happens when there are other constraints, what 717 00:39:28,880 --> 00:39:31,040 Speaker 1: happens when suddenly there are a whole bunch of producers 718 00:39:31,520 --> 00:39:33,880 Speaker 1: that have their ideas about what things should be. I mean, 719 00:39:33,880 --> 00:39:36,319 Speaker 1: there's so many different factors to take into place. Uh. 720 00:39:36,400 --> 00:39:39,160 Speaker 1: You know, with this example that you know, perhaps doesn't 721 00:39:39,200 --> 00:39:42,200 Speaker 1: bear too close of scrutiny, but but but it's but 722 00:39:42,280 --> 00:39:45,160 Speaker 1: it's still I think serves as a nice um illustration 723 00:39:45,239 --> 00:39:48,279 Speaker 1: of the overall trend that we're talking about here. Well, 724 00:39:48,320 --> 00:39:50,200 Speaker 1: it does bring up the fact that since I mentioned 725 00:39:50,239 --> 00:39:52,600 Speaker 1: athletes that, you know, I don't know a lot about sports. 726 00:39:52,640 --> 00:39:54,920 Speaker 1: I'm not a big sports fan. But but clearly, but 727 00:39:55,040 --> 00:39:57,319 Speaker 1: regression to the mean is something that has widely been 728 00:39:57,320 --> 00:40:00,440 Speaker 1: applied to the world of sports. Uh for example, in 729 00:40:00,520 --> 00:40:04,760 Speaker 1: the observation that often after having a really stellar season, 730 00:40:04,920 --> 00:40:08,600 Speaker 1: either an individual athlete or a sports team will be 731 00:40:08,760 --> 00:40:13,279 Speaker 1: perceived to underperform the next season. And again that very 732 00:40:13,320 --> 00:40:15,640 Speaker 1: well could have something to do with regression to the mean. Like, 733 00:40:15,840 --> 00:40:18,960 Speaker 1: you know, the fact that they're observed having an amazing 734 00:40:19,040 --> 00:40:23,080 Speaker 1: season is actually an outlier. You're starting your expectations then 735 00:40:23,680 --> 00:40:25,440 Speaker 1: and saying like, Okay, now they're going to be the 736 00:40:25,480 --> 00:40:28,839 Speaker 1: best forever. Just by random fluctuation over time, you would 737 00:40:28,840 --> 00:40:31,399 Speaker 1: expect their next season to probably be not as good 738 00:40:31,440 --> 00:40:34,600 Speaker 1: as the first. I wonder to what an extent this 739 00:40:34,719 --> 00:40:38,200 Speaker 1: can be applied to, say, the world of the culinary arts, 740 00:40:38,280 --> 00:40:40,719 Speaker 1: or even just like various food crops, like say the 741 00:40:41,600 --> 00:40:44,200 Speaker 1: selecting a cantalope at the grocery store, that sort of thing. 742 00:40:44,960 --> 00:40:46,600 Speaker 1: I mean, I guess it would apply to pretty much 743 00:40:46,600 --> 00:40:49,720 Speaker 1: anything where you're sampling in a series over time. There's 744 00:40:50,000 --> 00:40:54,560 Speaker 1: plenty of random fluctuation in what you're sampling and the 745 00:40:54,640 --> 00:40:57,080 Speaker 1: first thing you sample is an outlier in some way 746 00:40:57,200 --> 00:41:00,799 Speaker 1: really good or really bad. If those things true, then 747 00:41:00,840 --> 00:41:03,840 Speaker 1: you can probably expect you're going to see some regression 748 00:41:04,000 --> 00:41:06,799 Speaker 1: one way or the other. Yeah. Yeah. By the way, 749 00:41:06,840 --> 00:41:09,680 Speaker 1: I was looking around for like really stellar examples of 750 00:41:09,719 --> 00:41:14,640 Speaker 1: a sequel film that is widely believed to be uh rubbish, 751 00:41:14,960 --> 00:41:17,720 Speaker 1: and I think The Exorcist Too is the primary example. 752 00:41:17,840 --> 00:41:19,640 Speaker 1: Like you get into some of the other examples that 753 00:41:19,719 --> 00:41:22,920 Speaker 1: pop up, I feel like there's room for disagreement. Um. 754 00:41:23,440 --> 00:41:26,160 Speaker 1: For instance, Texas Chainsaw Masker two is one which I 755 00:41:26,239 --> 00:41:29,160 Speaker 1: saw popping up on some of these lists for disappointing sequels. 756 00:41:29,600 --> 00:41:31,879 Speaker 1: But I think that's entirely based on who you ask. 757 00:41:31,960 --> 00:41:34,400 Speaker 1: I think if you ask us, we will agree that 758 00:41:34,920 --> 00:41:37,640 Speaker 1: that that t c M two is is actually a 759 00:41:37,719 --> 00:41:40,200 Speaker 1: great film. It's different from the first one perhaps if 760 00:41:40,239 --> 00:41:42,320 Speaker 1: you go into if you go into part two with 761 00:41:42,440 --> 00:41:45,239 Speaker 1: the expectations you had for part one, you may see 762 00:41:45,280 --> 00:41:48,440 Speaker 1: it as a dip in quality. But depending on what 763 00:41:48,560 --> 00:41:50,080 Speaker 1: else you're bringing to the table, you might see it 764 00:41:50,120 --> 00:41:52,640 Speaker 1: as an increase in in quality or at least or 765 00:41:52,760 --> 00:41:55,280 Speaker 1: something that maybe is different but on par with the original. 766 00:41:55,560 --> 00:41:57,279 Speaker 1: I mean it's certainly not for everybody. I mean, it 767 00:41:57,400 --> 00:42:00,160 Speaker 1: is a it is a gross, disgusting film in in 768 00:42:00,200 --> 00:42:02,560 Speaker 1: a way like the first one, probably even grosser, but 769 00:42:02,640 --> 00:42:07,120 Speaker 1: also a sort of satirical masterpiece. Um. But I just 770 00:42:07,239 --> 00:42:09,279 Speaker 1: had another thought when you said that The Exorcist Too 771 00:42:09,440 --> 00:42:11,520 Speaker 1: is regarded as like one of the best examples of 772 00:42:11,600 --> 00:42:13,800 Speaker 1: a sequel. That's really rubbish. I mean, it makes me 773 00:42:13,920 --> 00:42:17,960 Speaker 1: also wonder about the pretty high estimation critics generally have 774 00:42:18,080 --> 00:42:21,240 Speaker 1: of The Exorcist three. Makes me wonder if the effect 775 00:42:21,400 --> 00:42:25,360 Speaker 1: of the Exorcist to being so bad actually makes people 776 00:42:25,400 --> 00:42:28,200 Speaker 1: sort of over you know, they're like they're ready to 777 00:42:28,239 --> 00:42:31,439 Speaker 1: be impressed by the Exorcist three. Yeah. Yeah, I wonder 778 00:42:31,480 --> 00:42:33,560 Speaker 1: if that's the case too with it with especially when 779 00:42:33,600 --> 00:42:36,040 Speaker 1: you have a situation with a part three coming back 780 00:42:36,160 --> 00:42:40,239 Speaker 1: and restoring uh some I don't know, some level of 781 00:42:40,320 --> 00:42:42,840 Speaker 1: quality to a franchise. I mean there's also like the 782 00:42:42,880 --> 00:42:46,040 Speaker 1: Star Trek h example, right, I mean that was long 783 00:42:46,200 --> 00:42:47,919 Speaker 1: the Long held up as an example of like, okay, 784 00:42:47,960 --> 00:42:51,160 Speaker 1: you have you even Star Treks and your odd Star Treks, right. Uh. 785 00:42:51,280 --> 00:42:54,080 Speaker 1: And I think you've made a similar case for the 786 00:42:55,040 --> 00:42:57,520 Speaker 1: Faster and Furious movies, right, I mean once you get 787 00:42:57,560 --> 00:42:59,239 Speaker 1: to a certain point in the series, I think it's 788 00:42:59,320 --> 00:43:02,839 Speaker 1: pretty much all uh, you know, a nitrous boosted brain. 789 00:43:02,920 --> 00:43:06,040 Speaker 1: It's it gets you know, it's all like we're driving 790 00:43:06,080 --> 00:43:09,600 Speaker 1: cars in space now and flying and all that. But um, 791 00:43:09,760 --> 00:43:12,400 Speaker 1: but for the earlier ones, yeah, i'd say the odd 792 00:43:12,520 --> 00:43:15,680 Speaker 1: ones are better, Like, uh, three is the first one 793 00:43:15,719 --> 00:43:20,000 Speaker 1: where it really starts getting ludicrously weird. Four is kind 794 00:43:20,040 --> 00:43:23,440 Speaker 1: of a uh, and then five starts. Five is when 795 00:43:23,440 --> 00:43:26,160 Speaker 1: the rock shows up, and then but by seven year 796 00:43:26,200 --> 00:43:29,760 Speaker 1: Golden All right, well, we're gonna go ahead and close 797 00:43:29,840 --> 00:43:31,480 Speaker 1: this one out here. But we'd obviously love to hear 798 00:43:31,520 --> 00:43:36,040 Speaker 1: from everyone about this about regression towards the mean, just 799 00:43:36,280 --> 00:43:41,279 Speaker 1: in our daily lives, in various scientific studies. Perhaps you 800 00:43:41,360 --> 00:43:43,720 Speaker 1: have thoughts about how this applies to something we've discussed 801 00:43:43,760 --> 00:43:45,960 Speaker 1: on the show in the past, because I know we've 802 00:43:46,320 --> 00:43:50,719 Speaker 1: we've mentioned regression to the mean in passing before, but 803 00:43:50,800 --> 00:43:53,439 Speaker 1: certainly we've never taken the opportunity to really dive into 804 00:43:53,480 --> 00:43:55,600 Speaker 1: it and explain it like we did today. Yeah, I 805 00:43:55,680 --> 00:43:58,359 Speaker 1: know it's come up in passing, just in us making 806 00:43:58,400 --> 00:44:00,960 Speaker 1: comments here and there about like the import of of 807 00:44:01,120 --> 00:44:04,440 Speaker 1: randomized trials and control groups and all that. In the meantime, 808 00:44:04,480 --> 00:44:06,279 Speaker 1: if you would like to listen to other episodes of 809 00:44:06,320 --> 00:44:08,920 Speaker 1: Stuff to Blow Your Mind, you will find them wherever 810 00:44:09,040 --> 00:44:11,360 Speaker 1: you get your podcast. Just look for the Stuff to 811 00:44:11,400 --> 00:44:14,400 Speaker 1: Blow your Mind podcast feed. We have our core episodes 812 00:44:14,480 --> 00:44:18,560 Speaker 1: on Tuesdays and Thursday's, Artifact episodes on Wednesday, listener mail 813 00:44:18,600 --> 00:44:20,880 Speaker 1: on Monday's. On Fridays, we do a little bit of 814 00:44:21,000 --> 00:44:22,960 Speaker 1: a weird house cinema. That's our times. We just talk 815 00:44:23,000 --> 00:44:26,520 Speaker 1: about some sort of a strange film. Uh, and you know, 816 00:44:26,719 --> 00:44:31,319 Speaker 1: teas apart what makes it strange? Uh, let's see what else. So, yeah, 817 00:44:31,400 --> 00:44:32,960 Speaker 1: you go to Stuff to Blow your Mind dot com 818 00:44:33,080 --> 00:44:34,840 Speaker 1: that will send you to the I heart listing for 819 00:44:35,120 --> 00:44:38,400 Speaker 1: our show. And there's a button there for a store 820 00:44:38,480 --> 00:44:40,440 Speaker 1: if you want to click on that. Wealth then you 821 00:44:40,560 --> 00:44:42,680 Speaker 1: can buy some merchandise that has Stuff to Blow your 822 00:44:42,719 --> 00:44:46,080 Speaker 1: Mind logos and whatnot on it, or perhaps Weird House 823 00:44:46,120 --> 00:44:48,720 Speaker 1: Cinema logos and whatnot on it, and you can get mugs, 824 00:44:48,800 --> 00:44:52,279 Speaker 1: t shirts, all that kind of stuff. Uh, just for fun, 825 00:44:52,400 --> 00:44:55,480 Speaker 1: you know, or but if you want to be all 826 00:44:55,520 --> 00:44:57,480 Speaker 1: business about it, you can make it work to get 827 00:44:57,520 --> 00:45:00,120 Speaker 1: our merge. Now, now, this this is just for fun, Like, 828 00:45:00,320 --> 00:45:03,600 Speaker 1: don't use these these clothing items to clothe your nakedness. 829 00:45:03,960 --> 00:45:07,760 Speaker 1: These are just for fun. This is just purely extra clothing. Okay, 830 00:45:07,880 --> 00:45:10,520 Speaker 1: this should not be your core clothing. Huge thanks as 831 00:45:10,520 --> 00:45:13,960 Speaker 1: always to our excellent audio producer Seth Nicholas Johnson. If 832 00:45:14,000 --> 00:45:15,520 Speaker 1: you would like to get in touch with us with 833 00:45:15,680 --> 00:45:18,040 Speaker 1: feedback on this episode or any other, to suggest a 834 00:45:18,120 --> 00:45:20,080 Speaker 1: topic for the future, just to say hello, you can 835 00:45:20,200 --> 00:45:22,919 Speaker 1: email us at contact at stuff to Blow your Mind 836 00:45:23,120 --> 00:45:32,920 Speaker 1: dot com. Stuff to Blow Your Mind is production of 837 00:45:33,000 --> 00:45:35,600 Speaker 1: I Heart Radio. For more podcasts for my heart Radio, 838 00:45:35,840 --> 00:45:38,680 Speaker 1: visit the iHeart Radio app, Apple Podcasts, or wherever you're 839 00:45:38,680 --> 00:45:39,880 Speaker 1: listening to your favorite shows.