1 00:00:03,040 --> 00:00:06,080 Speaker 1: Welcome to Stuff to Blow Your Mind, a production of iHeartRadio. 2 00:00:10,000 --> 00:00:13,319 Speaker 2: Hello, and welcome to Stuff to Blow Your Mind Listener mail. 3 00:00:13,520 --> 00:00:16,560 Speaker 2: My name is Joe McCormick. My co host Robert Lamb 4 00:00:16,600 --> 00:00:18,880 Speaker 2: would normally be here with me, but he is out 5 00:00:18,920 --> 00:00:22,120 Speaker 2: on the day that I'm recording this, so today's episode 6 00:00:22,239 --> 00:00:25,279 Speaker 2: is going to be solow. We read back listener mail 7 00:00:25,320 --> 00:00:27,560 Speaker 2: every Monday, so if you would like to get in touch, 8 00:00:27,600 --> 00:00:29,639 Speaker 2: go ahead and give it a shot. You can reach 9 00:00:29,720 --> 00:00:34,720 Speaker 2: us at contact at stuff to Blow your Mind dot com. 10 00:00:34,880 --> 00:00:38,239 Speaker 2: All types of messages are welcome, but we especially appreciate 11 00:00:38,320 --> 00:00:41,559 Speaker 2: feedback to recent episodes. If you have something fascinating to 12 00:00:41,600 --> 00:00:44,120 Speaker 2: add to a topic we have talked about. I'm going 13 00:00:44,200 --> 00:00:48,000 Speaker 2: to kick things off today with responses to our series 14 00:00:48,080 --> 00:00:52,360 Speaker 2: on the illusion of control, which is a psychology concept, 15 00:00:52,400 --> 00:00:55,840 Speaker 2: a type of cognitive illusion in which, in many cases, 16 00:00:55,920 --> 00:01:00,240 Speaker 2: people believe they have more control over outcomes than they 17 00:01:00,280 --> 00:01:11,000 Speaker 2: actually do. And this first message comes from Lauren. Lauren says, Hi, Robin, Joe. 18 00:01:11,520 --> 00:01:14,480 Speaker 2: This response pertains to the comment made about light switches 19 00:01:14,520 --> 00:01:18,280 Speaker 2: and cookies in the kitchen. Yeah, so for context, this 20 00:01:18,440 --> 00:01:20,360 Speaker 2: was something that came up in part three of the 21 00:01:20,440 --> 00:01:24,920 Speaker 2: series when we were talking about magical thinking and its 22 00:01:25,000 --> 00:01:29,479 Speaker 2: relationship to control heuristics, which are mental shortcuts we use 23 00:01:29,600 --> 00:01:32,760 Speaker 2: to judge whether we had influence over an outcome or not. 24 00:01:33,440 --> 00:01:37,600 Speaker 2: And Rob was talking about how even normal direct physical 25 00:01:37,640 --> 00:01:41,199 Speaker 2: causation can feel kind of alien and maybe even kind 26 00:01:41,200 --> 00:01:44,880 Speaker 2: of magical if we don't recall consciously willing in action 27 00:01:45,080 --> 00:01:48,880 Speaker 2: before doing it. Then the example was absent mindedly turning 28 00:01:48,960 --> 00:01:51,080 Speaker 2: on a light when you go into the kitchen at 29 00:01:51,160 --> 00:01:53,640 Speaker 2: night for a cookie. So you flip the switch as 30 00:01:53,680 --> 00:01:56,560 Speaker 2: you enter the room, but it could suddenly occur to you. 31 00:01:57,040 --> 00:02:00,880 Speaker 2: I don't remember intending to do that. Then troll heuristic 32 00:02:01,000 --> 00:02:03,640 Speaker 2: doesn't really give you the green check, because you did 33 00:02:03,680 --> 00:02:06,120 Speaker 2: take the action and the outcome happened, but you don't 34 00:02:06,200 --> 00:02:10,679 Speaker 2: remember consciously intending the outcome in advance. Anyway, back to 35 00:02:10,760 --> 00:02:14,760 Speaker 2: Lauren's message. In the single room restroom at work, they 36 00:02:14,800 --> 00:02:17,520 Speaker 2: took out the light switch near the entrance two months 37 00:02:17,560 --> 00:02:20,520 Speaker 2: ago and replaced it with a flat piece of plastic 38 00:02:20,600 --> 00:02:23,760 Speaker 2: to cover the electrical gap. They installed sensor lights that 39 00:02:23,880 --> 00:02:27,600 Speaker 2: automatically go on when you enter. This irked me for 40 00:02:27,680 --> 00:02:31,919 Speaker 2: two reasons. First, I didn't like no longer having control 41 00:02:32,000 --> 00:02:34,240 Speaker 2: over whether the light was on or off, and I 42 00:02:34,280 --> 00:02:37,760 Speaker 2: felt mildly insulted that my freedom of choice had been 43 00:02:37,800 --> 00:02:40,760 Speaker 2: taken from me. Did they not trust us to decide? 44 00:02:41,240 --> 00:02:44,040 Speaker 2: The second reason the whole situation annoys me is the 45 00:02:44,040 --> 00:02:46,880 Speaker 2: fact that I still lift my hand to turn on 46 00:02:46,919 --> 00:02:51,360 Speaker 2: the light upon entry every time, even when the lights 47 00:02:51,360 --> 00:02:54,000 Speaker 2: are flickering their way on as I move my arm. 48 00:02:54,360 --> 00:02:59,160 Speaker 2: Maybe humans, led by a pervasive, occasionally subconscious desire for 49 00:02:59,360 --> 00:03:02,760 Speaker 2: control and an inability to cope with the loss or 50 00:03:02,840 --> 00:03:06,160 Speaker 2: lack of it, delude themselves into believing they have it. 51 00:03:06,720 --> 00:03:11,799 Speaker 2: Thanks for making an extremely interesting and enjoyable podcast, Lauren, Well, 52 00:03:11,840 --> 00:03:14,919 Speaker 2: thank you so much, Lauren. Yeah, as for automatic lights 53 00:03:14,919 --> 00:03:17,160 Speaker 2: in the bathrooms, I don't know, I can see exactly 54 00:03:17,240 --> 00:03:19,120 Speaker 2: what you're saying, but I would kind of feel like 55 00:03:19,160 --> 00:03:21,919 Speaker 2: that's got to be more hygienic, right, You're forced to 56 00:03:21,919 --> 00:03:26,680 Speaker 2: touch fewer things while you're in the bathroom. I don't know, so. Yeah. 57 00:03:26,720 --> 00:03:29,959 Speaker 2: The idea that some amount of the illusion of control 58 00:03:30,200 --> 00:03:33,920 Speaker 2: is driven by wishful thinking or by a desire for 59 00:03:34,040 --> 00:03:37,880 Speaker 2: control is something that came up casually in these episodes 60 00:03:37,880 --> 00:03:41,120 Speaker 2: but was not explored as like a major hypothesis in 61 00:03:41,160 --> 00:03:44,640 Speaker 2: the literature that we were looking at. So the main 62 00:03:45,000 --> 00:03:49,640 Speaker 2: explanatory models we talked about were Ellen Langer's skill chance 63 00:03:49,680 --> 00:03:55,040 Speaker 2: confusion hypothesis and then Suzanne Thompson's favorite hypothesis of the 64 00:03:55,600 --> 00:03:58,200 Speaker 2: imprecise control heuristic what I was just talking about a 65 00:03:58,200 --> 00:04:01,200 Speaker 2: minute ago. So remember the the control heuristic was the 66 00:04:01,240 --> 00:04:03,800 Speaker 2: idea that we usually make judgments of whether we can 67 00:04:03,880 --> 00:04:07,320 Speaker 2: control an outcome or not by using a shortcut that 68 00:04:07,400 --> 00:04:10,240 Speaker 2: asks a few questions, and those questions would be did 69 00:04:10,280 --> 00:04:14,920 Speaker 2: I intend an outcome? Did I take an action? Did 70 00:04:14,960 --> 00:04:18,279 Speaker 2: I get the intended outcome or at least in part 71 00:04:18,400 --> 00:04:23,200 Speaker 2: or intermittently? And was the action connected in some way, 72 00:04:23,320 --> 00:04:28,080 Speaker 2: most often by temporal proximity. You know, there's a time relationship. 73 00:04:28,520 --> 00:04:31,839 Speaker 2: Was the action connected to the outcome? And if these 74 00:04:31,880 --> 00:04:35,400 Speaker 2: criteria are met, we usually think we had control over 75 00:04:35,440 --> 00:04:39,080 Speaker 2: the outcome. And this system works fine most of the time, 76 00:04:39,440 --> 00:04:42,960 Speaker 2: but it can get us really confused when results are 77 00:04:43,080 --> 00:04:47,520 Speaker 2: mixed or when the connection between action and outcome is ambiguous, 78 00:04:48,360 --> 00:04:54,279 Speaker 2: especially in contrived artificial scenarios with hidden mechanisms of causation 79 00:04:54,520 --> 00:04:58,680 Speaker 2: and manipulative patterns of feedback like a slot machine. And 80 00:04:58,839 --> 00:05:01,600 Speaker 2: I think in the episode Rob and I both thought 81 00:05:01,640 --> 00:05:04,640 Speaker 2: it seemed like the control heuristic explanation made a lot 82 00:05:04,640 --> 00:05:07,680 Speaker 2: of sense, and there's pretty good evidence for it. But 83 00:05:07,880 --> 00:05:11,520 Speaker 2: also it's absolutely true that illusions of control could arise 84 00:05:12,000 --> 00:05:16,440 Speaker 2: from a combination of different factors and confusions of the 85 00:05:16,480 --> 00:05:20,200 Speaker 2: control heuristic might be only one part of the equation. 86 00:05:20,839 --> 00:05:24,039 Speaker 2: Wishful thinking or a desire for control could play a 87 00:05:24,080 --> 00:05:27,800 Speaker 2: part as well, and that would dovetail nicely with some 88 00:05:28,080 --> 00:05:30,719 Speaker 2: experimental findings that we did talk about in the episode, 89 00:05:30,760 --> 00:05:34,360 Speaker 2: such as the finding that people who are highly motivated 90 00:05:34,440 --> 00:05:38,440 Speaker 2: to get a certain outcome show more illusions of control 91 00:05:38,839 --> 00:05:41,839 Speaker 2: over the process of getting that outcome. So the example 92 00:05:41,880 --> 00:05:44,279 Speaker 2: we talked about would be if you enter a lottery 93 00:05:44,760 --> 00:05:48,040 Speaker 2: and the prize is a sandwich, people who are currently 94 00:05:48,200 --> 00:05:52,360 Speaker 2: hungry apparently show more illusions of control over the lottery 95 00:05:52,640 --> 00:05:56,240 Speaker 2: than people who aren't hungry. So I definitely think you're 96 00:05:56,240 --> 00:05:59,920 Speaker 2: onto something with the hypothesis that in some situations wish 97 00:06:00,000 --> 00:06:02,960 Speaker 2: full thinking or motivated reasoning could play a part in 98 00:06:03,080 --> 00:06:07,240 Speaker 2: causing these illusions, And I guess that's to invoke cliche. 99 00:06:07,600 --> 00:06:10,000 Speaker 2: More research is needed. Be good to see more studies 100 00:06:10,000 --> 00:06:13,359 Speaker 2: on that anyway, Thank you, Lauren. All Right, this next 101 00:06:13,440 --> 00:06:21,960 Speaker 2: message is from Anna Anna says, Hello, Robert, Joe and JJ. 102 00:06:22,640 --> 00:06:25,000 Speaker 2: In your first episode about the Illusion of control, you 103 00:06:25,040 --> 00:06:28,040 Speaker 2: talked about the buttons that you pressed to cross the road. 104 00:06:28,520 --> 00:06:30,640 Speaker 2: I just wanted to mention the brilliant design of the 105 00:06:30,640 --> 00:06:34,600 Speaker 2: crosswalk buttons in Australia. They are specially designed for people 106 00:06:34,640 --> 00:06:37,599 Speaker 2: with different abilities. They're positioned at a level so that 107 00:06:37,640 --> 00:06:40,400 Speaker 2: people in wheelchairs can press them. The button is a 108 00:06:40,560 --> 00:06:43,320 Speaker 2: large silver button that is easy to press, so people 109 00:06:43,360 --> 00:06:46,880 Speaker 2: with arthritis can easily press them. This is also helpful 110 00:06:46,960 --> 00:06:49,080 Speaker 2: if your hands are full, because you can press it 111 00:06:49,120 --> 00:06:52,520 Speaker 2: with your elbow. Also, they make a special noise when 112 00:06:52,560 --> 00:06:54,800 Speaker 2: it's time to cross. This is for people who are 113 00:06:54,920 --> 00:06:58,240 Speaker 2: vision impaired. When you're waiting, there's a slow beeping and 114 00:06:58,360 --> 00:07:01,080 Speaker 2: when you cross, there is a fast trill of beeps. 115 00:07:01,560 --> 00:07:04,320 Speaker 2: I think some of these are now implemented in other countries, 116 00:07:04,400 --> 00:07:07,320 Speaker 2: but from memory, they were invented in Australia. I seem 117 00:07:07,320 --> 00:07:10,800 Speaker 2: to remember an article praising Australian innovation and inclusiveness that 118 00:07:10,960 --> 00:07:14,880 Speaker 2: mentioned these buttons. Yeah, in the States, we have some 119 00:07:15,040 --> 00:07:18,960 Speaker 2: crosswalks that have set ups basically like this, but not 120 00:07:19,000 --> 00:07:22,000 Speaker 2: all crosswalks are like that. You know, it varies and 121 00:07:22,080 --> 00:07:25,320 Speaker 2: it goes on. I also wanted to mention a tangential 122 00:07:25,320 --> 00:07:27,640 Speaker 2: topic related to the illusion of control, which is the 123 00:07:27,680 --> 00:07:30,000 Speaker 2: idea that if you have one piece of good luck, 124 00:07:30,360 --> 00:07:33,720 Speaker 2: somehow that luck will continue and you need to harness 125 00:07:33,800 --> 00:07:36,640 Speaker 2: that luck because it will only last a short time. 126 00:07:37,200 --> 00:07:39,800 Speaker 2: Like if you have a particular piece of luck, like 127 00:07:39,880 --> 00:07:42,600 Speaker 2: you get a very good parking space, someone will say 128 00:07:42,600 --> 00:07:45,080 Speaker 2: something like you should go out and buy a lottery ticket. 129 00:07:45,880 --> 00:07:47,680 Speaker 2: I guess this could also be chalked up to the 130 00:07:47,720 --> 00:07:52,160 Speaker 2: idea of good omens, and she specifies not the excellent 131 00:07:52,160 --> 00:07:55,760 Speaker 2: book by Neil Gamon and Terry Pratchett. This can be 132 00:07:55,760 --> 00:07:57,760 Speaker 2: a good thing. Say, if you're going to a job 133 00:07:57,760 --> 00:07:59,720 Speaker 2: interview and you park on a street that has the 134 00:07:59,840 --> 00:08:04,360 Speaker 2: name of your best friend or a particularly lucky sounding name, 135 00:08:04,800 --> 00:08:07,520 Speaker 2: you will feel this is a sign. Then you will 136 00:08:07,560 --> 00:08:10,600 Speaker 2: be more relaxed and confident in the interview, and that 137 00:08:10,640 --> 00:08:13,880 Speaker 2: will increase your chances of getting the job. Anyway. Thanks 138 00:08:13,880 --> 00:08:17,200 Speaker 2: for all the great work, Anna well, thank you, Anna. Yeah, 139 00:08:17,240 --> 00:08:21,160 Speaker 2: I totally agree. I think, much like with the beneficial 140 00:08:21,320 --> 00:08:25,600 Speaker 2: side of the illusion of control, little lucky feelings like 141 00:08:25,640 --> 00:08:28,680 Speaker 2: this can be a net positive in our lives, even 142 00:08:28,720 --> 00:08:32,480 Speaker 2: if they are strictly speaking, illusions and not predictive of 143 00:08:32,520 --> 00:08:36,120 Speaker 2: external outcomes. So we talked about this a little bit 144 00:08:36,120 --> 00:08:39,040 Speaker 2: in the episodes, and I would stand by this idea. 145 00:08:39,080 --> 00:08:41,600 Speaker 2: I think it can be a good thing sometimes for 146 00:08:41,679 --> 00:08:45,360 Speaker 2: people to wear their lucky charm, even though I don't 147 00:08:45,360 --> 00:08:49,839 Speaker 2: believe that lucky charms are literally efficacious in changing your luck. 148 00:08:49,920 --> 00:08:53,559 Speaker 2: So I don't believe the charm would change the external 149 00:08:53,640 --> 00:08:56,120 Speaker 2: chances of a good or bad outcome. The thing it 150 00:08:56,160 --> 00:08:59,920 Speaker 2: probably actually does is help you regulate your emotions, which 151 00:09:00,120 --> 00:09:03,560 Speaker 2: in turn could actually lead to a better outcome, just 152 00:09:03,559 --> 00:09:05,960 Speaker 2: like the example you give. So if you go into 153 00:09:06,000 --> 00:09:09,120 Speaker 2: a job interview more relaxed and confident because you are 154 00:09:09,120 --> 00:09:13,000 Speaker 2: wearing your lucky charm, this could literally improve the impression 155 00:09:13,040 --> 00:09:15,200 Speaker 2: you make and your chances of getting the job. Even 156 00:09:15,240 --> 00:09:18,520 Speaker 2: though the lucky charm isn't actually magic, it's just helping 157 00:09:18,559 --> 00:09:22,280 Speaker 2: you control your own feelings and your own behavior. So 158 00:09:22,760 --> 00:09:24,960 Speaker 2: I think that's a great point. Na, thanks for getting 159 00:09:24,960 --> 00:09:28,680 Speaker 2: a touch. This next message, still about the illusion of control, 160 00:09:28,760 --> 00:09:38,000 Speaker 2: is from Renata. Renata says, Hi, Joe and Rob, one 161 00:09:38,040 --> 00:09:41,760 Speaker 2: of my job responsibilities is testing software, and the illusion 162 00:09:41,800 --> 00:09:44,760 Speaker 2: of control crops up all the time in my work. 163 00:09:45,080 --> 00:09:47,959 Speaker 2: One thing I've encountered several times and seeing others get 164 00:09:48,000 --> 00:09:51,320 Speaker 2: stuck on too, is trying to figure out the elaborate 165 00:09:51,360 --> 00:09:55,600 Speaker 2: steps it takes to reproduce a bug when the bug 166 00:09:55,720 --> 00:09:58,640 Speaker 2: in fact occurs due to factors that are not in 167 00:09:58,720 --> 00:10:01,760 Speaker 2: my control. For example, there might be a bug that 168 00:10:01,840 --> 00:10:05,040 Speaker 2: only happens at a certain interval or after a delay 169 00:10:05,240 --> 00:10:09,280 Speaker 2: because of system processing time. But what sometimes I end 170 00:10:09,360 --> 00:10:13,360 Speaker 2: up doing is coming up with totally irrelevant steps that 171 00:10:13,480 --> 00:10:16,040 Speaker 2: just happen to take the exact amount of time you 172 00:10:16,160 --> 00:10:20,000 Speaker 2: need to wait to see the bug. Another example is 173 00:10:20,080 --> 00:10:23,199 Speaker 2: similar to an elevator door close button, where we may 174 00:10:23,440 --> 00:10:26,880 Speaker 2: introduce a button knowing that it doesn't do much, but 175 00:10:27,040 --> 00:10:30,000 Speaker 2: it makes users feel more in control, for example, a 176 00:10:30,040 --> 00:10:34,400 Speaker 2: button that refreshes information even though it automatically refreshes every 177 00:10:34,440 --> 00:10:39,240 Speaker 2: few seconds. The illusion of control also reminds me of 178 00:10:39,280 --> 00:10:43,880 Speaker 2: the self help industry manifestation and the law of attraction 179 00:10:44,440 --> 00:10:47,080 Speaker 2: prey on a combination of the illusion of control and 180 00:10:47,240 --> 00:10:50,880 Speaker 2: confirmation bias. I think the authors you cited who critiqued 181 00:10:50,920 --> 00:10:53,959 Speaker 2: the illusion of control are probably right that as humans, 182 00:10:53,960 --> 00:10:57,440 Speaker 2: we are generally bad at knowing the probability of an outcome. 183 00:10:57,600 --> 00:11:01,360 Speaker 2: And therefore we estimate our control in correctly. But the 184 00:11:01,360 --> 00:11:04,720 Speaker 2: illusion of control is apparently much easier to exploit than 185 00:11:04,760 --> 00:11:08,000 Speaker 2: the illusion of non control, so it ends up being 186 00:11:08,040 --> 00:11:10,760 Speaker 2: a bias we see in the real world that causes 187 00:11:10,800 --> 00:11:14,560 Speaker 2: harm more often. And then Ranata ends the message with 188 00:11:14,679 --> 00:11:17,000 Speaker 2: some kind words about the show and by attaching a 189 00:11:17,080 --> 00:11:20,480 Speaker 2: picture of a couple of pet rats. I gotta say, 190 00:11:20,480 --> 00:11:23,560 Speaker 2: these rats are very cute. In the picture, They're kind 191 00:11:23,600 --> 00:11:26,719 Speaker 2: of smooshed together on top of each other, so I'm 192 00:11:26,760 --> 00:11:28,760 Speaker 2: trying to figure out are they like cuddling or are 193 00:11:28,760 --> 00:11:32,680 Speaker 2: they climbing over one another to get your attention? Unclear? 194 00:11:34,040 --> 00:11:39,640 Speaker 2: Let's see regarding your comments about illusions of control. So 195 00:11:39,760 --> 00:11:44,560 Speaker 2: with illusions of control and beliefs about manifestation or the 196 00:11:44,679 --> 00:11:48,080 Speaker 2: law of attraction or other types of magical thinking, this 197 00:11:48,160 --> 00:11:54,320 Speaker 2: got me wondering where the line is between the normal, healthy, 198 00:11:54,800 --> 00:11:59,359 Speaker 2: empowering optimism and feelings of self efficacy that we talked about, 199 00:11:59,679 --> 00:12:03,880 Speaker 2: which may in some cases even encompass what are objectively illusions, 200 00:12:04,960 --> 00:12:07,200 Speaker 2: maybe thinking you have a bit more control over an 201 00:12:07,200 --> 00:12:10,160 Speaker 2: outcome than you actually do. Where's the line between that 202 00:12:10,960 --> 00:12:14,599 Speaker 2: and these beliefs in things like manifestation or whatever. You 203 00:12:14,640 --> 00:12:18,600 Speaker 2: would call a theory of literal psychic control over outcomes, 204 00:12:19,240 --> 00:12:22,520 Speaker 2: which in most cases I would judge the latter personally 205 00:12:22,600 --> 00:12:26,640 Speaker 2: to be not just mistaken but fairly toxic, associated with 206 00:12:26,840 --> 00:12:31,719 Speaker 2: the just world fallacy, just generally bad epistemic values and 207 00:12:32,280 --> 00:12:36,920 Speaker 2: connections to other false belief formation and so forth. Maybe 208 00:12:36,960 --> 00:12:39,559 Speaker 2: it's just a matter of degree, like how strong are 209 00:12:39,600 --> 00:12:42,920 Speaker 2: your illusions of control? But I'm a little bit tempted 210 00:12:42,960 --> 00:12:46,760 Speaker 2: to think there's a qualitative difference here and not just 211 00:12:46,800 --> 00:12:49,760 Speaker 2: a quantitative one. I kind of wonder if it is 212 00:12:50,040 --> 00:12:54,960 Speaker 2: literally the presence of a model or a concept like 213 00:12:55,400 --> 00:12:58,760 Speaker 2: manifestation or the law of attraction or whatever it is 214 00:12:59,800 --> 00:13:05,480 Speaker 2: x externally socially validated model that explains and justifies the 215 00:13:05,520 --> 00:13:09,840 Speaker 2: workings of this principle, and the existence of this socially 216 00:13:09,880 --> 00:13:17,079 Speaker 2: reinforced model can turn normal, healthy, motivating, empowering minor illusions 217 00:13:17,120 --> 00:13:20,920 Speaker 2: of control into full blown delusions with all of these 218 00:13:20,960 --> 00:13:24,440 Speaker 2: toxic consequences. I don't know, that's just a guess, and 219 00:13:24,520 --> 00:13:27,520 Speaker 2: I'd be curious what listeners think, Like, how do you 220 00:13:27,559 --> 00:13:32,200 Speaker 2: go from minor illusions of control that are mostly harmless, 221 00:13:32,480 --> 00:13:34,120 Speaker 2: you know for the most part, and can help us 222 00:13:34,160 --> 00:13:36,240 Speaker 2: achieve our goals. How do you go from that to 223 00:13:36,360 --> 00:13:39,200 Speaker 2: thinking that you can literally will yourself to win the 224 00:13:39,240 --> 00:13:43,120 Speaker 2: lottery if you just focus and stay positive. I think 225 00:13:43,160 --> 00:13:47,360 Speaker 2: that's actually a very consequential question about the thing about 226 00:13:47,400 --> 00:13:51,600 Speaker 2: software testing. I don't know if I've ever considered this before, 227 00:13:51,600 --> 00:13:55,640 Speaker 2: but yeah, software testing and bug hunting could function like 228 00:13:55,679 --> 00:13:59,680 Speaker 2: a really devious skinner box where the trigger for the 229 00:13:59,720 --> 00:14:04,360 Speaker 2: buzug is not clear and consistent. It's been a while 230 00:14:04,400 --> 00:14:07,840 Speaker 2: since I read the history of the operant conditioning research, 231 00:14:07,960 --> 00:14:10,960 Speaker 2: so I hope I'm remembering this correctly, But what I 232 00:14:11,040 --> 00:14:16,040 Speaker 2: recall is that conditioning is pretty straightforward when the reinforcement 233 00:14:16,280 --> 00:14:20,120 Speaker 2: is consistent. So the rat presses a button, it gets 234 00:14:20,160 --> 00:14:22,360 Speaker 2: a food pellet. It's the same type of food pellet 235 00:14:22,360 --> 00:14:25,200 Speaker 2: every time it learns to press the button when hungry. 236 00:14:25,760 --> 00:14:30,360 Speaker 2: But the variation was that you could really drive test 237 00:14:30,400 --> 00:14:34,680 Speaker 2: animals crazy and make them addicted to the conditioned behavior 238 00:14:35,400 --> 00:14:40,360 Speaker 2: by inserting some randomness and unpredictability into the reward allocation, 239 00:14:40,920 --> 00:14:45,160 Speaker 2: giving us the notion of intermittent variable rewards, which are 240 00:14:45,560 --> 00:14:49,680 Speaker 2: little rewards, little fleeting bursts of pleasure that come at 241 00:14:49,760 --> 00:14:54,640 Speaker 2: variable time intervals or variable levels of quality with a 242 00:14:54,760 --> 00:14:58,520 Speaker 2: somewhat ambiguous relationship to your inputs, so it's not one 243 00:14:58,600 --> 00:15:02,080 Speaker 2: hundred percent predictable which of your inputs will produce the 244 00:15:02,080 --> 00:15:07,920 Speaker 2: best reward and with what reliability and intermittent variable rewards 245 00:15:07,960 --> 00:15:11,080 Speaker 2: are what you get on slot machines or on your 246 00:15:11,200 --> 00:15:15,120 Speaker 2: social media app of choice. And we've talked about this 247 00:15:15,200 --> 00:15:17,400 Speaker 2: on the show and before. It's been several years now, 248 00:15:17,440 --> 00:15:19,640 Speaker 2: but this is one of the reasons I had to 249 00:15:19,680 --> 00:15:22,800 Speaker 2: pretty much completely get off of social media years ago. 250 00:15:23,360 --> 00:15:25,880 Speaker 2: I know it works for some people, but for me, 251 00:15:26,320 --> 00:15:29,240 Speaker 2: I realized it was just way too addictive, way too 252 00:15:29,280 --> 00:15:32,640 Speaker 2: easy for it to completely monopolize my time and attention. 253 00:15:33,160 --> 00:15:36,400 Speaker 2: And while it had control of my attention, it was 254 00:15:36,480 --> 00:15:40,880 Speaker 2: busy rapidly installing malware on my brain. So I don't 255 00:15:40,880 --> 00:15:45,160 Speaker 2: want to be too preachy as the anti social media guy, 256 00:15:45,200 --> 00:15:47,880 Speaker 2: and I'm sure lots of people do genuinely have a 257 00:15:47,880 --> 00:15:50,960 Speaker 2: healthier relationship with it, but I had to get off, 258 00:15:51,000 --> 00:15:54,200 Speaker 2: and I'm extremely glad I did. And if you find 259 00:15:54,200 --> 00:15:56,560 Speaker 2: yourself having the same kind of relationship with it that 260 00:15:56,600 --> 00:15:59,560 Speaker 2: I did, I would recommend the same. But anyway, thank 261 00:15:59,560 --> 00:16:01,760 Speaker 2: you so much, which we're not a very interesting email, 262 00:16:01,920 --> 00:16:11,760 Speaker 2: A lot to think about all right. This next message 263 00:16:11,800 --> 00:16:15,480 Speaker 2: is from Calvin. Calvin says, Hello, Rob, Joe, and JJ. 264 00:16:16,040 --> 00:16:19,440 Speaker 2: Greetings from Chicago. I've been listening to the show daily 265 00:16:19,560 --> 00:16:23,360 Speaker 2: for several years, starting with the episode on the First Monster, 266 00:16:23,800 --> 00:16:27,000 Speaker 2: or about the Loewen Minch. Yeah, it's been a while 267 00:16:27,000 --> 00:16:29,720 Speaker 2: since we did that episode. That was where we talked 268 00:16:29,760 --> 00:16:34,120 Speaker 2: about this truly fascinating thirty five thousand year old mammoth 269 00:16:34,200 --> 00:16:38,840 Speaker 2: ivory artifact. It's a little statuette from Paleolithic Germany called 270 00:16:38,920 --> 00:16:42,280 Speaker 2: the loewen Minch, which means lion man or lion person, 271 00:16:43,240 --> 00:16:46,040 Speaker 2: and we were talking about it because it appears to 272 00:16:46,200 --> 00:16:50,120 Speaker 2: depict a human body with a lion head, and at 273 00:16:50,160 --> 00:16:52,920 Speaker 2: the time this was the earliest or one of the 274 00:16:53,000 --> 00:16:58,120 Speaker 2: earliest known artistic depictions of a creature not existing in nature, 275 00:16:58,840 --> 00:17:01,680 Speaker 2: and this led to us talking more generally about the 276 00:17:01,720 --> 00:17:04,960 Speaker 2: idea of when and how it was that humans first 277 00:17:05,000 --> 00:17:11,000 Speaker 2: imagined not just existing predators, but synthetic monsters, predators purely 278 00:17:11,080 --> 00:17:15,960 Speaker 2: of the imagination, based often on combinations of pieces of 279 00:17:15,960 --> 00:17:19,479 Speaker 2: anatomy from other creatures. I should flag that I think 280 00:17:19,520 --> 00:17:21,880 Speaker 2: I've read in the meantime since that episode that there's 281 00:17:21,920 --> 00:17:25,680 Speaker 2: some dispute about whether the loan minch is best interpreted 282 00:17:26,320 --> 00:17:29,240 Speaker 2: as a human body with a lion head, or maybe 283 00:17:30,440 --> 00:17:32,800 Speaker 2: whether maybe it's supposed to depict something else like a 284 00:17:32,800 --> 00:17:36,199 Speaker 2: bear or something more strictly found in nature. Don't know 285 00:17:36,240 --> 00:17:39,520 Speaker 2: the answer there, but whatever this particular artifact is based on, 286 00:17:40,400 --> 00:17:43,440 Speaker 2: it's obviously not the first time in history some human 287 00:17:43,480 --> 00:17:45,919 Speaker 2: imagined a monster. It would just be like our first 288 00:17:46,440 --> 00:17:50,880 Speaker 2: extant piece of evidence for it. And the historical emergence 289 00:17:51,119 --> 00:17:55,600 Speaker 2: of an imaginative capacity to dream up nonexistent beings and 290 00:17:55,680 --> 00:17:59,520 Speaker 2: creatures is a really really interesting subject to me, like, 291 00:17:59,600 --> 00:18:03,440 Speaker 2: where does that capacity come from? What powers it? Anyway, 292 00:18:03,720 --> 00:18:07,879 Speaker 2: back to Calvin's message, Calvin says, You've given me so 293 00:18:08,000 --> 00:18:10,399 Speaker 2: much great content that helped me get through many boring 294 00:18:10,440 --> 00:18:14,320 Speaker 2: work days. Glad to hear that, Calvin. So Calvin says, 295 00:18:14,640 --> 00:18:17,359 Speaker 2: I've never really felt like I had anything to contribute 296 00:18:17,520 --> 00:18:19,520 Speaker 2: or write in about until I was listening to the 297 00:18:19,560 --> 00:18:22,760 Speaker 2: Illusion of Control series. My experience with the Illusion of 298 00:18:22,760 --> 00:18:27,080 Speaker 2: Control comes from playing the original Pokemon Red, Blue Green 299 00:18:27,200 --> 00:18:30,600 Speaker 2: games as a child. It was a well known quote 300 00:18:31,000 --> 00:18:33,800 Speaker 2: fact among my friend group that if you threw a 301 00:18:33,840 --> 00:18:37,600 Speaker 2: poke ball and Presston held the B button that the 302 00:18:37,640 --> 00:18:41,400 Speaker 2: precise moment the poke ball hit you would greatly increase 303 00:18:41,440 --> 00:18:44,720 Speaker 2: the chances of a successful capture. Now I know the 304 00:18:44,840 --> 00:18:48,800 Speaker 2: chances of catching a Pokemon is really just a dice 305 00:18:48,880 --> 00:18:52,639 Speaker 2: roll with modifiers for various statistics of the Pokemon. But 306 00:18:52,720 --> 00:18:55,680 Speaker 2: at the time, I pressed bee every time, and if 307 00:18:55,680 --> 00:18:57,919 Speaker 2: the Pokemon broke out of the ball, it was because 308 00:18:58,000 --> 00:19:01,679 Speaker 2: I didn't time it right. Even after realizing that this 309 00:19:01,920 --> 00:19:04,480 Speaker 2: wasn't really part of the game mechanic, I still press 310 00:19:04,560 --> 00:19:07,280 Speaker 2: to be every time for a long while, even when 311 00:19:07,400 --> 00:19:11,159 Speaker 2: playing later generation games. It just never felt right to 312 00:19:11,280 --> 00:19:15,040 Speaker 2: not press it. It was like I just couldn't leave 313 00:19:15,080 --> 00:19:17,920 Speaker 2: it to chance if there was even the slightest possibility 314 00:19:17,960 --> 00:19:21,480 Speaker 2: that I was having some kind of impact. Anyway, I 315 00:19:21,560 --> 00:19:23,320 Speaker 2: just want to thank you for all you do and 316 00:19:23,440 --> 00:19:26,080 Speaker 2: for giving me a great list of movies from Weird 317 00:19:26,119 --> 00:19:28,320 Speaker 2: House Cinema. I know a lot of people prefer to 318 00:19:28,359 --> 00:19:30,480 Speaker 2: watch the movie and then listen to the episode, but 319 00:19:30,520 --> 00:19:34,680 Speaker 2: I like to listen first and then watch. Well, thank 320 00:19:34,720 --> 00:19:37,280 Speaker 2: you so much, Calvin. Yeah, the Pokemon thing is a 321 00:19:37,320 --> 00:19:40,959 Speaker 2: great example. Again, It's like when the mechanisms and the 322 00:19:41,000 --> 00:19:45,600 Speaker 2: feedback are ambiguous, there's so much room to think we're 323 00:19:45,600 --> 00:19:50,040 Speaker 2: exerting control when we're not so Yeah, thanks Calvin. Okay, 324 00:19:50,080 --> 00:19:53,320 Speaker 2: this next message comes from Cindy. It is about our 325 00:19:53,440 --> 00:20:02,919 Speaker 2: Vault episodes on heart removal and heart burial. Cindy says, Hi, 326 00:20:03,119 --> 00:20:05,800 Speaker 2: Robert and Joe, I thought you'd get a kick out 327 00:20:05,800 --> 00:20:08,879 Speaker 2: of this. I listened to part one of Because It 328 00:20:08,960 --> 00:20:12,360 Speaker 2: Is My Heart on February tenth and remembered that it 329 00:20:12,440 --> 00:20:16,119 Speaker 2: was an earlier aired two part series. So I searched 330 00:20:16,119 --> 00:20:18,800 Speaker 2: for part two to listen to it again and noticed 331 00:20:18,920 --> 00:20:22,280 Speaker 2: that I had never finished. I wondered aloud, why did 332 00:20:22,320 --> 00:20:25,280 Speaker 2: I stop part way through the episode? I pressed play, 333 00:20:25,400 --> 00:20:28,680 Speaker 2: and I was immediately transported back to that fateful day 334 00:20:29,200 --> 00:20:32,720 Speaker 2: almost a year ago, exactly when the vivid description of 335 00:20:32,800 --> 00:20:39,280 Speaker 2: the most teutonicus funerary process caused me to press stop. Yeah. So, 336 00:20:39,359 --> 00:20:41,560 Speaker 2: for those who don't remember, and Cindy, I'm very sorry 337 00:20:41,640 --> 00:20:44,520 Speaker 2: to subject you to this a third time. The most 338 00:20:44,520 --> 00:20:48,560 Speaker 2: Teutonicus translates to the German custom, and it was a 339 00:20:48,560 --> 00:20:52,440 Speaker 2: solution used in medieval Europe for transporting dead bodies back 340 00:20:52,480 --> 00:20:56,240 Speaker 2: home from far away lands. If, for example, a rich 341 00:20:56,359 --> 00:21:01,040 Speaker 2: German warrior died while away in the Crusades, untreated of course, 342 00:21:01,080 --> 00:21:04,560 Speaker 2: the body would be subject to horrible decomposition during travel, 343 00:21:04,960 --> 00:21:08,560 Speaker 2: so the German custom was to remove the flesh from 344 00:21:08,760 --> 00:21:13,199 Speaker 2: the bones before transport back home, sometimes by boiling. I 345 00:21:13,240 --> 00:21:17,040 Speaker 2: think we called this making a crusader bone broth in 346 00:21:17,080 --> 00:21:19,720 Speaker 2: the episode, and then you would of course take the 347 00:21:19,720 --> 00:21:23,320 Speaker 2: clean bones home to the family. Crypt Cindy's message goes 348 00:21:23,359 --> 00:21:26,000 Speaker 2: on this time around. I only had to take a 349 00:21:26,000 --> 00:21:28,919 Speaker 2: break once, but managed to power through the rest of 350 00:21:28,920 --> 00:21:31,679 Speaker 2: the episode. I can't say it was extremely enjoyable, but 351 00:21:31,800 --> 00:21:36,359 Speaker 2: it was very enlightening and interesting. Smiley face emoji. I 352 00:21:36,600 --> 00:21:39,440 Speaker 2: choose to be boiled in one of my favorite liquids, 353 00:21:39,600 --> 00:21:43,359 Speaker 2: maple syrup, if that was an option. Thanks for covering 354 00:21:43,400 --> 00:21:45,520 Speaker 2: all sorts of topics, even the ones that make us 355 00:21:45,560 --> 00:21:50,560 Speaker 2: mentally squirm. Keep up the excellent work. Cheers, Cindy. Well, 356 00:21:50,600 --> 00:21:53,720 Speaker 2: thank you so much. Cindy. Does maple syrup boil or 357 00:21:53,760 --> 00:21:56,480 Speaker 2: would it start to kind of like cook and caramelize? 358 00:21:56,720 --> 00:21:58,720 Speaker 2: It's got a lot of sugar. I don't know how 359 00:21:58,720 --> 00:22:02,760 Speaker 2: that works. Uh yeah, thank you, Cindy. Okay, this very 360 00:22:02,840 --> 00:22:06,760 Speaker 2: last message is about Weird House Cinema. It's from Andy. 361 00:22:11,160 --> 00:22:14,159 Speaker 2: Andy says Hi, Robert and Joe loved the show, and 362 00:22:14,240 --> 00:22:17,280 Speaker 2: as always, thank you and your excellent audio producer JJ 363 00:22:17,440 --> 00:22:19,560 Speaker 2: for all the work that goes into it. Your weird 364 00:22:19,600 --> 00:22:22,439 Speaker 2: house cinema episode on Flash Gordon took me back to 365 00:22:22,480 --> 00:22:25,560 Speaker 2: my youth, where I first saw the film on my 366 00:22:25,680 --> 00:22:29,560 Speaker 2: dad's pirated Betamax cassette. I hadn't watched it in probably 367 00:22:29,640 --> 00:22:32,320 Speaker 2: thirty years until you inspired me to seek it out again. 368 00:22:33,040 --> 00:22:35,280 Speaker 2: Just a few minutes in I saw a familiar face 369 00:22:35,359 --> 00:22:40,600 Speaker 2: which escaped your analysis. Munson, doctor Zarkov's beleaguered and rationally 370 00:22:40,640 --> 00:22:45,120 Speaker 2: fearful associate, was played by William Hootkins, best known as 371 00:22:45,400 --> 00:22:49,600 Speaker 2: Jack Porkins, hero of the Rebellion and the first casualty 372 00:22:49,720 --> 00:22:53,400 Speaker 2: of the audacious assault on the Death Star Trench. Though 373 00:22:53,440 --> 00:22:56,320 Speaker 2: I'm not happy about it, I disagreed with your reading 374 00:22:56,400 --> 00:22:58,720 Speaker 2: of the plane crash scene this is in Flash Gordon. 375 00:22:59,119 --> 00:23:02,120 Speaker 2: It seems clear to me that Munson does not escape, 376 00:23:02,320 --> 00:23:05,040 Speaker 2: and is indeed crushed to death by the airplane as 377 00:23:05,080 --> 00:23:09,280 Speaker 2: it crashes through his laboratory. I suppose it's his cinematic 378 00:23:09,320 --> 00:23:14,080 Speaker 2: penance for failing to board Doctor Zarkov's dicey homebrew rocket Flash, 379 00:23:14,119 --> 00:23:16,280 Speaker 2: and Dale must not have noticed him or else they'd 380 00:23:16,280 --> 00:23:19,520 Speaker 2: have been far more distraught. Doctor Zarkov, however, is so 381 00:23:19,680 --> 00:23:22,479 Speaker 2: consumed by his need to see his mission fulfilled that 382 00:23:22,520 --> 00:23:25,240 Speaker 2: he seems not to care. All in all, it seems 383 00:23:25,280 --> 00:23:27,960 Speaker 2: an ignominious end for one who gave his life to 384 00:23:28,040 --> 00:23:31,159 Speaker 2: the cause of galactic freedom. Thanks again, and keep up 385 00:23:31,160 --> 00:23:35,240 Speaker 2: the good work. Andy. Thank you for bringing this up. Andy. Yeah, 386 00:23:35,280 --> 00:23:37,120 Speaker 2: I didn't even think about the actor. Though he did 387 00:23:37,160 --> 00:23:40,320 Speaker 2: look familiar to me, I didn't make the connection. But 388 00:23:40,400 --> 00:23:42,399 Speaker 2: now that I had a chance to go back and 389 00:23:42,440 --> 00:23:45,320 Speaker 2: peek at his filmography, I realized that William Hootkins is 390 00:23:45,480 --> 00:23:48,720 Speaker 2: not only Porkins from the original Star Wars. He was 391 00:23:48,760 --> 00:23:50,760 Speaker 2: also in Raiders of the Lost Ark. In one of 392 00:23:50,800 --> 00:23:55,080 Speaker 2: my favorite scenes, he's one of the two Army intelligence 393 00:23:55,119 --> 00:23:58,719 Speaker 2: officers who sort of come to give Indy his mission 394 00:23:58,760 --> 00:24:01,240 Speaker 2: to retrieve the Ark or the beginning of the movie. 395 00:24:01,440 --> 00:24:03,720 Speaker 2: I've said this on the show before, but that scene 396 00:24:03,800 --> 00:24:07,600 Speaker 2: is so good, I think specifically from a screenwriting and 397 00:24:07,640 --> 00:24:10,960 Speaker 2: filmmaking point of view, it's one of the most efficient 398 00:24:11,560 --> 00:24:15,560 Speaker 2: and powerful exposition scenes in movie history. So it's like 399 00:24:15,920 --> 00:24:19,879 Speaker 2: very short, but it quickly and convincingly tells us so 400 00:24:19,960 --> 00:24:22,920 Speaker 2: much about the characters, about the significance of the arc, 401 00:24:23,640 --> 00:24:27,080 Speaker 2: the stakes of the coming conflict. It infuses the whole 402 00:24:27,119 --> 00:24:30,560 Speaker 2: story with the sense of mystery and power and magic 403 00:24:30,680 --> 00:24:33,919 Speaker 2: and all that. It's great, great scene. But also I 404 00:24:33,960 --> 00:24:38,320 Speaker 2: think William Hootkins is He's the later in the movie. 405 00:24:38,359 --> 00:24:40,760 Speaker 2: He's the guy at the end who assures us that 406 00:24:40,920 --> 00:24:44,600 Speaker 2: further investigation of the arc has been assigned to top men. 407 00:24:45,560 --> 00:24:47,119 Speaker 2: All right, I think that's going to be it for 408 00:24:47,240 --> 00:24:49,720 Speaker 2: today's mail bag. Thank you so much to everyone who 409 00:24:49,720 --> 00:24:52,800 Speaker 2: wrote in. If you are new to the show, Stuff 410 00:24:52,840 --> 00:24:55,600 Speaker 2: to Blow Your Mind is primarily a science and culture 411 00:24:55,680 --> 00:25:00,240 Speaker 2: podcast with core episodes on Tuesdays and Thursdays. On Monday, 412 00:25:00,320 --> 00:25:04,040 Speaker 2: we read back listener mail on episodes like this. On Wednesdays, 413 00:25:04,080 --> 00:25:07,600 Speaker 2: we have a short form scripted podcast called The Artifact 414 00:25:07,720 --> 00:25:10,640 Speaker 2: or The Monster Fact or maybe even there are new 415 00:25:10,680 --> 00:25:14,760 Speaker 2: forms emerging. On Fridays, we do a special series called 416 00:25:14,920 --> 00:25:19,040 Speaker 2: Weird House Cinema where we just watch and discuss weird movies, 417 00:25:19,119 --> 00:25:21,800 Speaker 2: good or bad, well known or obscure. We do them 418 00:25:21,800 --> 00:25:26,400 Speaker 2: all huge thanks as always to our excellent audio producer 419 00:25:26,600 --> 00:25:29,160 Speaker 2: JJ Posway. If you would like to get in touch 420 00:25:29,160 --> 00:25:31,360 Speaker 2: with us with feedback on this episode or any other 421 00:25:31,440 --> 00:25:33,679 Speaker 2: to suggest a topic for the future, or just to 422 00:25:33,720 --> 00:25:37,040 Speaker 2: say hello. You can email us at contact at stuff 423 00:25:37,080 --> 00:25:44,840 Speaker 2: to Blow your Mind dot com. 424 00:25:44,920 --> 00:25:47,879 Speaker 1: Stuff to Blow Your Mind is production of iHeartRadio. For 425 00:25:47,960 --> 00:25:50,760 Speaker 1: more podcasts from my heart Radio, visit the iHeartRadio app, 426 00:25:50,920 --> 00:25:53,679 Speaker 1: Apple Podcasts, or wherever you listen to your favorite shows.