1 00:00:03,040 --> 00:00:05,280 Speaker 1: Welcome to Stuff to Blow Your Mind, the production of 2 00:00:05,320 --> 00:00:14,080 Speaker 1: My Heart Radio. Hey, you're welcome to Stuff to Blow 3 00:00:14,120 --> 00:00:17,280 Speaker 1: your Mind. My name is Robert Land, and I'm Joe McCormick. 4 00:00:17,360 --> 00:00:21,200 Speaker 1: In Today, we're going to be talking about choices and preferences, 5 00:00:21,280 --> 00:00:24,279 Speaker 1: and I wanted to start off by looking at what 6 00:00:24,360 --> 00:00:27,760 Speaker 1: I think is one of the most commonly misunderstood poems 7 00:00:27,760 --> 00:00:32,040 Speaker 1: in English literature. It's a classic most Americans already know. 8 00:00:32,280 --> 00:00:34,360 Speaker 1: You probably read it at some point in high school 9 00:00:34,479 --> 00:00:38,000 Speaker 1: or even earlier. But it's an interesting poem because I 10 00:00:38,040 --> 00:00:42,080 Speaker 1: think it usually gets interpreted to mean the exact opposite 11 00:00:42,159 --> 00:00:45,159 Speaker 1: of what it actually means. All right, so this is 12 00:00:45,200 --> 00:00:47,720 Speaker 1: the road not taken by Robert Frost. Are you ready 13 00:00:47,720 --> 00:00:50,199 Speaker 1: to hear it again? Yeah? Yeah. This is always a 14 00:00:50,240 --> 00:00:52,959 Speaker 1: pleasure to to hear or to read, even though it's 15 00:00:52,960 --> 00:00:57,160 Speaker 1: one that I think we're all hit with le probably 16 00:00:57,280 --> 00:00:59,920 Speaker 1: at the elementary school level. You know. I feel like 17 00:01:00,120 --> 00:01:03,920 Speaker 1: I came to a greater appreciation of just the music 18 00:01:04,040 --> 00:01:07,240 Speaker 1: of Robert Frost's poems as an adult than than I 19 00:01:07,240 --> 00:01:09,720 Speaker 1: had for them when I was in school. So I'm 20 00:01:09,720 --> 00:01:12,280 Speaker 1: not sure exactly what changed there. Maybe I became grumpier 21 00:01:12,319 --> 00:01:16,039 Speaker 1: and he was quite a grump himself. But but here 22 00:01:16,080 --> 00:01:19,680 Speaker 1: we go. Two roads diverged in a yellow wood, and 23 00:01:19,840 --> 00:01:23,000 Speaker 1: sorry I could not travel both and be one traveler. 24 00:01:23,400 --> 00:01:26,160 Speaker 1: Long I stood and looked down one as far as 25 00:01:26,160 --> 00:01:29,759 Speaker 1: I could to where it bent in the undergrowth, then 26 00:01:29,840 --> 00:01:33,319 Speaker 1: took the other, as just as fair and having perhaps 27 00:01:33,360 --> 00:01:36,320 Speaker 1: the better claim, because it was grassy and wanted, where 28 00:01:36,959 --> 00:01:39,959 Speaker 1: though as for that, the passing there had warned them 29 00:01:40,000 --> 00:01:44,200 Speaker 1: really about the same, And both that morning equally lay 30 00:01:44,240 --> 00:01:48,080 Speaker 1: in leaves no step had trodden black. Oh, I kept 31 00:01:48,120 --> 00:01:51,600 Speaker 1: the first for another day, yet, knowing how way leads 32 00:01:51,600 --> 00:01:54,320 Speaker 1: onto way, I doubted if I should ever come back, 33 00:01:55,080 --> 00:01:58,280 Speaker 1: I shall be telling this with a sigh. Somewhere ages 34 00:01:58,320 --> 00:02:02,040 Speaker 1: and ages. Hence, two roads diverged in a wood, and 35 00:02:02,160 --> 00:02:05,440 Speaker 1: I I took the one less traveled by, and that 36 00:02:05,520 --> 00:02:09,240 Speaker 1: has made all the difference. It's a beautiful bomb, it 37 00:02:09,320 --> 00:02:11,920 Speaker 1: really is. But one of the things that that is 38 00:02:11,960 --> 00:02:17,560 Speaker 1: really funny is that I think people usually interpret this 39 00:02:17,639 --> 00:02:23,680 Speaker 1: poem as a sort of celebration of unique individuality and 40 00:02:23,720 --> 00:02:27,240 Speaker 1: a celebration of going your own way. It's it's about 41 00:02:27,240 --> 00:02:30,600 Speaker 1: how if you go boldly where others have not gone before, 42 00:02:31,080 --> 00:02:34,760 Speaker 1: if you remain your unique, authentic self and choose the 43 00:02:34,840 --> 00:02:39,040 Speaker 1: stranger path, you'll be rewarded with a life of unique meaning. 44 00:02:39,200 --> 00:02:42,080 Speaker 1: But if you read it closely, I think the poem 45 00:02:42,160 --> 00:02:44,960 Speaker 1: is meant to be a quite ironic sort of perry 46 00:02:45,120 --> 00:02:49,760 Speaker 1: against exactly that way of thinking, because what happens in it, well, 47 00:02:49,800 --> 00:02:52,240 Speaker 1: the speaker comes to a fork in the road. The 48 00:02:52,280 --> 00:02:56,240 Speaker 1: speaker evaluates the for each path for a bit, at 49 00:02:56,280 --> 00:02:58,720 Speaker 1: first thinks one is more traveled than the other, but 50 00:02:58,800 --> 00:03:02,079 Speaker 1: then ultimately realize is that they're about the same. Then 51 00:03:02,240 --> 00:03:06,000 Speaker 1: takes one road rather than the other for no major reason, 52 00:03:06,080 --> 00:03:10,040 Speaker 1: they are in reality pretty much indistinguishable. Then thinks about 53 00:03:10,080 --> 00:03:13,519 Speaker 1: how later in life he'll be claiming that he took 54 00:03:13,560 --> 00:03:16,959 Speaker 1: the bold, untraveled path and that it changed his life, 55 00:03:17,080 --> 00:03:20,000 Speaker 1: even though that wasn't true. Yeah, I feel like that's 56 00:03:20,040 --> 00:03:22,480 Speaker 1: something that a lot of people miss out on in 57 00:03:22,520 --> 00:03:24,120 Speaker 1: the poem, and I think a lot of it sometimes 58 00:03:24,200 --> 00:03:28,080 Speaker 1: comes down to um the discussions about what is he 59 00:03:28,120 --> 00:03:30,440 Speaker 1: actually talking about, And people get very wrapped up in that, 60 00:03:30,520 --> 00:03:32,320 Speaker 1: like what was the choice? No, no, no, not the 61 00:03:32,360 --> 00:03:34,840 Speaker 1: walk in the woods? What were you actually talking about? Frost, 62 00:03:35,560 --> 00:03:38,720 Speaker 1: And then you you kind of end up ignoring the 63 00:03:38,720 --> 00:03:41,280 Speaker 1: mechanics of it that you're talking about here, Well, yeah, 64 00:03:41,360 --> 00:03:43,520 Speaker 1: because I think this is in a way a sort 65 00:03:43,560 --> 00:03:46,520 Speaker 1: of an image poem that can be applied to many 66 00:03:46,560 --> 00:03:48,840 Speaker 1: different types of choices one makes in life. Though I 67 00:03:48,880 --> 00:03:51,480 Speaker 1: think it was literally inspired by him walking in the 68 00:03:51,520 --> 00:03:53,920 Speaker 1: woods in New England. I'm not positive about that, but 69 00:03:54,320 --> 00:03:56,760 Speaker 1: I think I've read that before. But yeah, So it's 70 00:03:56,840 --> 00:04:00,400 Speaker 1: essentially a poem about a person who show was at 71 00:04:00,520 --> 00:04:05,800 Speaker 1: random between two at the time pretty much indistinguishable options, 72 00:04:06,360 --> 00:04:09,360 Speaker 1: and then comes up later with an ex post facto 73 00:04:09,480 --> 00:04:13,120 Speaker 1: justification for his choice that it was the one made 74 00:04:13,240 --> 00:04:16,599 Speaker 1: you made out of daring an authentic principle, and that 75 00:04:16,640 --> 00:04:20,000 Speaker 1: it was deeply meaningful. And I really like this ironic 76 00:04:20,040 --> 00:04:23,880 Speaker 1: interpretation because it raises a number of really interesting questions 77 00:04:23,880 --> 00:04:27,120 Speaker 1: about human nature. So, first of all, isn't so much 78 00:04:27,120 --> 00:04:31,120 Speaker 1: of life like this. We do make life changing decisions 79 00:04:31,480 --> 00:04:34,800 Speaker 1: without knowing what the outcome will be. That the options 80 00:04:34,800 --> 00:04:37,760 Speaker 1: in front of us might look indistinguishable. At the time 81 00:04:38,040 --> 00:04:41,200 Speaker 1: you choose between two job opportunities, you can't really tell 82 00:04:41,240 --> 00:04:44,440 Speaker 1: that one is necessarily better than the other. But then 83 00:04:44,760 --> 00:04:48,080 Speaker 1: later you you will have had much of your life 84 00:04:48,279 --> 00:04:51,560 Speaker 1: developed on the basis of whichever choice you made, and 85 00:04:51,600 --> 00:04:53,360 Speaker 1: you have to come up with a narrative of your 86 00:04:53,400 --> 00:04:56,680 Speaker 1: life story that makes sense of that choice in light 87 00:04:56,680 --> 00:05:00,640 Speaker 1: of its later unpredictable significance. And I've viously, when you 88 00:05:00,680 --> 00:05:02,720 Speaker 1: do this a lot of times you're gonna end up 89 00:05:02,760 --> 00:05:06,159 Speaker 1: remembering the choice differently than it was in your mind 90 00:05:06,200 --> 00:05:09,040 Speaker 1: when you made it. But then it also raises an 91 00:05:09,040 --> 00:05:12,800 Speaker 1: interesting question about decision making. In the moment. When there 92 00:05:12,800 --> 00:05:15,880 Speaker 1: are two options that are pretty much the same, we 93 00:05:15,880 --> 00:05:18,520 Speaker 1: we often have to form a preference for one or 94 00:05:18,560 --> 00:05:21,840 Speaker 1: the other. Now, there are plenty of cases where you 95 00:05:21,880 --> 00:05:25,560 Speaker 1: can quite clearly see why you'd prefer one option over another. 96 00:05:25,600 --> 00:05:29,120 Speaker 1: But in cases where that's not true, in the absence 97 00:05:29,160 --> 00:05:33,240 Speaker 1: of the obvious superiority of one option over another, where 98 00:05:33,240 --> 00:05:36,640 Speaker 1: do our preferences arise from? Why do we decide we 99 00:05:36,800 --> 00:05:39,320 Speaker 1: like the left path rather than the right path if 100 00:05:39,320 --> 00:05:42,400 Speaker 1: they look about the same. And for the purpose of 101 00:05:42,440 --> 00:05:45,520 Speaker 1: today's episode, I want to expand beyond thinking about paths 102 00:05:45,560 --> 00:05:48,840 Speaker 1: in the woods or big life decisions when it comes 103 00:05:48,880 --> 00:05:52,440 Speaker 1: to the formation of any preferences, even extremely minor ones. 104 00:05:52,520 --> 00:05:55,800 Speaker 1: You know you choose between two basically equivalent brands of 105 00:05:56,279 --> 00:05:59,599 Speaker 1: blender at the store, why do we like the things 106 00:05:59,640 --> 00:06:02,440 Speaker 1: that we like, Why do we have the preferences that 107 00:06:02,520 --> 00:06:06,000 Speaker 1: we have. I'm probably gonna refer back to the Black 108 00:06:06,080 --> 00:06:09,400 Speaker 1: Mirrorum episode the Black Mirror movie Band or Snatch a 109 00:06:09,440 --> 00:06:11,680 Speaker 1: lot in this one. We did an episode about it 110 00:06:11,760 --> 00:06:15,039 Speaker 1: last year, breaking down, you know, the nature of choice 111 00:06:15,120 --> 00:06:18,000 Speaker 1: and free will and all. But like I instantly think 112 00:06:18,040 --> 00:06:21,039 Speaker 1: about the early stages of Band or Snatch, where as 113 00:06:21,080 --> 00:06:23,919 Speaker 1: you do this choose your own adventure media, you have 114 00:06:24,000 --> 00:06:26,880 Speaker 1: to choose which cereal the main character is going to 115 00:06:26,960 --> 00:06:30,080 Speaker 1: have for breakfast, and you know, ultimately it doesn't really 116 00:06:30,120 --> 00:06:34,719 Speaker 1: matter in the context of that of that story. Uh, 117 00:06:34,800 --> 00:06:37,240 Speaker 1: And it's it's more about just teaching the mechanics of 118 00:06:37,360 --> 00:06:42,240 Speaker 1: choice within this um you know, computer narrative. But but 119 00:06:42,279 --> 00:06:44,760 Speaker 1: it's interesting that you still have to exert a certain 120 00:06:44,800 --> 00:06:48,120 Speaker 1: amount of mental energy to make that choice, to decide 121 00:06:48,160 --> 00:06:50,760 Speaker 1: this serial over that one. And it's interesting how and 122 00:06:50,760 --> 00:06:52,400 Speaker 1: this will tie into something we'll talk about in just 123 00:06:52,440 --> 00:06:54,280 Speaker 1: a minute here. It's interesting how, at least for me, 124 00:06:54,960 --> 00:06:57,360 Speaker 1: those early choices are kind of uncomfortable when you have 125 00:06:57,400 --> 00:06:59,279 Speaker 1: to pick the cereal or you have to pick the 126 00:06:59,279 --> 00:07:01,960 Speaker 1: record or some thing, and you don't have a natural 127 00:07:02,080 --> 00:07:05,120 Speaker 1: strong preference one way or another. You've got this kind 128 00:07:05,160 --> 00:07:09,120 Speaker 1: of weird anxiety that lingers after your choice, like, I 129 00:07:09,160 --> 00:07:11,640 Speaker 1: don't know, did I pick the right one? Yeah, because 130 00:07:11,720 --> 00:07:13,960 Speaker 1: later on you can definitely make a call like Okay, 131 00:07:13,960 --> 00:07:16,600 Speaker 1: this is the more dramatic choice, or well, this is 132 00:07:16,640 --> 00:07:19,280 Speaker 1: the more this is the moral choice. But in choosing 133 00:07:19,280 --> 00:07:22,200 Speaker 1: the two cereals, aside from maybe health concerns about the 134 00:07:22,200 --> 00:07:24,880 Speaker 1: sugary cereal versus the other cereal, there's not as much 135 00:07:24,920 --> 00:07:28,560 Speaker 1: to go on, right. So one of the main things 136 00:07:28,640 --> 00:07:31,080 Speaker 1: I want to talk about in this episode today it 137 00:07:31,280 --> 00:07:33,920 Speaker 1: is a really interesting fact that's been observed in a 138 00:07:33,920 --> 00:07:37,240 Speaker 1: bunch of psychology studies over the years, and I'm gonna 139 00:07:37,240 --> 00:07:39,640 Speaker 1: look at an early one from the nineteen fifties in 140 00:07:39,680 --> 00:07:43,040 Speaker 1: just a minute. Here. We often assume that our preferences 141 00:07:43,160 --> 00:07:46,960 Speaker 1: are what determine our choices. I pick this option instead 142 00:07:46,960 --> 00:07:49,720 Speaker 1: of that because I like it better. But there is 143 00:07:49,760 --> 00:07:54,000 Speaker 1: also significant evidence here's your AUNTI Metaboli, that our choices 144 00:07:54,320 --> 00:07:59,680 Speaker 1: determine our preferences. I like this option because I picked it. Uh. 145 00:07:59,680 --> 00:08:02,280 Speaker 1: And of the big early studies here, a classic study 146 00:08:02,680 --> 00:08:05,200 Speaker 1: that was in the Journal of Abnormal and Social Psychology 147 00:08:05,240 --> 00:08:08,840 Speaker 1: in nineteen fifty six by Jack W. Brim is called 148 00:08:09,120 --> 00:08:14,640 Speaker 1: post decision changes in the desirability of alternatives. So so again, 149 00:08:14,680 --> 00:08:18,160 Speaker 1: this is by the American psychologist Jack W. Brim. Brim 150 00:08:18,200 --> 00:08:22,120 Speaker 1: had been a student of the highly influential American social 151 00:08:22,120 --> 00:08:26,640 Speaker 1: psychologist Leon Festinger, who is probably best known for developing 152 00:08:26,680 --> 00:08:30,280 Speaker 1: the theory of cognitive dissonance. Now, this is a term 153 00:08:30,320 --> 00:08:32,720 Speaker 1: you've probably all heard before, but a lot of people 154 00:08:33,000 --> 00:08:36,679 Speaker 1: don't know the experimental history surrounding it. So the simple 155 00:08:36,800 --> 00:08:40,360 Speaker 1: version is that cognitive dissonance is the state of holding 156 00:08:40,559 --> 00:08:46,880 Speaker 1: contradictory beliefs or values, or contradictions between your beliefs and 157 00:08:46,920 --> 00:08:52,960 Speaker 1: your values and your actions, observing these contradictions within yourself simultaneously. 158 00:08:53,600 --> 00:08:56,920 Speaker 1: So one example that's very often sided is knowing that 159 00:08:57,040 --> 00:09:01,560 Speaker 1: smoking cigarettes is harmful to your health, but smoking them anyway. 160 00:09:01,920 --> 00:09:04,440 Speaker 1: But there there can be all kinds of cognitive dissonance. 161 00:09:04,440 --> 00:09:07,480 Speaker 1: Our life is just full of of of cognitive dissonance. 162 00:09:07,760 --> 00:09:10,800 Speaker 1: You know, you believe that your spouse is a good person, 163 00:09:11,240 --> 00:09:13,679 Speaker 1: but you also know that they did something wrong. You 164 00:09:14,000 --> 00:09:16,240 Speaker 1: know that they stole money out of the church collection 165 00:09:16,280 --> 00:09:18,920 Speaker 1: plate or something. I think one that's probably very common 166 00:09:18,960 --> 00:09:23,000 Speaker 1: appearances you love your child, but you really honestly don't 167 00:09:23,120 --> 00:09:25,280 Speaker 1: like something they did. You know, you hate the way 168 00:09:25,320 --> 00:09:29,080 Speaker 1: their crayon drawings look or something. Uh. And and when 169 00:09:29,120 --> 00:09:31,640 Speaker 1: you're faced with this kind of contradiction, and of course 170 00:09:31,679 --> 00:09:35,679 Speaker 1: we're faced with these kind of contradictions all the time, Uh, 171 00:09:35,760 --> 00:09:39,040 Speaker 1: there is a problem that arises. What Festinger argued was 172 00:09:39,120 --> 00:09:43,760 Speaker 1: that the state of cognitive dissonance is experienced internally as 173 00:09:43,800 --> 00:09:47,720 Speaker 1: a profound stress, and people will do almost anything to 174 00:09:47,920 --> 00:09:52,440 Speaker 1: alleviate that stress. And so this this remedial action to 175 00:09:52,440 --> 00:09:55,400 Speaker 1: to alleviate the stress can take many forms, but it's 176 00:09:55,480 --> 00:10:00,280 Speaker 1: just some finding some way to resolve the contradiction. Really, 177 00:10:00,320 --> 00:10:05,040 Speaker 1: anything that reduces the internal perception of a contradiction between 178 00:10:05,160 --> 00:10:08,439 Speaker 1: beliefs and values and actions. So if you're going back 179 00:10:08,440 --> 00:10:11,080 Speaker 1: to the classic example of a person who smokes cigarettes 180 00:10:11,120 --> 00:10:14,440 Speaker 1: but who is aware of the dangers of tobacco, they 181 00:10:14,480 --> 00:10:17,719 Speaker 1: have options including they could they could change their actions 182 00:10:17,760 --> 00:10:20,400 Speaker 1: so you can actually quit smoking. But of course that 183 00:10:20,480 --> 00:10:24,320 Speaker 1: one's really hard, so a lot of people would instead 184 00:10:24,400 --> 00:10:27,080 Speaker 1: go for one of the other options, which is change 185 00:10:27,120 --> 00:10:31,040 Speaker 1: explicit beliefs. You can say, uh, yeah, what are these 186 00:10:31,080 --> 00:10:33,560 Speaker 1: doctors know? You know, doctors are wrong about stuff all 187 00:10:33,600 --> 00:10:36,840 Speaker 1: the time. I don't know, nobody ever really proved that 188 00:10:36,960 --> 00:10:40,640 Speaker 1: smoking causes cancer. That's you know, the these studies are 189 00:10:40,920 --> 00:10:43,280 Speaker 1: Can you really trust these studies? And on and on? 190 00:10:43,360 --> 00:10:46,280 Speaker 1: You can you can just say no, I don't believe 191 00:10:46,600 --> 00:10:50,560 Speaker 1: that the risks are real, or you could change other 192 00:10:50,640 --> 00:10:54,120 Speaker 1: types of beliefs, such as changing underlying beliefs that are 193 00:10:54,120 --> 00:10:58,000 Speaker 1: going unspoken, because if there's an internal conflict, if there's 194 00:10:58,120 --> 00:11:02,800 Speaker 1: cognitive dissonance arising over smoking, it relies on the unspoken 195 00:11:02,840 --> 00:11:05,520 Speaker 1: premise that you want to live as long and be 196 00:11:05,600 --> 00:11:09,280 Speaker 1: as healthy as possible. So you could relieve cognitive dissonance 197 00:11:09,360 --> 00:11:12,400 Speaker 1: by explicitly rejecting that belief. And you've probably heard this 198 00:11:12,480 --> 00:11:15,280 Speaker 1: before from people who say like, yeah, I smoke, Yeah 199 00:11:15,280 --> 00:11:17,720 Speaker 1: it causes cancer, but hey, who wants to live forever. 200 00:11:18,240 --> 00:11:20,480 Speaker 1: That's also you know, a great example of of short 201 00:11:20,600 --> 00:11:24,360 Speaker 1: term versus long term thinking, right exactly. I mean, I 202 00:11:24,400 --> 00:11:26,520 Speaker 1: think there are ways of looking at things like I mean, 203 00:11:26,559 --> 00:11:28,760 Speaker 1: on one hand, like you know, people are free to 204 00:11:28,760 --> 00:11:31,560 Speaker 1: to make the decisions about their own health as they choose, 205 00:11:31,600 --> 00:11:34,120 Speaker 1: But I think there is a legitimate school of thought 206 00:11:34,120 --> 00:11:37,120 Speaker 1: that would say that, uh, making statements like that is 207 00:11:37,160 --> 00:11:41,880 Speaker 1: basically a lack of compassion for your own future self. Yeah, 208 00:11:42,400 --> 00:11:45,720 Speaker 1: but statements like that can help resolve the dissonance. Uh, 209 00:11:45,800 --> 00:11:48,440 Speaker 1: there there are other things people do to People can 210 00:11:48,600 --> 00:11:53,040 Speaker 1: think of compensatory reasons that they would keep smoking. So 211 00:11:53,080 --> 00:11:55,520 Speaker 1: they might say, Okay, I I accept the fact that 212 00:11:55,600 --> 00:11:58,120 Speaker 1: smoking is bad for health. I keep doing it, but 213 00:11:58,200 --> 00:12:02,319 Speaker 1: I've got some like compensatory justification in my brain. I 214 00:12:02,400 --> 00:12:05,760 Speaker 1: need to smoke in order to stay focused at work, 215 00:12:06,040 --> 00:12:09,080 Speaker 1: or like I need to smoke in order to stay thin, 216 00:12:09,480 --> 00:12:12,800 Speaker 1: or things like that. And so people have argued about 217 00:12:12,840 --> 00:12:16,760 Speaker 1: how best to interpret cognitive dissonance theory, and and they've 218 00:12:16,800 --> 00:12:18,960 Speaker 1: argued around the margins over the years. But it seems 219 00:12:18,960 --> 00:12:21,840 Speaker 1: to me like cognitive dissonance is is pretty robust and 220 00:12:21,880 --> 00:12:26,600 Speaker 1: a very lasting concept from from social psychology that explains 221 00:12:26,640 --> 00:12:30,000 Speaker 1: a lot of our behaviors and cognitive processes. There have 222 00:12:30,040 --> 00:12:32,679 Speaker 1: been a ton of different experiments that seem to support 223 00:12:32,720 --> 00:12:36,840 Speaker 1: the idea of cognitive dissonance reduction is a major pressure 224 00:12:37,000 --> 00:12:40,520 Speaker 1: driving our beliefs and behaviors. Just one I was reading 225 00:12:40,520 --> 00:12:43,160 Speaker 1: about as a study by Festinger and Carl Smith from 226 00:12:43,240 --> 00:12:47,000 Speaker 1: nineteen fifty nine that works something like this. So you 227 00:12:47,040 --> 00:12:50,600 Speaker 1: have people perform something that they believe to be the 228 00:12:50,640 --> 00:12:54,800 Speaker 1: actual bulk of the experiment. It's this like long, repetitive, 229 00:12:54,840 --> 00:12:57,640 Speaker 1: extremely boring task. I don't remember exactly what it is. 230 00:12:57,760 --> 00:12:59,880 Speaker 1: Is like, you know, you put these pegs in holes 231 00:13:00,000 --> 00:13:04,480 Speaker 1: for an hour or something is mind numbingly boring, and 232 00:13:04,520 --> 00:13:08,240 Speaker 1: then you pay the subjects after they're done with the 233 00:13:08,280 --> 00:13:11,160 Speaker 1: experiment to tell the people who are going in to 234 00:13:11,240 --> 00:13:15,080 Speaker 1: do it next that it's really fun and interesting. Uh 235 00:13:15,120 --> 00:13:18,400 Speaker 1: so they're gonna be lying. They're gonna be openly saying 236 00:13:18,440 --> 00:13:21,480 Speaker 1: something that they know not to be true. And Festinger 237 00:13:21,520 --> 00:13:24,240 Speaker 1: and Carl Smith found something interesting, which is that if 238 00:13:24,280 --> 00:13:27,200 Speaker 1: you pay people a larger sum of money to tell 239 00:13:27,240 --> 00:13:30,960 Speaker 1: this lie, they will they will afterwards acknowledge it as 240 00:13:31,000 --> 00:13:32,840 Speaker 1: a lie. So, you know, you give me a hundred 241 00:13:32,840 --> 00:13:34,640 Speaker 1: bucks or whatever. I think it was twenty dollars in 242 00:13:34,679 --> 00:13:36,959 Speaker 1: the study, but that was nine fifties money. You give 243 00:13:37,000 --> 00:13:39,560 Speaker 1: me a hundred bucks or something. I say like, yeah, 244 00:13:39,760 --> 00:13:41,960 Speaker 1: you know, I I lied to the next guy. I 245 00:13:41,960 --> 00:13:44,120 Speaker 1: told him it was going to be really fun. If 246 00:13:44,160 --> 00:13:46,959 Speaker 1: you pay somebody a pittance sum to tell the lie, 247 00:13:47,000 --> 00:13:50,560 Speaker 1: you give them just a dollar, they are more likely 248 00:13:50,640 --> 00:13:54,840 Speaker 1: afterwards to report believing that what they said was true. 249 00:13:55,400 --> 00:13:57,760 Speaker 1: So you give somebody a hundred dollars to say this 250 00:13:57,840 --> 00:14:00,360 Speaker 1: is really putting the pegs in the holes is really one? 251 00:14:00,800 --> 00:14:03,040 Speaker 1: They say, yeah, I was lying, but hey, I gotta 252 00:14:03,080 --> 00:14:05,880 Speaker 1: pay day. You pay people a dollar to say it, 253 00:14:05,960 --> 00:14:08,079 Speaker 1: and they say, actually putting the pegs in the holes 254 00:14:08,280 --> 00:14:12,560 Speaker 1: was pretty fun. And the reasoning here is that in 255 00:14:12,600 --> 00:14:15,520 Speaker 1: the absence of a large sum of money to internally 256 00:14:15,679 --> 00:14:19,560 Speaker 1: justify the lie in order to basically relieve the cognitive dissonance, 257 00:14:19,600 --> 00:14:21,920 Speaker 1: give you a reason in your mind for having said it. 258 00:14:22,320 --> 00:14:25,120 Speaker 1: The easiest way for people to reduce cognitive dissonance is 259 00:14:25,160 --> 00:14:28,480 Speaker 1: to change their beliefs, change what they believe about what 260 00:14:28,520 --> 00:14:30,880 Speaker 1: they were doing, so that what they were saying actually 261 00:14:30,920 --> 00:14:32,920 Speaker 1: wasn't a lie. It was true. Yeah, but yeah the 262 00:14:32,920 --> 00:14:36,640 Speaker 1: pigs and the holes. It's great. Yeah, I um, I 263 00:14:36,680 --> 00:14:39,320 Speaker 1: agree with what you said said earlier. I think this 264 00:14:39,360 --> 00:14:41,120 Speaker 1: helps to explain a lot of what goes on in 265 00:14:41,120 --> 00:14:44,720 Speaker 1: our heads cognitive dissonance, both specifically as it applies to 266 00:14:44,800 --> 00:14:48,360 Speaker 1: contradictory opinions and beliefs that that we we hold at 267 00:14:48,400 --> 00:14:51,520 Speaker 1: once in our minds, as well as just more broadly, 268 00:14:51,640 --> 00:14:54,400 Speaker 1: getting it the lack of a congruent self you know, yeah, 269 00:14:54,440 --> 00:14:57,160 Speaker 1: because I mean human life, you're you're just gonna be 270 00:14:57,200 --> 00:15:00,080 Speaker 1: full of contradictions. I mean, there is no way a 271 00:15:00,160 --> 00:15:03,840 Speaker 1: human can be consistent all the time. You're you're going 272 00:15:03,880 --> 00:15:05,920 Speaker 1: to have pressures that are acting on your mind and 273 00:15:05,960 --> 00:15:09,000 Speaker 1: going in multiple directions, and and most of the time 274 00:15:09,240 --> 00:15:12,080 Speaker 1: these contradictions can exist within you without you really being 275 00:15:12,120 --> 00:15:14,560 Speaker 1: aware of them. But once you become aware of them, 276 00:15:14,600 --> 00:15:17,040 Speaker 1: I am pretty convinced that, yeah, it does manifest is 277 00:15:17,080 --> 00:15:21,600 Speaker 1: this type of stress that you've got to do something about. Yeah, 278 00:15:21,640 --> 00:15:24,120 Speaker 1: Like you often see people using sort of like self 279 00:15:24,280 --> 00:15:29,160 Speaker 1: defining mantras, you know, like I am this first, this second, 280 00:15:29,200 --> 00:15:31,560 Speaker 1: this third, or you know, I am this and this 281 00:15:31,640 --> 00:15:33,400 Speaker 1: and this or you know, you define yourself and your 282 00:15:33,760 --> 00:15:37,560 Speaker 1: your profile on social media as being as being this 283 00:15:37,720 --> 00:15:39,960 Speaker 1: or that or the other. But you know, ultimately, if 284 00:15:39,960 --> 00:15:42,960 Speaker 1: we're being honest, a lot of times it depends on 285 00:15:42,960 --> 00:15:45,720 Speaker 1: on what time of day it is, when we last 286 00:15:45,800 --> 00:15:48,800 Speaker 1: had a little boost of sugar caffeine, you know, how 287 00:15:48,840 --> 00:15:52,160 Speaker 1: tired we are, um, how much sunlight we've been exposed 288 00:15:52,200 --> 00:15:53,720 Speaker 1: to during the day, that sort of thing, how much 289 00:15:53,720 --> 00:15:57,840 Speaker 1: exercise we've had. Uh, those are some just some of 290 00:15:57,880 --> 00:16:01,640 Speaker 1: the factors that can influence the ranking of those little 291 00:16:01,760 --> 00:16:05,120 Speaker 1: um uh, those little phrases that we used to define ourselves, 292 00:16:05,120 --> 00:16:07,960 Speaker 1: and and and even incorporating different phrases that we might 293 00:16:08,000 --> 00:16:11,160 Speaker 1: not we not not have on the list, uh normally 294 00:16:11,240 --> 00:16:13,320 Speaker 1: or certainly when we're you know, outward facing and dealing 295 00:16:13,320 --> 00:16:15,840 Speaker 1: with other people. Yeah, a lot of a lot of 296 00:16:15,840 --> 00:16:19,040 Speaker 1: our lives are concerned with trying to create a consistent 297 00:16:19,160 --> 00:16:23,440 Speaker 1: narrative about ourselves, and in fact, ourselves are just not 298 00:16:23,480 --> 00:16:26,600 Speaker 1: that consistent. Yeah. And and really now, I mean, neither 299 00:16:26,680 --> 00:16:28,960 Speaker 1: is our understanding of the past or memory of the 300 00:16:29,000 --> 00:16:32,080 Speaker 1: past or anything. I mean, it it's just it's it's 301 00:16:32,120 --> 00:16:34,480 Speaker 1: so ridiculous the more you unravel it, Like, we were 302 00:16:34,520 --> 00:16:37,120 Speaker 1: so obsessed with our our personal narratives and where we 303 00:16:37,240 --> 00:16:40,240 Speaker 1: fit into it, when in reality, there is no past, 304 00:16:40,360 --> 00:16:42,520 Speaker 1: you know, we are we are creatures of the present, 305 00:16:43,280 --> 00:16:46,360 Speaker 1: traveling into the future. And uh, and yet we end up, 306 00:16:46,880 --> 00:16:51,080 Speaker 1: you know, spending all this time fretting about things that 307 00:16:51,120 --> 00:16:54,720 Speaker 1: are essentially fiction because all we have is just this, uh, 308 00:16:54,800 --> 00:16:59,400 Speaker 1: this copple together false memory of what we were. This 309 00:16:59,440 --> 00:17:01,720 Speaker 1: ties back to previous episodes that we've done on the 310 00:17:01,720 --> 00:17:05,720 Speaker 1: phenomenon of fundamental attribution error. The tendency for people to 311 00:17:06,480 --> 00:17:10,960 Speaker 1: overestimate the role of like internal agency and character and 312 00:17:11,160 --> 00:17:15,239 Speaker 1: underestimate the role of just external situations and forces in 313 00:17:15,280 --> 00:17:18,040 Speaker 1: guiding what human behavior is. It turns out people are 314 00:17:18,080 --> 00:17:21,399 Speaker 1: more malleable and more changeable based on situations than we 315 00:17:21,480 --> 00:17:23,920 Speaker 1: normally like to think. We like to think in terms 316 00:17:23,960 --> 00:17:28,200 Speaker 1: of like, you know, consistent solid psychological storytelling, where John 317 00:17:28,240 --> 00:17:31,480 Speaker 1: snow always stands for right and he just always does 318 00:17:31,520 --> 00:17:34,919 Speaker 1: what is perfectly consistent with his character, and it's explained 319 00:17:34,920 --> 00:17:37,320 Speaker 1: by who he is. But in fact, what we do 320 00:17:37,359 --> 00:17:39,520 Speaker 1: a lot of times it's just explained by what's going 321 00:17:39,560 --> 00:17:48,800 Speaker 1: on around us. But anyway, to come back to the 322 00:17:49,400 --> 00:17:53,040 Speaker 1: cognitive dissonance question, one implication of cognitive dissonance is that 323 00:17:53,520 --> 00:17:57,520 Speaker 1: in fact, our beliefs are quite malleable. When beliefs are 324 00:17:57,600 --> 00:18:00,600 Speaker 1: dissident with one another, it looks like, you know, it's 325 00:18:00,920 --> 00:18:03,080 Speaker 1: people don't want to think this about themselves, but it 326 00:18:03,080 --> 00:18:06,400 Speaker 1: seems to be true. We quite often and quite readily 327 00:18:06,480 --> 00:18:09,399 Speaker 1: just change one of our beliefs. We just believe something 328 00:18:09,440 --> 00:18:12,240 Speaker 1: different to get them in line. So anyway, the study 329 00:18:12,280 --> 00:18:15,919 Speaker 1: by Jack Brim looked at the question of whether cognitive 330 00:18:15,960 --> 00:18:19,560 Speaker 1: dissonance might be a motivator, even when people are evaluating 331 00:18:19,800 --> 00:18:24,959 Speaker 1: their own preferences, their own personal desires, just likes and dislikes, 332 00:18:25,040 --> 00:18:28,919 Speaker 1: even with regards to very minor things like do you 333 00:18:29,000 --> 00:18:31,880 Speaker 1: like this appliance or not? How much do you like it? 334 00:18:32,640 --> 00:18:35,840 Speaker 1: So the basics of the study, UH, you present people 335 00:18:36,280 --> 00:18:40,359 Speaker 1: with a selection of different household items and appliances ranging 336 00:18:40,359 --> 00:18:44,159 Speaker 1: in retail value for from fifteen dollars to thirty dollars 337 00:18:44,160 --> 00:18:48,000 Speaker 1: but that was at the time a teen fifties dollars um. 338 00:18:48,040 --> 00:18:50,639 Speaker 1: And then you ask the people to rate each of 339 00:18:50,680 --> 00:18:53,639 Speaker 1: these items in terms of desirability. How much would you 340 00:18:53,760 --> 00:18:55,679 Speaker 1: would you like to own this item on a scale 341 00:18:55,680 --> 00:18:59,879 Speaker 1: of one to ten, from extremely desirable to definitely not desirable, 342 00:19:00,040 --> 00:19:02,639 Speaker 1: all or sorry? I think I said one to ten. 343 00:19:02,720 --> 00:19:05,960 Speaker 1: It's one to eight, um, so you know you really 344 00:19:06,000 --> 00:19:08,800 Speaker 1: want the eights, you don't really care about the ones. Uh. 345 00:19:08,840 --> 00:19:12,240 Speaker 1: And the items included things like an automatic coffee maker, 346 00:19:12,320 --> 00:19:17,240 Speaker 1: an electric sandwich grill, a silk screen reproduction and automatic toaster, 347 00:19:17,720 --> 00:19:20,919 Speaker 1: a fluorescent desk lamp, a book of art reproductions, a 348 00:19:21,040 --> 00:19:24,840 Speaker 1: stop watch, and a portable radio. And so, if I'm 349 00:19:25,359 --> 00:19:27,560 Speaker 1: subject in this experiment, I go down the list, I 350 00:19:27,840 --> 00:19:30,720 Speaker 1: do my ratings. I might rate the stop Watch at 351 00:19:30,720 --> 00:19:32,880 Speaker 1: a three out of ten, I don't really care about 352 00:19:32,880 --> 00:19:35,240 Speaker 1: that much. Maybe the sandwich grill at a five, the 353 00:19:35,280 --> 00:19:38,400 Speaker 1: coffee maker at a six, etcetera. And then after I'm 354 00:19:38,440 --> 00:19:41,879 Speaker 1: finished with my ratings and they're taken away, the experiment 355 00:19:41,960 --> 00:19:44,800 Speaker 1: er tells me that as part of my payment for participating, 356 00:19:45,440 --> 00:19:48,480 Speaker 1: I'll get to take home my choice of one of 357 00:19:48,520 --> 00:19:51,520 Speaker 1: two items from the list. But the experiment or picks 358 00:19:51,560 --> 00:19:54,800 Speaker 1: what the two are. So maybe he tells me that 359 00:19:54,880 --> 00:19:58,200 Speaker 1: I can take home either the toaster or the coffee maker, 360 00:19:58,600 --> 00:20:02,159 Speaker 1: which I rated equal, giving both a six, but I 361 00:20:02,200 --> 00:20:04,680 Speaker 1: have to pick one, So I picked the coffee maker 362 00:20:04,720 --> 00:20:08,240 Speaker 1: and I reject the toaster. Then some other conditions take place. 363 00:20:08,400 --> 00:20:11,240 Speaker 1: Uh that there were various other control conditions, but the 364 00:20:11,280 --> 00:20:15,040 Speaker 1: experiment ends at some point with me re rating the 365 00:20:15,080 --> 00:20:19,960 Speaker 1: original objects again for desirability without being able to refer 366 00:20:20,080 --> 00:20:22,800 Speaker 1: to the ratings I had already made. And what the 367 00:20:22,840 --> 00:20:26,520 Speaker 1: researchers found was, on average, if I was forced to 368 00:20:26,560 --> 00:20:30,800 Speaker 1: pick between two objects, my desirability rating for the object 369 00:20:30,880 --> 00:20:33,879 Speaker 1: I picked would go up, and my rating for the 370 00:20:33,920 --> 00:20:38,360 Speaker 1: object I rejected would go down. So maybe I initially 371 00:20:38,440 --> 00:20:41,080 Speaker 1: rated the toaster and the coffee maker both as a six. 372 00:20:41,640 --> 00:20:43,919 Speaker 1: But then if I'm forced to pick between them and 373 00:20:43,920 --> 00:20:46,800 Speaker 1: I picked the coffee maker, afterwards, I might rate the 374 00:20:46,800 --> 00:20:49,280 Speaker 1: coffee maker is a seven and the toaster is a 375 00:20:49,359 --> 00:20:53,200 Speaker 1: four or something like that. Now, why would that be? Yeah, 376 00:20:53,280 --> 00:20:56,400 Speaker 1: this is interesting because one of the one of the 377 00:20:56,400 --> 00:20:58,440 Speaker 1: possible examples that came to mind when I was thinking 378 00:20:58,480 --> 00:21:00,880 Speaker 1: about this was to go back to a previous episode 379 00:21:00,880 --> 00:21:04,760 Speaker 1: that we recorded, thinking about how you know, back when 380 00:21:04,800 --> 00:21:06,800 Speaker 1: when I was younger, you had, you know, like maybe 381 00:21:06,840 --> 00:21:09,800 Speaker 1: twenty bucks to blow on a CD during the course 382 00:21:09,800 --> 00:21:12,199 Speaker 1: of a month, and you made your pick, you bought it, 383 00:21:12,520 --> 00:21:14,880 Speaker 1: and then even if it wasn't that great, you kind 384 00:21:14,880 --> 00:21:17,560 Speaker 1: of found a reason to like that album as you 385 00:21:17,560 --> 00:21:19,000 Speaker 1: listen to it over and over again, you found at 386 00:21:19,040 --> 00:21:23,320 Speaker 1: least one song. But in those cases, you have sunk 387 00:21:23,359 --> 00:21:26,000 Speaker 1: cost in the situation, like I spent money on it 388 00:21:26,440 --> 00:21:31,400 Speaker 1: um in addition to time, whereas in this scenario, uh, 389 00:21:31,480 --> 00:21:34,040 Speaker 1: it's there. Their money is not the issue. There's like 390 00:21:34,080 --> 00:21:36,120 Speaker 1: I guess there's sort of a sunk cost in time, 391 00:21:36,200 --> 00:21:42,240 Speaker 1: but we don't have that that financial aspect of the scenario, right, 392 00:21:42,320 --> 00:21:44,560 Speaker 1: So the sunk cost fallacy does seem to be real, 393 00:21:44,680 --> 00:21:48,240 Speaker 1: Like we make choice supportive, biased judgments in favor of 394 00:21:48,600 --> 00:21:51,120 Speaker 1: stuff that we've already invested time and money and all 395 00:21:51,160 --> 00:21:53,200 Speaker 1: that into. But here it's just like, well, you're gonna 396 00:21:53,200 --> 00:21:55,200 Speaker 1: get one or the other, which one do you want? 397 00:21:55,280 --> 00:21:57,560 Speaker 1: And it seems like once you pick one out of 398 00:21:57,600 --> 00:22:00,960 Speaker 1: the two, the one you didn't pick look like junk, 399 00:22:01,119 --> 00:22:03,520 Speaker 1: and the one you did pick, oh, that's pretty great. 400 00:22:03,840 --> 00:22:06,280 Speaker 1: I do find this. I think there's just you know, 401 00:22:06,359 --> 00:22:08,639 Speaker 1: me thinking back on past experiences. But I feel like 402 00:22:08,640 --> 00:22:12,000 Speaker 1: this with ice creams sometimes, like ultimately, most of the 403 00:22:12,040 --> 00:22:14,080 Speaker 1: ice creams at then ice ice cream place, you know, 404 00:22:14,119 --> 00:22:16,480 Speaker 1: they're gonna be great. I'm gonna enjoy them. They're just 405 00:22:16,480 --> 00:22:19,480 Speaker 1: gonna be varying degrees of sweetness and uh, you know, 406 00:22:19,760 --> 00:22:23,880 Speaker 1: complex flavor, I guess. But I'll often find myself thinking, like, 407 00:22:24,480 --> 00:22:26,280 Speaker 1: you know, afterwards, if I'm there with my family, will all, 408 00:22:26,400 --> 00:22:29,280 Speaker 1: you know, sample each other's ice creams, and I'll generally go, yep, 409 00:22:29,400 --> 00:22:31,480 Speaker 1: I made the right choice. This is the ice cream 410 00:22:31,520 --> 00:22:34,800 Speaker 1: for me. Yep. Now there could be multiple reasons for that. 411 00:22:34,840 --> 00:22:39,080 Speaker 1: One reason is extremely straightforward. One reason is you just 412 00:22:39,160 --> 00:22:41,520 Speaker 1: picked the one you actually wanted most, you know what, 413 00:22:41,560 --> 00:22:43,840 Speaker 1: your preferences are and you acted them out. But there 414 00:22:43,840 --> 00:22:46,359 Speaker 1: could be good at this, Yeah, there could be other 415 00:22:46,480 --> 00:22:50,120 Speaker 1: things at work too, And so the underlying explanation based 416 00:22:50,160 --> 00:22:53,159 Speaker 1: on cognitive dissonance for what was observed in the study, 417 00:22:53,520 --> 00:22:56,840 Speaker 1: it goes something like this. When you evaluate how much 418 00:22:56,880 --> 00:22:59,920 Speaker 1: you want to potential possessions, you think in a general 419 00:23:00,119 --> 00:23:02,040 Speaker 1: way about the pros and cons of each, what do 420 00:23:02,080 --> 00:23:04,880 Speaker 1: you like about them, what do you dislike about them? 421 00:23:04,960 --> 00:23:08,360 Speaker 1: Then if you are forced to choose between two options 422 00:23:08,359 --> 00:23:11,760 Speaker 1: for which you see roughly similar amounts of pros and cons, 423 00:23:12,320 --> 00:23:16,200 Speaker 1: it creates one of these mildly stressful states of cognitive dissonance. 424 00:23:16,200 --> 00:23:18,359 Speaker 1: And again that sounds funny because like, how could that 425 00:23:18,400 --> 00:23:21,679 Speaker 1: be stressful? But it looks like this just does manifest 426 00:23:21,720 --> 00:23:23,720 Speaker 1: a stress in our brains, even though it doesn't really 427 00:23:23,720 --> 00:23:26,119 Speaker 1: make a lot of sense that it would. So you 428 00:23:26,160 --> 00:23:29,320 Speaker 1: didn't choose the toaster, even though there are things that 429 00:23:29,400 --> 00:23:33,840 Speaker 1: you like about the toaster, etcetera. And this uncomfortable state 430 00:23:33,880 --> 00:23:36,719 Speaker 1: of cognitive dissonance has a name actually, when it's applied 431 00:23:36,760 --> 00:23:39,960 Speaker 1: to expensive purchases, when you've spent money on it, it's 432 00:23:40,000 --> 00:23:43,440 Speaker 1: known as buyer's remorse. Right, Okay, I need to buy 433 00:23:43,440 --> 00:23:45,280 Speaker 1: a lawnmower. But you know, what the hell do I 434 00:23:45,320 --> 00:23:47,840 Speaker 1: know about lawnmowers. I can't tell one from the other. 435 00:23:48,160 --> 00:23:50,600 Speaker 1: They cost a lot of money, but I need one, 436 00:23:50,760 --> 00:23:52,679 Speaker 1: and I can't really tell them apart. So I'm just 437 00:23:52,680 --> 00:23:54,320 Speaker 1: gonna have to pick one of these here at the 438 00:23:54,320 --> 00:23:56,280 Speaker 1: store and buy it so I can cut the grass. 439 00:23:57,280 --> 00:23:59,960 Speaker 1: But after you've made a purchase like this, okay, big 440 00:24:00,040 --> 00:24:02,520 Speaker 1: dollar item, you've spent a lot on it, you you 441 00:24:02,640 --> 00:24:07,000 Speaker 1: just picked one, people often experience a sinking feeling, this 442 00:24:07,119 --> 00:24:10,919 Speaker 1: form of stress and psychological discomfort. Did I buy the 443 00:24:11,040 --> 00:24:13,639 Speaker 1: right one? And you think about what might have been 444 00:24:13,720 --> 00:24:15,680 Speaker 1: good about the ones you didn't buy, and you think 445 00:24:15,680 --> 00:24:18,720 Speaker 1: about what might be wrong with the one you did buy. 446 00:24:18,800 --> 00:24:22,439 Speaker 1: So to eliminate the stress of this dissonance, the theory 447 00:24:22,480 --> 00:24:26,359 Speaker 1: goes that your brain simply changes your beliefs. You change 448 00:24:26,359 --> 00:24:29,240 Speaker 1: your beliefs about what you prefer and what you want, 449 00:24:29,760 --> 00:24:33,840 Speaker 1: emphasizing the pros and de emphasizing the cons of the 450 00:24:33,840 --> 00:24:37,639 Speaker 1: option you chose, and vice versa for the option you rejected. 451 00:24:38,080 --> 00:24:40,200 Speaker 1: And it makes sense in a weird way, right, I Mean, 452 00:24:40,440 --> 00:24:42,840 Speaker 1: we often think of that our beliefs should be these 453 00:24:42,920 --> 00:24:46,479 Speaker 1: these core and just fixed things about ourselves, you know, 454 00:24:46,960 --> 00:24:49,679 Speaker 1: uh that you know, although the wind and the raging 455 00:24:49,680 --> 00:24:52,119 Speaker 1: of the world just move around, but uh, you know, 456 00:24:52,160 --> 00:24:55,159 Speaker 1: from from the mind standpoint, it's like, well, uh, this 457 00:24:55,240 --> 00:24:57,480 Speaker 1: is causing a problem. Let's let's change this circuit here, 458 00:24:57,520 --> 00:25:00,679 Speaker 1: because we're getting some some feedback that that is not 459 00:25:00,760 --> 00:25:04,800 Speaker 1: optimal for the system. Right. Um. Now. I will note that, 460 00:25:04,880 --> 00:25:08,080 Speaker 1: of course, as always, these these results apply on average, 461 00:25:08,119 --> 00:25:10,760 Speaker 1: and it's interesting to think about other ways that some 462 00:25:10,800 --> 00:25:14,560 Speaker 1: people might reduce cognitive dissonance in this kind of situation 463 00:25:14,680 --> 00:25:19,440 Speaker 1: without changing their original preferences, without changing their opinions about 464 00:25:19,440 --> 00:25:24,199 Speaker 1: what's desirable. I think one very common adaptive strategy is 465 00:25:24,240 --> 00:25:29,760 Speaker 1: the adaptive strategy of internally de emphasizing the importance of possessions, which, 466 00:25:29,840 --> 00:25:32,800 Speaker 1: in fact, in reality, which you know, moment to moment, 467 00:25:33,200 --> 00:25:38,760 Speaker 1: reduces the cognitive dissonance that arises from making choices about possessions. Yeah, 468 00:25:38,840 --> 00:25:43,280 Speaker 1: sort of realizing, well, a lawnlowers don't really matter. It 469 00:25:43,359 --> 00:25:46,640 Speaker 1: doesn't it's just the thing, and I'm going to spend 470 00:25:46,680 --> 00:25:48,640 Speaker 1: a certain amount on it. It's just I'm gonna spend 471 00:25:48,680 --> 00:25:51,120 Speaker 1: what it costs. I'm gonna get whichever one is just easiest. 472 00:25:51,119 --> 00:25:53,920 Speaker 1: To obtain in the smoking example. I think this is 473 00:25:54,000 --> 00:25:56,280 Speaker 1: kind of the this is equivalent to the like who 474 00:25:56,359 --> 00:25:59,760 Speaker 1: wants to live forever? Option, but thinking about you know, 475 00:26:00,000 --> 00:26:03,800 Speaker 1: consumer items instead of your life, Like I'm not reckless 476 00:26:03,840 --> 00:26:05,880 Speaker 1: with my health, I'm just very zen about this whole 477 00:26:05,880 --> 00:26:08,600 Speaker 1: smoking thing, right. I think it makes more sense to 478 00:26:08,640 --> 00:26:13,240 Speaker 1: try to do the zen path about the lawnmower material possessions. 479 00:26:13,280 --> 00:26:15,720 Speaker 1: But hey, I mean that's hard. I mean, we shouldn't 480 00:26:15,720 --> 00:26:18,040 Speaker 1: just like blithely say everybody should do that. But I 481 00:26:18,040 --> 00:26:20,919 Speaker 1: mean it's difficult to do that. People, you're spending your money, 482 00:26:20,960 --> 00:26:23,840 Speaker 1: that is your labor. You're you're thinking, oh God, did 483 00:26:23,880 --> 00:26:26,760 Speaker 1: I did I get it right? Um and and the 484 00:26:26,800 --> 00:26:29,280 Speaker 1: same even manifest when you're just making a decision about 485 00:26:29,280 --> 00:26:31,520 Speaker 1: what appliance you want to take home after spending an 486 00:26:31,520 --> 00:26:35,000 Speaker 1: afternoon doing an experiment. Um and And I should also 487 00:26:35,040 --> 00:26:38,160 Speaker 1: note that there have been some competing explanations for this phenomenon, 488 00:26:38,200 --> 00:26:40,840 Speaker 1: but it seems like cognitive dissonance is favored by the 489 00:26:40,880 --> 00:26:44,719 Speaker 1: experts and supported by a lot of other experiments. Um So, 490 00:26:44,760 --> 00:26:46,960 Speaker 1: I wanted to note in this experiment there were a 491 00:26:47,040 --> 00:26:51,800 Speaker 1: couple of interesting control conditions and additional hypotheses tested that 492 00:26:51,960 --> 00:26:54,440 Speaker 1: ended up not receiving support from the data, so I'm 493 00:26:54,440 --> 00:26:56,160 Speaker 1: not going to get into those, but I did want 494 00:26:56,200 --> 00:27:00,280 Speaker 1: to mention one control condition, the gift condition, and this 495 00:27:00,359 --> 00:27:03,920 Speaker 1: provides an interesting variation on what they found. So there 496 00:27:04,000 --> 00:27:07,680 Speaker 1: is some indication that owning something makes people see that 497 00:27:07,760 --> 00:27:11,359 Speaker 1: thing as more desirable. So what if it was the 498 00:27:11,359 --> 00:27:15,120 Speaker 1: effect of ownership of this appliance they received that made 499 00:27:15,119 --> 00:27:18,920 Speaker 1: the difference, rather than reduction of cognitive dissonance arising from 500 00:27:18,920 --> 00:27:23,000 Speaker 1: your choices. Well, to control for that, in this gift condition, 501 00:27:23,080 --> 00:27:25,840 Speaker 1: the subject did not get to choose which item they 502 00:27:25,840 --> 00:27:29,040 Speaker 1: would receive. It was just picked for them and given 503 00:27:29,080 --> 00:27:32,879 Speaker 1: as a gift by by the experimenter. And what Brim 504 00:27:32,920 --> 00:27:36,080 Speaker 1: found is that this control condition did not produce effects 505 00:27:36,119 --> 00:27:38,720 Speaker 1: to challenge the main finding. So it really did look like, 506 00:27:38,800 --> 00:27:42,440 Speaker 1: at least from this experiment, that people's ratings were actually 507 00:27:42,520 --> 00:27:45,960 Speaker 1: changed by the choices they made and not just by 508 00:27:46,000 --> 00:27:50,600 Speaker 1: feelings associated with ownership or feelings from what you're taking home. 509 00:27:50,920 --> 00:27:53,160 Speaker 1: It wasn't the fact that you have the coffee maker 510 00:27:53,240 --> 00:27:55,440 Speaker 1: that makes it seem better. It was the fact that 511 00:27:55,560 --> 00:27:59,040 Speaker 1: you chose it. You know, when this is this gets 512 00:27:59,080 --> 00:28:01,679 Speaker 1: more complicated and perhaps it's just looked at more in 513 00:28:01,880 --> 00:28:05,040 Speaker 1: the appropriate literature surrounding it. But I'm instantly reminded of 514 00:28:05,040 --> 00:28:08,920 Speaker 1: some of the advertising mechanics that I've encountered recently on YouTube, 515 00:28:09,040 --> 00:28:11,679 Speaker 1: where I'm watching a show that that I that we 516 00:28:11,720 --> 00:28:15,560 Speaker 1: regularly watch, and then I'll instead of just being served 517 00:28:15,560 --> 00:28:18,359 Speaker 1: an ad, I get served a little choice that says, 518 00:28:18,560 --> 00:28:21,040 Speaker 1: which of the following ads would you like to receive most? 519 00:28:21,119 --> 00:28:23,920 Speaker 1: I wonder if that is a mechanic that's playing into 520 00:28:24,400 --> 00:28:26,760 Speaker 1: some of this. Oh yes, so like if you choose it, 521 00:28:26,800 --> 00:28:30,720 Speaker 1: maybe the ideas you'll actually be uh less resentful of 522 00:28:30,760 --> 00:28:32,720 Speaker 1: the ad and more likely to pay attention to it 523 00:28:32,760 --> 00:28:35,760 Speaker 1: and listen all the way through. Yeah, I wonder, yeah, maybe, 524 00:28:35,960 --> 00:28:38,600 Speaker 1: because I know is the choice is never like um, 525 00:28:38,600 --> 00:28:40,959 Speaker 1: it's never a wonderful like an easy choice. It's not 526 00:28:41,040 --> 00:28:42,360 Speaker 1: like do you want to see an ad for the 527 00:28:42,360 --> 00:28:44,560 Speaker 1: new Star Wars TV show? Or do you want to 528 00:28:44,560 --> 00:28:48,000 Speaker 1: watch an insurance commercial? Now, it's always like which insurance 529 00:28:48,000 --> 00:28:51,560 Speaker 1: commercial would you like to watch? Well, it's got to 530 00:28:51,600 --> 00:28:55,200 Speaker 1: be the one with that really nice lady. Was one 531 00:28:55,240 --> 00:28:58,080 Speaker 1: with a really nice lady who's always really nice? Oh? Yeah, 532 00:28:58,160 --> 00:28:59,800 Speaker 1: I guess I like the weird ones, give me the 533 00:29:00,000 --> 00:29:03,640 Speaker 1: give me the yeah, the gecko ones c g I 534 00:29:03,720 --> 00:29:06,760 Speaker 1: geck yeah, make me not realize it was for insurance 535 00:29:06,800 --> 00:29:11,360 Speaker 1: and and never think twice about what the product actually was. Okay, So, 536 00:29:11,560 --> 00:29:13,800 Speaker 1: just to read the top line from the conclusion of 537 00:29:13,840 --> 00:29:17,080 Speaker 1: brim study quote, the results supported the prediction that choosing 538 00:29:17,120 --> 00:29:21,520 Speaker 1: between alternatives would create dissonance and attempts to reduce it 539 00:29:21,560 --> 00:29:25,320 Speaker 1: by making the chosen alternative more desirable and the unchosen 540 00:29:25,360 --> 00:29:29,240 Speaker 1: alternative less desirable. Yeah. This reminds me of another paper, 541 00:29:29,680 --> 00:29:31,760 Speaker 1: Joe that I think you're familiar with, Love the one 542 00:29:31,800 --> 00:29:37,080 Speaker 1: You're with by Stephen Stills at All Yep, when you 543 00:29:37,080 --> 00:29:40,000 Speaker 1: know when you're down, when you're confused and you don't 544 00:29:40,000 --> 00:29:48,880 Speaker 1: remember how you rated the items originally now uh more. Seriously, though, 545 00:29:49,200 --> 00:29:52,320 Speaker 1: I'm not sure if this completely sticks, but I instantly 546 00:29:52,600 --> 00:29:55,400 Speaker 1: looking over all this, I started thinking about the still 547 00:29:55,440 --> 00:30:00,120 Speaker 1: relevant divide over gaming systems. So back when I was 548 00:30:00,280 --> 00:30:02,680 Speaker 1: a kid, Like the first major choice I think I 549 00:30:03,120 --> 00:30:05,400 Speaker 1: had to make because I was an ne Ne Ne s kid, 550 00:30:05,440 --> 00:30:09,360 Speaker 1: and then came the choice Super nes or Saga Genesis, 551 00:30:10,320 --> 00:30:13,160 Speaker 1: and then eventually later comes to PlayStation Xbox Divide, and 552 00:30:13,200 --> 00:30:16,400 Speaker 1: I think that's that's still very much alive today. But 553 00:30:16,400 --> 00:30:18,640 Speaker 1: but basically, you know, one often has to make a 554 00:30:18,720 --> 00:30:21,520 Speaker 1: choice which prices system they're going to invest in. And 555 00:30:21,600 --> 00:30:25,080 Speaker 1: this also impacts certain console exclusives, right Like, if you're 556 00:30:25,080 --> 00:30:28,239 Speaker 1: a Nintendo White, then you're gonna get Mario and so 557 00:30:28,280 --> 00:30:31,880 Speaker 1: you can end up on Team Mario. If you're a Sagatarian, 558 00:30:31,920 --> 00:30:34,280 Speaker 1: then you're gonna you're gonna be a follower of Saints Sonic, 559 00:30:34,320 --> 00:30:38,440 Speaker 1: the Hedgehog and may you know, maybe Saint Altered Beast. Xbox. 560 00:30:38,440 --> 00:30:40,600 Speaker 1: You're gonna get Gears of War. In Halo PlayStation, you're 561 00:30:40,640 --> 00:30:43,840 Speaker 1: gonna get Gods of War and or God of War 562 00:30:44,000 --> 00:30:47,800 Speaker 1: whatever it was, and the Last Office. So um. You know, 563 00:30:47,840 --> 00:30:50,040 Speaker 1: on on one level, on like a very rational level, 564 00:30:50,080 --> 00:30:53,120 Speaker 1: you engage with in some decision making. You're like, well, 565 00:30:53,680 --> 00:30:56,320 Speaker 1: uh I know this franchise as a console exclusive, I'm 566 00:30:56,320 --> 00:31:00,560 Speaker 1: gonna go this direction. But in other cases, uh I 567 00:31:00,640 --> 00:31:03,760 Speaker 1: do looking back, I do find myself having engaged in 568 00:31:03,800 --> 00:31:06,440 Speaker 1: some of that. You know, like I didn't really have 569 00:31:06,600 --> 00:31:11,120 Speaker 1: a huge opinion on the whole Mario Sonic divide, and 570 00:31:11,240 --> 00:31:14,680 Speaker 1: yet I found at times someone like today will bring 571 00:31:14,760 --> 00:31:17,320 Speaker 1: up Mario and Sonic and be like oh, well, you know, 572 00:31:17,480 --> 00:31:19,520 Speaker 1: Mario was cool, but Sonic was a bit lame. Sonic 573 00:31:19,560 --> 00:31:21,800 Speaker 1: was a bit a bit of a pu Pucci, and 574 00:31:23,000 --> 00:31:25,200 Speaker 1: I realistically have to agree with them. But I have 575 00:31:25,320 --> 00:31:29,360 Speaker 1: this impulse to defend Sonic because I was a Saga player, 576 00:31:29,400 --> 00:31:31,280 Speaker 1: because I had the Saga Genesis, and even though I 577 00:31:31,280 --> 00:31:34,680 Speaker 1: didn't really love Sonic the Hedgehog like he was, still 578 00:31:34,720 --> 00:31:36,880 Speaker 1: I was still on that team, you know. So I'm 579 00:31:36,880 --> 00:31:39,320 Speaker 1: still reeling from Sonic being a Pucci, which I think 580 00:31:39,400 --> 00:31:42,920 Speaker 1: is highly accurate. I'm sorry it is. No, I I agree, 581 00:31:42,960 --> 00:31:47,200 Speaker 1: I rationally agree with you, but I have this irrational response, 582 00:31:47,280 --> 00:31:50,400 Speaker 1: this knee jerk reaction to defend him for some reason, 583 00:31:50,680 --> 00:31:53,680 Speaker 1: even though I never completed a Sonic game and ultimately 584 00:31:53,680 --> 00:31:56,920 Speaker 1: don't have a real strong opinion on Sonic versus Mario. 585 00:31:56,960 --> 00:31:59,040 Speaker 1: I didn't. I played both of them at some point 586 00:31:59,080 --> 00:32:01,760 Speaker 1: or another. Uh, and I didn't particularly, you know, I 587 00:32:01,760 --> 00:32:05,320 Speaker 1: don't really rationally love one more than the other. Yeah, 588 00:32:05,360 --> 00:32:09,240 Speaker 1: I mean, I think ideas like which video game console 589 00:32:09,320 --> 00:32:11,080 Speaker 1: you buy that that goes in the same direction as 590 00:32:11,120 --> 00:32:13,360 Speaker 1: a lot of these sort of like consumer options that 591 00:32:13,440 --> 00:32:17,320 Speaker 1: people choose between, where I think clearly like both kinds 592 00:32:17,320 --> 00:32:19,440 Speaker 1: of considerations are going to be feeding in like there 593 00:32:19,480 --> 00:32:22,640 Speaker 1: are some just genuine preference differences, like you can look 594 00:32:22,680 --> 00:32:24,560 Speaker 1: at like which games you can get on each one 595 00:32:24,560 --> 00:32:26,840 Speaker 1: and have a genuine desire to play one more than 596 00:32:26,880 --> 00:32:30,360 Speaker 1: the other. But then there's also probably some choice supportive 597 00:32:30,360 --> 00:32:33,600 Speaker 1: bias kicking in and how you retrospectively think about making 598 00:32:33,600 --> 00:32:36,800 Speaker 1: the choice and which one you'd like better? And I 599 00:32:36,840 --> 00:32:39,560 Speaker 1: guess I should say that A less favored but also 600 00:32:39,640 --> 00:32:44,520 Speaker 1: possibly viable explanation for for this phenomenon um like observed 601 00:32:44,520 --> 00:32:48,640 Speaker 1: in Brim study is known as self perception theory. Basically, 602 00:32:48,760 --> 00:32:52,040 Speaker 1: this is an alternative to cognitive dissonance theory that comes 603 00:32:52,040 --> 00:32:56,640 Speaker 1: down to the principle that people form their internal perceptions 604 00:32:56,680 --> 00:33:01,360 Speaker 1: of the self by observing external action ends. So, how 605 00:33:01,400 --> 00:33:04,080 Speaker 1: do you decide what your preferences are? Will you actually 606 00:33:04,120 --> 00:33:08,600 Speaker 1: decide them by observing what you choose? And so if 607 00:33:08,640 --> 00:33:12,040 Speaker 1: this were the correct interpretation, this would also explain choice 608 00:33:12,080 --> 00:33:15,160 Speaker 1: induced preference change, which is what the phenomenon would come 609 00:33:15,200 --> 00:33:18,440 Speaker 1: to be known as choice induced preference change. You make 610 00:33:18,480 --> 00:33:22,680 Speaker 1: the choice and that changes retrospectively what you think your 611 00:33:22,720 --> 00:33:26,880 Speaker 1: preferences are. And Brim's results have been replicated many times 612 00:33:26,920 --> 00:33:30,920 Speaker 1: across many studies. Uh, there are, there are some disagreements, 613 00:33:30,920 --> 00:33:33,120 Speaker 1: but it appears to me to be a pretty solid 614 00:33:33,160 --> 00:33:36,960 Speaker 1: conclusion that not only do our preferences influence our choices, 615 00:33:36,960 --> 00:33:40,520 Speaker 1: but our choices really do influence our preferences. And this 616 00:33:40,680 --> 00:33:44,680 Speaker 1: probably happens in both positive and negative directions. So again, 617 00:33:44,760 --> 00:33:48,480 Speaker 1: just like in that first study, our preferences for options 618 00:33:48,520 --> 00:33:52,080 Speaker 1: that we choose increase, and our preferences for options that 619 00:33:52,120 --> 00:33:56,280 Speaker 1: we reject decrease. And I think the second condition is 620 00:33:56,400 --> 00:33:59,800 Speaker 1: especially interesting. It explains something that I've often observed, and 621 00:33:59,840 --> 00:34:04,040 Speaker 1: it totally that so many things in life are once 622 00:34:04,200 --> 00:34:08,640 Speaker 1: discarded despised. Almost as soon as you have committed to 623 00:34:08,800 --> 00:34:12,800 Speaker 1: rejecting an option, you can suddenly think of all kinds 624 00:34:12,800 --> 00:34:15,920 Speaker 1: of reasons why that option was bad anyway, The cons 625 00:34:16,040 --> 00:34:20,520 Speaker 1: just boil up into your brain. Yeah, this is interesting. Um. 626 00:34:20,800 --> 00:34:22,520 Speaker 1: I thought about this in terms of video games, but 627 00:34:22,560 --> 00:34:24,840 Speaker 1: then I think I thought of an even better example, 628 00:34:25,320 --> 00:34:30,359 Speaker 1: and that is, um, the music of Metallica. So so 629 00:34:30,960 --> 00:34:33,160 Speaker 1: i'm i'm I. I always try to be a polite 630 00:34:33,200 --> 00:34:36,160 Speaker 1: person about things that I like and what things other 631 00:34:36,200 --> 00:34:39,200 Speaker 1: people like. So you know, if at any point someone 632 00:34:39,320 --> 00:34:40,680 Speaker 1: was to come up to me and be like, hey, 633 00:34:40,680 --> 00:34:43,400 Speaker 1: I'm really excited about Metallica, or you know, I'm listening 634 00:34:43,440 --> 00:34:45,160 Speaker 1: to this old Metallic album. I'm trying to have this 635 00:34:45,160 --> 00:34:48,160 Speaker 1: new Metallic album. I would probably be like, oh, yeah, yeah, 636 00:34:48,160 --> 00:34:51,960 Speaker 1: Metallic is cool. But if I if I'm being if 637 00:34:52,000 --> 00:34:54,319 Speaker 1: I'm being honest, like there was there was a time 638 00:34:54,360 --> 00:34:56,719 Speaker 1: in my life where I was super into Metallica. I 639 00:34:56,760 --> 00:34:59,319 Speaker 1: was like, you know, discovering those those albums for the 640 00:34:59,360 --> 00:35:02,000 Speaker 1: first time, know uh, you know, you know, Ride the 641 00:35:02,080 --> 00:35:06,640 Speaker 1: Lightning and so forth for me. For me, that was 642 00:35:06,719 --> 00:35:10,320 Speaker 1: like Metallica City. Yeah, yeah, it was. I think I 643 00:35:10,320 --> 00:35:12,759 Speaker 1: I was maybe just starting college or maybe it was 644 00:35:12,800 --> 00:35:15,160 Speaker 1: finishing high school when I really started getting into them. 645 00:35:15,200 --> 00:35:16,799 Speaker 1: But it was like, you know, everything from the Black 646 00:35:16,840 --> 00:35:21,879 Speaker 1: album prior. I was like, this is amazing and um, 647 00:35:21,920 --> 00:35:25,560 Speaker 1: and then at some point I was like, uh, basically 648 00:35:25,600 --> 00:35:28,000 Speaker 1: I less. I stopped listening to them for a very 649 00:35:28,040 --> 00:35:30,880 Speaker 1: long time, and then more recently I started listening to 650 00:35:30,920 --> 00:35:34,239 Speaker 1: them again. And and that's like the realistic read on it. 651 00:35:34,280 --> 00:35:37,040 Speaker 1: But on some level, I do feel like when I 652 00:35:37,120 --> 00:35:40,240 Speaker 1: discarded Metallica, I was like, yeah, Metallica kind of sucks. 653 00:35:40,239 --> 00:35:42,680 Speaker 1: Like those guys are jerks. Uh, They're newer stuff is 654 00:35:42,680 --> 00:35:44,800 Speaker 1: not any good. You know, all these various things you 655 00:35:44,840 --> 00:35:49,040 Speaker 1: kind of heap onto the pile, which is ridiculous because hey, 656 00:35:49,280 --> 00:35:52,759 Speaker 1: I used to really like them, and then I would 657 00:35:52,800 --> 00:35:55,160 Speaker 1: have to like current me would have to point out 658 00:35:55,200 --> 00:35:57,680 Speaker 1: to then me, you're going to like them again. There's 659 00:35:57,680 --> 00:36:00,360 Speaker 1: gonna be a come of time in where you suddenly 660 00:36:00,360 --> 00:36:03,799 Speaker 1: starts streaming a bunch of old Metaca albums again, and uh, 661 00:36:04,360 --> 00:36:06,360 Speaker 1: and it's not going to make sense with this current 662 00:36:06,400 --> 00:36:09,080 Speaker 1: rejection of them. Is A is A is A is 663 00:36:09,080 --> 00:36:12,120 Speaker 1: a musical entity. This is really funny because just this 664 00:36:12,160 --> 00:36:14,720 Speaker 1: week I started listening to their first two albums again. 665 00:36:15,760 --> 00:36:18,240 Speaker 1: I wonder why that happened. Is there something in common 666 00:36:18,280 --> 00:36:20,640 Speaker 1: that did this come up in a previous talk we had. 667 00:36:21,280 --> 00:36:24,000 Speaker 1: I don't think we've really talked about Metauica recently. I 668 00:36:24,040 --> 00:36:27,760 Speaker 1: mean it comes up time to time. But metal serendipity, 669 00:36:27,960 --> 00:36:30,680 Speaker 1: I I find very interesting that I love the stupid 670 00:36:30,800 --> 00:36:34,560 Speaker 1: ideology of their early albums, which there is presumed to 671 00:36:34,600 --> 00:36:38,440 Speaker 1: be some kind of great conflict over the concept of metal. 672 00:36:39,080 --> 00:36:42,920 Speaker 1: And one of the things that's great about early early 673 00:36:43,160 --> 00:36:46,239 Speaker 1: albums within a genre is that they're often very much 674 00:36:46,239 --> 00:36:49,000 Speaker 1: about the genre. So like you know, like early rock 675 00:36:49,160 --> 00:36:52,080 Speaker 1: music is all about what rocking is and instructing you 676 00:36:52,160 --> 00:36:55,640 Speaker 1: to rock. Uh, They're like early rap songs that are 677 00:36:55,680 --> 00:36:58,279 Speaker 1: about rapping and about and telling people how to wrap. 678 00:36:58,440 --> 00:37:01,360 Speaker 1: And their early metal albums are vertty much all about metal, 679 00:37:01,760 --> 00:37:05,239 Speaker 1: and Metallica's early albums are are all about the concept 680 00:37:05,280 --> 00:37:07,160 Speaker 1: of metal and what it means to fight in the 681 00:37:07,200 --> 00:37:11,200 Speaker 1: metal wars. I love this, Yeah, I have a big 682 00:37:11,520 --> 00:37:13,719 Speaker 1: I really love house music as well, and of course 683 00:37:13,760 --> 00:37:16,000 Speaker 1: there's so many different types of house music to listen to, 684 00:37:16,400 --> 00:37:19,799 Speaker 1: but I still have a very warm place for house 685 00:37:19,920 --> 00:37:22,880 Speaker 1: music that informs you that this is house music. We 686 00:37:22,920 --> 00:37:25,880 Speaker 1: have a voice telling you you are listening to house music, 687 00:37:25,920 --> 00:37:27,920 Speaker 1: and I'm like, that's great. I don't get enough music 688 00:37:28,320 --> 00:37:31,480 Speaker 1: that is very explicit about the genre that I'm listening to. 689 00:37:31,760 --> 00:37:35,920 Speaker 1: That's excellent. I wonder what age a genre mostly stops 690 00:37:36,000 --> 00:37:39,279 Speaker 1: being about the concept of itself as a genre is 691 00:37:39,400 --> 00:37:42,520 Speaker 1: like metal today isn't usually very much about the concept 692 00:37:42,520 --> 00:37:45,800 Speaker 1: of metal, like early thresh metal was. Yeah, I don't know. 693 00:37:45,840 --> 00:37:49,520 Speaker 1: I guess it just evolves to a certain certain point. Um. Now, 694 00:37:49,560 --> 00:37:51,440 Speaker 1: of course, in all of this, you know, not to 695 00:37:51,480 --> 00:37:53,200 Speaker 1: get too far off the point here, I think also 696 00:37:53,440 --> 00:37:56,279 Speaker 1: you have that kind of like Evan flow of nostalgia, right, 697 00:37:56,280 --> 00:37:57,800 Speaker 1: So the thing you're into then you get out of, 698 00:37:57,800 --> 00:37:59,399 Speaker 1: and then you can reach a point where you look 699 00:37:59,440 --> 00:38:01,440 Speaker 1: back on it only and get back into it at 700 00:38:01,520 --> 00:38:04,080 Speaker 1: least some degree. But it's funny because I went through 701 00:38:04,080 --> 00:38:08,160 Speaker 1: a cycle that exactly mirrors yours. Like I liked them 702 00:38:08,160 --> 00:38:11,440 Speaker 1: when I was younger, and then after once I stopped 703 00:38:11,480 --> 00:38:13,560 Speaker 1: listening to them. Wasn't a deliberate choice. I just kind 704 00:38:13,560 --> 00:38:15,879 Speaker 1: of moved on to other things. And then I look 705 00:38:15,920 --> 00:38:17,960 Speaker 1: back on music that I used to listen to and 706 00:38:17,960 --> 00:38:21,520 Speaker 1: don't listen to anymore, and often feel this, uh, this, 707 00:38:21,520 --> 00:38:24,560 Speaker 1: this kind of sting this thing like Okay, I mean, 708 00:38:24,600 --> 00:38:27,160 Speaker 1: I guess what's probably very much going on is I 709 00:38:27,200 --> 00:38:30,120 Speaker 1: don't listen to it. I'm supporting that choice to not 710 00:38:30,239 --> 00:38:33,319 Speaker 1: listen to it by changing my beliefs about it and 711 00:38:33,360 --> 00:38:43,759 Speaker 1: deciding that it's dumb anyway than now. Following up from 712 00:38:43,800 --> 00:38:46,080 Speaker 1: Brim's original study in the fifties, like I said, there 713 00:38:46,080 --> 00:38:48,600 Speaker 1: have been a bunch of replications, but there have also 714 00:38:48,640 --> 00:38:51,680 Speaker 1: been some interesting questions. Like one study I was looking 715 00:38:51,719 --> 00:38:55,759 Speaker 1: at investigated something about the methodology of the test, so 716 00:38:55,880 --> 00:38:57,600 Speaker 1: it was it was trying to see if the results 717 00:38:57,600 --> 00:39:01,960 Speaker 1: stand up to challenges to brims original method. And the 718 00:39:02,000 --> 00:39:04,520 Speaker 1: paper I was looking at here was by Tally Shiro, 719 00:39:04,800 --> 00:39:09,400 Speaker 1: Christina M. Velasquez, and Raymond J. Dolan, published in Psychological 720 00:39:09,440 --> 00:39:13,440 Speaker 1: Science in two thousand ten, called do Decisions Shape Preference? 721 00:39:13,520 --> 00:39:17,520 Speaker 1: Evidence from blind Choice? Now that this was pretty interesting. 722 00:39:17,560 --> 00:39:20,719 Speaker 1: So the authors here begin by noting some papers all 723 00:39:20,760 --> 00:39:23,279 Speaker 1: the ones I saw, were associated with their researcher named 724 00:39:23,400 --> 00:39:27,600 Speaker 1: mk Chen that noticed a potential problem with Brem's method, 725 00:39:28,080 --> 00:39:30,879 Speaker 1: such that it could be telling us something different than 726 00:39:30,920 --> 00:39:33,680 Speaker 1: what we think it does. And the critique goes like 727 00:39:33,719 --> 00:39:37,280 Speaker 1: this in uh, in the author's hiro at all's words 728 00:39:37,280 --> 00:39:42,399 Speaker 1: here quote, people's preferences cannot be measured perfectly and are 729 00:39:42,400 --> 00:39:47,640 Speaker 1: subject to rating noise. Okay, true, as participants gain experience 730 00:39:47,719 --> 00:39:51,839 Speaker 1: with the rating scale, they will provide more accurate ratings, 731 00:39:51,840 --> 00:39:56,120 Speaker 1: such that post choice shifts in ratings simply reflect the 732 00:39:56,320 --> 00:40:01,120 Speaker 1: unmasking of the participants initial preferences, which can be predicted 733 00:40:01,200 --> 00:40:05,239 Speaker 1: by their choices, rather than reflecting any changes in preference 734 00:40:05,600 --> 00:40:09,360 Speaker 1: induced by the choice. Uh So does that make sense? Basically? 735 00:40:09,400 --> 00:40:12,240 Speaker 1: I think what they're saying is that maybe when people 736 00:40:12,360 --> 00:40:15,840 Speaker 1: change their desirability ratings of two things that are initially 737 00:40:15,880 --> 00:40:18,319 Speaker 1: similar after being forced to pick one or the other. 738 00:40:18,960 --> 00:40:22,840 Speaker 1: What's happening is not an ex post facto reevaluation of 739 00:40:22,880 --> 00:40:26,560 Speaker 1: their preferences, but people are just getting better with successive 740 00:40:26,600 --> 00:40:31,160 Speaker 1: tries at expressing their genuine, pre existing preferences on the 741 00:40:31,239 --> 00:40:34,040 Speaker 1: rating scale used in the experiment. It seems like a 742 00:40:34,080 --> 00:40:37,359 Speaker 1: reasonable critique that that would be worth looking into. Yeah, yeah, 743 00:40:37,520 --> 00:40:40,080 Speaker 1: and I think we can all see examples of that 744 00:40:40,200 --> 00:40:42,000 Speaker 1: or find examples of that where you're just like, well, 745 00:40:42,040 --> 00:40:44,000 Speaker 1: I was trying out this one musical genre. It turns 746 00:40:44,000 --> 00:40:46,359 Speaker 1: out that just one my thing. Or like I think 747 00:40:46,360 --> 00:40:48,920 Speaker 1: back on video games and I'm like, yeah, I eventually 748 00:40:48,960 --> 00:40:51,919 Speaker 1: realized I'm just not good at real time strategy games. 749 00:40:51,920 --> 00:40:54,080 Speaker 1: I just don't like them as much they don't. It's 750 00:40:54,120 --> 00:40:56,520 Speaker 1: just not my deal, right, So it would be that 751 00:40:56,600 --> 00:41:00,680 Speaker 1: the actual preferences in the beginning were what was revealed 752 00:41:00,719 --> 00:41:03,120 Speaker 1: in the second rating, and you're just getting better at 753 00:41:03,160 --> 00:41:07,240 Speaker 1: expressing them rather than changing them. Uh So, the authors 754 00:41:07,280 --> 00:41:09,759 Speaker 1: of this two thousands ten study tried to design an 755 00:41:09,760 --> 00:41:13,399 Speaker 1: experiment that couldn't be subject to that problem, and what 756 00:41:13,440 --> 00:41:15,560 Speaker 1: they came up with was what they called a blind 757 00:41:15,800 --> 00:41:19,120 Speaker 1: choice model as opposed to a free choice model. So 758 00:41:19,160 --> 00:41:21,439 Speaker 1: what's the difference. Well, in a free choice model, again, 759 00:41:21,480 --> 00:41:24,400 Speaker 1: remember you would rate a number of options according to 760 00:41:24,400 --> 00:41:27,600 Speaker 1: your preference. Then you'd be forced to choose between some 761 00:41:27,719 --> 00:41:31,280 Speaker 1: subset of them. Then later you rate the options again. 762 00:41:31,840 --> 00:41:34,520 Speaker 1: In this study, what was different was that people didn't 763 00:41:34,600 --> 00:41:37,640 Speaker 1: know what two options from the list they were choosing 764 00:41:37,680 --> 00:41:41,799 Speaker 1: between until they had made their choice. So you're given 765 00:41:41,840 --> 00:41:45,400 Speaker 1: a hypothetical list of vacation destinations and you rate them 766 00:41:45,440 --> 00:41:47,399 Speaker 1: in terms of how much you'd like to go there 767 00:41:47,440 --> 00:41:52,000 Speaker 1: for a vacation, so you know, Rome, Cairo, et cetera. Then, 768 00:41:52,120 --> 00:41:55,040 Speaker 1: after the initial rating task, you are asked to choose 769 00:41:55,120 --> 00:41:59,560 Speaker 1: blindly between a binary subset for a hypothetical vacation, but 770 00:41:59,640 --> 00:42:01,560 Speaker 1: you can't see what they are. Is you have option 771 00:42:01,640 --> 00:42:04,720 Speaker 1: A and option B, but the actual locations are hidden, 772 00:42:05,520 --> 00:42:08,520 Speaker 1: and you choose one. Once you choose between them, the 773 00:42:08,560 --> 00:42:10,960 Speaker 1: options are then revealed, so it's like, oh, so it 774 00:42:11,000 --> 00:42:15,120 Speaker 1: seems you've picked Making instead of Tuscany or whatever. Uh, 775 00:42:15,160 --> 00:42:17,200 Speaker 1: And and then once it's all over, you will be 776 00:42:17,239 --> 00:42:20,560 Speaker 1: asked to rate the options again. So so does that 777 00:42:20,600 --> 00:42:22,839 Speaker 1: make sense that you can't see what the options are 778 00:42:22,880 --> 00:42:25,520 Speaker 1: you're just making a choice without any information at all, 779 00:42:25,640 --> 00:42:29,000 Speaker 1: just complete blind choice. And they also included a couple 780 00:42:29,000 --> 00:42:32,640 Speaker 1: of control conditions where a computer made the decision for people, 781 00:42:32,680 --> 00:42:34,200 Speaker 1: so you don't get to make a choice at all, 782 00:42:34,920 --> 00:42:38,400 Speaker 1: to see if the perception of personal agency was important 783 00:42:38,480 --> 00:42:41,279 Speaker 1: even though the choice was made blind. And it's not 784 00:42:41,320 --> 00:42:44,080 Speaker 1: just a case of like picking a door and then 785 00:42:44,120 --> 00:42:46,279 Speaker 1: what's behind door number three? Because at least in that 786 00:42:46,320 --> 00:42:48,880 Speaker 1: scenario you picked three. But in this there's like a 787 00:42:49,000 --> 00:42:51,520 Speaker 1: robot game show host that as you walk up and 788 00:42:51,520 --> 00:42:53,960 Speaker 1: then it just says you're getting a toaster, right, you 789 00:42:54,080 --> 00:42:57,520 Speaker 1: get what's behind door number three? That that's the difference there. 790 00:42:57,840 --> 00:43:00,720 Speaker 1: And so what did the study find. Quote, We found 791 00:43:00,719 --> 00:43:05,000 Speaker 1: that preferences were altered after participants made a blind choice, 792 00:43:05,200 --> 00:43:09,600 Speaker 1: but not when a computer instructed the participants decision. The 793 00:43:09,640 --> 00:43:15,280 Speaker 1: results suggests that just as preferences form choices, choices shape preferences. 794 00:43:15,320 --> 00:43:19,160 Speaker 1: So this is confirming to some degree Brem's original results. 795 00:43:19,200 --> 00:43:21,879 Speaker 1: It looks like, yes, these studies have not merely been 796 00:43:21,880 --> 00:43:25,319 Speaker 1: tracking how people get better at assigning ratings to their 797 00:43:25,320 --> 00:43:29,359 Speaker 1: pre existing preferences. What people want and prefer really does 798 00:43:29,400 --> 00:43:32,120 Speaker 1: seem to change, So that it falls in line with 799 00:43:32,200 --> 00:43:35,480 Speaker 1: what they have already chosen. And this study also reveals 800 00:43:35,719 --> 00:43:39,920 Speaker 1: this very interesting wrinkle, choice induced preference change can happen 801 00:43:40,120 --> 00:43:43,440 Speaker 1: even when we are not making an informed choice but 802 00:43:43,600 --> 00:43:47,960 Speaker 1: just choosing randomly between two options that are temporarily hidden, 803 00:43:48,560 --> 00:43:51,120 Speaker 1: which which is very interesting. So there's some part of 804 00:43:51,200 --> 00:43:54,400 Speaker 1: us that, again, if the if the cognitive dissonance interpretation 805 00:43:54,440 --> 00:43:57,640 Speaker 1: of this phenomenon is correct, there's some part of us 806 00:43:57,680 --> 00:44:00,719 Speaker 1: that feels a kind of agency that needs to be 807 00:44:00,760 --> 00:44:03,920 Speaker 1: accounted for in what you chose, even if you didn't 808 00:44:03,960 --> 00:44:06,280 Speaker 1: know what you were choosing, Even if you're just choosing 809 00:44:06,360 --> 00:44:08,680 Speaker 1: you know, hats and you can't see what's inside them, 810 00:44:08,760 --> 00:44:11,400 Speaker 1: or or yeah, door number three, you still feel like 811 00:44:11,480 --> 00:44:16,279 Speaker 1: I picked that and I need to justify that decision internally. Huh, Yeah, 812 00:44:16,360 --> 00:44:19,680 Speaker 1: that that is That is interesting. I'm surprised we we 813 00:44:19,760 --> 00:44:24,440 Speaker 1: don't see more of this utilized in online advertising, you know, 814 00:44:24,480 --> 00:44:27,080 Speaker 1: like maybe there's a version of that YouTube scenarios describing 815 00:44:27,120 --> 00:44:30,040 Speaker 1: earlier where instead of giving you a choice of specific as, 816 00:44:30,080 --> 00:44:31,600 Speaker 1: it says, what do you want ad number one or 817 00:44:31,600 --> 00:44:35,080 Speaker 1: add number two? And maybe it's a completely false choice. 818 00:44:35,120 --> 00:44:36,880 Speaker 1: You know you're always going to get the same ad, 819 00:44:36,920 --> 00:44:39,480 Speaker 1: but they are going to give you the provide this 820 00:44:39,560 --> 00:44:42,400 Speaker 1: illusion that you had to say. Well, this is interesting 821 00:44:42,440 --> 00:44:45,560 Speaker 1: because when people do not have the illusion that they 822 00:44:45,600 --> 00:44:48,440 Speaker 1: have a say, then apparently the effect does not hold 823 00:44:48,520 --> 00:44:52,200 Speaker 1: because again thing back to the computer condition. At least 824 00:44:52,239 --> 00:44:55,680 Speaker 1: in this study, choice induced preference change only seems to 825 00:44:55,719 --> 00:44:59,520 Speaker 1: apply if you think it's really you making the choice, 826 00:44:59,560 --> 00:45:02,439 Speaker 1: not if some someone or something else chooses for you. 827 00:45:02,719 --> 00:45:05,160 Speaker 1: And this mirror is what brim found in the gift condition. 828 00:45:05,239 --> 00:45:08,000 Speaker 1: If you're given three options and then the computer says Okay, 829 00:45:08,000 --> 00:45:12,040 Speaker 1: of these three options, you get number three, it doesn't 830 00:45:12,080 --> 00:45:16,440 Speaker 1: have you don't change your evaluations afterwards. Now, to come 831 00:45:16,440 --> 00:45:20,080 Speaker 1: back to the black mirror, Bandersnatch episode, which again is 832 00:45:20,120 --> 00:45:23,200 Speaker 1: a choose your own adventure type episode where you make 833 00:45:23,280 --> 00:45:26,320 Speaker 1: choices when you watch it in Netflix. Um, I remember 834 00:45:26,480 --> 00:45:28,920 Speaker 1: when I rewatched it last year for our episode. I 835 00:45:29,000 --> 00:45:31,000 Speaker 1: ended up being really pleased with the way it came 836 00:45:31,040 --> 00:45:34,839 Speaker 1: together based on my choices. But it was because what 837 00:45:34,880 --> 00:45:37,000 Speaker 1: was it? Because I actually hit on a good combo 838 00:45:37,160 --> 00:45:40,320 Speaker 1: of narrative branches and this choose your own adventure world? 839 00:45:41,160 --> 00:45:43,320 Speaker 1: Or or was it this? You know? Because to a 840 00:45:43,400 --> 00:45:47,239 Speaker 1: certain extent, there are aspects of of of all this, 841 00:45:47,360 --> 00:45:49,560 Speaker 1: and in the blind test, you know you don't necessarily 842 00:45:49,600 --> 00:45:52,280 Speaker 1: know how the choices you make will impact the overall 843 00:45:52,320 --> 00:45:54,719 Speaker 1: shape of the narrative by the time you're done with it. 844 00:45:55,040 --> 00:46:00,120 Speaker 1: That's a really good comparison. I mean, I feel well, 845 00:46:00,120 --> 00:46:03,040 Speaker 1: I mean thinking about how I interacted with Bandersnatch or 846 00:46:03,080 --> 00:46:05,279 Speaker 1: with Choose your Own Adventure books when I was a kid. 847 00:46:06,120 --> 00:46:09,120 Speaker 1: It's funny how we feel some amount of angst and 848 00:46:09,280 --> 00:46:12,759 Speaker 1: personal accountability, or at least I did, for how the 849 00:46:12,800 --> 00:46:15,879 Speaker 1: Bandersnatch or that Choose your Own Adventure choices turn out, 850 00:46:16,600 --> 00:46:19,200 Speaker 1: even though there's usually no way you could have predicted 851 00:46:20,200 --> 00:46:22,720 Speaker 1: the ways that they will actually play out in narrative. 852 00:46:23,200 --> 00:46:27,000 Speaker 1: Merely the suggestion that you're in control seems to be 853 00:46:27,160 --> 00:46:30,360 Speaker 1: enough to conjure the shadow of personal agency over the 854 00:46:30,360 --> 00:46:33,160 Speaker 1: direction of the narrative, and thus I think enough to 855 00:46:33,400 --> 00:46:36,480 Speaker 1: bring in the feeling of cognitive dissonance when you choose 856 00:46:36,520 --> 00:46:38,800 Speaker 1: a path that goes somewhere you don't like or that 857 00:46:38,920 --> 00:46:43,760 Speaker 1: feels bad or increases the tension. Yeah. So there's another study, 858 00:46:43,840 --> 00:46:46,360 Speaker 1: and an older study that I wanted to mention briefly, 859 00:46:46,560 --> 00:46:49,000 Speaker 1: and this one is from the year two thousand ten 860 00:46:49,480 --> 00:46:53,640 Speaker 1: that looks at choice induced preference change in children and 861 00:46:53,760 --> 00:46:56,319 Speaker 1: non human animals. And I thought that this was very 862 00:46:56,360 --> 00:46:59,680 Speaker 1: interesting because this seems to get to because you could wonder, like, 863 00:46:59,760 --> 00:47:03,080 Speaker 1: ok so, it seems like this choice induced preference change thing, 864 00:47:03,120 --> 00:47:05,320 Speaker 1: it really does go on. But is this a function 865 00:47:05,400 --> 00:47:09,640 Speaker 1: of like like adult cognition, you know, adult pictures of 866 00:47:09,680 --> 00:47:12,520 Speaker 1: the self, or would this happen at a more primal 867 00:47:12,640 --> 00:47:15,240 Speaker 1: level that you would see even in you know, uh, 868 00:47:15,280 --> 00:47:18,520 Speaker 1: even in four year old children and in monkeys and stuff. 869 00:47:18,560 --> 00:47:21,280 Speaker 1: And and it looks like the answer is basically yes, 870 00:47:21,360 --> 00:47:24,400 Speaker 1: you do see this even in four year old children 871 00:47:24,480 --> 00:47:28,040 Speaker 1: and capuchin monkeys. Now you might wonder how you could 872 00:47:28,400 --> 00:47:30,759 Speaker 1: how you could create the test conditions there, because you 873 00:47:30,800 --> 00:47:33,359 Speaker 1: can't like ask them to to like rate a list 874 00:47:33,360 --> 00:47:37,520 Speaker 1: of appliances or something. Right, Um, so the study design 875 00:47:37,560 --> 00:47:40,480 Speaker 1: here for for human children. It's kind of complicated to explain, 876 00:47:40,560 --> 00:47:42,160 Speaker 1: but once I read it, I thought it was actually 877 00:47:42,280 --> 00:47:44,600 Speaker 1: very elegant and ingenious. So if you don't mind, I 878 00:47:44,640 --> 00:47:47,279 Speaker 1: just want to read their description of their experimental set 879 00:47:47,320 --> 00:47:49,359 Speaker 1: up here. Oh and sorry, I don't think I said 880 00:47:49,400 --> 00:47:52,560 Speaker 1: that this. Uh This paper is by Luisa see Egan 881 00:47:52,719 --> 00:47:56,000 Speaker 1: Paul Bloom and Laurie are Santos in the Journal of 882 00:47:56,040 --> 00:47:59,640 Speaker 1: Experimental Social Psychology in two thousand ten um so to 883 00:47:59,719 --> 00:48:04,160 Speaker 1: read from their their methodology. With the test condition involving 884 00:48:04,200 --> 00:48:08,759 Speaker 1: human children, quote, the experiment or first displayed an opaque 885 00:48:08,800 --> 00:48:13,600 Speaker 1: gray stocking to the child and sequentially extracted three toys, 886 00:48:13,680 --> 00:48:17,399 Speaker 1: described as some of the experimenter's favorite things, which were 887 00:48:17,680 --> 00:48:20,840 Speaker 1: really fun, but you have to be creative with them. 888 00:48:20,880 --> 00:48:24,319 Speaker 1: The toys distended the stockings such that the contours of 889 00:48:24,320 --> 00:48:27,560 Speaker 1: each could be seen, but the color could not be discerned. 890 00:48:28,120 --> 00:48:31,279 Speaker 1: The experiment are extracted and displayed the three toys to 891 00:48:31,320 --> 00:48:34,560 Speaker 1: the child, described them as some of her favorite things, 892 00:48:34,880 --> 00:48:38,000 Speaker 1: then shuffled them as she lifted them behind an occluder 893 00:48:38,320 --> 00:48:41,600 Speaker 1: and announced that she would hide the toys. She removed 894 00:48:41,640 --> 00:48:46,160 Speaker 1: the occluder to display two stockings, one dotted and one argyle. 895 00:48:46,800 --> 00:48:49,640 Speaker 1: The experiment or pointed out that the outlines of two 896 00:48:49,680 --> 00:48:52,600 Speaker 1: toys were visible within one of the stockings, and that 897 00:48:52,719 --> 00:48:55,239 Speaker 1: the outline of the third toy was visible in the 898 00:48:55,280 --> 00:48:58,920 Speaker 1: second stocking. In the choice condition, the experiment are held 899 00:48:59,000 --> 00:49:01,680 Speaker 1: up the stocking with two toys and asked the child 900 00:49:01,719 --> 00:49:05,239 Speaker 1: to reach in without peaking and choose a toy. In 901 00:49:05,320 --> 00:49:08,640 Speaker 1: the no choice condition, the experiment or reached into the 902 00:49:08,680 --> 00:49:11,839 Speaker 1: stocking with two toys, pulled one closer to the mouth 903 00:49:11,920 --> 00:49:14,920 Speaker 1: of the stocking, held up the stocking, and asked the 904 00:49:15,000 --> 00:49:18,440 Speaker 1: child to remove the toy on top again without peaking. 905 00:49:19,160 --> 00:49:22,640 Speaker 1: In phase two, a second experiment or blind to which 906 00:49:22,640 --> 00:49:26,960 Speaker 1: stocking originally contained two toys, indicated the two stockings and 907 00:49:27,080 --> 00:49:29,399 Speaker 1: asked the child to choose a toy to play with. 908 00:49:29,920 --> 00:49:33,759 Speaker 1: Children were instructed not to peek before making their selection. 909 00:49:34,560 --> 00:49:37,840 Speaker 1: So what were the results here? Well, in the choice condition, 910 00:49:38,239 --> 00:49:42,400 Speaker 1: children strongly preferred the toy in the second stocking, meaning 911 00:49:42,520 --> 00:49:44,880 Speaker 1: the toy that they had not had a chance to 912 00:49:45,000 --> 00:49:49,120 Speaker 1: reject from the first stocking. Uh and and they preferred 913 00:49:49,360 --> 00:49:52,360 Speaker 1: sixty six point seven percent of children in the choice 914 00:49:52,400 --> 00:49:55,440 Speaker 1: condition went for the new toy in the second stocking 915 00:49:55,560 --> 00:49:58,520 Speaker 1: instead of the one that they hadn't grabbed from from 916 00:49:58,560 --> 00:50:02,720 Speaker 1: the first stocking. But in the no choice condition, remember 917 00:50:02,760 --> 00:50:04,640 Speaker 1: this is the one where the experiment or picks for 918 00:50:04,840 --> 00:50:07,319 Speaker 1: the kid. The kid doesn't get to pick themselves, the 919 00:50:07,360 --> 00:50:11,160 Speaker 1: effect vanished. In fact, in the condition where the toy 920 00:50:11,320 --> 00:50:14,560 Speaker 1: was chosen for them, kids did the opposite with the 921 00:50:14,600 --> 00:50:18,080 Speaker 1: majority wanting to reach into the first stocking again and 922 00:50:18,160 --> 00:50:22,040 Speaker 1: get the other toy. And remember that this is despite 923 00:50:22,080 --> 00:50:25,960 Speaker 1: them fishing the toys out at random. Uh. And there 924 00:50:26,040 --> 00:50:28,520 Speaker 1: was also a similar test on capuchin monkeys. I'm not 925 00:50:28,520 --> 00:50:30,160 Speaker 1: going to go into as much detail, and that one 926 00:50:30,200 --> 00:50:33,319 Speaker 1: it involves skittles instead of toys, and it found the 927 00:50:33,400 --> 00:50:36,520 Speaker 1: same thing. When monkeys were tricked into believing that they 928 00:50:36,520 --> 00:50:40,120 Speaker 1: had a choice between two initial candies, and then they 929 00:50:40,160 --> 00:50:44,040 Speaker 1: were given the option to choose between the previously rejected 930 00:50:44,080 --> 00:50:47,200 Speaker 1: candy of the first two and a new third alternative. 931 00:50:47,680 --> 00:50:51,680 Speaker 1: They overwhelmingly preferred the new alternative instead of the one 932 00:50:51,760 --> 00:50:54,920 Speaker 1: that they had not chosen in the previous choice. So again, 933 00:50:54,960 --> 00:50:59,279 Speaker 1: it looks kind of like once discarded, now despised. But 934 00:50:59,520 --> 00:51:01,880 Speaker 1: as with women children, this was only true if the 935 00:51:01,880 --> 00:51:04,640 Speaker 1: monkeys were made to think they had a free choice 936 00:51:04,760 --> 00:51:07,840 Speaker 1: between the first two. If the choice was clearly made 937 00:51:08,040 --> 00:51:10,560 Speaker 1: for them and they didn't get to pick, they no 938 00:51:10,600 --> 00:51:13,600 Speaker 1: longer seemed to devalue the other option from the first 939 00:51:13,680 --> 00:51:17,640 Speaker 1: pair of candies. Uh. That that's very interesting to me. 940 00:51:17,680 --> 00:51:21,880 Speaker 1: And it's interesting that if this manifests in children and monkeys, 941 00:51:21,920 --> 00:51:26,520 Speaker 1: it seems like choice induced preference change obviously doesn't depend 942 00:51:26,600 --> 00:51:29,520 Speaker 1: on any sort of like adult sense of self image 943 00:51:29,600 --> 00:51:33,880 Speaker 1: or sophisticated logical logical reasoning. Based on this study, if 944 00:51:33,960 --> 00:51:36,359 Speaker 1: if this holds up, it appears that our choices may 945 00:51:36,480 --> 00:51:41,000 Speaker 1: influence our preferences at a fairly primal level. And I 946 00:51:41,000 --> 00:51:43,880 Speaker 1: want to read from a section that from their conclusion 947 00:51:43,880 --> 00:51:45,600 Speaker 1: that picks up on one of the things I noted 948 00:51:45,600 --> 00:51:50,920 Speaker 1: about the children's no choice condition. So quote Curiously, we 949 00:51:51,000 --> 00:51:54,200 Speaker 1: observed a marginally significant effect in which children in the 950 00:51:54,280 --> 00:51:56,839 Speaker 1: no choice condition. Remember that this one where they didn't 951 00:51:56,840 --> 00:51:59,200 Speaker 1: get to pick the experiment or picked for them out 952 00:51:59,200 --> 00:52:02,440 Speaker 1: of the first stock, they preferred the toy that the 953 00:52:02,480 --> 00:52:06,000 Speaker 1: experiment or did not give them. Although we had originally 954 00:52:06,120 --> 00:52:09,440 Speaker 1: hypothesized that children would be at chance on this condition, 955 00:52:09,719 --> 00:52:13,160 Speaker 1: the observed pattern of performance hence that children's preferences may 956 00:52:13,239 --> 00:52:16,640 Speaker 1: change not merely because of their choices, but also because 957 00:52:16,680 --> 00:52:21,320 Speaker 1: of their lack of choices. Consistent with Brims nineteen sixty 958 00:52:21,400 --> 00:52:24,719 Speaker 1: six reactance theory and Brim and Wine Troubs research on 959 00:52:24,840 --> 00:52:29,280 Speaker 1: reactants and two year olds, children's preferences may reflect psychological 960 00:52:29,440 --> 00:52:34,799 Speaker 1: reactants when choice freedom is denied. So and the possibility 961 00:52:35,080 --> 00:52:37,560 Speaker 1: uh here is that the effect is not only not 962 00:52:37,800 --> 00:52:41,480 Speaker 1: present when you perceive somebody else's denying you a free choice, 963 00:52:41,520 --> 00:52:45,279 Speaker 1: there could be a reverse effect. Once one of two 964 00:52:45,360 --> 00:52:48,960 Speaker 1: options is denied you by an outside force, the denied 965 00:52:49,000 --> 00:52:53,160 Speaker 1: option is not only not despised, it's coveted. You want 966 00:52:53,239 --> 00:52:56,239 Speaker 1: that thing that you were told you couldn't have, Yeah, 967 00:52:56,239 --> 00:52:58,040 Speaker 1: I imagine we can a lot of us can imagine 968 00:52:58,320 --> 00:53:02,000 Speaker 1: remember childhood examples of this, you know, like the the 969 00:53:02,840 --> 00:53:06,200 Speaker 1: toy you were not permitted to have, the the book 970 00:53:06,239 --> 00:53:10,440 Speaker 1: that was denied to you, that sort of thing. Yeah. Now, now, 971 00:53:10,480 --> 00:53:12,440 Speaker 1: of course, there are always gonna be reasons for this 972 00:53:12,560 --> 00:53:14,399 Speaker 1: that make it makes sense in your brain, like they're 973 00:53:14,440 --> 00:53:17,280 Speaker 1: intrinsic qualities to that toy or that book or something 974 00:53:17,320 --> 00:53:19,640 Speaker 1: that seemed like that's why I really wanted. But it 975 00:53:19,680 --> 00:53:23,879 Speaker 1: seems like even among toys that are identical, there there 976 00:53:24,000 --> 00:53:28,000 Speaker 1: is this preference that arises from Uh. It seems like 977 00:53:28,400 --> 00:53:31,120 Speaker 1: if we have had the option to pick something and 978 00:53:31,120 --> 00:53:34,080 Speaker 1: we didn't pick it, afterwards, it's it becomes far less 979 00:53:34,120 --> 00:53:35,920 Speaker 1: interesting to us. We don't really want it at all. 980 00:53:36,200 --> 00:53:39,240 Speaker 1: But if we were presented with something as a possible 981 00:53:39,239 --> 00:53:42,600 Speaker 1: option and we're not given the opportunity to get it, 982 00:53:42,680 --> 00:53:45,920 Speaker 1: then we really wanted So anyway, I was looking around 983 00:53:45,920 --> 00:53:49,920 Speaker 1: for some challenges to the to the choice induce preference 984 00:53:50,040 --> 00:53:52,400 Speaker 1: change phenomenon, and I was trying to find if there 985 00:53:52,400 --> 00:53:55,080 Speaker 1: are any studies that found the opposite. There are a few. 986 00:53:55,360 --> 00:53:59,120 Speaker 1: For example, I found this paper which criticizes the interpretation 987 00:53:59,160 --> 00:54:03,480 Speaker 1: of Brim's original findings and the replications um and it 988 00:54:03,520 --> 00:54:06,400 Speaker 1: attempts a modified replication of its own. So this was 989 00:54:06,440 --> 00:54:11,720 Speaker 1: by Steiner Holden polition in the Journal of Applied Social Psychology. 990 00:54:11,880 --> 00:54:15,759 Speaker 1: Do choices affect preferences? Some doubts and new evidence, And 991 00:54:15,800 --> 00:54:18,600 Speaker 1: the author here says quote, I find no evidence of 992 00:54:18,680 --> 00:54:22,560 Speaker 1: choice induced changes in preferences after a choice between items 993 00:54:22,880 --> 00:54:25,600 Speaker 1: where one was viewed as more attractive than the other, 994 00:54:26,040 --> 00:54:29,799 Speaker 1: but potentially some weak evidence of changes in preferences after 995 00:54:29,880 --> 00:54:33,920 Speaker 1: a choice between items viewed as equally attractive. So that's 996 00:54:33,920 --> 00:54:37,080 Speaker 1: worth keeping in mind. There are some challenges to this phenomenon, 997 00:54:37,160 --> 00:54:39,640 Speaker 1: and in its robustness that this does appear to be 998 00:54:39,680 --> 00:54:43,280 Speaker 1: a minority finding, and in fact it doesn't fully contradict 999 00:54:43,320 --> 00:54:46,520 Speaker 1: to the other results, it only partially contradicts them. But 1000 00:54:46,560 --> 00:54:48,880 Speaker 1: then finally I wanted to get to one last study 1001 00:54:48,920 --> 00:54:50,360 Speaker 1: I was reading this was actually the one I was 1002 00:54:50,400 --> 00:54:52,320 Speaker 1: reading about that made me want to do this episode 1003 00:54:52,320 --> 00:54:54,600 Speaker 1: in the first place. It's a very recent study on 1004 00:54:54,760 --> 00:54:59,080 Speaker 1: choice induced preference change, this time in human babies in 1005 00:54:59,239 --> 00:55:03,120 Speaker 1: improve verbal human infants, published just this year. So this 1006 00:55:03,200 --> 00:55:07,440 Speaker 1: is by alex M. Silver, Amy, E. Stall Rita Loyaltial, 1007 00:55:07,960 --> 00:55:13,240 Speaker 1: Alexis S. Smith Flores, and Lisa Feigenson. When not choosing 1008 00:55:13,320 --> 00:55:16,840 Speaker 1: leads to not liking Choice induced preference in Infancy, published 1009 00:55:16,880 --> 00:55:20,080 Speaker 1: in Psychological Science this year. Some of the authors were 1010 00:55:20,120 --> 00:55:24,080 Speaker 1: affiliated with JOHNS. Hopkins University, the University of Pittsburgh, and 1011 00:55:24,160 --> 00:55:27,200 Speaker 1: the College of New Jersey. And again they tested for 1012 00:55:27,360 --> 00:55:31,400 Speaker 1: choice induced preference change in pre verbal infants across seven 1013 00:55:31,440 --> 00:55:35,920 Speaker 1: studies with a methodology that's uh somewhat similar to one 1014 00:55:35,920 --> 00:55:38,200 Speaker 1: of the ones we looked at earlier, with the ones 1015 00:55:38,440 --> 00:55:42,160 Speaker 1: testing with four year olds and capuchin monkeys and UH 1016 00:55:42,280 --> 00:55:45,160 Speaker 1: from from their conclusion and discussion, they say, quote, our 1017 00:55:45,200 --> 00:55:48,800 Speaker 1: findings suggests that choice induced preference change does not require 1018 00:55:48,880 --> 00:55:52,960 Speaker 1: extensive experience making choices, nor does it rely on advanced 1019 00:55:53,040 --> 00:55:57,640 Speaker 1: metacognitive ability or developed sense of self. Because they found 1020 00:55:57,680 --> 00:56:01,399 Speaker 1: this in pre verbal infants. If pre verbal infants are 1021 00:56:01,840 --> 00:56:05,239 Speaker 1: changing their their preferences based on what they've chosen. It 1022 00:56:05,280 --> 00:56:08,120 Speaker 1: seems like it really would not require any of those things. 1023 00:56:08,120 --> 00:56:11,960 Speaker 1: It's happening at some lower level in the brain. And 1024 00:56:12,040 --> 00:56:16,720 Speaker 1: it also raises interesting questions about how preferences get formed 1025 00:56:17,080 --> 00:56:21,040 Speaker 1: very early in life, if they might stem from choices 1026 00:56:21,120 --> 00:56:25,360 Speaker 1: made at random in some sense when you're a baby. Uh, 1027 00:56:25,400 --> 00:56:28,400 Speaker 1: like they say, quote, our findings add to our understanding 1028 00:56:28,400 --> 00:56:31,240 Speaker 1: of the role of choice in infancy, showing that infants 1029 00:56:31,360 --> 00:56:34,439 Speaker 1: use their own choices to shape their preferences. This work 1030 00:56:34,560 --> 00:56:37,640 Speaker 1: raises the question of whether other aspects of the psychology 1031 00:56:37,640 --> 00:56:41,879 Speaker 1: of decision making also have their roots in very early life. So, yeah, 1032 00:56:41,960 --> 00:56:44,160 Speaker 1: that doesn't make me wonder if, like, there are things 1033 00:56:44,200 --> 00:56:47,440 Speaker 1: that adults are still carrying trying to keep a consistent 1034 00:56:47,520 --> 00:56:51,600 Speaker 1: narrative about their preferences, their likes and dislikes that may 1035 00:56:51,680 --> 00:56:54,640 Speaker 1: have their roots may have emerged at some point when 1036 00:56:54,680 --> 00:56:58,800 Speaker 1: they made some basically random decision as a pre verbal infant. 1037 00:56:59,040 --> 00:57:02,680 Speaker 1: In't that weird? That is weird? Yeah, Yeah, it's like 1038 00:57:02,719 --> 00:57:04,440 Speaker 1: you don't want to dwell on the past and to 1039 00:57:04,560 --> 00:57:06,840 Speaker 1: think that you know choices in your past to find you. 1040 00:57:06,840 --> 00:57:08,920 Speaker 1: But what if those are baby choices? What if it 1041 00:57:08,960 --> 00:57:11,879 Speaker 1: all treads down to baby choices? Right? What if things 1042 00:57:11,920 --> 00:57:14,239 Speaker 1: that you think of as fundamental to your you know, 1043 00:57:14,320 --> 00:57:19,040 Speaker 1: your own idiosyncrasies, your your view of yourself, are rooted 1044 00:57:19,080 --> 00:57:22,680 Speaker 1: in you just trying to stay consistent with something that 1045 00:57:22,800 --> 00:57:26,560 Speaker 1: happened when you were two, yeah, or one, even? I 1046 00:57:26,560 --> 00:57:30,600 Speaker 1: mean you know, I picked I picked the yellow block 1047 00:57:30,640 --> 00:57:34,120 Speaker 1: instead of the red block, and and ever since then, 1048 00:57:34,200 --> 00:57:40,960 Speaker 1: yellow has been my preferred color. Interesting, Um, I had 1049 00:57:40,960 --> 00:57:42,360 Speaker 1: a I had a scenario in my head. I'm not 1050 00:57:42,520 --> 00:57:46,840 Speaker 1: I don't think this one necessarily applies, but perhaps you have. 1051 00:57:48,040 --> 00:57:49,480 Speaker 1: You have an opinion on it based on what we've 1052 00:57:49,520 --> 00:57:53,880 Speaker 1: discussed so far. In the movie A Christmas Story, the 1053 00:57:53,920 --> 00:57:57,040 Speaker 1: old man receives a major award, which of course turns 1054 00:57:57,040 --> 00:58:01,080 Speaker 1: out to be a lamp that looks like a woman's leg. Um, 1055 00:58:01,120 --> 00:58:08,720 Speaker 1: how would you, um interpret his attachment to the major award? Well, 1056 00:58:08,800 --> 00:58:11,320 Speaker 1: clearly he he is suffering from a kind of preference 1057 00:58:11,400 --> 00:58:15,520 Speaker 1: bias about the major award. That's like a self flattering 1058 00:58:15,560 --> 00:58:17,480 Speaker 1: bias of some kind. I'm not sure best how to 1059 00:58:17,520 --> 00:58:20,200 Speaker 1: categorize it. I don't think it would be choice induced 1060 00:58:20,240 --> 00:58:22,640 Speaker 1: preference change because he didn't pick the leg lamp. It 1061 00:58:22,720 --> 00:58:25,240 Speaker 1: was picked for him, and the studies have showed that 1062 00:58:25,280 --> 00:58:28,720 Speaker 1: when things are picked for you, this effect does not manifest. 1063 00:58:29,080 --> 00:58:32,240 Speaker 1: But I think he's doing a different kind of thing, 1064 00:58:32,240 --> 00:58:35,680 Speaker 1: which is um, the leg lamp is a symbol of 1065 00:58:35,840 --> 00:58:38,919 Speaker 1: his intellectual prowess and victory, and thus the leg lamp 1066 00:58:39,080 --> 00:58:42,760 Speaker 1: is itself beautiful and good. Yes, all right. And then 1067 00:58:42,760 --> 00:58:46,520 Speaker 1: of course there's the added wrinkle that his wife does 1068 00:58:46,560 --> 00:58:49,000 Speaker 1: not like the award and does not think it should 1069 00:58:49,000 --> 00:58:51,480 Speaker 1: be in the front of the house, which he regards 1070 00:58:51,480 --> 00:58:54,840 Speaker 1: as a personal insult because he has so deeply associated 1071 00:58:54,880 --> 00:59:01,440 Speaker 1: this lamp with his with his personal intellectual abilities mind power. Uh. 1072 00:59:01,480 --> 00:59:04,200 Speaker 1: This all also made me think of another great work, 1073 00:59:04,360 --> 00:59:08,200 Speaker 1: um that would be a Paradise Lost by by Milton. 1074 00:59:08,800 --> 00:59:11,600 Speaker 1: We have that line from Satan, the mind is its 1075 00:59:11,640 --> 00:59:14,560 Speaker 1: own place and in itself can make a heaven of 1076 00:59:14,640 --> 00:59:19,680 Speaker 1: hell a hell of heaven. That's yeah, that's that's really 1077 00:59:19,680 --> 00:59:22,880 Speaker 1: good because I've never interpreted this line in that way 1078 00:59:22,920 --> 00:59:26,439 Speaker 1: as like a reflection of an ex post facto justification 1079 00:59:26,480 --> 00:59:29,880 Speaker 1: to reduce cognitive dissonance. But you could absolutely read it 1080 00:59:29,920 --> 00:59:32,720 Speaker 1: that way. You can totally see it like that. I mean, 1081 00:59:32,800 --> 00:59:36,920 Speaker 1: I've always interpreted it, I guess as um, you know, 1082 00:59:37,040 --> 00:59:39,240 Speaker 1: just a statement about like, you know, the power to 1083 00:59:39,440 --> 00:59:42,080 Speaker 1: like Satan is asserting that he can make what he 1084 00:59:42,120 --> 00:59:45,320 Speaker 1: will of any situation. But yeah, you could interpret that 1085 00:59:45,400 --> 00:59:50,360 Speaker 1: much more in a cognitive bias way, where he's saying like, well, 1086 00:59:50,400 --> 00:59:53,280 Speaker 1: you know, I made my decisions, and my decisions led 1087 00:59:53,320 --> 00:59:56,280 Speaker 1: me to hell, and thus I will engage in choice 1088 00:59:56,320 --> 00:59:59,520 Speaker 1: supportive biased reasoning that makes me think, actually, actually hell 1089 00:59:59,640 --> 01:00:03,680 Speaker 1: is good. It's good, you know, you know, uh, and 1090 01:00:03,720 --> 01:00:08,680 Speaker 1: that reduces the cognitive dissonance within Satan's soul. What if 1091 01:00:08,720 --> 01:00:11,160 Speaker 1: you just had a vision of help where everybody's in 1092 01:00:11,200 --> 01:00:13,320 Speaker 1: that where people are they're just all setting around, you know, 1093 01:00:13,760 --> 01:00:16,080 Speaker 1: being tortured or torturing each other, and like this place 1094 01:00:16,160 --> 01:00:22,720 Speaker 1: is great, this is great. I don't Yeah, yeah, yeah, 1095 01:00:23,160 --> 01:00:26,240 Speaker 1: I think that's a fantastic image In with Yeah, we 1096 01:00:26,320 --> 01:00:29,640 Speaker 1: ended by justifying the ways of God demands, so it's 1097 01:00:29,640 --> 01:00:33,040 Speaker 1: generally what we seek to do in this uh this podcast. Wait, no, 1098 01:00:33,160 --> 01:00:35,439 Speaker 1: aren't we justifying the ways of Satan demand? I think 1099 01:00:35,440 --> 01:00:37,360 Speaker 1: that's what we did. Oh yeah, I guess that's all 1100 01:00:37,400 --> 01:00:41,800 Speaker 1: what we're doing here. Yeah, even better, Yeah, a lesser go, 1101 01:00:41,920 --> 01:00:44,920 Speaker 1: a lower goal. All right, Well, we'll go and close 1102 01:00:45,000 --> 01:00:46,120 Speaker 1: this one out. I think this will be a fun 1103 01:00:46,200 --> 01:00:49,040 Speaker 1: one for listeners to reflect on, especially since I think 1104 01:00:49,040 --> 01:00:51,760 Speaker 1: we actually had a stocking based experiment there. Maybe you 1105 01:00:51,760 --> 01:00:55,280 Speaker 1: can reflect on on on on gift giving and stockings 1106 01:00:55,320 --> 01:00:59,200 Speaker 1: and and so forth with the holiday season that we're 1107 01:00:59,240 --> 01:01:02,600 Speaker 1: passing through at the moment. Uh, certainly everybody can relate 1108 01:01:02,640 --> 01:01:06,520 Speaker 1: on some level to some of the mental mechanics that 1109 01:01:06,560 --> 01:01:09,440 Speaker 1: we're discussing here in this episode. In the meantime, if 1110 01:01:09,440 --> 01:01:11,200 Speaker 1: you would like to listen to other episodes of Stuff 1111 01:01:11,200 --> 01:01:12,880 Speaker 1: to Blow Your Mind, you'll find them wherever you get 1112 01:01:12,880 --> 01:01:15,120 Speaker 1: your podcasts and wherever that happens to be. Just rate, 1113 01:01:15,160 --> 01:01:17,320 Speaker 1: review and subscribe. That helps us out. If you want 1114 01:01:17,320 --> 01:01:18,760 Speaker 1: to go to stuff to Blow your Mind dot com, 1115 01:01:19,040 --> 01:01:21,200 Speaker 1: that will take you over to the I Heart listening 1116 01:01:21,280 --> 01:01:23,120 Speaker 1: for this page, and there's a place you can click 1117 01:01:23,160 --> 01:01:25,160 Speaker 1: on there for our store if you wanted to get 1118 01:01:25,200 --> 01:01:27,320 Speaker 1: a shirt or a stick or something with our logo 1119 01:01:27,840 --> 01:01:30,160 Speaker 1: or a monster on it. I believe by the time 1120 01:01:30,160 --> 01:01:31,480 Speaker 1: you listen to this there should be a couple of 1121 01:01:31,480 --> 01:01:34,720 Speaker 1: different user created designs that are pretty cool huge things. 1122 01:01:34,720 --> 01:01:37,680 Speaker 1: As always to our excellent audio producers, Seth Nicholis Johnson. 1123 01:01:37,960 --> 01:01:39,480 Speaker 1: If you would like to get in touch with us 1124 01:01:39,480 --> 01:01:41,840 Speaker 1: with feedback on this episode or any other, to suggest 1125 01:01:41,840 --> 01:01:43,800 Speaker 1: a topic for the future, or just say hi, you 1126 01:01:43,800 --> 01:01:46,520 Speaker 1: can email us at contact and Stuff to Blow your 1127 01:01:46,520 --> 01:01:56,600 Speaker 1: Mind dot com. Stuff to Blow Your Mind is production 1128 01:01:56,680 --> 01:01:59,040 Speaker 1: of I Heart Radio. For more podcasts for my heart 1129 01:01:59,120 --> 01:02:02,280 Speaker 1: Radio because the heart Radio app, Apple Podcasts, or wherever 1130 01:02:02,320 --> 01:02:16,280 Speaker 1: you're listening to your favorite shows.