1 00:00:15,476 --> 00:00:24,396 Speaker 1: Pushkin. The late American author and media critic Neil Postman 2 00:00:24,436 --> 00:00:29,276 Speaker 1: once famously wrote, technological change is not additive, it is ecological. 3 00:00:29,836 --> 00:00:33,876 Speaker 1: A new technology does not merely add something, it changes everything. 4 00:00:34,676 --> 00:00:37,196 Speaker 1: Postman made this observation all the way back in nineteen 5 00:00:37,236 --> 00:00:40,636 Speaker 1: ninety two, over a decade before smartphones and over a 6 00:00:40,676 --> 00:00:44,636 Speaker 1: decade before the launch of social media platforms like Facebook, Instagram, 7 00:00:44,676 --> 00:00:50,076 Speaker 1: and TikTok. Postman's quote feels particularly relevant today, especially given 8 00:00:50,076 --> 00:00:52,756 Speaker 1: what researchers around the world are learning about the negative 9 00:00:52,756 --> 00:00:56,436 Speaker 1: effects of these technologies. Researchers like today's guest. 10 00:00:56,716 --> 00:01:00,076 Speaker 2: I'm CAAs Sunstein. I teach at Harvard. I work on 11 00:01:00,196 --> 00:01:04,356 Speaker 2: law and behavioral science. I've been working for about seven 12 00:01:04,436 --> 00:01:09,436 Speaker 2: years on social media and happiness and the divergence between 13 00:01:09,476 --> 00:01:13,036 Speaker 2: what people choose and what actually makes their lives better. 14 00:01:13,516 --> 00:01:16,236 Speaker 1: You might know Cast from his influential book Nudge, which 15 00:01:16,236 --> 00:01:19,396 Speaker 1: he co authored with the Nobel Prize winning economist Richard Taylor. 16 00:01:19,956 --> 00:01:23,036 Speaker 1: Nudge explores how small changes in our environments can influence 17 00:01:23,036 --> 00:01:26,116 Speaker 1: the choices we make. Or you may know Casts from 18 00:01:26,116 --> 00:01:29,116 Speaker 1: his work in the Obama administration, where he helped bring 19 00:01:29,156 --> 00:01:31,236 Speaker 1: behavioral science into public policy. 20 00:01:31,596 --> 00:01:36,236 Speaker 2: I had the Office of Information at Regulatory Affairs, analyzing 21 00:01:36,316 --> 00:01:39,836 Speaker 2: the effects of regulations to make sure that the benefits 22 00:01:39,876 --> 00:01:40,996 Speaker 2: are higher than the costs. 23 00:01:41,436 --> 00:01:43,596 Speaker 1: Cass is one of the scholars in behavioral science that 24 00:01:43,676 --> 00:01:46,276 Speaker 1: I really look up to. He's also one of the 25 00:01:46,356 --> 00:01:49,436 Speaker 1: most prolific academics I've ever met. I've lost count of 26 00:01:49,476 --> 00:01:51,596 Speaker 1: the number of books he's written. I think it's well 27 00:01:51,636 --> 00:01:54,476 Speaker 1: over fifty at this point, and that doesn't even include 28 00:01:54,516 --> 00:01:57,756 Speaker 1: the hundreds of academic articles he's authored, most of which 29 00:01:57,756 --> 00:02:00,916 Speaker 1: are about some strange or unexpected aspect of human behavior. 30 00:02:01,716 --> 00:02:04,396 Speaker 1: And this year, Cass adds to that long list as 31 00:02:04,436 --> 00:02:07,116 Speaker 1: one of the authors of the World Happiness Report, an 32 00:02:07,156 --> 00:02:10,476 Speaker 1: annual academic publication about the state of global well being. 33 00:02:11,276 --> 00:02:14,196 Speaker 1: Each year the World Happiness Report centers on a different theme. 34 00:02:14,556 --> 00:02:17,556 Speaker 1: The twenty twenty six report is all about how technology 35 00:02:17,676 --> 00:02:21,716 Speaker 1: affects human happiness, and in true cast Sunstein fashion, his 36 00:02:21,916 --> 00:02:25,276 Speaker 1: chapter in this year's report introduces an important new concept, 37 00:02:25,716 --> 00:02:28,556 Speaker 1: one that I find super helpful for making sense of 38 00:02:28,596 --> 00:02:31,836 Speaker 1: all the irrational ways we get stuck online and behaviors 39 00:02:31,836 --> 00:02:34,236 Speaker 1: that tend to decrease our health and our happiness. 40 00:02:34,476 --> 00:02:37,516 Speaker 2: People are thinking given the fact that there is this 41 00:02:37,596 --> 00:02:40,876 Speaker 2: platform and my people are on it. I'm going to 42 00:02:40,916 --> 00:02:43,396 Speaker 2: stay on and I'm going to get off at kicking 43 00:02:43,436 --> 00:02:46,996 Speaker 2: and screaming. But do I like this status quo? I 44 00:02:47,036 --> 00:02:48,916 Speaker 2: do not like the status quo at all. 45 00:02:49,716 --> 00:02:52,356 Speaker 1: So if you're feeling trapped by your relationship with technology 46 00:02:52,356 --> 00:02:55,476 Speaker 1: and social media, stay tuned because casp will share this 47 00:02:55,516 --> 00:02:58,596 Speaker 1: exciting new concept from his chapter why this new concept 48 00:02:58,676 --> 00:03:01,596 Speaker 1: is so important and what understanding it means for escaping 49 00:03:01,596 --> 00:03:05,316 Speaker 1: the trap of social media platforms. All that when the 50 00:03:05,316 --> 00:03:08,676 Speaker 1: Happiness Lab returns right after some quick ads from our sponsors. 51 00:03:20,356 --> 00:03:23,716 Speaker 1: Behavioral scientist and legal scholar cast Sunstein spends a lot 52 00:03:23,756 --> 00:03:25,796 Speaker 1: of time thinking about ways that we can shape our 53 00:03:25,836 --> 00:03:29,796 Speaker 1: behavior to feel happier and healthier, and he's particularly interested 54 00:03:29,836 --> 00:03:32,556 Speaker 1: in cases in which people behave in ways that decrease 55 00:03:32,596 --> 00:03:36,316 Speaker 1: what economists call their utility. Since I'm guessing that many 56 00:03:36,316 --> 00:03:39,316 Speaker 1: of my listeners aren't trained economists, I asked Kass for 57 00:03:39,356 --> 00:03:41,156 Speaker 1: a quick definition of utility. 58 00:03:41,396 --> 00:03:43,196 Speaker 2: Well, it basically means well being. 59 00:03:43,636 --> 00:03:46,996 Speaker 3: So if you have a day where you're really enjoying 60 00:03:47,036 --> 00:03:51,196 Speaker 3: it and maybe life is very meaningful, and you think 61 00:03:51,236 --> 00:03:53,716 Speaker 3: at the end that was such a great day that 62 00:03:53,836 --> 00:03:55,756 Speaker 3: had a lot of utility if. 63 00:03:55,636 --> 00:03:57,836 Speaker 2: You had a day where you were in pain or 64 00:03:57,876 --> 00:04:01,756 Speaker 2: struggling or sad or scared or worried or depressed, that 65 00:04:01,796 --> 00:04:04,516 Speaker 2: would be a low utility day. So to think of 66 00:04:04,636 --> 00:04:08,396 Speaker 2: utility as pretty close to synonymous with well being is 67 00:04:08,596 --> 00:04:09,636 Speaker 2: fundamental right. 68 00:04:09,956 --> 00:04:13,196 Speaker 1: Economists tend to assume that people are rational, that is, 69 00:04:13,276 --> 00:04:16,636 Speaker 1: they should consistently behave in ways that maximize their utility. 70 00:04:17,196 --> 00:04:20,716 Speaker 1: But of course we do irrational stuff all the time, like, 71 00:04:20,756 --> 00:04:24,276 Speaker 1: for example, spending hour after hours scrolling on Instagram or 72 00:04:24,276 --> 00:04:28,196 Speaker 1: TikTok when that behavior makes us feel unproductive and pretty gross. 73 00:04:28,436 --> 00:04:30,916 Speaker 1: So why on earth do you so called rational creatures 74 00:04:30,956 --> 00:04:33,436 Speaker 1: like us waste so much time on platforms that don't 75 00:04:33,436 --> 00:04:36,396 Speaker 1: even feel good. That was what Casts set out to 76 00:04:36,436 --> 00:04:39,356 Speaker 1: explain in his chapter in this year's World Happiness Report, 77 00:04:39,996 --> 00:04:43,876 Speaker 1: and his explanation involves recognizing something new that social media 78 00:04:44,116 --> 00:04:47,476 Speaker 1: isn't just a typical kind of product. Instead, it falls 79 00:04:47,516 --> 00:04:50,556 Speaker 1: into the category of what Cast calls a product trap. 80 00:04:50,756 --> 00:04:54,796 Speaker 2: A product trap is something where people buy it because 81 00:04:54,836 --> 00:04:57,636 Speaker 2: there's some negative thing that happens if they're not the 82 00:04:57,636 --> 00:05:01,836 Speaker 2: one who's buying it. So people sometimes buy goods whose 83 00:05:01,836 --> 00:05:06,756 Speaker 2: existence they deplore. That phenomenon is I think keenly interesting 84 00:05:06,836 --> 00:05:10,636 Speaker 2: and pervasive goods that people consume but they wish they 85 00:05:10,636 --> 00:05:14,796 Speaker 2: weren't around, And social media has that form. So people 86 00:05:14,836 --> 00:05:18,756 Speaker 2: are trapped. They are kind of forced, so to speak, 87 00:05:18,796 --> 00:05:22,436 Speaker 2: into a situation where they're on social media, even though 88 00:05:22,436 --> 00:05:25,756 Speaker 2: they would be happier if social media didn't exist. 89 00:05:26,036 --> 00:05:28,956 Speaker 1: To better understand this idea of a product trap, let's 90 00:05:28,956 --> 00:05:31,276 Speaker 1: turn to the way that the usual sorts of products, 91 00:05:31,316 --> 00:05:34,756 Speaker 1: the non trappy kind, tend to affect utility. Let's say 92 00:05:34,756 --> 00:05:37,476 Speaker 1: I decide to buy a new blender. There are lots 93 00:05:37,516 --> 00:05:39,676 Speaker 1: of things about my new blender that might affect how 94 00:05:39,756 --> 00:05:42,116 Speaker 1: much I like it, things like how well it blends 95 00:05:42,156 --> 00:05:44,476 Speaker 1: through big chunks of ice, or how easy it is 96 00:05:44,516 --> 00:05:47,596 Speaker 1: to clean. All stuff related to how well that blender 97 00:05:47,636 --> 00:05:50,476 Speaker 1: works in my own kitchen. But one thing that won't 98 00:05:50,516 --> 00:05:53,196 Speaker 1: affect my utility is whether or not lots of other 99 00:05:53,236 --> 00:05:56,116 Speaker 1: people have bought the same blender. But for a small 100 00:05:56,116 --> 00:05:58,796 Speaker 1: subset of products, it matters whether other people buy the 101 00:05:58,796 --> 00:06:01,996 Speaker 1: same thing, either because keeping up with the joneses is 102 00:06:02,036 --> 00:06:04,316 Speaker 1: the main point of buying that product in the first place, 103 00:06:04,676 --> 00:06:08,316 Speaker 1: think luxury watches are designer handbags, or because the products 104 00:06:08,356 --> 00:06:11,516 Speaker 1: themselves get better simply because more people are using them. 105 00:06:11,956 --> 00:06:16,116 Speaker 1: Think social media platforms like Facebook or Instagram. Products like 106 00:06:16,156 --> 00:06:18,356 Speaker 1: these are what CAST calls product traps. 107 00:06:19,076 --> 00:06:21,916 Speaker 2: And the trap is that the company is able to 108 00:06:21,956 --> 00:06:25,596 Speaker 2: maneuver you into a situation in which you get the 109 00:06:25,636 --> 00:06:29,156 Speaker 2: thing because you would incur some sort of social cost 110 00:06:29,276 --> 00:06:30,556 Speaker 2: if you weren't engage. 111 00:06:30,836 --> 00:06:33,316 Speaker 1: So that's a product trap. And now that you have 112 00:06:33,396 --> 00:06:35,436 Speaker 1: a word for this concept, I bet you're going to 113 00:06:35,476 --> 00:06:40,076 Speaker 1: start noticing product traps everywhere. This is a vocabulary term 114 00:06:40,116 --> 00:06:42,516 Speaker 1: that I introduce in my Yeale Happiness class, and it 115 00:06:42,596 --> 00:06:45,196 Speaker 1: is the one that they yell students resonate with the most. 116 00:06:45,236 --> 00:06:47,396 Speaker 1: They are just like, I'm so happy that there is 117 00:06:47,436 --> 00:06:49,356 Speaker 1: a word for this thing that I have been into 118 00:06:49,436 --> 00:06:52,036 Speaker 1: for a long time because I think that so many 119 00:06:52,116 --> 00:06:54,436 Speaker 1: of the goods that they're supposed to get in life 120 00:06:54,476 --> 00:06:56,596 Speaker 1: wind up being product traps or just the things that 121 00:06:56,596 --> 00:06:59,516 Speaker 1: they use. All the time, students brought up filters on 122 00:06:59,596 --> 00:07:01,796 Speaker 1: the photos that they use. They know it makes them 123 00:07:01,796 --> 00:07:03,876 Speaker 1: look kind of weird and that it's not great for 124 00:07:03,916 --> 00:07:05,636 Speaker 1: our body image that we're all using them, but they 125 00:07:05,636 --> 00:07:07,956 Speaker 1: don't want to be the one person who's not filtering 126 00:07:07,956 --> 00:07:10,436 Speaker 1: their photos. Bill in Middle age like me, w'd be 127 00:07:10,436 --> 00:07:13,236 Speaker 1: things like botox or you know, supplements or these things 128 00:07:13,276 --> 00:07:15,516 Speaker 1: where it's like I just wish the world didn't have 129 00:07:15,596 --> 00:07:18,076 Speaker 1: these things, but in fact, given to the world does 130 00:07:18,116 --> 00:07:19,876 Speaker 1: have them, I feel like I have to use them too. 131 00:07:19,876 --> 00:07:21,756 Speaker 1: And when I was talking with my production team, they 132 00:07:21,756 --> 00:07:24,356 Speaker 1: brought up Elf on a Shelf, which is a holiday 133 00:07:24,396 --> 00:07:26,756 Speaker 1: example of this, where it's like, if the kid down 134 00:07:26,756 --> 00:07:28,876 Speaker 1: the hall is their parents are doing Elf on the shelf, 135 00:07:28,916 --> 00:07:30,556 Speaker 1: you feel like you have to do it too, but 136 00:07:30,636 --> 00:07:32,116 Speaker 1: it just kind of makes you miserable. 137 00:07:32,396 --> 00:07:35,316 Speaker 2: I'll give you two examples, if I may. My sister 138 00:07:35,436 --> 00:07:38,756 Speaker 2: decided one Christmas that the adults would not give each 139 00:07:38,796 --> 00:07:43,156 Speaker 2: other presents anymore. The presence would only go to children. 140 00:07:43,596 --> 00:07:47,276 Speaker 2: And everyone thought that was an extremely great thing because 141 00:07:47,276 --> 00:07:50,596 Speaker 2: for years we've been giving each other presence where there 142 00:07:50,676 --> 00:07:54,676 Speaker 2: was no benefit, mostly because people would struggle to find 143 00:07:54,716 --> 00:07:58,036 Speaker 2: something people would like and people didn't really need another 144 00:07:58,516 --> 00:08:03,116 Speaker 2: tie or whatever. Another example is, you know, I go 145 00:08:03,236 --> 00:08:07,556 Speaker 2: to Ireland because my wife is Irish and I drink 146 00:08:07,796 --> 00:08:11,556 Speaker 2: a little bit of alcohol, even though I don't drink 147 00:08:11,596 --> 00:08:15,116 Speaker 2: alcohol anywhere but Ireland, and the reason is among my 148 00:08:15,196 --> 00:08:19,396 Speaker 2: beloved Irish relatives, I say sorry, I'm not drinking. The 149 00:08:19,836 --> 00:08:26,036 Speaker 2: reaction is maybe he's an alcoholic, or maybe he's very 150 00:08:26,076 --> 00:08:29,916 Speaker 2: negative about drinkers and the non alcoholic. I'm not negative 151 00:08:29,916 --> 00:08:32,756 Speaker 2: about drinkers, but I give that kind of signal to 152 00:08:32,836 --> 00:08:36,316 Speaker 2: some Irish relatives. And the idea is that this is 153 00:08:36,356 --> 00:08:38,116 Speaker 2: a pervasive thing. 154 00:08:38,676 --> 00:08:41,916 Speaker 1: Right, It's not just about the other people using these products. 155 00:08:41,956 --> 00:08:44,156 Speaker 1: It's about what happens to you if you're the one 156 00:08:44,196 --> 00:08:46,956 Speaker 1: person who chooses not to use this product. And this 157 00:08:47,076 --> 00:08:49,996 Speaker 1: is what you've called a consumption spillover or a negative 158 00:08:50,036 --> 00:08:52,956 Speaker 1: non user externality. Walk me through how that works. 159 00:08:53,236 --> 00:08:56,516 Speaker 2: The non user externality. That's a little fussier turn that 160 00:08:56,556 --> 00:08:59,876 Speaker 2: we're using, where if you're not using the thing, you suffer. 161 00:09:00,156 --> 00:09:04,076 Speaker 2: So let's suppose there's a party on a Monday night. 162 00:09:04,396 --> 00:09:08,236 Speaker 2: Everyone's going, and people might think I'm certainly going to 163 00:09:08,276 --> 00:09:10,836 Speaker 2: go to the party, because if I don't go, I'll 164 00:09:10,836 --> 00:09:13,836 Speaker 2: be giving a signal to people that I don't like parties, 165 00:09:13,876 --> 00:09:16,996 Speaker 2: that I don't like the hosts, that I'm an antisocial person, 166 00:09:17,036 --> 00:09:19,316 Speaker 2: that I'm a workaholic, or that I really like this 167 00:09:19,396 --> 00:09:23,916 Speaker 2: show on Netflix. So not going imposes a cost on you, 168 00:09:24,356 --> 00:09:26,876 Speaker 2: and the cost might be you know, self perception, or 169 00:09:26,916 --> 00:09:28,916 Speaker 2: you might think that the other people are going to 170 00:09:28,996 --> 00:09:31,356 Speaker 2: saying what's wrong with that guy? But it might be 171 00:09:31,396 --> 00:09:33,916 Speaker 2: there are some social events that you go to because 172 00:09:33,916 --> 00:09:36,316 Speaker 2: you kind of have to, but if they were canceled, 173 00:09:36,356 --> 00:09:38,996 Speaker 2: you'd think life is a little bit better. 174 00:09:39,516 --> 00:09:41,596 Speaker 1: And a lot of these product traps where the consumptions 175 00:09:41,636 --> 00:09:44,276 Speaker 1: fill over occurs is because of a specific emotion, which 176 00:09:44,316 --> 00:09:46,916 Speaker 1: is this emotion of fomo. Right. This is what my 177 00:09:46,996 --> 00:09:49,076 Speaker 1: Yale students talk about and why I think these product 178 00:09:49,076 --> 00:09:51,956 Speaker 1: traps are so powerful and college students is like it's 179 00:09:52,036 --> 00:09:53,956 Speaker 1: literally affecting your sense of belonging. 180 00:09:54,476 --> 00:09:58,276 Speaker 2: So there's a connection between the product trap phenomenon the 181 00:09:58,316 --> 00:10:02,556 Speaker 2: behavioral phenomenon of loss aversion. People dislike losses that tend 182 00:10:02,636 --> 00:10:06,516 Speaker 2: to anticipate at least disliking losses about twice as much 183 00:10:06,596 --> 00:10:10,276 Speaker 2: as they like equivalent gains. If you think that I'm 184 00:10:10,316 --> 00:10:14,356 Speaker 2: going to miss out, let's say on a Instagram something 185 00:10:14,876 --> 00:10:18,836 Speaker 2: or on a TikTok something that's a loss, and loss 186 00:10:18,876 --> 00:10:23,116 Speaker 2: makes people feel very nervous. It's a distinctive kind of 187 00:10:23,156 --> 00:10:26,196 Speaker 2: fear of missing out, which is accompanied by a thought 188 00:10:26,236 --> 00:10:29,196 Speaker 2: that the thing that you're missing out from you wish 189 00:10:29,236 --> 00:10:32,836 Speaker 2: weren't occurring, and the fact that people get trapped in 190 00:10:32,916 --> 00:10:36,676 Speaker 2: this because of social norms or because of agile let's say, 191 00:10:36,716 --> 00:10:41,956 Speaker 2: company behavior, that's really concerning it might be that anyone 192 00:10:42,356 --> 00:10:48,036 Speaker 2: can entrap almost anyone by triggering fear of missing out. 193 00:10:48,476 --> 00:10:50,756 Speaker 1: One of the domains of my students instantly saw that 194 00:10:50,836 --> 00:10:54,316 Speaker 1: they're dealing with a product trap. Interestingly was with AI 195 00:10:54,636 --> 00:10:57,956 Speaker 1: and specifically the use of AI to do their schoolwork, 196 00:10:58,276 --> 00:11:00,436 Speaker 1: the cheap bought use of AI, as they call it, 197 00:11:00,756 --> 00:11:02,596 Speaker 1: because I think all of them want to learn the 198 00:11:02,676 --> 00:11:05,156 Speaker 1: material and do the essay on their own and get 199 00:11:05,196 --> 00:11:07,396 Speaker 1: the sense of purpose that comes from that. But if 200 00:11:07,396 --> 00:11:09,996 Speaker 1: they know that everyone in the class is you using LLM, 201 00:11:10,036 --> 00:11:12,036 Speaker 1: they feel like, well, I'm a chump. If I'm putting 202 00:11:12,076 --> 00:11:14,116 Speaker 1: my time into this, I should just use these same 203 00:11:14,196 --> 00:11:16,956 Speaker 1: cheapbot tools that everyone else is. It seems like many 204 00:11:16,956 --> 00:11:19,116 Speaker 1: of them aren't using it because of their own individual 205 00:11:19,116 --> 00:11:21,116 Speaker 1: benefit of cheating. Many of them just kind of hated 206 00:11:21,116 --> 00:11:22,836 Speaker 1: the idea that everybody else is using it, but they 207 00:11:22,836 --> 00:11:24,356 Speaker 1: feel like, well, now I got to use it too. 208 00:11:24,636 --> 00:11:28,516 Speaker 2: That's a great example. So AI for many students is 209 00:11:28,796 --> 00:11:32,756 Speaker 2: a product trap where you wish it didn't exist, but 210 00:11:32,836 --> 00:11:37,076 Speaker 2: contingent on its existence, you have to use it. That's 211 00:11:37,076 --> 00:11:40,476 Speaker 2: a different mechanism kind of from fear of missing out. 212 00:11:40,676 --> 00:11:44,036 Speaker 2: It's that you would be performing less well. 213 00:11:44,356 --> 00:11:46,036 Speaker 1: So product traps are bad for the people who get 214 00:11:46,036 --> 00:11:48,876 Speaker 1: trapped using these products. But what about for the companies 215 00:11:48,876 --> 00:11:51,116 Speaker 1: that make these products. It kind of seems like a 216 00:11:51,116 --> 00:11:51,916 Speaker 1: good deal for them. 217 00:11:52,116 --> 00:11:57,596 Speaker 2: Oh yeah, it's fantastic. So in business schools, this relatively 218 00:11:57,676 --> 00:12:02,556 Speaker 2: new stuff should be taught as a technique for attracting customers. 219 00:12:03,036 --> 00:12:06,116 Speaker 2: So if you're trying to sell a product to say 220 00:12:06,196 --> 00:12:08,636 Speaker 2: that you don't want to be one of the few 221 00:12:08,716 --> 00:12:13,476 Speaker 2: who and have it, that can be very effective, especially 222 00:12:13,556 --> 00:12:17,876 Speaker 2: if it is visible. So if you're visibly not someone 223 00:12:18,276 --> 00:12:21,436 Speaker 2: who's using let's say a social media platform that everyone 224 00:12:21,516 --> 00:12:24,556 Speaker 2: in your group is using, that triggers something in the 225 00:12:24,596 --> 00:12:27,756 Speaker 2: human brain. And if there's some product that's visible, it 226 00:12:27,876 --> 00:12:31,716 Speaker 2: might be that the exclusion and the cost that is 227 00:12:31,716 --> 00:12:35,756 Speaker 2: imposed by people who aren't included is the principal determinant 228 00:12:35,756 --> 00:12:39,836 Speaker 2: of consumption behavior. So we know that a company would 229 00:12:39,916 --> 00:12:44,036 Speaker 2: do very well in a very cheerful way to emphasize 230 00:12:44,076 --> 00:12:47,596 Speaker 2: the wonderfulness of being part of a large community of 231 00:12:47,636 --> 00:12:51,516 Speaker 2: people who are increasingly visibly buying or engaged in this. 232 00:12:51,876 --> 00:12:54,076 Speaker 1: It also seems like companies are doing everything in their 233 00:12:54,076 --> 00:12:56,356 Speaker 1: power to do this more and more. You can't just 234 00:12:56,436 --> 00:12:58,796 Speaker 1: have a like internet game as to be an Internet 235 00:12:58,796 --> 00:13:01,236 Speaker 1: game where you share your stats with other people. You're 236 00:13:01,356 --> 00:13:04,836 Speaker 1: always constantly showing off whether or not you're using the product, 237 00:13:04,876 --> 00:13:07,516 Speaker 1: which of course contributes to the product happiness of some 238 00:13:07,556 --> 00:13:08,196 Speaker 1: of these goods. 239 00:13:08,556 --> 00:13:12,916 Speaker 2: Yeah, if you don't post your number, you are giving 240 00:13:12,956 --> 00:13:17,676 Speaker 2: a signal of some maybe embarrassing sort that you're not participating, 241 00:13:17,716 --> 00:13:20,516 Speaker 2: that you're not good at the thing, that you're not playful. 242 00:13:20,956 --> 00:13:25,876 Speaker 2: And all of these things can be profoundly motivating, and 243 00:13:26,156 --> 00:13:29,116 Speaker 2: they can be exploited in a way that makes people 244 00:13:29,156 --> 00:13:29,676 Speaker 2: worse off. 245 00:13:30,076 --> 00:13:32,196 Speaker 1: So now you know what a product trap is. And 246 00:13:32,236 --> 00:13:34,676 Speaker 1: I got to admit it kind of looks like social 247 00:13:34,716 --> 00:13:39,116 Speaker 1: media platforms qualify, but not so fast because casts and 248 00:13:39,156 --> 00:13:42,756 Speaker 1: other behavioral scientists have a strict empirical test for determining 249 00:13:42,836 --> 00:13:45,996 Speaker 1: whether a product truly counts as a product trap, one 250 00:13:46,036 --> 00:13:48,876 Speaker 1: that involves a curiously irrational thing that we tend to 251 00:13:48,876 --> 00:13:51,796 Speaker 1: do with our money. We'll hear all about that curious 252 00:13:51,836 --> 00:14:06,116 Speaker 1: monetary behavior right after this quick break. Like many behavioral 253 00:14:06,156 --> 00:14:09,676 Speaker 1: economists Harvard legal scholar Cass Sunstein, there's a lot of 254 00:14:09,676 --> 00:14:13,396 Speaker 1: time thinking about how people spend their money, and more specifically, 255 00:14:13,676 --> 00:14:16,556 Speaker 1: how much people are willing to pay for different products. 256 00:14:16,996 --> 00:14:21,036 Speaker 2: Suppose the question is would you benefit from having a 257 00:14:21,076 --> 00:14:24,476 Speaker 2: book or a dinner? How do we know? Well, we 258 00:14:24,516 --> 00:14:26,756 Speaker 2: can ask commuch you're willing to pay for it. It's 259 00:14:26,836 --> 00:14:29,356 Speaker 2: kind of the best measure we have. So if people 260 00:14:29,356 --> 00:14:31,836 Speaker 2: are willing to pay let's say fifteen dollars for a 261 00:14:31,836 --> 00:14:35,676 Speaker 2: book and not fifty, then we have some clue of 262 00:14:35,716 --> 00:14:37,836 Speaker 2: what the book is worth to them in terms of 263 00:14:37,876 --> 00:14:41,076 Speaker 2: well being. So willingness to pay is the best real 264 00:14:41,116 --> 00:14:45,116 Speaker 2: world measure we have off hand of what makes people 265 00:14:45,436 --> 00:14:46,076 Speaker 2: better off. 266 00:14:46,716 --> 00:14:49,676 Speaker 1: Economists also have lots of theories about how a rational 267 00:14:49,716 --> 00:14:53,476 Speaker 1: actor's willingness to pay should work. For example, the idea 268 00:14:53,516 --> 00:14:55,676 Speaker 1: that people will have some dollar amount in their heads 269 00:14:55,676 --> 00:14:58,796 Speaker 1: that represents a product's utility, how much it's worth to you. 270 00:14:59,196 --> 00:15:01,516 Speaker 1: So if someone is willing to pay fifty bucks for 271 00:15:01,596 --> 00:15:04,436 Speaker 1: say a blender, that dollar amount should be about the 272 00:15:04,436 --> 00:15:07,436 Speaker 1: same if you're thinking about buying the product or selling it. 273 00:15:07,756 --> 00:15:11,236 Speaker 1: But blenders and books are readler, non trappy kinds of products. 274 00:15:11,596 --> 00:15:14,436 Speaker 1: Ones that aren't affected all that much by whether other 275 00:15:14,476 --> 00:15:17,676 Speaker 1: people are buying the same product. Cas was interested in 276 00:15:17,676 --> 00:15:21,036 Speaker 1: whether social media platforms work differently than books and blenders, 277 00:15:21,156 --> 00:15:23,436 Speaker 1: and whether they qualified as product traps. 278 00:15:23,876 --> 00:15:26,676 Speaker 2: So I asked people how much you're willing to pay 279 00:15:26,716 --> 00:15:29,716 Speaker 2: for a month of social media platforms? And then I 280 00:15:29,756 --> 00:15:33,956 Speaker 2: asked a different population, how much money would you demand 281 00:15:34,116 --> 00:15:37,796 Speaker 2: to be off a platform for a month. So the 282 00:15:37,836 --> 00:15:40,556 Speaker 2: setup is people are asked how much would you pay 283 00:15:40,596 --> 00:15:43,276 Speaker 2: to use YouTube, or pay to use Facebook, or pay 284 00:15:43,316 --> 00:15:46,396 Speaker 2: to use Twitter, or how much would you demand to 285 00:15:46,436 --> 00:15:50,076 Speaker 2: be off? And there's a Nobel Prize winning theorem that 286 00:15:50,156 --> 00:15:53,156 Speaker 2: says the number has to be the same that if 287 00:15:53,156 --> 00:15:56,116 Speaker 2: people are willing to pay let's say six dollars for 288 00:15:56,276 --> 00:16:00,076 Speaker 2: movie ticket, they would demand six dollars to give up 289 00:16:00,156 --> 00:16:03,676 Speaker 2: the movie ticket value is value. And I was testing 290 00:16:03,716 --> 00:16:08,116 Speaker 2: whether this idea would be reflected in people's valuation of 291 00:16:08,196 --> 00:16:12,396 Speaker 2: social media platforms. But I got a staggering result. A 292 00:16:12,436 --> 00:16:15,996 Speaker 2: substantial number of people said I'll pay nothing to use 293 00:16:16,076 --> 00:16:19,996 Speaker 2: social media platforms, and the average answer was pretty low, 294 00:16:20,036 --> 00:16:23,356 Speaker 2: like five or ten dollars. So people are saying nothing, 295 00:16:23,676 --> 00:16:26,756 Speaker 2: or they're saying kind of a pittance to use social 296 00:16:26,756 --> 00:16:29,836 Speaker 2: media platforms and then to give up use. I got 297 00:16:29,836 --> 00:16:32,636 Speaker 2: a really big number, like people wanted like one hundred 298 00:16:32,676 --> 00:16:36,596 Speaker 2: dollars on average. So the disparity between how much people 299 00:16:36,636 --> 00:16:39,116 Speaker 2: would demand to give up use and how much people 300 00:16:39,156 --> 00:16:43,196 Speaker 2: are willing to pay to use Facebook is twenty to one. 301 00:16:44,036 --> 00:16:46,756 Speaker 2: There's a Nobel Prize winning theorem that says it has 302 00:16:46,796 --> 00:16:47,956 Speaker 2: to be one to one. 303 00:16:48,076 --> 00:16:50,036 Speaker 1: It was really surprising to me when you showed that 304 00:16:50,076 --> 00:16:52,276 Speaker 1: people are just not willing to pay anything to be 305 00:16:52,316 --> 00:16:54,916 Speaker 1: on social media, because if you look at just time use, 306 00:16:55,236 --> 00:16:57,596 Speaker 1: you might have predicted something very different. Right our young 307 00:16:57,636 --> 00:16:59,836 Speaker 1: people today, my Yale College students are on average in 308 00:16:59,836 --> 00:17:03,196 Speaker 1: some studies using these platforms for like four hours a day, 309 00:17:03,276 --> 00:17:05,316 Speaker 1: up to eight hours a day. But you ask them, 310 00:17:05,356 --> 00:17:06,836 Speaker 1: how much is it worth to you if you have 311 00:17:06,876 --> 00:17:09,276 Speaker 1: to pay money for and people are like nothing, I 312 00:17:09,276 --> 00:17:10,956 Speaker 1: would never pay to go on this stuff. What does 313 00:17:10,996 --> 00:17:11,516 Speaker 1: that tell you? 314 00:17:12,196 --> 00:17:15,076 Speaker 2: So there are a couple different possibilities. One is people 315 00:17:15,156 --> 00:17:18,716 Speaker 2: are anchoring on the current price, which is zero. So 316 00:17:18,836 --> 00:17:20,596 Speaker 2: if the current price is zero and they asked how 317 00:17:20,636 --> 00:17:24,516 Speaker 2: much will you pay, they'll say zero, or they'll maybe 318 00:17:24,636 --> 00:17:27,476 Speaker 2: adjust a little bit up from zero to five or ten. 319 00:17:27,916 --> 00:17:32,716 Speaker 2: Another explanation is that people think that they're wasting their time. 320 00:17:33,236 --> 00:17:36,676 Speaker 2: So I bet for a certain percentage of my population, 321 00:17:36,836 --> 00:17:39,076 Speaker 2: people thought, yeah, I spend a lot of time on it, 322 00:17:39,116 --> 00:17:41,316 Speaker 2: but it's dumb and I'm not going to buy that 323 00:17:41,756 --> 00:17:44,756 Speaker 2: terrible waste of time. I'll pay you nothing. And then 324 00:17:44,796 --> 00:17:48,116 Speaker 2: there's a third explanation, which is a number of people 325 00:17:48,236 --> 00:17:51,876 Speaker 2: might have just been mad so having enjoyed, so to speak, 326 00:17:51,916 --> 00:17:54,596 Speaker 2: a good for free, then they're asked how much would 327 00:17:54,636 --> 00:17:56,956 Speaker 2: you pay for it? They say, what are you talking about? 328 00:17:56,996 --> 00:17:59,316 Speaker 2: This is free? Like if people are asked how much 329 00:17:59,356 --> 00:18:01,996 Speaker 2: would you pay for clean air or the opportunity to 330 00:18:02,036 --> 00:18:06,516 Speaker 2: breathe oxygen? They might say zero in a survey because 331 00:18:06,556 --> 00:18:10,036 Speaker 2: they're rebelling against the very idea they'd have to pay. 332 00:18:10,356 --> 00:18:13,516 Speaker 2: So those are three explanations. I think the most fun 333 00:18:13,596 --> 00:18:17,276 Speaker 2: explanation is that people think they're wasting time. So we 334 00:18:17,356 --> 00:18:21,036 Speaker 2: need a category. Let's call them wasting time goods where 335 00:18:21,076 --> 00:18:24,516 Speaker 2: people devote a lot of minutes and maybe even hours 336 00:18:24,556 --> 00:18:27,876 Speaker 2: to a thing, but they know on reflection that it's 337 00:18:27,916 --> 00:18:29,996 Speaker 2: not doing them any good, so they're not going to 338 00:18:29,996 --> 00:18:30,556 Speaker 2: pay for them. 339 00:18:30,876 --> 00:18:34,236 Speaker 1: And so researchers did a study which actually paid people 340 00:18:34,276 --> 00:18:36,716 Speaker 1: to get off Facebook for a month. Tell me about 341 00:18:36,756 --> 00:18:37,236 Speaker 1: that study. 342 00:18:37,596 --> 00:18:41,116 Speaker 2: So there's this study by Alcott and others, as one 343 00:18:41,196 --> 00:18:44,356 Speaker 2: thing very much in common with mine. What was elicited 344 00:18:44,636 --> 00:18:47,316 Speaker 2: was how much would you demand to be off? The 345 00:18:47,396 --> 00:18:51,076 Speaker 2: difference is people were actually paid to be off, and 346 00:18:51,116 --> 00:18:53,196 Speaker 2: then it turns out they have a good month, so 347 00:18:53,236 --> 00:18:56,316 Speaker 2: they're more satisfied with their life, they're less depressed, they're 348 00:18:56,356 --> 00:18:59,836 Speaker 2: less anxious, and every measure that's thrown at them, it's 349 00:18:59,876 --> 00:19:03,756 Speaker 2: a good month. And then they're asked after that good month, 350 00:19:04,156 --> 00:19:06,996 Speaker 2: how much would you the relevant person who had a 351 00:19:07,036 --> 00:19:10,916 Speaker 2: good month demand to be off Facebook? And on average 352 00:19:10,916 --> 00:19:14,236 Speaker 2: people give approximately the same number they gave before they 353 00:19:14,276 --> 00:19:17,436 Speaker 2: experienced the good month. Having said one hundred dollars the 354 00:19:17,476 --> 00:19:20,716 Speaker 2: first round, the average answer the second round is eighty 355 00:19:20,756 --> 00:19:24,556 Speaker 2: six dollars. Now, the part that's kind of intuitive about 356 00:19:24,556 --> 00:19:27,436 Speaker 2: that is eighty six is lower than one hundred. So 357 00:19:27,516 --> 00:19:29,556 Speaker 2: people learn that it's kind of good to be off 358 00:19:29,636 --> 00:19:33,076 Speaker 2: they don't demand as much. But the wild part is 359 00:19:33,116 --> 00:19:36,156 Speaker 2: that having had a good month, they should say, you 360 00:19:36,156 --> 00:19:38,836 Speaker 2: don't have to pay me anything. I'm getting off. This 361 00:19:38,876 --> 00:19:41,796 Speaker 2: is not a good thing for me. I just learned 362 00:19:41,796 --> 00:19:43,956 Speaker 2: I had a great month. And we've found none of 363 00:19:43,956 --> 00:19:47,356 Speaker 2: that they asked for eighty six. That's that's weird. The 364 00:19:47,596 --> 00:19:50,116 Speaker 2: authors of the study don't know how to. 365 00:19:50,076 --> 00:19:54,036 Speaker 1: Explain it, but Cass did have an explanation. Cass reason 366 00:19:54,076 --> 00:19:57,036 Speaker 1: that the participants may totally recognize that they don't enjoy 367 00:19:57,116 --> 00:19:59,836 Speaker 1: being on TikTok or Instagram, but they feel like they 368 00:19:59,836 --> 00:20:03,276 Speaker 1: have to be because everyone else is. They were suffering 369 00:20:03,316 --> 00:20:06,676 Speaker 1: from that negative non user externality the cast mentioned before 370 00:20:06,716 --> 00:20:10,156 Speaker 1: the break. Their own utility was worse because so many 371 00:20:10,196 --> 00:20:14,476 Speaker 1: other people were using social media platforms. They were product trapped. 372 00:20:14,876 --> 00:20:18,436 Speaker 1: But was Kaz's hypothesis right. Well, researchers recruited a new 373 00:20:18,436 --> 00:20:21,356 Speaker 1: group of participants and ask them a different willingness to 374 00:20:21,396 --> 00:20:22,196 Speaker 1: pay question. 375 00:20:22,276 --> 00:20:24,836 Speaker 2: How much would you demand to be off contingent on 376 00:20:24,916 --> 00:20:28,956 Speaker 2: everyone in your community being off? That's the product trap question. 377 00:20:29,556 --> 00:20:31,876 Speaker 2: And what they found was if you ask people how 378 00:20:31,956 --> 00:20:33,996 Speaker 2: much would you demand to be off? You get the 379 00:20:33,996 --> 00:20:38,196 Speaker 2: standard answers as in my study. So people say, I 380 00:20:38,276 --> 00:20:40,916 Speaker 2: will require you to pay a lot to be off 381 00:20:41,036 --> 00:20:45,676 Speaker 2: these platforms, in this case TikTok and Instagram fifty dollars, 382 00:20:45,756 --> 00:20:48,276 Speaker 2: sixty dollars, seventy dollars, one hundred dollars. People are going 383 00:20:48,316 --> 00:20:50,716 Speaker 2: to demand real money to be off. But then if 384 00:20:50,756 --> 00:20:53,316 Speaker 2: people and these are college students, are asked how much 385 00:20:53,356 --> 00:20:56,796 Speaker 2: would you demand to be off contingent on everyone in 386 00:20:56,836 --> 00:21:00,156 Speaker 2: your community being off? Then they say, if everyone in 387 00:21:00,156 --> 00:21:03,356 Speaker 2: the community is off, then I will pay you. You 388 00:21:03,356 --> 00:21:06,076 Speaker 2: don't have to pay me a nickel. That's the dominant 389 00:21:06,156 --> 00:21:10,516 Speaker 2: sentiment and the explanation there is a little more intuitive, 390 00:21:10,556 --> 00:21:14,036 Speaker 2: I think, which is people are thinking, given the fact 391 00:21:14,076 --> 00:21:17,556 Speaker 2: that there is this platform and my people are on it, 392 00:21:17,956 --> 00:21:20,156 Speaker 2: I'm going to stay on and I'm going to get 393 00:21:20,156 --> 00:21:23,476 Speaker 2: off it kicking and screaming. But do I like this 394 00:21:23,596 --> 00:21:26,716 Speaker 2: status quo? I do not like the status quo at all. 395 00:21:27,196 --> 00:21:29,116 Speaker 2: If you ask me what I want to live in 396 00:21:29,156 --> 00:21:32,956 Speaker 2: a world that didn't have TikTok or Instagram, large numbers 397 00:21:32,996 --> 00:21:36,796 Speaker 2: of people say absolutely, I'll pay you real money to 398 00:21:36,876 --> 00:21:39,956 Speaker 2: produce it. This is a very profound finding, and we're 399 00:21:39,996 --> 00:21:44,676 Speaker 2: now studying it in multiple domains with respect to Starbucks 400 00:21:44,756 --> 00:21:49,436 Speaker 2: and iPhones and multiple products. But there's an assortment of 401 00:21:49,516 --> 00:21:53,996 Speaker 2: goods which people enjoy in the sense that they devote 402 00:21:53,996 --> 00:21:57,516 Speaker 2: time or money to them only because other people are 403 00:21:57,636 --> 00:22:00,796 Speaker 2: enjoying them in that sense. But they're not really enjoying 404 00:22:00,796 --> 00:22:01,436 Speaker 2: them at all. 405 00:22:01,796 --> 00:22:03,876 Speaker 1: I mean, this is pretty fascinating just as a happiness 406 00:22:03,876 --> 00:22:06,236 Speaker 1: researcher in general, because I think we're so locked into 407 00:22:06,276 --> 00:22:10,676 Speaker 1: people's individual utility we tend not to think about collective utility. 408 00:22:11,156 --> 00:22:13,236 Speaker 1: But I think this is a really special case of 409 00:22:13,276 --> 00:22:16,676 Speaker 1: collective utility where it's like, the collective utility is itself 410 00:22:16,716 --> 00:22:20,116 Speaker 1: making us choose things that are bad for us right completely. 411 00:22:20,196 --> 00:22:24,276 Speaker 2: And it's a really tough collective action problem to get 412 00:22:24,316 --> 00:22:25,956 Speaker 2: yourself out of a product trap. 413 00:22:26,676 --> 00:22:29,796 Speaker 1: It is a really tough collective action problem, but it's 414 00:22:29,796 --> 00:22:33,276 Speaker 1: also not impossible. We'll explore what we can do to 415 00:22:33,396 --> 00:22:37,076 Speaker 1: escape product traps when the Happiness Lab returns from the break. 416 00:22:49,196 --> 00:22:52,436 Speaker 1: I'm speaking with Harvard law professor and behavioral scientist Cass 417 00:22:52,516 --> 00:22:55,996 Speaker 1: Sunstein about his chapter in this year's World Happiness Report. 418 00:22:56,356 --> 00:22:58,956 Speaker 1: The research cast has shared so far paints a pretty 419 00:22:58,956 --> 00:23:02,716 Speaker 1: bleak conclusion. Even when people recognize that social media hurts 420 00:23:02,756 --> 00:23:05,836 Speaker 1: their well being, many still feel compelled to keep using 421 00:23:05,876 --> 00:23:08,316 Speaker 1: it so long as other people are using it too. 422 00:23:09,116 --> 00:23:12,156 Speaker 1: Social media, in other words, is a classic product trap, 423 00:23:12,716 --> 00:23:16,476 Speaker 1: which frankly kind of sucks. So given that most people 424 00:23:16,516 --> 00:23:20,036 Speaker 1: aren't leaving these platforms anytime soon. How do we break free? 425 00:23:20,316 --> 00:23:23,836 Speaker 1: Cassays that behavioral science suggests at least three different paths forward. 426 00:23:24,356 --> 00:23:29,716 Speaker 2: First, they are individuals, communities of individuals. There are companies 427 00:23:30,036 --> 00:23:35,516 Speaker 2: and they're regulators, so first line of defense communities. So 428 00:23:35,596 --> 00:23:39,276 Speaker 2: people can band together like my sister and my family 429 00:23:39,316 --> 00:23:42,436 Speaker 2: did to say adults aren't going to get Christmas presents. 430 00:23:42,876 --> 00:23:45,796 Speaker 2: You can have a community of we're not going to 431 00:23:45,796 --> 00:23:50,916 Speaker 2: give our kids cell phones until eighth grade. You're exiting 432 00:23:50,956 --> 00:23:54,276 Speaker 2: from the product trap by virtue of some kind of 433 00:23:54,316 --> 00:23:59,156 Speaker 2: collective agreement. This is happening all over the country. Schools 434 00:23:59,236 --> 00:24:03,116 Speaker 2: can help by saying no cell phones in schools, and 435 00:24:03,196 --> 00:24:07,076 Speaker 2: that can be supplemented by parental efforts. Or there can 436 00:24:07,116 --> 00:24:10,836 Speaker 2: be agreements among people that just say we're going to 437 00:24:10,876 --> 00:24:14,156 Speaker 2: limit our time on social media. And this is just 438 00:24:14,396 --> 00:24:18,036 Speaker 2: a self help remedy on the part of groups who 439 00:24:18,036 --> 00:24:20,956 Speaker 2: are alert to the existence of the product trapped, who 440 00:24:20,996 --> 00:24:26,036 Speaker 2: can publicize its existence and make things better and better 441 00:24:26,396 --> 00:24:30,116 Speaker 2: is good, even if it doesn't lead to perfection. Then 442 00:24:30,156 --> 00:24:33,116 Speaker 2: there are the companies, and they're right now in a 443 00:24:33,316 --> 00:24:37,316 Speaker 2: bind where it appears that some of their economic interests 444 00:24:37,396 --> 00:24:43,036 Speaker 2: are competing with their values and also their desire not 445 00:24:43,156 --> 00:24:46,396 Speaker 2: to get a regulatory hammer. Instagram has done a bunch 446 00:24:46,436 --> 00:24:49,396 Speaker 2: of things to try to discourage young people from being 447 00:24:49,436 --> 00:24:53,076 Speaker 2: on their platform and forgetting more sleep. So there's a 448 00:24:53,076 --> 00:24:56,516 Speaker 2: lot that the companies could do on the regulatory side. 449 00:24:56,556 --> 00:24:59,796 Speaker 2: I'd be very cautious, just because I'm that kind of 450 00:24:59,836 --> 00:25:02,796 Speaker 2: guy and because we're talking about speech. But you could 451 00:25:02,836 --> 00:25:08,196 Speaker 2: imagine disclosure requirements for example, or when I was in 452 00:25:08,236 --> 00:25:11,516 Speaker 2: the government, we would into guidance documents, and if the 453 00:25:11,556 --> 00:25:14,996 Speaker 2: government has a best practice for let's say, social media 454 00:25:15,476 --> 00:25:18,956 Speaker 2: platforms with respect to product traps, that can do a 455 00:25:18,996 --> 00:25:22,876 Speaker 2: lot of good. I mean, I'm excited that you're using 456 00:25:22,876 --> 00:25:25,716 Speaker 2: the term product trap and that this has taken off 457 00:25:25,796 --> 00:25:29,396 Speaker 2: among students because the term is actually quite new. It 458 00:25:29,516 --> 00:25:34,236 Speaker 2: wasn't a thing. The very existence of a phrase can 459 00:25:34,676 --> 00:25:38,636 Speaker 2: provide the ingredients for a solution. We're seeing this starting 460 00:25:38,716 --> 00:25:41,436 Speaker 2: early days, but I'm hopeful that we're going to see 461 00:25:41,436 --> 00:25:43,036 Speaker 2: a lot more in the next six months. 462 00:25:43,476 --> 00:25:45,356 Speaker 1: I love this idea of having the phrase that can 463 00:25:45,396 --> 00:25:47,516 Speaker 1: be so so helpful. One of the things I find 464 00:25:47,636 --> 00:25:50,516 Speaker 1: most compelling about product traps was how my students reacted 465 00:25:50,516 --> 00:25:53,196 Speaker 1: when I first presented this idea in the happiness class. 466 00:25:53,276 --> 00:25:55,436 Speaker 1: These light bulbs went off where they just felt like 467 00:25:55,996 --> 00:25:58,276 Speaker 1: so much of the stuff that they spend time on 468 00:25:59,036 --> 00:26:01,476 Speaker 1: is one of these things, these things that they feel 469 00:26:01,516 --> 00:26:04,876 Speaker 1: like has little value for themselves, maybe is time wasting, 470 00:26:05,036 --> 00:26:07,636 Speaker 1: maybe is actively harmful, but they have to do it 471 00:26:07,676 --> 00:26:10,396 Speaker 1: because everybody else is doing it. And I feel like 472 00:26:10,476 --> 00:26:13,356 Speaker 1: with social media, there was an interesting transition talking to 473 00:26:13,356 --> 00:26:16,316 Speaker 1: my students. When it was just the Facebook days of 474 00:26:16,356 --> 00:26:18,956 Speaker 1: social media, I think people didn't realize it was as 475 00:26:18,996 --> 00:26:21,236 Speaker 1: much of a product trap as it was, in part 476 00:26:21,236 --> 00:26:24,516 Speaker 1: because it wasn't so negatively affecting students' utility just kind 477 00:26:24,516 --> 00:26:26,036 Speaker 1: of being on social media. But I think in the 478 00:26:26,076 --> 00:26:28,996 Speaker 1: age of tiktoks and reels, when students feel like they're 479 00:26:29,076 --> 00:26:31,076 Speaker 1: really sucked in and they can kind of feel it 480 00:26:31,116 --> 00:26:33,436 Speaker 1: in the moment of the product use, I think they 481 00:26:33,436 --> 00:26:35,516 Speaker 1: all get this really really clearly. 482 00:26:36,156 --> 00:26:40,396 Speaker 2: Yes, so people know that they're trapped with respect to 483 00:26:40,436 --> 00:26:44,156 Speaker 2: social media platforms, there are other domains where people are 484 00:26:44,196 --> 00:26:47,716 Speaker 2: trapped and they don't know. And this is a little 485 00:26:47,836 --> 00:26:52,756 Speaker 2: like clue about the non isolated nature of this problem. 486 00:26:53,276 --> 00:26:57,356 Speaker 1: So these product traps are everywhere. Are there past examples 487 00:26:57,436 --> 00:26:59,916 Speaker 1: of people solving them well? What has worked in the 488 00:26:59,956 --> 00:27:00,756 Speaker 1: past to solve them. 489 00:27:00,916 --> 00:27:06,716 Speaker 2: I think liquor and cigarettes are two clear examples where 490 00:27:06,836 --> 00:27:10,116 Speaker 2: young people would smoke and drink and some number of 491 00:27:10,116 --> 00:27:13,796 Speaker 2: them wish people weren't doing that, and that's there are 492 00:27:13,876 --> 00:27:19,116 Speaker 2: three strategies, which is individuals doing things to prevent the 493 00:27:19,196 --> 00:27:24,556 Speaker 2: product trap from damaging them, or companies trying to do things. 494 00:27:24,996 --> 00:27:27,156 Speaker 2: You know, might be a ban on smoking in public 495 00:27:27,196 --> 00:27:30,436 Speaker 2: places or something which has an expressive effect and ripples over, 496 00:27:30,956 --> 00:27:33,596 Speaker 2: or it might be government doing something. So you're a 497 00:27:33,716 --> 00:27:36,196 Speaker 2: very provocative idea which my co authors and I are 498 00:27:36,236 --> 00:27:39,036 Speaker 2: playing with, which is we're dealing with a non user 499 00:27:39,116 --> 00:27:43,796 Speaker 2: externality here, and the way you handle externalities typically is 500 00:27:43,836 --> 00:27:49,396 Speaker 2: imposed a financial thing, so taxes as a response, and 501 00:27:49,476 --> 00:27:52,916 Speaker 2: for cigarettes that's actually been done. There's Stiff's cigarette taxes, 502 00:27:52,956 --> 00:27:58,076 Speaker 2: which have contributed to the extraordinary reduction and smoking rates 503 00:27:58,076 --> 00:28:01,996 Speaker 2: of the United States and for many young smokers. It 504 00:28:02,076 --> 00:28:04,596 Speaker 2: took exactly the same form as what we're talking about 505 00:28:04,636 --> 00:28:07,236 Speaker 2: for social media platforms, where young people say I'm going 506 00:28:07,276 --> 00:28:11,996 Speaker 2: to smoke. Sure, I wish there was no And if 507 00:28:12,036 --> 00:28:14,996 Speaker 2: we'd run an experiment like the ones we've talked about, 508 00:28:15,516 --> 00:28:17,876 Speaker 2: we would have found that people would demand a lot 509 00:28:17,956 --> 00:28:20,676 Speaker 2: to give up smoking, but would probably pay for a 510 00:28:20,676 --> 00:28:23,156 Speaker 2: world in which no one in their group was smoking. 511 00:28:24,116 --> 00:28:26,156 Speaker 1: And this gets to an idea for which you are 512 00:28:26,236 --> 00:28:28,516 Speaker 1: very well known, This idea that we could find freedom 513 00:28:28,556 --> 00:28:31,636 Speaker 1: and what's known as libertarian paternalism. What is this idea? 514 00:28:32,476 --> 00:28:35,996 Speaker 2: The idea is there are ways of intervening that completely 515 00:28:36,036 --> 00:28:40,716 Speaker 2: respect people's freedom, so they're libertarian, but that also steer 516 00:28:40,836 --> 00:28:44,316 Speaker 2: people in a direction that they wouldn't otherwise go, that 517 00:28:44,436 --> 00:28:50,476 Speaker 2: makes their lives go better. So libertarian. Some people are paternalists, 518 00:28:50,516 --> 00:28:53,876 Speaker 2: some people are. They typically don't agree with each other. 519 00:28:54,316 --> 00:28:57,716 Speaker 2: But if you think of a GPS device or warning, 520 00:28:57,916 --> 00:29:01,796 Speaker 2: for example, that certain foods have allergens, both of those 521 00:29:02,036 --> 00:29:05,796 Speaker 2: are liberty preserving. You can eat the food with the allergens. 522 00:29:05,916 --> 00:29:07,916 Speaker 2: You can say I see it as shrimp and peanuts 523 00:29:07,916 --> 00:29:10,076 Speaker 2: and I'm allergic in neither of them, and so I'm 524 00:29:10,116 --> 00:29:12,436 Speaker 2: going to go for it. Or you can say I 525 00:29:12,476 --> 00:29:15,316 Speaker 2: see the GPS device says here's the way to go 526 00:29:15,356 --> 00:29:17,956 Speaker 2: to New Haven. I have my own way, or I 527 00:29:17,956 --> 00:29:20,356 Speaker 2: have a scenic way and I'm going to use that. 528 00:29:20,836 --> 00:29:25,196 Speaker 2: So it's liberty preserving, hence libertarian. But a GPS device 529 00:29:25,316 --> 00:29:28,076 Speaker 2: is paternalist think it tells you here's how you ought 530 00:29:28,076 --> 00:29:32,956 Speaker 2: to go. So nodges are typically a form of libertarian paternalism. 531 00:29:33,396 --> 00:29:37,436 Speaker 2: That is, they preserve your freedom, as when you see 532 00:29:37,516 --> 00:29:41,476 Speaker 2: a nutrition facts panel at the pressery store, it kind 533 00:29:41,516 --> 00:29:44,556 Speaker 2: of pushes you a little bit, but it doesn't take 534 00:29:44,596 --> 00:29:46,076 Speaker 2: your freedom of choice away. 535 00:29:46,276 --> 00:29:48,596 Speaker 1: And so what would libertarian paternalism look like in the 536 00:29:48,636 --> 00:29:49,556 Speaker 1: social media case. 537 00:29:49,796 --> 00:29:52,356 Speaker 2: I'd like to see a lot of libertarian paternalism in 538 00:29:52,396 --> 00:29:55,876 Speaker 2: the social media case, in the first instance, from the companies. 539 00:29:56,236 --> 00:29:59,756 Speaker 2: So if a company says you've been on for five hours, 540 00:30:00,436 --> 00:30:05,916 Speaker 2: consider getting off, that's libertarian paternalism, a little like cars 541 00:30:05,956 --> 00:30:08,316 Speaker 2: will say, if you've been driving for a long time, 542 00:30:09,196 --> 00:30:12,996 Speaker 2: want to take a break, that's libertarian paternalism. The company 543 00:30:13,036 --> 00:30:16,516 Speaker 2: could nudge people to take breaks, to be off their 544 00:30:16,556 --> 00:30:20,996 Speaker 2: platform at certain hours to join the growing number of people, 545 00:30:21,076 --> 00:30:24,676 Speaker 2: let's say, who aren't using social media late at night. 546 00:30:25,076 --> 00:30:28,436 Speaker 2: There are any number of nudges that companies could use. 547 00:30:28,796 --> 00:30:32,756 Speaker 2: We can also imagine the government requiring disclosure of the 548 00:30:32,796 --> 00:30:39,316 Speaker 2: policies companies adopt to hook users. So sunlight, just as 549 00:30:39,556 --> 00:30:42,676 Speaker 2: BRANDI said, is the best of disinfectants, we could have 550 00:30:42,836 --> 00:30:46,556 Speaker 2: very light touch regulation, and that would be designed to 551 00:30:46,596 --> 00:30:48,716 Speaker 2: liberate people, maybe from product traps. 552 00:30:50,076 --> 00:30:52,036 Speaker 1: The good news is sounding like, even though these things 553 00:30:52,076 --> 00:30:54,276 Speaker 1: are traps, if we can team up with our communities, 554 00:30:54,316 --> 00:30:55,756 Speaker 1: there's ways we can get out of them. 555 00:30:56,116 --> 00:30:59,316 Speaker 2: Sure, And if we look at the arc of human history, 556 00:30:59,396 --> 00:31:02,316 Speaker 2: even in the last twenty years, there's been an implicit 557 00:31:02,436 --> 00:31:05,956 Speaker 2: understanding that certain things they're not good for us, and 558 00:31:06,036 --> 00:31:10,796 Speaker 2: people have found their way out again with massive success stories. 559 00:31:11,956 --> 00:31:14,916 Speaker 1: Social media may feel like something we're stuck with, but 560 00:31:15,036 --> 00:31:17,716 Speaker 1: history shows that when we start to recognize patterns that 561 00:31:17,716 --> 00:31:20,876 Speaker 1: make us worse off, we often find ways to change them. 562 00:31:21,196 --> 00:31:23,836 Speaker 1: Step one is to name the problem. The next time 563 00:31:23,836 --> 00:31:26,516 Speaker 1: you're feeling stuck on social media or with some other 564 00:31:26,556 --> 00:31:29,676 Speaker 1: product you're using. Just because everyone else is name what's 565 00:31:29,716 --> 00:31:32,556 Speaker 1: going on, you can even say to yourself, there's a 566 00:31:32,596 --> 00:31:35,396 Speaker 1: reason I'm feeling this way because I'm dealing with a 567 00:31:35,436 --> 00:31:39,716 Speaker 1: product trap. Step number two, take action. Sometimes that means 568 00:31:39,756 --> 00:31:42,836 Speaker 1: working with your community to set new norms. Sometimes it 569 00:31:42,876 --> 00:31:46,876 Speaker 1: means advocating that the companies involved redesign their products, and 570 00:31:46,916 --> 00:31:50,916 Speaker 1: sometimes it means government stepping in with new rules. Recognizing 571 00:31:50,956 --> 00:31:53,556 Speaker 1: product traps for what they are can help us reshape 572 00:31:53,596 --> 00:31:56,916 Speaker 1: the environment around these technologies so that they serve our 573 00:31:56,956 --> 00:32:01,436 Speaker 1: well being rather than quietly undermining it. That's all for today, 574 00:32:01,636 --> 00:32:03,476 Speaker 1: but if you'd like to learn more about how social 575 00:32:03,556 --> 00:32:06,276 Speaker 1: media functions as a product trap check out this year's 576 00:32:06,276 --> 00:32:09,396 Speaker 1: World Happiness Report, which you can download for free at 577 00:32:09,436 --> 00:32:13,756 Speaker 1: Worldhappiness dot report. And if you have thoughts about today's episode, 578 00:32:13,796 --> 00:32:16,116 Speaker 1: we'd love to hear them. You can email us at 579 00:32:16,116 --> 00:32:18,956 Speaker 1: Happiness Lab at Pushkin dot fm, or leave us a 580 00:32:18,956 --> 00:32:21,796 Speaker 1: review to tell us what resonated. You can also sign 581 00:32:21,876 --> 00:32:23,996 Speaker 1: up to learn more about the science of happiness and 582 00:32:24,156 --> 00:32:27,876 Speaker 1: join my free newsletter on my website, Doctor Lari Santos 583 00:32:27,956 --> 00:32:30,716 Speaker 1: dot com. That's d R l a U r I 584 00:32:30,796 --> 00:32:33,916 Speaker 1: E s A n t o s dot com. We'll 585 00:32:33,956 --> 00:32:35,916 Speaker 1: be back in two weeks with a brand new season 586 00:32:35,996 --> 00:32:38,916 Speaker 1: about how to spring clean your well being, and we'll 587 00:32:38,916 --> 00:32:41,356 Speaker 1: be doing our own in house spring cleaning as we 588 00:32:41,396 --> 00:32:44,196 Speaker 1: head back into the Happiness Lab archive to dig up 589 00:32:44,196 --> 00:32:46,836 Speaker 1: some of our favorite tips, So be sure to come 590 00:32:46,876 --> 00:32:49,556 Speaker 1: back soon for the next episode of the Happiness Lab 591 00:32:49,636 --> 00:32:51,316 Speaker 1: with me, Doctor Vari Santos