1 00:00:00,120 --> 00:00:05,600 Speaker 1: A day in the life of Alice. Alice works at 2 00:00:05,640 --> 00:00:09,119 Speaker 1: the plant. Alice has a strange pay system at her 3 00:00:09,200 --> 00:00:13,080 Speaker 1: job at the plant. Instead of getting weekly paychecks, every day, 4 00:00:13,119 --> 00:00:16,599 Speaker 1: her employer deposits one seventh of her weekly income into 5 00:00:16,600 --> 00:00:19,880 Speaker 1: her bank account. She makes what passes for an average 6 00:00:19,920 --> 00:00:22,560 Speaker 1: median wage in the New U s territories, so this 7 00:00:22,600 --> 00:00:24,960 Speaker 1: means about a hundred and twenty one credits shows up 8 00:00:24,960 --> 00:00:28,479 Speaker 1: in her bank account every day. It's Saturday if she's 9 00:00:28,520 --> 00:00:31,040 Speaker 1: off work, and she's out of food at her apartment, 10 00:00:31,520 --> 00:00:33,920 Speaker 1: so Alice decides to go to the grocery store to 11 00:00:33,960 --> 00:00:36,240 Speaker 1: buy a few things to make herself some lunch and dinner. 12 00:00:37,200 --> 00:00:39,560 Speaker 1: On our way walking to the store, she sees another 13 00:00:39,600 --> 00:00:43,640 Speaker 1: woman walking an extremely cute puppy. Alice goes up to 14 00:00:43,680 --> 00:00:46,280 Speaker 1: ask the woman if she can pet the puppy. The 15 00:00:46,320 --> 00:00:49,680 Speaker 1: woman says, okay, but it will cost you three credits. 16 00:00:50,440 --> 00:00:52,320 Speaker 1: Alice thinks about it for a second, then makes a 17 00:00:52,360 --> 00:00:55,720 Speaker 1: few swipes on her identipad and sends this woman the money. 18 00:00:56,040 --> 00:00:59,400 Speaker 1: She pets the puppy. Of course, she enjoys the experience 19 00:00:59,440 --> 00:01:03,000 Speaker 1: because hey, puppies are great. On the way to the store, 20 00:01:03,040 --> 00:01:06,160 Speaker 1: she stops four other dog walkers and pays them each 21 00:01:06,240 --> 00:01:09,199 Speaker 1: three credits to pet their dogs. Now, dogs are still 22 00:01:09,240 --> 00:01:12,000 Speaker 1: great when you can find them. Petting dogs is still wonderful, 23 00:01:12,280 --> 00:01:15,000 Speaker 1: but Alice notices that each time she stops to pet 24 00:01:15,000 --> 00:01:17,920 Speaker 1: a dog, she gets less happiness out of it than 25 00:01:18,000 --> 00:01:21,319 Speaker 1: she did the last time. Having spent fifteen credits on dogs, 26 00:01:21,319 --> 00:01:23,399 Speaker 1: she's now down to a hundred and six credits for 27 00:01:23,400 --> 00:01:26,679 Speaker 1: the day. Halfway to the store, she walks past the 28 00:01:26,720 --> 00:01:31,280 Speaker 1: scene of an auto car accident ambulances a peacekeeper detail. 29 00:01:31,880 --> 00:01:34,720 Speaker 1: She doesn't really want to look, and she feels guilty, 30 00:01:34,880 --> 00:01:38,320 Speaker 1: but she can't help it. The feeling of curiosity is 31 00:01:38,400 --> 00:01:42,319 Speaker 1: so overwhelming she has the irresistible urge to peek and 32 00:01:42,360 --> 00:01:45,400 Speaker 1: see if anybody got hurt, but she gets caught. An 33 00:01:45,480 --> 00:01:48,640 Speaker 1: armored peacekeeper walks up and says, ma'am, I noticed you 34 00:01:48,680 --> 00:01:51,640 Speaker 1: watching the scene. There's a surcharge of twenty six credits 35 00:01:51,680 --> 00:01:55,280 Speaker 1: if you want to do that. Reluctantly, Alice hands over 36 00:01:55,320 --> 00:01:57,760 Speaker 1: the payment and then spends a while watching the scene. 37 00:01:58,360 --> 00:02:00,680 Speaker 1: Now she's down to eighty credits of the day, and 38 00:02:00,720 --> 00:02:03,360 Speaker 1: on top of that, she feels pretty guilty about snooping 39 00:02:03,360 --> 00:02:05,520 Speaker 1: on the wreck in the first place. On the way 40 00:02:05,560 --> 00:02:07,840 Speaker 1: to the store, she has more encounters like this. She 41 00:02:07,920 --> 00:02:10,400 Speaker 1: stops to watch a couple having a heated argument a 42 00:02:10,520 --> 00:02:13,560 Speaker 1: parking lot. They charge her forty credits to watch, and 43 00:02:13,600 --> 00:02:16,560 Speaker 1: all it does is make her feel bad. Then outside 44 00:02:16,560 --> 00:02:20,240 Speaker 1: the grocery store, a vendor is selling crystal lattees. She 45 00:02:20,360 --> 00:02:23,440 Speaker 1: buys one for five credits and stands there drinking it. 46 00:02:23,440 --> 00:02:26,840 Speaker 1: It is a truly delicious crystal latte, so she buys 47 00:02:26,840 --> 00:02:29,800 Speaker 1: another one, and then another one, and then three more. 48 00:02:30,600 --> 00:02:34,040 Speaker 1: By the time she's done, she is highly overcaffeinated, feels 49 00:02:34,040 --> 00:02:37,400 Speaker 1: twitchy and anxious, and has a horrible stomach ache, and 50 00:02:37,480 --> 00:02:40,160 Speaker 1: she only has fifteen credits left to buy groceries with. 51 00:02:41,280 --> 00:02:43,400 Speaker 1: This is not enough for her to make what she 52 00:02:43,520 --> 00:02:45,840 Speaker 1: was planning for lunch and dinner, let alone enough to 53 00:02:45,880 --> 00:02:47,919 Speaker 1: save what she needs each day to pay for rent 54 00:02:48,200 --> 00:02:51,760 Speaker 1: her virtual reality bill, or save for that Ganymede Mec holiday. 55 00:02:51,800 --> 00:02:57,160 Speaker 1: She's been dreaming of what happened Alice in this story. 56 00:02:58,080 --> 00:03:00,640 Speaker 1: At every stage of every part of the she was 57 00:03:00,720 --> 00:03:04,120 Speaker 1: doing what she wanted to do in the moment. Nobody 58 00:03:04,160 --> 00:03:07,080 Speaker 1: coerced her at all, nobody forced her to do any 59 00:03:07,080 --> 00:03:10,240 Speaker 1: of that stuff, and yet everything has gone wrong. Her 60 00:03:10,280 --> 00:03:14,760 Speaker 1: financial plans for the day have been completely destroyed. What happened? 61 00:03:18,600 --> 00:03:21,440 Speaker 1: Welcome to Stuff to Blow your Mind from how stup 62 00:03:21,720 --> 00:03:30,480 Speaker 1: dot com. Hey you, welcome to stuff to blow your mind. 63 00:03:30,560 --> 00:03:33,320 Speaker 1: My name is Robert Lamb and I'm Joe McCormick. And 64 00:03:33,560 --> 00:03:37,480 Speaker 1: what we just read was an attempt to sketch out 65 00:03:38,040 --> 00:03:40,800 Speaker 1: an illustration of an idea that we're going to be 66 00:03:40,840 --> 00:03:43,120 Speaker 1: talking about for a couple of episodes here. This is 67 00:03:43,160 --> 00:03:45,080 Speaker 1: going to be the first of a two part episode 68 00:03:45,120 --> 00:03:49,640 Speaker 1: about the attention economy, the technosphere, and the war for 69 00:03:49,720 --> 00:03:52,840 Speaker 1: your eyeballs. And one of the core ideas that we're 70 00:03:52,840 --> 00:03:56,280 Speaker 1: gonna be talking about today is the idea that we 71 00:03:56,320 --> 00:04:01,080 Speaker 1: should start thinking about our attention, what we pay attention to, 72 00:04:01,800 --> 00:04:05,160 Speaker 1: as a finite resource that has to be invested in 73 00:04:05,240 --> 00:04:08,760 Speaker 1: things with deliberation and care in order for us to 74 00:04:08,840 --> 00:04:11,600 Speaker 1: get what we want out of life. If you think 75 00:04:11,600 --> 00:04:14,360 Speaker 1: back over the story we told about Alice and simply 76 00:04:14,520 --> 00:04:17,560 Speaker 1: take out the money, remove the credits from the scenario, 77 00:04:18,040 --> 00:04:20,480 Speaker 1: is it a day well spent? Is the story? Okay? 78 00:04:20,520 --> 00:04:23,039 Speaker 1: If you take out all the money she spent staring 79 00:04:23,160 --> 00:04:26,080 Speaker 1: at the at the wreck, at the argument, at the puppies, 80 00:04:26,120 --> 00:04:31,480 Speaker 1: at the Latte's um. It's still not quite right, is it. Yeah. 81 00:04:31,920 --> 00:04:34,640 Speaker 1: The great thing about this little story is that we 82 00:04:34,920 --> 00:04:38,240 Speaker 1: have these sort of fun dystopian sci fi elements, but 83 00:04:38,560 --> 00:04:43,040 Speaker 1: at heart we have a very dystopian idea that matches 84 00:04:43,160 --> 00:04:48,719 Speaker 1: up with our our daily life, our daily struggles over attention. Yeah, exactly, 85 00:04:48,760 --> 00:04:51,120 Speaker 1: because even if you take all of the money out 86 00:04:51,160 --> 00:04:54,240 Speaker 1: of the story, she wasn't spending money, but she's spending 87 00:04:54,360 --> 00:04:57,640 Speaker 1: something that cannot be replaced, which is her time and 88 00:04:57,680 --> 00:05:02,640 Speaker 1: what she devotes her mind to. Yeah, thinking about that 89 00:05:02,680 --> 00:05:06,000 Speaker 1: purchase that she wants to make, thinking about the situation 90 00:05:06,080 --> 00:05:09,120 Speaker 1: that she's a mere bystandard too. Right, It's a distraction 91 00:05:09,279 --> 00:05:12,360 Speaker 1: from whatever else it is you want to do with 92 00:05:12,440 --> 00:05:16,240 Speaker 1: your life. Because our attention is truly finite. It might 93 00:05:16,440 --> 00:05:20,800 Speaker 1: feel like an infinite and value uncoupled resource to us, right, 94 00:05:20,800 --> 00:05:23,080 Speaker 1: because there's just so much of it for us to give, 95 00:05:23,400 --> 00:05:26,080 Speaker 1: and we squander it so much of the time, and 96 00:05:26,080 --> 00:05:28,960 Speaker 1: there's no explicit dollar value attached to it. But think 97 00:05:28,960 --> 00:05:32,120 Speaker 1: about it like this. You do work so that you 98 00:05:32,120 --> 00:05:34,680 Speaker 1: can get a paycheck that will number one by you 99 00:05:34,800 --> 00:05:38,000 Speaker 1: the basic necessities of survival, But then what do you 100 00:05:38,080 --> 00:05:41,520 Speaker 1: work for on top of that? It's because you want 101 00:05:41,560 --> 00:05:44,200 Speaker 1: to have money in your paycheck that will allow you 102 00:05:44,240 --> 00:05:48,120 Speaker 1: to spend your free time devoting your attention to things 103 00:05:48,160 --> 00:05:51,920 Speaker 1: you want. You like to spend your attention on family, 104 00:05:52,440 --> 00:05:56,600 Speaker 1: on friends, on books and movies and enjoyable meals, and 105 00:05:56,720 --> 00:06:00,479 Speaker 1: sports and the outdoors and meditation and how abbies and 106 00:06:00,560 --> 00:06:05,120 Speaker 1: personal projects and all of all of that is attentional desires. 107 00:06:05,400 --> 00:06:08,200 Speaker 1: The reason we want to have free time is because 108 00:06:08,240 --> 00:06:10,720 Speaker 1: we want to be able to spend our attention on 109 00:06:10,880 --> 00:06:14,440 Speaker 1: these things. And we at least subconsciously realize that we 110 00:06:14,480 --> 00:06:18,400 Speaker 1: only have a limited amount of attention to spend. Our attention, 111 00:06:18,720 --> 00:06:22,320 Speaker 1: quite literally is our lives. And over the next couple 112 00:06:22,360 --> 00:06:25,480 Speaker 1: of episodes, we're gonna try to offer some evidence and 113 00:06:25,520 --> 00:06:28,880 Speaker 1: make an argument that we increasingly live in a world 114 00:06:29,400 --> 00:06:35,200 Speaker 1: of incredibly powerful, highly evolved attention vampires that we have 115 00:06:35,360 --> 00:06:39,800 Speaker 1: thoroughly integrated into our daily lives and allowed to just 116 00:06:40,000 --> 00:06:43,960 Speaker 1: suck away our attention at a massive scale without us 117 00:06:44,000 --> 00:06:48,119 Speaker 1: even noticing it. Because do you ever realize that you've 118 00:06:48,160 --> 00:06:51,920 Speaker 1: been sitting there scrolling through your Facebook news feed mindlessly 119 00:06:52,040 --> 00:06:54,479 Speaker 1: for half an hour when you really meant to be 120 00:06:54,560 --> 00:06:57,920 Speaker 1: doing something else, Or do you ever suddenly snap, as 121 00:06:57,960 --> 00:07:00,479 Speaker 1: if from a trance from watching a long string of 122 00:07:00,600 --> 00:07:04,680 Speaker 1: random YouTube videos, not videos that you went to YouTube 123 00:07:04,760 --> 00:07:08,000 Speaker 1: specifically to watch, but videos that have been served to 124 00:07:08,080 --> 00:07:12,400 Speaker 1: you as related content, compulsively clicking on the next video 125 00:07:12,800 --> 00:07:17,080 Speaker 1: or allowing it to auto play, like you're just hypnotized. Yeah. Like, 126 00:07:17,160 --> 00:07:20,240 Speaker 1: I often have this situation where I feel like if 127 00:07:20,240 --> 00:07:24,200 Speaker 1: I were a primitive human, I would be seated and 128 00:07:24,240 --> 00:07:27,960 Speaker 1: say a clearing um, perhaps you know, next to a tree, 129 00:07:28,120 --> 00:07:31,280 Speaker 1: and I would be working on some craft. Perhaps I'm 130 00:07:31,360 --> 00:07:34,800 Speaker 1: I'm chipping away at what will be an arrowhead or 131 00:07:35,160 --> 00:07:37,720 Speaker 1: the head of a spear. And then occasionally I'll look 132 00:07:37,800 --> 00:07:40,320 Speaker 1: up from my craft and I'll look around at the 133 00:07:40,320 --> 00:07:42,480 Speaker 1: surrounding territory, you know, just to make sure that the 134 00:07:43,160 --> 00:07:46,320 Speaker 1: rival tribe is not attacking, that there are no predators 135 00:07:46,360 --> 00:07:48,239 Speaker 1: creeping up on me, and then continue with my work. 136 00:07:48,840 --> 00:07:52,400 Speaker 1: But in my day to day environment, it's not merely 137 00:07:52,640 --> 00:07:56,160 Speaker 1: the physical surroundings that I had that I gaze up 138 00:07:56,200 --> 00:07:58,600 Speaker 1: from and look at. It's the all the social media 139 00:07:59,400 --> 00:08:03,480 Speaker 1: um aspects of my environment, like gazing over at at Facebook, 140 00:08:03,480 --> 00:08:06,160 Speaker 1: at Twitter, at Reddit, at at what have you, many 141 00:08:06,200 --> 00:08:09,000 Speaker 1: of which are tied to my job. I'm looking at 142 00:08:09,080 --> 00:08:12,680 Speaker 1: feeds for the podcast feeds for work, and yet it 143 00:08:12,760 --> 00:08:14,760 Speaker 1: is it is getting in the way of my my 144 00:08:14,840 --> 00:08:17,920 Speaker 1: actual job. But you often get derailed. I bet, don't 145 00:08:17,960 --> 00:08:20,480 Speaker 1: you Like you go to Twitter or you go to 146 00:08:20,560 --> 00:08:23,440 Speaker 1: Facebook because it's something you have to do for your job, 147 00:08:23,800 --> 00:08:26,200 Speaker 1: and say, I need to check the page for X, 148 00:08:26,280 --> 00:08:29,080 Speaker 1: or I need to upload a link to why. But 149 00:08:29,240 --> 00:08:32,560 Speaker 1: then that news feeds there isn't it, Or some things 150 00:08:32,600 --> 00:08:34,760 Speaker 1: are scrolling in and you get kind of sucked in, 151 00:08:35,000 --> 00:08:38,400 Speaker 1: or sometimes you're not even in there for any other reason, 152 00:08:38,480 --> 00:08:42,000 Speaker 1: but you feel this kind of inexplicable tug. Do you 153 00:08:42,000 --> 00:08:46,559 Speaker 1: ever notice yourself getting derailed from important or interesting projects 154 00:08:46,880 --> 00:08:52,080 Speaker 1: they frequent unnecessary compulsion to check Twitter and see what 155 00:08:52,280 --> 00:08:54,400 Speaker 1: happened on it? Or do you ever have that feeling 156 00:08:54,400 --> 00:08:57,280 Speaker 1: that suddenly a whole evening is gone by and you 157 00:08:57,320 --> 00:09:01,120 Speaker 1: didn't really do anything productive or enjoyable. You just had 158 00:09:01,160 --> 00:09:06,240 Speaker 1: a long sustained session of consuming digital content without making 159 00:09:06,280 --> 00:09:10,040 Speaker 1: any conscious choices or decisions, and all you're left with 160 00:09:10,160 --> 00:09:14,319 Speaker 1: is this hollow feeling of regret. What happened to my night? Yeah? 161 00:09:14,720 --> 00:09:17,520 Speaker 1: Sometimes it's something is is nonsensical too, is just a 162 00:09:17,559 --> 00:09:22,160 Speaker 1: sudden need to explore the filmography of Rod Steiger or 163 00:09:22,200 --> 00:09:23,960 Speaker 1: something that added fact. You know, it's like it's just 164 00:09:24,000 --> 00:09:27,080 Speaker 1: like I wonder what his big film was, and then 165 00:09:27,320 --> 00:09:32,160 Speaker 1: and then going out. Yeah, like the encyclopedic rabbit holes, 166 00:09:32,200 --> 00:09:35,840 Speaker 1: going one Wikipedia link to another. That can happen too. 167 00:09:36,160 --> 00:09:37,920 Speaker 1: And this is what we're gonna be talking about for 168 00:09:37,960 --> 00:09:40,280 Speaker 1: these next couple of episodes. So I came across a 169 00:09:40,320 --> 00:09:44,000 Speaker 1: startling fact that's often cited by a technology ethicis that 170 00:09:44,040 --> 00:09:46,160 Speaker 1: we're gonna be talking about in these couple of episodes. 171 00:09:46,200 --> 00:09:48,800 Speaker 1: A guy named Tristan Harris. You might have seen him 172 00:09:48,840 --> 00:09:50,880 Speaker 1: in the news over the past couple of years because 173 00:09:50,880 --> 00:09:53,079 Speaker 1: he's been leading this new movement. Will be discussing a 174 00:09:53,120 --> 00:09:56,120 Speaker 1: little bit. But Harris often cites the fact that the 175 00:09:56,200 --> 00:10:00,719 Speaker 1: average person with the smartphone checks their phone hundred and 176 00:10:00,760 --> 00:10:03,920 Speaker 1: fifty times a day. Yeah, well that doesn't that doesn't 177 00:10:03,920 --> 00:10:07,680 Speaker 1: sound too far off. I I shudder to think that 178 00:10:07,720 --> 00:10:10,120 Speaker 1: could be true. But that might be true about me. 179 00:10:10,200 --> 00:10:13,160 Speaker 1: I don't know, but I would be scared to see 180 00:10:13,200 --> 00:10:15,520 Speaker 1: the number if somebody kept track of how often I 181 00:10:15,640 --> 00:10:18,800 Speaker 1: checked my phone, because it begins to feel like coming 182 00:10:18,920 --> 00:10:21,840 Speaker 1: up for air. It's just something that you almost do 183 00:10:22,000 --> 00:10:25,240 Speaker 1: as a reflex. You know, I'm walking down the hallway, Well, 184 00:10:25,360 --> 00:10:29,120 Speaker 1: nothing to do but check my social media. Yeah, if 185 00:10:29,160 --> 00:10:31,840 Speaker 1: that number is true, And if you're in that average 186 00:10:31,880 --> 00:10:34,120 Speaker 1: band and you check your phone about a hundred and 187 00:10:34,120 --> 00:10:36,840 Speaker 1: fifty times a day, think about what that does to 188 00:10:36,880 --> 00:10:39,360 Speaker 1: your day. Even if you only spend an average of 189 00:10:39,440 --> 00:10:43,480 Speaker 1: thirty seconds each time looking at your phone, that's seventy 190 00:10:43,480 --> 00:10:46,720 Speaker 1: five minutes of every day staring at your phone. But 191 00:10:46,840 --> 00:10:50,160 Speaker 1: we all know it's not just thirty seconds, right, Sometimes 192 00:10:50,200 --> 00:10:53,240 Speaker 1: it's thirty seconds, sometimes maybe a little bit less. Sometimes 193 00:10:53,320 --> 00:10:56,080 Speaker 1: you open the Facebook app or the Instagram app and 194 00:10:56,120 --> 00:10:59,400 Speaker 1: it turns into ten minutes, twenty minutes, or an hour. 195 00:11:00,240 --> 00:11:03,640 Speaker 1: Why is this happening and how is this affecting us? 196 00:11:04,160 --> 00:11:06,440 Speaker 1: One of the funny things about this is the fact 197 00:11:06,520 --> 00:11:10,200 Speaker 1: that we're using in this conversation the word phone. I 198 00:11:10,240 --> 00:11:13,360 Speaker 1: know we've had this talk before about how like one time, Robert, 199 00:11:13,400 --> 00:11:16,400 Speaker 1: you were like, it's not really a phone, is it? No? 200 00:11:16,600 --> 00:11:19,240 Speaker 1: I yeah, I have. I have gotten in the habit 201 00:11:19,240 --> 00:11:21,480 Speaker 1: of trying to refer to it as a tiny pocket 202 00:11:21,520 --> 00:11:24,520 Speaker 1: computer because it feels far more accurate than saying phone, 203 00:11:24,520 --> 00:11:28,080 Speaker 1: because phone sounds utilitarian. I have my phone in case 204 00:11:28,120 --> 00:11:30,920 Speaker 1: an important call comes through. It's a tool for getting 205 00:11:30,920 --> 00:11:33,960 Speaker 1: in touch with the hospital or something like. But it's not. 206 00:11:34,040 --> 00:11:38,040 Speaker 1: It's a tiny pocket computer full of data streams and 207 00:11:38,240 --> 00:11:42,160 Speaker 1: games and various social media bells and whistles and and 208 00:11:42,280 --> 00:11:45,920 Speaker 1: ultimately also it's just this portal for anxiety, you know, 209 00:11:46,080 --> 00:11:49,520 Speaker 1: like all the various, all sorts of bad news can 210 00:11:49,559 --> 00:11:52,960 Speaker 1: come flying through this black little screen. And yet we 211 00:11:52,960 --> 00:11:54,840 Speaker 1: we want to be next to it at all times, 212 00:11:55,280 --> 00:11:57,600 Speaker 1: you know, we we need to be there when when 213 00:11:57,600 --> 00:12:00,920 Speaker 1: the bad news comes to calling. We have talked about 214 00:12:00,920 --> 00:12:02,920 Speaker 1: how it should be called a tiny pocket computer, and 215 00:12:02,920 --> 00:12:05,720 Speaker 1: that would be much more accurate. But I also think 216 00:12:05,720 --> 00:12:09,840 Speaker 1: people should run this experiment. Try this exercise. Anytime somebody 217 00:12:09,840 --> 00:12:12,680 Speaker 1: says a sentence about their smartphone and they're not actually 218 00:12:12,760 --> 00:12:16,000 Speaker 1: talking about like a landline telephone, if they're talking about 219 00:12:16,000 --> 00:12:19,560 Speaker 1: a smartphone, replace the word phone with the word god 220 00:12:20,080 --> 00:12:23,000 Speaker 1: and see how the sentence sounds actually. Just search Twitter 221 00:12:23,280 --> 00:12:25,880 Speaker 1: for some my phone statements and turned up things like 222 00:12:26,280 --> 00:12:29,160 Speaker 1: I hate going out when my God isn't fully charged, 223 00:12:29,720 --> 00:12:33,559 Speaker 1: my God died mid inst alive, It's been a big day. 224 00:12:33,760 --> 00:12:36,200 Speaker 1: High school really is a waste of time. I just 225 00:12:36,240 --> 00:12:38,680 Speaker 1: sit here on my God all day. Now. I wonder 226 00:12:38,679 --> 00:12:40,720 Speaker 1: if we could give the god a proper name like 227 00:12:40,840 --> 00:12:43,440 Speaker 1: phone Hovah or something. That would be kind of fun, 228 00:12:44,040 --> 00:12:48,040 Speaker 1: but no, it would more accurately represent the I mean, 229 00:12:48,080 --> 00:12:50,720 Speaker 1: the the value in time and attention that we give 230 00:12:50,800 --> 00:12:54,040 Speaker 1: this device. I think it is not exaggerating at all 231 00:12:54,080 --> 00:12:56,920 Speaker 1: to say that we now have a religious level of 232 00:12:56,960 --> 00:13:00,520 Speaker 1: devotion to these personal devices that are Internet connected. We 233 00:13:00,760 --> 00:13:04,160 Speaker 1: consult them in a way that if we if we 234 00:13:04,160 --> 00:13:07,439 Speaker 1: were suddenly exposed to this world, if we hadn't been 235 00:13:07,520 --> 00:13:10,160 Speaker 1: eased into it over the past, you know, ten years 236 00:13:10,240 --> 00:13:13,000 Speaker 1: or whatever, if we just popped into this future and 237 00:13:13,000 --> 00:13:15,839 Speaker 1: saw everybody checking their phones and staring at their phones 238 00:13:15,880 --> 00:13:18,000 Speaker 1: all the time, we we would think this was absurd. 239 00:13:18,559 --> 00:13:21,520 Speaker 1: We would think this was a ridiculous way to live. Yeah, 240 00:13:21,559 --> 00:13:23,640 Speaker 1: And I mean I have moments where I do snap 241 00:13:23,679 --> 00:13:26,200 Speaker 1: out of it and for a few minutes anyway, and 242 00:13:26,240 --> 00:13:29,760 Speaker 1: realize this is this is crazy that we're all pouring 243 00:13:29,800 --> 00:13:33,600 Speaker 1: ourselves into these devices all the time, and it it's 244 00:13:33,679 --> 00:13:37,040 Speaker 1: it's beyond mere convenience. And yet when people bring this 245 00:13:37,120 --> 00:13:39,680 Speaker 1: up in a satirical context, I often see people making 246 00:13:39,720 --> 00:13:42,200 Speaker 1: this point. Everybody's staring at their phones all the time. 247 00:13:42,640 --> 00:13:46,160 Speaker 1: It is treated as an indictment of the behavior of 248 00:13:46,240 --> 00:13:50,920 Speaker 1: people and not an indictment of the design of the technology. 249 00:13:50,960 --> 00:13:53,319 Speaker 1: If you notice that it's like, oh, all these stupid 250 00:13:53,360 --> 00:13:56,679 Speaker 1: people staring at their phones all the time? Yeah, wake up, 251 00:13:56,720 --> 00:13:59,200 Speaker 1: people are are often at times, there's there's sort of 252 00:13:59,240 --> 00:14:03,520 Speaker 1: this this this this anti youth vibe to it as well, like, yeah, 253 00:14:03,559 --> 00:14:06,679 Speaker 1: there's millennials won't stop looking at the fields. But I 254 00:14:06,679 --> 00:14:08,760 Speaker 1: think we all have seen enough examples to know that 255 00:14:09,760 --> 00:14:13,880 Speaker 1: humans of all ages are susceptible to the the the 256 00:14:13,960 --> 00:14:16,960 Speaker 1: sirens song of the smartphone. Right later in this pair 257 00:14:17,000 --> 00:14:18,960 Speaker 1: of episodes, we're going to be discussing a lot of 258 00:14:18,960 --> 00:14:23,560 Speaker 1: the specific ways that our technologies that are personal devices 259 00:14:23,600 --> 00:14:29,040 Speaker 1: and the apps within them are absolutely totally designed to 260 00:14:29,240 --> 00:14:32,840 Speaker 1: hijack our attention and make us behave this way. I 261 00:14:32,880 --> 00:14:35,680 Speaker 1: think it's not because people are stupid, it's not because 262 00:14:35,720 --> 00:14:39,800 Speaker 1: they're weak. It's how we are. We have certain vulnerabilities 263 00:14:39,880 --> 00:14:43,080 Speaker 1: in our brains, and these devices and the software within 264 00:14:43,200 --> 00:14:48,040 Speaker 1: them exploit those vulnerabilities brilliantly. Yeah, we have just we 265 00:14:48,120 --> 00:14:54,040 Speaker 1: have crafted an ever evolving neural parasite that we all 266 00:14:54,080 --> 00:14:56,360 Speaker 1: pay top dollar for. I want to discuss a couple 267 00:14:56,360 --> 00:14:59,960 Speaker 1: of findings about the effects of having of like phone 268 00:15:00,240 --> 00:15:03,400 Speaker 1: and interruptions, the interruptions they provide on our attention, and 269 00:15:03,440 --> 00:15:05,320 Speaker 1: the way this happens in our lives. Before we dig 270 00:15:05,360 --> 00:15:08,520 Speaker 1: more into the meat to the topic. Just in, Robert, 271 00:15:08,560 --> 00:15:09,960 Speaker 1: I think you might have seen the study. I think 272 00:15:10,000 --> 00:15:12,320 Speaker 1: maybe we talked about it. There was research out of 273 00:15:12,400 --> 00:15:15,280 Speaker 1: u T. Austin just last year in that found something 274 00:15:15,320 --> 00:15:20,600 Speaker 1: pretty weird. People who had their smartphone within reach did 275 00:15:20,760 --> 00:15:24,440 Speaker 1: worse on tests of cognitive performance than people who had 276 00:15:24,480 --> 00:15:28,560 Speaker 1: it in another room. This is even if the smartphone 277 00:15:28,920 --> 00:15:33,040 Speaker 1: was turned off. If it's visible or within an arm's length, 278 00:15:33,320 --> 00:15:36,480 Speaker 1: it was distracting enough to deduct from our effective I 279 00:15:36,600 --> 00:15:41,000 Speaker 1: Q and I have anecdotally noticed this in my own life. 280 00:15:41,360 --> 00:15:44,240 Speaker 1: I'll be working and having trouble focusing on work, even 281 00:15:44,320 --> 00:15:47,120 Speaker 1: if I'm not actually checking my phone, I will be 282 00:15:47,160 --> 00:15:49,720 Speaker 1: in a more distracted state of mind if I've got 283 00:15:49,760 --> 00:15:52,280 Speaker 1: my phone on me, And if I merely take my 284 00:15:52,320 --> 00:15:55,040 Speaker 1: phone to a different room of the house and shut 285 00:15:55,080 --> 00:15:57,680 Speaker 1: the door and then go back to my work, I 286 00:15:57,720 --> 00:16:01,680 Speaker 1: can focus more. I'm serious. I have found this as well. 287 00:16:01,680 --> 00:16:03,640 Speaker 1: If I can stick it in a drawer now, as 288 00:16:03,800 --> 00:16:05,800 Speaker 1: as I am saying, this as I am speaking into 289 00:16:05,800 --> 00:16:09,240 Speaker 1: the microphone. My phone is situated over here to the side. 290 00:16:09,280 --> 00:16:11,280 Speaker 1: I have it an airplane mode so that it won't 291 00:16:11,280 --> 00:16:15,400 Speaker 1: interrupt me. But still it's there in my peripheral vision, 292 00:16:15,880 --> 00:16:18,000 Speaker 1: like kind of staring at me with its with its 293 00:16:18,240 --> 00:16:21,520 Speaker 1: dark screen, and there's this this sense that's still this 294 00:16:21,680 --> 00:16:24,440 Speaker 1: is this idea that it might light up at any moment, 295 00:16:24,480 --> 00:16:26,160 Speaker 1: I might get some sort of an alert, even though 296 00:16:26,200 --> 00:16:27,680 Speaker 1: it should not be able to give it to me, 297 00:16:27,760 --> 00:16:30,720 Speaker 1: right yeah, even though it's not doing anything to you, 298 00:16:31,560 --> 00:16:34,280 Speaker 1: it's in your It's in your consciousness at some level. 299 00:16:34,360 --> 00:16:37,920 Speaker 1: Right in the same way that somebody's god belief becomes 300 00:16:37,920 --> 00:16:41,600 Speaker 1: a part of their conscience, the presence of your phone 301 00:16:41,720 --> 00:16:45,840 Speaker 1: becomes a part of your situational awareness. It's always lingering there. 302 00:16:45,840 --> 00:16:50,080 Speaker 1: It's always a potential, right yeah. And sometimes the potential 303 00:16:50,120 --> 00:16:53,000 Speaker 1: is just for mundane stuff to happen, like, oh, the 304 00:16:53,080 --> 00:16:55,800 Speaker 1: doorbell will ring at my house. That means a package 305 00:16:55,800 --> 00:16:57,800 Speaker 1: has been delivered, you know, and it's just I don't 306 00:16:57,880 --> 00:17:01,360 Speaker 1: need that information. It's just you know, a box of 307 00:17:01,360 --> 00:17:03,960 Speaker 1: of cat food. I don't need to be there, and 308 00:17:04,040 --> 00:17:06,720 Speaker 1: no additional plans need to be made. But yet the 309 00:17:06,760 --> 00:17:10,960 Speaker 1: delivery of cat food has interrupted my daily workflow. Yeah. 310 00:17:11,000 --> 00:17:14,480 Speaker 1: The English psychologist Glen Daniel Wilson I've read, has shown 311 00:17:14,560 --> 00:17:18,679 Speaker 1: that the distraction threat of things in the digital realm, 312 00:17:18,800 --> 00:17:21,760 Speaker 1: So like the example would be an unread message sitting 313 00:17:21,800 --> 00:17:24,160 Speaker 1: in your inbox, you're aware of that, and you've got 314 00:17:24,200 --> 00:17:27,199 Speaker 1: that distraction threat on your mind. It can lower our 315 00:17:27,240 --> 00:17:30,680 Speaker 1: effective i Q by about ten points. But then think 316 00:17:30,720 --> 00:17:33,880 Speaker 1: about what happens when you actually have the distractions coming in, 317 00:17:34,000 --> 00:17:37,000 Speaker 1: Like if you've got notifications turned on on your phone. 318 00:17:37,640 --> 00:17:41,479 Speaker 1: Interruptions matter a lot to your ability to focus on 319 00:17:41,520 --> 00:17:45,520 Speaker 1: projects and activities you care about. When a notification interrupts you, 320 00:17:45,960 --> 00:17:49,040 Speaker 1: it's not just the time you spend looking at the 321 00:17:49,080 --> 00:17:53,400 Speaker 1: interruption that you lose. Interruptions degrade the quality of your 322 00:17:53,440 --> 00:17:57,000 Speaker 1: focus on your primary activity. Some psychologists call this quote 323 00:17:57,000 --> 00:18:01,359 Speaker 1: a task shifting penalty. Robert, I'm sure you've felt this before, right, 324 00:18:01,400 --> 00:18:03,640 Speaker 1: I feel this all the time. Yeah. You get into 325 00:18:03,920 --> 00:18:06,920 Speaker 1: a certain workflow and then something pops you out of it, right, 326 00:18:07,320 --> 00:18:10,600 Speaker 1: and that's and that's when, or if you actually get 327 00:18:10,640 --> 00:18:13,240 Speaker 1: back into the task at all. According to a two 328 00:18:13,240 --> 00:18:16,880 Speaker 1: thousand five University of California, Irvine study by Gloria Mark 329 00:18:16,920 --> 00:18:20,520 Speaker 1: and colleagues. Regaining your original focus on a project or 330 00:18:20,560 --> 00:18:24,960 Speaker 1: task after an interruption takes people, on average over twenty 331 00:18:25,000 --> 00:18:28,600 Speaker 1: five minutes. And if that's they actually resume the task 332 00:18:28,720 --> 00:18:31,520 Speaker 1: on the same day they were interrupted. Sometimes they don't 333 00:18:31,560 --> 00:18:34,520 Speaker 1: resume it for the whole day. So what happens when 334 00:18:34,560 --> 00:18:36,680 Speaker 1: you get when you're working on something or you're focusing 335 00:18:36,720 --> 00:18:40,320 Speaker 1: on something, is if you're gonna get interrupted, it takes 336 00:18:40,359 --> 00:18:42,760 Speaker 1: you longer to get back into the zone. And that's 337 00:18:43,000 --> 00:18:45,800 Speaker 1: if you even pick the task back up, and sometimes 338 00:18:45,800 --> 00:18:48,720 Speaker 1: it will take you a while to do that. Interruptions 339 00:18:48,720 --> 00:18:52,000 Speaker 1: are more than just the content of the interruption. They 340 00:18:52,040 --> 00:18:55,680 Speaker 1: totally degrade the quality of our engagement with every goal 341 00:18:55,800 --> 00:18:58,159 Speaker 1: pursuit we care about. You know, this reminds me of 342 00:18:58,320 --> 00:19:00,879 Speaker 1: some some research I was putting in on another topic 343 00:19:01,280 --> 00:19:05,280 Speaker 1: where an individual touched on the nature of torture and 344 00:19:05,320 --> 00:19:09,480 Speaker 1: one of the psychological aspects of torture is making the 345 00:19:09,640 --> 00:19:13,959 Speaker 1: torture recipient unsure about what is going to happen at 346 00:19:13,960 --> 00:19:17,280 Speaker 1: any given time. And this is a power that we 347 00:19:17,520 --> 00:19:20,400 Speaker 1: have handed not only over to the torture but also 348 00:19:20,480 --> 00:19:24,600 Speaker 1: to the smartphone and various other devices in our life. Well, yeah, 349 00:19:24,640 --> 00:19:27,080 Speaker 1: we were just recently reading some stuff about pain and 350 00:19:27,119 --> 00:19:30,800 Speaker 1: the role that that uncertainty and anxiety plays in the 351 00:19:30,880 --> 00:19:34,680 Speaker 1: amplification of the sensation of pain. So anyway, given all 352 00:19:34,720 --> 00:19:37,840 Speaker 1: of this, I have to start wondering in a quite 353 00:19:37,960 --> 00:19:41,000 Speaker 1: literal sense. I I don't mean this as an exaggeration. 354 00:19:41,440 --> 00:19:45,679 Speaker 1: Are we currently living in an attention dystopia? All right? 355 00:19:45,880 --> 00:19:48,399 Speaker 1: You think about that while we take a quick break, 356 00:19:48,800 --> 00:19:53,760 Speaker 1: and then we'll come back and try to regain focus. 357 00:19:52,119 --> 00:19:55,840 Speaker 1: Than alright, we're back now. I think we should think 358 00:19:55,840 --> 00:19:59,080 Speaker 1: about the idea of attention. We've already discussed that attention 359 00:19:59,520 --> 00:20:03,080 Speaker 1: essential is our lives. Attention is what we value in 360 00:20:03,119 --> 00:20:05,640 Speaker 1: our lives. When you think about the life you want 361 00:20:05,680 --> 00:20:08,159 Speaker 1: to live, what you really mean is you want to 362 00:20:08,160 --> 00:20:12,680 Speaker 1: be able to spend your attention on certain things, right, Yeah, 363 00:20:12,720 --> 00:20:15,639 Speaker 1: Because I mean when you start thinking about what attention is, 364 00:20:15,680 --> 00:20:18,760 Speaker 1: it's more than just looking at something or thinking about 365 00:20:18,840 --> 00:20:22,000 Speaker 1: something momentarily. It is Uh, it's it's crucial to what 366 00:20:22,040 --> 00:20:25,000 Speaker 1: I've seen referred to as the selective directedness of our 367 00:20:25,040 --> 00:20:28,480 Speaker 1: mental lives, right, having a having a will for how 368 00:20:28,520 --> 00:20:30,879 Speaker 1: you use your mind. Yeah, I mean, in some models 369 00:20:30,880 --> 00:20:34,000 Speaker 1: of consciousness that we've discussed on the show even center 370 00:20:34,200 --> 00:20:37,320 Speaker 1: on the manner in which humans utilize their attentive powers. 371 00:20:37,880 --> 00:20:41,600 Speaker 1: As such, there are multiple philosophical arguments for the nature 372 00:20:41,640 --> 00:20:44,640 Speaker 1: of human attention and the reasons for our limited process 373 00:20:44,680 --> 00:20:47,720 Speaker 1: and capability. Yeah, but ultimately it's kind of like the 374 00:20:47,760 --> 00:20:50,000 Speaker 1: primitive example I gave earlier. You know, there's no doubt 375 00:20:50,040 --> 00:20:52,920 Speaker 1: that we evolve to handle various streams of stimuli at 376 00:20:52,960 --> 00:20:56,919 Speaker 1: once and to apply selective attention based on their importance. 377 00:20:57,440 --> 00:20:59,560 Speaker 1: This is why we can go to a party and 378 00:20:59,600 --> 00:21:02,280 Speaker 1: you can tune out all the other voices in the 379 00:21:02,320 --> 00:21:05,080 Speaker 1: party except for the person you're talking to, or that 380 00:21:05,160 --> 00:21:07,480 Speaker 1: you can do the exact opposite, tune out the person 381 00:21:07,720 --> 00:21:09,879 Speaker 1: you're supposed to be talking to and listen to a 382 00:21:09,920 --> 00:21:12,840 Speaker 1: more interesting conversation in the vicinity. But one thing you'll 383 00:21:12,840 --> 00:21:15,880 Speaker 1: notice if you do that is it really degrades what 384 00:21:15,920 --> 00:21:19,399 Speaker 1: you take away from any of those experiences. That's so 385 00:21:19,440 --> 00:21:21,840 Speaker 1: like if you were trying to talk to one person 386 00:21:21,960 --> 00:21:24,480 Speaker 1: in a crowded room, you can focus on them and 387 00:21:24,520 --> 00:21:27,040 Speaker 1: pay attention to them, But if you're trying to switch 388 00:21:27,160 --> 00:21:31,200 Speaker 1: between paying attention to them and eavesdropping on somebody nearby, 389 00:21:31,320 --> 00:21:33,520 Speaker 1: you're not really going to get a very good sense 390 00:21:33,560 --> 00:21:36,320 Speaker 1: of either one, right, But still there are limits to 391 00:21:36,400 --> 00:21:38,560 Speaker 1: what you can encounter in a like a physical real 392 00:21:38,600 --> 00:21:43,080 Speaker 1: world scenario. You know, even if the tribal meeting is 393 00:21:43,080 --> 00:21:45,600 Speaker 1: is rather crowded, there's only so much you're gonna be 394 00:21:45,600 --> 00:21:47,320 Speaker 1: able to hear over the shouting and of course the 395 00:21:47,520 --> 00:21:52,040 Speaker 1: sacrifices to to ug uh you know, I mean, it's uh, 396 00:21:52,480 --> 00:21:54,919 Speaker 1: there are going to be limits in place. There's the 397 00:21:54,960 --> 00:21:58,360 Speaker 1: world of multiple stimuli streams and fixed and moving objects 398 00:21:58,359 --> 00:22:00,760 Speaker 1: that we evolve to thrive in, and then there's this 399 00:22:00,840 --> 00:22:04,520 Speaker 1: world we've made a place of over stimulation, over choice 400 00:22:04,800 --> 00:22:07,520 Speaker 1: in myriad voices to command our psyche. This is a 401 00:22:07,560 --> 00:22:11,199 Speaker 1: really good point because we could easily think of the 402 00:22:11,280 --> 00:22:15,600 Speaker 1: virtual world we've created, the world within our phone operating systems, 403 00:22:15,680 --> 00:22:20,200 Speaker 1: or computer operating systems, or the web, especially the modern web, 404 00:22:20,240 --> 00:22:24,480 Speaker 1: which is so highly designed focused, or our social media 405 00:22:24,520 --> 00:22:25,919 Speaker 1: apps and all that. We could think of that as 406 00:22:25,960 --> 00:22:29,000 Speaker 1: an extension of our world. It's just like the world 407 00:22:29,080 --> 00:22:32,520 Speaker 1: is going into this virtual space. But I don't think 408 00:22:32,520 --> 00:22:35,640 Speaker 1: it's like that. This is not an extension of our 409 00:22:35,640 --> 00:22:38,520 Speaker 1: physical lives in the physical space. It is a different 410 00:22:38,600 --> 00:22:42,800 Speaker 1: world with different properties, and it operates on different rules 411 00:22:43,320 --> 00:22:45,320 Speaker 1: and the rules it operates on. As we're about to 412 00:22:45,359 --> 00:22:49,080 Speaker 1: get into our the monopolization of your attention primarily, that 413 00:22:49,320 --> 00:22:53,400 Speaker 1: is the ultimate goal of most of this stuff we 414 00:22:53,440 --> 00:22:57,960 Speaker 1: interact with in the information age. The attention devouring technosphere, 415 00:22:57,960 --> 00:23:01,240 Speaker 1: i would say, is not an accident. It's by design. 416 00:23:01,359 --> 00:23:06,160 Speaker 1: These apps and devices are specifically engineered to suck out 417 00:23:06,200 --> 00:23:08,439 Speaker 1: as much of our attention as they can get. And 418 00:23:08,480 --> 00:23:10,720 Speaker 1: you know this, rightly, feel it. You can see that 419 00:23:10,720 --> 00:23:13,560 Speaker 1: in every piece of clickbait headlines, right you know, or 420 00:23:13,880 --> 00:23:17,080 Speaker 1: anything like that. It's trying to get your eyeballs. People 421 00:23:17,119 --> 00:23:19,640 Speaker 1: in the industry talk about the idea of eyeballs. They 422 00:23:19,720 --> 00:23:23,320 Speaker 1: want eyeballs on things. Yeah, there is kind of the 423 00:23:23,200 --> 00:23:26,320 Speaker 1: the content product, and then there is the the the 424 00:23:26,359 --> 00:23:30,879 Speaker 1: app product, the interface product, and all of them are saying, hey, 425 00:23:30,920 --> 00:23:33,159 Speaker 1: look at me, look at me, do this, interact with me, 426 00:23:33,320 --> 00:23:37,000 Speaker 1: use me more, read me more, consumedly more, and ultimately 427 00:23:37,280 --> 00:23:40,359 Speaker 1: probably purchase more. Right exactly, so we should ask the 428 00:23:40,440 --> 00:23:45,120 Speaker 1: question why, like why are so many highly calibrated technological 429 00:23:45,160 --> 00:23:48,239 Speaker 1: monsters vying for our attention in the first place? And 430 00:23:48,280 --> 00:23:51,440 Speaker 1: this is an interesting question. One paper we might want 431 00:23:51,480 --> 00:23:53,320 Speaker 1: to mention is that back In two thousand twelve, the 432 00:23:53,440 --> 00:23:56,800 Speaker 1: M I. T economists Eric Brynjolfsson and U he Oh 433 00:23:57,359 --> 00:24:00,639 Speaker 1: wrote a paper called the Attention Economy. It was essentially 434 00:24:00,640 --> 00:24:04,960 Speaker 1: geared toward trying to quantify the economic value, the money 435 00:24:05,080 --> 00:24:09,280 Speaker 1: value provided by free content and services on the Internet, 436 00:24:09,640 --> 00:24:12,800 Speaker 1: because obviously the abundance of all these free services, you know, 437 00:24:12,880 --> 00:24:15,359 Speaker 1: websites that you go to for free, apps that you 438 00:24:15,440 --> 00:24:18,520 Speaker 1: use for free, all that stuff that's offered online is 439 00:24:18,600 --> 00:24:22,040 Speaker 1: providing value to people because they're choosing to spend their 440 00:24:22,040 --> 00:24:25,800 Speaker 1: attention on it, and that value isn't getting directly measured 441 00:24:25,800 --> 00:24:28,240 Speaker 1: in the economy the way the value of many other 442 00:24:28,280 --> 00:24:31,439 Speaker 1: consumer services would be. Like if you want to measure 443 00:24:31,520 --> 00:24:35,320 Speaker 1: the value of a bottle of drain, Oh, it's the 444 00:24:35,440 --> 00:24:37,800 Speaker 1: value is what people go to the store to spend 445 00:24:37,840 --> 00:24:41,720 Speaker 1: on it. Right, But the consumer is not spending money 446 00:24:42,280 --> 00:24:44,920 Speaker 1: on these apps and devices and stuff, but they are 447 00:24:44,960 --> 00:24:49,560 Speaker 1: spending something something that has some kind of value that 448 00:24:49,640 --> 00:24:52,520 Speaker 1: we need a way to measure. Well, it sounds an 449 00:24:52,520 --> 00:24:55,159 Speaker 1: awful lot like engagement to get into sort of the 450 00:24:55,160 --> 00:24:59,320 Speaker 1: business terms of websites and and even podcasting, right, The 451 00:24:59,359 --> 00:25:01,880 Speaker 1: idea that you want you want to user engagement, right, 452 00:25:02,200 --> 00:25:04,600 Speaker 1: and the user, the user or the consumer. One of 453 00:25:04,640 --> 00:25:06,639 Speaker 1: the things we've got to understand is that the user, 454 00:25:06,760 --> 00:25:11,080 Speaker 1: the consumer of these information services, these free internet services 455 00:25:11,119 --> 00:25:15,400 Speaker 1: like Facebook or any any digital media, even what we do, 456 00:25:15,920 --> 00:25:19,400 Speaker 1: the user is not really the customer because they're not 457 00:25:19,720 --> 00:25:24,600 Speaker 1: paying money. The customer is going to be the advertiser, 458 00:25:25,200 --> 00:25:28,960 Speaker 1: and the user is the product. Essentially, you cultivate a 459 00:25:29,000 --> 00:25:33,280 Speaker 1: product by getting eyeballs, and then you sell access to 460 00:25:33,320 --> 00:25:37,480 Speaker 1: those eyeballs that you gather to an advertiser in any way. 461 00:25:37,520 --> 00:25:40,680 Speaker 1: That paper I mentioned earlier, those two economists estimated that 462 00:25:40,960 --> 00:25:43,879 Speaker 1: over at the time, this was back in two thousand twelve, quote, 463 00:25:43,880 --> 00:25:47,960 Speaker 1: the increase in consumer surplus created by free internet services 464 00:25:48,080 --> 00:25:51,840 Speaker 1: was quote over one hundred billion dollars per year in 465 00:25:51,880 --> 00:25:55,760 Speaker 1: the US alone. There's an enormous amount of perceived value 466 00:25:55,840 --> 00:26:00,199 Speaker 1: or wealth being offered in these free internet services. But 467 00:26:00,560 --> 00:26:04,240 Speaker 1: somebody that they exist mainly because we live at a 468 00:26:04,280 --> 00:26:07,760 Speaker 1: time when your attention alone is worth money to somebody else. 469 00:26:08,080 --> 00:26:10,720 Speaker 1: So you mentioned the idea of ad supported media. Most 470 00:26:10,800 --> 00:26:13,639 Speaker 1: of the internet and online services that you uh that 471 00:26:13,720 --> 00:26:16,399 Speaker 1: you use are offered to you for free, and it 472 00:26:16,440 --> 00:26:19,119 Speaker 1: doesn't have to be this way, right, We could have 473 00:26:19,160 --> 00:26:22,760 Speaker 1: a tech economy where everything that you got over the 474 00:26:22,800 --> 00:26:27,240 Speaker 1: Internet or through your devices had a fee to access, right, 475 00:26:27,640 --> 00:26:30,239 Speaker 1: Do you ever think about why it's not that way? Well, 476 00:26:30,280 --> 00:26:32,199 Speaker 1: I mean both. Part of it is that we have 477 00:26:32,359 --> 00:26:34,800 Speaker 1: we have grown accustomed to getting things for free on 478 00:26:34,840 --> 00:26:38,320 Speaker 1: the Internet, and that applies to everything from newspaper articles 479 00:26:38,400 --> 00:26:41,040 Speaker 1: to music, and that's been one of the that's been 480 00:26:41,040 --> 00:26:44,320 Speaker 1: one of the hurdles for for online business to try 481 00:26:44,320 --> 00:26:47,320 Speaker 1: and figure out how to get people to pay for things. Again, 482 00:26:47,400 --> 00:26:51,320 Speaker 1: do you think free music sharing through piracy and Napster 483 00:26:51,440 --> 00:26:53,320 Speaker 1: and all that in the early days of the Internet 484 00:26:54,280 --> 00:26:58,959 Speaker 1: helped shape the free ad supported information content of the 485 00:26:59,000 --> 00:27:01,800 Speaker 1: Internet today, Like people got trained on the idea that 486 00:27:02,440 --> 00:27:05,560 Speaker 1: it wasn't really stealing because you know, I can't remember 487 00:27:05,560 --> 00:27:07,439 Speaker 1: who this is. But there was once a comedian I 488 00:27:07,520 --> 00:27:10,840 Speaker 1: heard doing a comedy act about those old piracy commercials 489 00:27:10,840 --> 00:27:13,840 Speaker 1: that are like, you wouldn't steal a car, why would you? 490 00:27:13,960 --> 00:27:16,919 Speaker 1: Why would you steal music? But the comedian said, you know, 491 00:27:17,720 --> 00:27:20,080 Speaker 1: I would steal a car if all I had to 492 00:27:20,119 --> 00:27:23,439 Speaker 1: do was touch the car, and then an instant copy 493 00:27:23,520 --> 00:27:26,000 Speaker 1: of the car would be created and I could have 494 00:27:26,080 --> 00:27:29,679 Speaker 1: it and the original person could keep their car, and 495 00:27:29,880 --> 00:27:32,280 Speaker 1: you know, it created the sense because it was digital 496 00:27:32,320 --> 00:27:35,840 Speaker 1: information that could be copied without consequence, that you weren't 497 00:27:35,840 --> 00:27:39,520 Speaker 1: really taking from anybody. You're just getting a copy of 498 00:27:39,560 --> 00:27:41,920 Speaker 1: a thing. Yeah, it's out there, and all I am 499 00:27:41,960 --> 00:27:44,560 Speaker 1: doing is breathing it in as if it were air. Yeah. 500 00:27:44,760 --> 00:27:47,800 Speaker 1: And so it's like trained people on this idea that that, 501 00:27:48,040 --> 00:27:51,320 Speaker 1: you know, information should be free, and as has been said, 502 00:27:51,359 --> 00:27:54,760 Speaker 1: information wants to be free, but it's led to this 503 00:27:54,800 --> 00:27:59,240 Speaker 1: world where Okay, to produce information, you have to spend money, right, 504 00:27:59,560 --> 00:28:02,199 Speaker 1: Like anything that is made on the internet. If you 505 00:28:02,200 --> 00:28:04,000 Speaker 1: want to run a website, if you want to write 506 00:28:04,000 --> 00:28:06,320 Speaker 1: an article, if you want to create a podcast, if 507 00:28:06,320 --> 00:28:08,480 Speaker 1: you want to create a video, almost none of this 508 00:28:08,560 --> 00:28:11,040 Speaker 1: can be done for free. You have to invest in it. 509 00:28:11,320 --> 00:28:13,560 Speaker 1: And so that's got to be paid for somehow. And 510 00:28:13,600 --> 00:28:16,840 Speaker 1: so the way everything works, including us here, is that 511 00:28:16,880 --> 00:28:19,760 Speaker 1: you pay for things by running ads. That's right, I mean, 512 00:28:19,800 --> 00:28:22,240 Speaker 1: And that's not even getting into the idea that people 513 00:28:22,240 --> 00:28:25,040 Speaker 1: are putting their time into creating things, and that time 514 00:28:25,040 --> 00:28:27,800 Speaker 1: has to come out of your life and you and 515 00:28:27,880 --> 00:28:31,760 Speaker 1: the individual the individual creator has bills to pay and 516 00:28:32,080 --> 00:28:35,600 Speaker 1: a route to maintain, and needs to have food eat, right, 517 00:28:35,640 --> 00:28:39,280 Speaker 1: and so the way technology companies and digital media companies 518 00:28:39,320 --> 00:28:42,160 Speaker 1: can pay for themselves is to run advertisements. And the 519 00:28:42,200 --> 00:28:44,880 Speaker 1: way they can increase the amount of money they're making 520 00:28:45,440 --> 00:28:48,400 Speaker 1: uh and that they can charge advertisers is basically twofold. 521 00:28:48,480 --> 00:28:52,640 Speaker 1: I think one is simply by selling more exposure to 522 00:28:52,640 --> 00:28:56,239 Speaker 1: your eyeballs to the user's eyeballs. The more attention you 523 00:28:56,240 --> 00:28:59,720 Speaker 1: spend on a platform, the more advertising that platform can 524 00:28:59,720 --> 00:29:01,840 Speaker 1: show you. So if you go to Facebook and they 525 00:29:01,880 --> 00:29:04,520 Speaker 1: get you to stay on Facebook twice as long, they 526 00:29:04,520 --> 00:29:07,520 Speaker 1: can show you twice as many ads, so they can 527 00:29:07,600 --> 00:29:11,840 Speaker 1: charge the advertisers that much more. You increase the supply 528 00:29:11,960 --> 00:29:14,320 Speaker 1: of attention that can be sold to the customer, which 529 00:29:14,360 --> 00:29:17,680 Speaker 1: is the advertiser or The other main way media companies 530 00:29:17,720 --> 00:29:20,720 Speaker 1: can make money off your attention is by gathering data 531 00:29:20,840 --> 00:29:24,760 Speaker 1: about you. The more you use a social media site 532 00:29:24,800 --> 00:29:27,440 Speaker 1: like Facebook or whatever, the more social media, the more 533 00:29:27,520 --> 00:29:30,280 Speaker 1: that social media site knows about who you are and 534 00:29:30,320 --> 00:29:32,680 Speaker 1: where you live, and what you're interested in and what 535 00:29:32,800 --> 00:29:35,280 Speaker 1: you'll spend money on and so forth. And the more 536 00:29:35,320 --> 00:29:37,640 Speaker 1: it knows about who you are, the better they know 537 00:29:37,720 --> 00:29:40,000 Speaker 1: how to sell you stuff. You might have noticed that 538 00:29:40,320 --> 00:29:42,239 Speaker 1: in the early days of the Internet when you were 539 00:29:42,240 --> 00:29:46,040 Speaker 1: getting targeted ads, they were bad. You. I remember the 540 00:29:46,120 --> 00:29:48,440 Speaker 1: days when I was getting targeted ads on Facebook that 541 00:29:48,440 --> 00:29:52,000 Speaker 1: were for ridiculous things I would never want. But over time, 542 00:29:52,080 --> 00:29:55,040 Speaker 1: targeted advertising has gotten a lot better, hasn't it. Oh, 543 00:29:55,080 --> 00:29:57,280 Speaker 1: it's gotten really really good. It's like to the point 544 00:29:57,280 --> 00:30:00,080 Speaker 1: where you you look up an item on Amazon on 545 00:30:00,200 --> 00:30:03,000 Speaker 1: you're thinking about thinking about getting it. For instance, recently 546 00:30:03,040 --> 00:30:06,360 Speaker 1: I looked at the blu ray of the movie Screamers, 547 00:30:06,400 --> 00:30:08,920 Speaker 1: A Wonderful Fishman, which one the one from the nineties 548 00:30:09,000 --> 00:30:11,280 Speaker 1: with Peter Weller or the no, that's a great film too, 549 00:30:11,280 --> 00:30:15,040 Speaker 1: but though this is the Fishman movie and the Italian one, yeah, 550 00:30:15,080 --> 00:30:17,560 Speaker 1: the Italian one. And I wasn't actually gonna buy, you know, 551 00:30:17,560 --> 00:30:18,920 Speaker 1: but I looked it up. I was curious to see 552 00:30:18,920 --> 00:30:21,400 Speaker 1: if the blue, if the blue ray was available, And 553 00:30:21,440 --> 00:30:24,520 Speaker 1: then it seems like every day for the following week, 554 00:30:24,840 --> 00:30:26,880 Speaker 1: when I first went onto Facebook, i'd be hit with 555 00:30:26,880 --> 00:30:30,360 Speaker 1: that Amazon targeted ad for that very blu ray disc. 556 00:30:30,560 --> 00:30:33,160 Speaker 1: Now that's the easy part. You've already looked that up. 557 00:30:33,200 --> 00:30:35,920 Speaker 1: They've sold that data to somebody else, or they're using 558 00:30:35,920 --> 00:30:39,080 Speaker 1: that data to run targeted ads for you on another platform. 559 00:30:39,120 --> 00:30:42,400 Speaker 1: That's pretty straightforward. It gets creepier though, what about what 560 00:30:42,600 --> 00:30:45,120 Speaker 1: about when they show you ads for things that you've 561 00:30:45,160 --> 00:30:47,480 Speaker 1: never looked up as far as you can recall, but 562 00:30:47,520 --> 00:30:50,560 Speaker 1: they really are in your wheelhouse. Yeah, yeah, I'm into that, 563 00:30:50,760 --> 00:30:53,440 Speaker 1: Like it's another Fishman movie that I'm not familiar with, 564 00:30:53,600 --> 00:30:56,640 Speaker 1: and or it's a Fishman comic, or it's a an 565 00:30:56,720 --> 00:31:00,920 Speaker 1: All Gillman, you know, doom that band, and I'm like, oh, 566 00:31:00,960 --> 00:31:02,840 Speaker 1: I had no idea, but that sounds like the kind 567 00:31:02,840 --> 00:31:04,480 Speaker 1: of thing would be into, right. So this is another 568 00:31:04,520 --> 00:31:07,680 Speaker 1: reason the providers of these technology platforms want your attention. 569 00:31:07,840 --> 00:31:11,280 Speaker 1: It's because the data is valuable to them directly. The 570 00:31:11,320 --> 00:31:14,200 Speaker 1: more you use the platform, the more attention you give them, 571 00:31:14,240 --> 00:31:17,480 Speaker 1: the more data about you you give them, and the 572 00:31:17,520 --> 00:31:20,000 Speaker 1: more they can target you in the future, the more 573 00:31:20,000 --> 00:31:23,040 Speaker 1: they learn about how to glazer in on exactly what 574 00:31:23,080 --> 00:31:26,160 Speaker 1: you'll click on and buy. So it's also it's valuable 575 00:31:26,200 --> 00:31:28,760 Speaker 1: for that reason. It's also valuable as the data is 576 00:31:28,880 --> 00:31:31,000 Speaker 1: valuable as a commodity that can be sold to third 577 00:31:31,040 --> 00:31:33,600 Speaker 1: parties that want to know things about you in order 578 00:31:33,640 --> 00:31:36,959 Speaker 1: to target sales or advertising. And actually, I'd put in 579 00:31:37,120 --> 00:31:41,040 Speaker 1: a third motivation for capturing more attention, and Robert, I 580 00:31:41,080 --> 00:31:43,040 Speaker 1: wonder what you think about this. This doesn't get cited 581 00:31:43,080 --> 00:31:46,440 Speaker 1: as much, but I would argue a third major motivation 582 00:31:46,520 --> 00:31:50,320 Speaker 1: for commanding your attention on a platform is habit formation. 583 00:31:51,000 --> 00:31:55,320 Speaker 1: What do we usually reflexively do with our attention when 584 00:31:55,320 --> 00:31:58,400 Speaker 1: we've got free time. We do what we've done before, right, 585 00:31:58,480 --> 00:32:00,600 Speaker 1: that's right. We fall into the hal bit of going 586 00:32:00,640 --> 00:32:03,880 Speaker 1: on Facebook, or for me, one of the worst is 587 00:32:03,920 --> 00:32:07,160 Speaker 1: just going into my my email, not to actually read 588 00:32:07,200 --> 00:32:10,760 Speaker 1: meaningful emails, but to clear out garbage, to clear out 589 00:32:10,960 --> 00:32:15,160 Speaker 1: not even necessarily spam, but the sort of promotional emails 590 00:32:15,200 --> 00:32:17,320 Speaker 1: that I think you signed up for, that I signed 591 00:32:17,360 --> 00:32:19,480 Speaker 1: up for, that I'm for some reason, I'm still like, 592 00:32:19,520 --> 00:32:22,520 Speaker 1: I don't want to actually unsubscribe, but I'm also not 593 00:32:22,560 --> 00:32:24,960 Speaker 1: going to read them, but they just keep coming. I'm 594 00:32:25,000 --> 00:32:28,840 Speaker 1: just sort of maintaining this weed garden of of crap, 595 00:32:29,800 --> 00:32:32,200 Speaker 1: and it's taking away from my day. And that's a 596 00:32:32,280 --> 00:32:35,680 Speaker 1: habit you formed. We are creatures of habit. We mostly 597 00:32:35,760 --> 00:32:38,600 Speaker 1: do what we've trained ourselves to do in the past. 598 00:32:38,720 --> 00:32:42,520 Speaker 1: Most of our behaviors are not novel. Most of the time, 599 00:32:42,600 --> 00:32:45,080 Speaker 1: we tend to shop at the stores we've shopped at before. 600 00:32:45,400 --> 00:32:47,200 Speaker 1: Most of the time, we go to the restaurants we've 601 00:32:47,240 --> 00:32:49,720 Speaker 1: gone to before, and we do the same thing in 602 00:32:49,720 --> 00:32:53,440 Speaker 1: our digital spaces. Every time you make a decision about 603 00:32:53,480 --> 00:32:56,880 Speaker 1: what to do with your attention, you are training your 604 00:32:56,960 --> 00:33:00,640 Speaker 1: brain to make similar decisions or the same decision in 605 00:33:00,680 --> 00:33:03,440 Speaker 1: the future, which is kind of a daunting thought, isn't it. 606 00:33:03,680 --> 00:33:06,760 Speaker 1: I think that's absolutely true. But you need to think 607 00:33:06,760 --> 00:33:08,760 Speaker 1: about the fact that every time you act, even in 608 00:33:08,840 --> 00:33:12,760 Speaker 1: trivial activities throughout the day, you are altering the source 609 00:33:12,840 --> 00:33:17,120 Speaker 1: code of your own brain to encourage your future self 610 00:33:17,160 --> 00:33:21,600 Speaker 1: to behave more like your behaving right now. And of 611 00:33:21,640 --> 00:33:24,240 Speaker 1: course all this is even more nefarious when you realize 612 00:33:24,240 --> 00:33:26,400 Speaker 1: that not everyone is selling a product. A lot of 613 00:33:26,400 --> 00:33:30,160 Speaker 1: people are selling an idea or a or or or 614 00:33:30,240 --> 00:33:33,640 Speaker 1: some sort of vision of reality, a political ideal, a 615 00:33:33,800 --> 00:33:38,400 Speaker 1: social ideal. They are attempting to tell you how to 616 00:33:38,480 --> 00:33:41,280 Speaker 1: feel and how to think. But that all depends, of course, 617 00:33:41,440 --> 00:33:45,000 Speaker 1: on continued access to that consumer of information. Right. So, 618 00:33:45,120 --> 00:33:48,200 Speaker 1: if you are a platform like Facebook, Twitter, Instagram, any 619 00:33:48,240 --> 00:33:51,560 Speaker 1: of that, and you want people's attention, it's it's important 620 00:33:51,600 --> 00:33:54,920 Speaker 1: to help people form patterns of habit in which they 621 00:33:55,080 --> 00:33:57,960 Speaker 1: learn to give that attention to you. They'll be more 622 00:33:58,000 --> 00:33:59,880 Speaker 1: likely to keep giving it to you in the future 623 00:33:59,880 --> 00:34:03,640 Speaker 1: because it's become familiar and easy and habitual. It's what 624 00:34:03,680 --> 00:34:06,560 Speaker 1: they've done before, so it's what they'll do again now. 625 00:34:06,600 --> 00:34:08,560 Speaker 1: I want to come back and say that we want 626 00:34:08,560 --> 00:34:11,239 Speaker 1: to be clear. I think that not every bid for 627 00:34:11,440 --> 00:34:15,040 Speaker 1: your attention is bad or evil. Right. There are lots 628 00:34:15,040 --> 00:34:17,320 Speaker 1: of things we do with our attention that we consider 629 00:34:17,440 --> 00:34:21,120 Speaker 1: good uses of our attention. And there's lots of for example, 630 00:34:21,200 --> 00:34:25,120 Speaker 1: ads supported media that obviously we think is worthwhile and 631 00:34:25,120 --> 00:34:28,439 Speaker 1: worth having. I mean, I listen to podcasts that run 632 00:34:28,520 --> 00:34:30,920 Speaker 1: ads in order to pay the bills and keep producing 633 00:34:30,960 --> 00:34:33,439 Speaker 1: the podcast. I'm fine with that if I care about 634 00:34:33,480 --> 00:34:36,719 Speaker 1: the podcast and want to make that decision deliberately with 635 00:34:36,760 --> 00:34:40,200 Speaker 1: my time, right Yeah. There's nothing evil about that, right, 636 00:34:40,239 --> 00:34:42,160 Speaker 1: I Mean, that's just we've recognized that's part of the 637 00:34:42,200 --> 00:34:45,560 Speaker 1: economics of how the content we like and care about 638 00:34:45,640 --> 00:34:49,319 Speaker 1: is produced. What becomes more nefarious is when we find 639 00:34:49,320 --> 00:34:56,080 Speaker 1: ourselves using technological services and consuming content compulsively or impulsively 640 00:34:56,280 --> 00:35:00,560 Speaker 1: reflexively without really making a decision about something we want 641 00:35:00,600 --> 00:35:03,839 Speaker 1: to consume or that we care about, right yeah, Yeah, 642 00:35:03,880 --> 00:35:07,319 Speaker 1: where it's just this click click, click, it's just this uh, 643 00:35:07,800 --> 00:35:10,600 Speaker 1: it's it's not even becomes a situation where you're not 644 00:35:10,640 --> 00:35:16,480 Speaker 1: even really absorbing information. You're just dipping into the information 645 00:35:16,600 --> 00:35:20,680 Speaker 1: stream exactly. Yeah, you are becoming a passive receiver of 646 00:35:20,760 --> 00:35:23,360 Speaker 1: what is being fed to you by a platform that 647 00:35:23,440 --> 00:35:28,399 Speaker 1: you have habitually formed an information consumption relationship with. You're 648 00:35:28,480 --> 00:35:32,319 Speaker 1: on Facebook, and it's just streaming into you. Yeah, and 649 00:35:32,320 --> 00:35:34,920 Speaker 1: and there's so there's this like informational context to it, 650 00:35:34,960 --> 00:35:37,640 Speaker 1: but also there's there's a tactile context to it as well. 651 00:35:38,000 --> 00:35:40,920 Speaker 1: You know, the swipe of fingers over the over the screen, 652 00:35:41,040 --> 00:35:44,600 Speaker 1: and then the the acknowledgement that you have clicked something, 653 00:35:44,920 --> 00:35:47,239 Speaker 1: you've checked something off of this list. In a way, 654 00:35:47,440 --> 00:35:49,560 Speaker 1: I think we should take a quick break and pay 655 00:35:49,560 --> 00:35:52,200 Speaker 1: the bills with some advertising, and then we should come 656 00:35:52,200 --> 00:35:54,719 Speaker 1: back and look at some actual data on the way 657 00:35:54,760 --> 00:35:57,759 Speaker 1: people use their attention on digital devices and how they 658 00:35:57,760 --> 00:36:02,680 Speaker 1: feel about that. Thank alright, we're back now. You might 659 00:36:02,719 --> 00:36:05,360 Speaker 1: have seen this idea of like the distractions and the 660 00:36:05,400 --> 00:36:07,920 Speaker 1: attention war come up in the in the news over 661 00:36:07,960 --> 00:36:11,160 Speaker 1: the past year or two with respect to technology, apps 662 00:36:11,160 --> 00:36:13,359 Speaker 1: and social media. And one reason you might have seen 663 00:36:13,440 --> 00:36:16,120 Speaker 1: something about this is that words been spreading about a 664 00:36:16,120 --> 00:36:21,160 Speaker 1: particular movement and nonprofit organization called Time Well Spent, which 665 00:36:21,239 --> 00:36:23,760 Speaker 1: was founded by a guy I mentioned earlier, a former 666 00:36:23,800 --> 00:36:27,240 Speaker 1: Google employee named Tristan Harris, and a number of other people. 667 00:36:27,800 --> 00:36:30,560 Speaker 1: And these people were not the first to point out 668 00:36:30,600 --> 00:36:33,120 Speaker 1: that the media are controlling the economy of our attention 669 00:36:33,120 --> 00:36:36,680 Speaker 1: in negative ways, but having worked in the modern tech sector, 670 00:36:37,040 --> 00:36:40,480 Speaker 1: I think they've got a more specific message about the way, 671 00:36:40,520 --> 00:36:43,560 Speaker 1: for example, your phone and the apps on it are 672 00:36:43,560 --> 00:36:48,160 Speaker 1: not just sub optimal, but possibly really driving your life 673 00:36:48,239 --> 00:36:52,160 Speaker 1: and the culture in general into an extremely unhealthy space 674 00:36:52,520 --> 00:36:55,319 Speaker 1: that runs contrary to our goals and what we want 675 00:36:55,400 --> 00:36:57,400 Speaker 1: our lives to be. And so they've been doing a 676 00:36:57,440 --> 00:37:00,239 Speaker 1: lot of media interviews recently in the past year so 677 00:37:00,880 --> 00:37:02,560 Speaker 1: and so I think maybe we should take a look 678 00:37:02,600 --> 00:37:06,040 Speaker 1: at some at some time well spent information and data. 679 00:37:06,440 --> 00:37:09,720 Speaker 1: That sounds good job. This sounds like time well spent alright, 680 00:37:09,760 --> 00:37:12,720 Speaker 1: So how good are we really at spending our attention 681 00:37:12,760 --> 00:37:14,480 Speaker 1: in ways that make us happy and give us a 682 00:37:14,520 --> 00:37:17,520 Speaker 1: sense of fulfillment. Maybe you're out there listening and you're like, now, 683 00:37:17,680 --> 00:37:20,000 Speaker 1: I don't buy it. I use my attention exactly how 684 00:37:20,040 --> 00:37:22,719 Speaker 1: I want, you know, when I use Facebook, it's just 685 00:37:23,120 --> 00:37:25,160 Speaker 1: it's because I want to do that. I'm living my 686 00:37:25,200 --> 00:37:27,440 Speaker 1: own life. Yeah, or you know, or you you sort 687 00:37:27,480 --> 00:37:29,360 Speaker 1: of qualify the kind of content. Well, I use it 688 00:37:29,360 --> 00:37:31,320 Speaker 1: to keep in touch with my friends. My real friends 689 00:37:31,320 --> 00:37:34,920 Speaker 1: are my real family, and it's about maintaining that social network. 690 00:37:34,920 --> 00:37:37,439 Speaker 1: And there's some truth to an argument like that. Oh sure, yeah, 691 00:37:37,480 --> 00:37:40,520 Speaker 1: we don't want to totally demonize any particular app. I mean, 692 00:37:40,560 --> 00:37:42,960 Speaker 1: we've been saying a lot of negative stuff about Facebook 693 00:37:42,960 --> 00:37:46,600 Speaker 1: and related apps, but there's no reason to totally demonize 694 00:37:46,640 --> 00:37:49,600 Speaker 1: an app to its core. But we should be aware 695 00:37:49,800 --> 00:37:52,680 Speaker 1: of what its capabilities are and how we're interacting with 696 00:37:52,719 --> 00:37:55,680 Speaker 1: it on average. So there's an iOS app called Moment, 697 00:37:55,920 --> 00:37:59,680 Speaker 1: which is a device usage tracker, and what it does 698 00:37:59,760 --> 00:38:02,040 Speaker 1: is it logs how much time you spend using your 699 00:38:02,040 --> 00:38:05,080 Speaker 1: iPhone or your iPad and which apps you spend the 700 00:38:05,160 --> 00:38:07,920 Speaker 1: most of your time on. And so the Moment app 701 00:38:07,960 --> 00:38:10,919 Speaker 1: has partnered with time Well Spent what I just spent 702 00:38:11,000 --> 00:38:14,120 Speaker 1: before and the associated UH nonprofit, the Center for Humane 703 00:38:14,160 --> 00:38:18,080 Speaker 1: Technology to gather data about how much time people spend 704 00:38:18,120 --> 00:38:21,200 Speaker 1: on various apps and how they feel about spending their 705 00:38:21,200 --> 00:38:24,120 Speaker 1: time that way. So to be clear, this is something 706 00:38:24,200 --> 00:38:27,359 Speaker 1: where the user can see how they're spending their time on. Right, 707 00:38:27,600 --> 00:38:29,959 Speaker 1: So the app says, hey, did you know you spent 708 00:38:30,239 --> 00:38:32,920 Speaker 1: x a number of minutes on this app and this 709 00:38:33,120 --> 00:38:36,960 Speaker 1: many times a day, and what it shows you your usage. 710 00:38:37,120 --> 00:38:39,160 Speaker 1: I like this idea because it reminds me of how 711 00:38:39,200 --> 00:38:41,000 Speaker 1: with a lot of the modern video games we have, 712 00:38:41,120 --> 00:38:44,000 Speaker 1: they'll tell you how many hours you've sunk into the game, 713 00:38:44,120 --> 00:38:47,040 Speaker 1: but you spent twenty seven days playing. Yeah. It's it's 714 00:38:47,080 --> 00:38:50,600 Speaker 1: generally horrifying to read because because it's the sobering moment 715 00:38:50,640 --> 00:38:53,200 Speaker 1: where you're like, oh my god, I spent that much 716 00:38:53,239 --> 00:38:56,600 Speaker 1: time playing this video game or any video game or 717 00:38:56,640 --> 00:39:00,000 Speaker 1: anything that isn't uh, you know of of of really 718 00:39:00,000 --> 00:39:03,200 Speaker 1: intrinsic value in my life. Yeah. So, there was an 719 00:39:03,239 --> 00:39:06,080 Speaker 1: initial study conducted in a pool of two hundred thousand 720 00:39:06,080 --> 00:39:09,799 Speaker 1: iPhone users on this this moment uh and time well 721 00:39:09,840 --> 00:39:12,239 Speaker 1: spent partnership. And when you look at the results, there 722 00:39:12,320 --> 00:39:17,279 Speaker 1: is an immediately apparent and overwhelming correlation between apps that 723 00:39:17,360 --> 00:39:21,520 Speaker 1: we spend more time with and a feeling of unhappiness 724 00:39:21,600 --> 00:39:24,319 Speaker 1: with spending our time in the apps. Now, you might 725 00:39:24,400 --> 00:39:26,360 Speaker 1: be able to explain this correlation in a number of 726 00:39:26,360 --> 00:39:29,320 Speaker 1: different ways. We'll get to that after we discuss the data, 727 00:39:29,360 --> 00:39:33,799 Speaker 1: but first let's just look at the results. Apps like Facebook, Snapchat, 728 00:39:34,000 --> 00:39:37,439 Speaker 1: and tweet bought people tended to spend roughly an hour 729 00:39:37,760 --> 00:39:41,280 Speaker 1: or more in each day, and the majority of users 730 00:39:41,320 --> 00:39:44,799 Speaker 1: reported that they were unhappy having spent the time in 731 00:39:44,840 --> 00:39:48,480 Speaker 1: the app. For example, Facebook, the average daily Facebook use 732 00:39:48,600 --> 00:39:53,400 Speaker 1: was fifty nine minutes, and sixty cent of users, or 733 00:39:53,440 --> 00:39:56,839 Speaker 1: about two thirds, reported that they were unhappy that they 734 00:39:56,840 --> 00:39:59,080 Speaker 1: had spent the time that that amount of time in 735 00:39:59,160 --> 00:40:01,920 Speaker 1: the Facebook app, and the apps that made people the 736 00:40:02,000 --> 00:40:06,320 Speaker 1: most unhappy seemed to be things like social media, dating apps, 737 00:40:06,440 --> 00:40:09,719 Speaker 1: and games like candy Crush. Meanwhile, let's look at the 738 00:40:09,719 --> 00:40:12,040 Speaker 1: opposite end of the spectrum, which apps did people claim 739 00:40:12,080 --> 00:40:14,960 Speaker 1: to be the most happy with the time they spent using. 740 00:40:15,520 --> 00:40:18,920 Speaker 1: The list is more diverse here. It's more populated by 741 00:40:19,040 --> 00:40:22,839 Speaker 1: things like books and podcasts. It makes me feel good 742 00:40:22,840 --> 00:40:28,280 Speaker 1: about podcast space, but books and podcasts, music, weather apps, 743 00:40:28,640 --> 00:40:35,200 Speaker 1: navigation apps, fitness, meditation, and calmness. So the general trend 744 00:40:35,239 --> 00:40:37,279 Speaker 1: here is that I think these are things related to 745 00:40:37,400 --> 00:40:42,120 Speaker 1: intellectual stimulation like books and podcasts, uh and even like music. 746 00:40:42,800 --> 00:40:46,480 Speaker 1: Personal improvement like you know, wellness, and all that, and 747 00:40:46,520 --> 00:40:49,480 Speaker 1: functional utility tools that are useful in your life, like 748 00:40:49,560 --> 00:40:52,719 Speaker 1: navigation or weather right or the flashlight app. Let's not 749 00:40:52,760 --> 00:40:55,319 Speaker 1: forget that at all the ding dang time. I'm sure 750 00:40:55,360 --> 00:40:57,560 Speaker 1: people are very very happy with their use of the 751 00:40:57,560 --> 00:41:01,400 Speaker 1: flashlight app, and yet people reported spending much less time 752 00:41:01,480 --> 00:41:05,400 Speaker 1: engaging directly with these apps that made them happy. For example, 753 00:41:05,480 --> 00:41:09,360 Speaker 1: people were happy with the time they spent with Audible 754 00:41:09,760 --> 00:41:12,640 Speaker 1: the audio books app, but they only spent an average 755 00:41:12,680 --> 00:41:14,920 Speaker 1: of about eight minutes a day with it, which is 756 00:41:14,960 --> 00:41:18,000 Speaker 1: a lot less than people spent in Facebook and Candy Crush, 757 00:41:18,280 --> 00:41:20,279 Speaker 1: especially given how long some of the audio books. Are 758 00:41:20,320 --> 00:41:22,680 Speaker 1: you planning to finish that thing anytime? Saying I don't know, 759 00:41:22,760 --> 00:41:24,960 Speaker 1: People spent a lot less time, but they felt good 760 00:41:25,000 --> 00:41:29,360 Speaker 1: about the time they spent. They were happy about the 761 00:41:29,400 --> 00:41:32,040 Speaker 1: time they spent with podcasts, but spent only an average 762 00:41:32,080 --> 00:41:35,000 Speaker 1: of about eight minutes a day there as well. Again, 763 00:41:35,280 --> 00:41:37,680 Speaker 1: so these episodes seen to be about an hour long, 764 00:41:37,960 --> 00:41:40,080 Speaker 1: the two coming out a week, You gotta you gotta 765 00:41:40,120 --> 00:41:41,959 Speaker 1: crack the quip a little bit more. Well, you gotta 766 00:41:41,960 --> 00:41:44,319 Speaker 1: think about these are averages of courses, So you know, 767 00:41:44,360 --> 00:41:46,719 Speaker 1: maybe people listen to one podcast a week I'm just 768 00:41:46,719 --> 00:41:50,319 Speaker 1: thinking of Joe and Jane average out there. Now. I 769 00:41:50,320 --> 00:41:53,840 Speaker 1: wonder if I don't know exactly what the methodology here is, 770 00:41:53,880 --> 00:41:55,560 Speaker 1: So I wonder if this could be an artifact of 771 00:41:55,600 --> 00:41:59,080 Speaker 1: how moment tracks app usage. Like if an app runs 772 00:41:59,080 --> 00:42:02,000 Speaker 1: in the background without being the first app open on 773 00:42:02,040 --> 00:42:05,359 Speaker 1: the screen, I don't know how that gets tracked. Does 774 00:42:05,400 --> 00:42:08,319 Speaker 1: that get Does that get tracked the same way a 775 00:42:08,400 --> 00:42:11,480 Speaker 1: screen oriented app like a game or Facebook would. I'm 776 00:42:11,520 --> 00:42:13,520 Speaker 1: not quite sure, but I want to think. I think 777 00:42:13,520 --> 00:42:17,120 Speaker 1: we should think about explanations for this. So why is 778 00:42:17,160 --> 00:42:20,080 Speaker 1: it that the apps we spend the most time with 779 00:42:20,280 --> 00:42:23,400 Speaker 1: make us the most unhappy? It could be that the 780 00:42:23,440 --> 00:42:26,160 Speaker 1: same apps that command the most of our attention just 781 00:42:26,239 --> 00:42:29,279 Speaker 1: coincidentally also happened to be the ones that leave us 782 00:42:29,320 --> 00:42:32,000 Speaker 1: feeling regretful about spending our time with them. Well, you 783 00:42:32,000 --> 00:42:33,879 Speaker 1: know one thing that instantly pops out at me too 784 00:42:34,000 --> 00:42:38,279 Speaker 1: is these examples of positive experience apps. They are not 785 00:42:38,440 --> 00:42:42,200 Speaker 1: as focused on the machine itself. You know, like when 786 00:42:42,200 --> 00:42:44,839 Speaker 1: I'm listening to music, I am potentially doing something else. 787 00:42:44,840 --> 00:42:47,400 Speaker 1: I'm not staring at my phone and looking at Spotify, 788 00:42:47,840 --> 00:42:50,960 Speaker 1: you know, I'm doing I'm engaging in some of their activity. 789 00:42:51,120 --> 00:42:53,400 Speaker 1: But people were very happy with their books apps, and 790 00:42:53,400 --> 00:42:55,720 Speaker 1: so like, if you're reading your kindle app or something, 791 00:42:55,719 --> 00:42:57,600 Speaker 1: people that made people happy. But where are you when 792 00:42:57,600 --> 00:43:00,000 Speaker 1: you read a book? You're not really in the phone, 793 00:43:00,840 --> 00:43:03,759 Speaker 1: in the imagination in your mind. Yeah. And then likewise 794 00:43:03,800 --> 00:43:06,440 Speaker 1: with some of these mindfulness apps, whereas what are you 795 00:43:06,480 --> 00:43:09,279 Speaker 1: doing when you're looking at Facebook? You are kind of 796 00:43:09,320 --> 00:43:13,359 Speaker 1: in the default mode network of worries about your own 797 00:43:13,400 --> 00:43:16,799 Speaker 1: social situation past in future, you're in the Facebook space. Yeah. 798 00:43:17,040 --> 00:43:21,040 Speaker 1: So another explanation could be a more direct correlation, right. 799 00:43:21,040 --> 00:43:23,440 Speaker 1: It could be the fact that we spend so much 800 00:43:23,480 --> 00:43:26,840 Speaker 1: time with an app is what makes us regretful about 801 00:43:26,920 --> 00:43:30,000 Speaker 1: using it, Right, Like, would we be happier with our 802 00:43:30,040 --> 00:43:32,759 Speaker 1: Facebook usage if we only spent ten minutes a day 803 00:43:32,800 --> 00:43:35,920 Speaker 1: on it? Would we be sad about our Audible or 804 00:43:36,000 --> 00:43:38,680 Speaker 1: Kindle usage if we spent an hour or more a 805 00:43:38,760 --> 00:43:41,200 Speaker 1: day on that? I feel like there might be a 806 00:43:41,360 --> 00:43:43,799 Speaker 1: yes to the first question, but a node to the 807 00:43:43,840 --> 00:43:46,719 Speaker 1: second one, right, Like there, I can't be sure, but 808 00:43:46,800 --> 00:43:50,640 Speaker 1: somehow I doubt that we would be sad using Audible 809 00:43:50,760 --> 00:43:53,000 Speaker 1: for an hour a day because I don't know when 810 00:43:53,040 --> 00:43:55,359 Speaker 1: I listened to audiobooks for a long stretch of time 811 00:43:55,360 --> 00:43:57,759 Speaker 1: while I'm doing yardwork or cooking or something. I feel 812 00:43:57,840 --> 00:44:00,040 Speaker 1: very happy about that. Yeah, it doesn't feel like a 813 00:44:00,040 --> 00:44:02,319 Speaker 1: slur to say, to say, oh, man, Joe is an 814 00:44:02,320 --> 00:44:06,520 Speaker 1: audible theme. It's an audible freak. Joe, get out that audible, 815 00:44:06,760 --> 00:44:08,960 Speaker 1: you know. But if we if we inserted a Facebook 816 00:44:09,000 --> 00:44:13,520 Speaker 1: or Twitter in instead of instead of an audio book program, 817 00:44:13,880 --> 00:44:16,000 Speaker 1: it would be a different connotation entirely. Now I should 818 00:44:16,040 --> 00:44:18,200 Speaker 1: say I just realized we've run audible ads on the 819 00:44:18,200 --> 00:44:20,359 Speaker 1: show before. They're not paying us to talk about any 820 00:44:20,400 --> 00:44:22,920 Speaker 1: of this. This is just what the results were. But 821 00:44:23,040 --> 00:44:26,480 Speaker 1: I can honestly say I do enjoy books and podcasts 822 00:44:26,560 --> 00:44:30,000 Speaker 1: related apps much more than social media apps. But then again, 823 00:44:30,040 --> 00:44:33,200 Speaker 1: I don't know. Like, so imagine that the average Facebook 824 00:44:33,280 --> 00:44:36,600 Speaker 1: usage was only ten minutes a day, and it's still 825 00:44:36,640 --> 00:44:38,759 Speaker 1: the same kind of usage. It's not like having a 826 00:44:38,800 --> 00:44:41,920 Speaker 1: specific interaction with somebody or looking up an event page 827 00:44:42,000 --> 00:44:44,680 Speaker 1: or something. But just ten minutes a day scrolling through 828 00:44:44,680 --> 00:44:48,480 Speaker 1: the algorithmically generated news feed. Would people be happy about that? 829 00:44:48,840 --> 00:44:50,719 Speaker 1: I don't know. It's hard to say. I don't know. 830 00:44:50,760 --> 00:44:53,440 Speaker 1: It doesn't doesn't sound that happy. Yeah, I want to 831 00:44:53,560 --> 00:44:56,080 Speaker 1: look at a second set of results compiled through the 832 00:44:56,120 --> 00:44:58,560 Speaker 1: Moment app with Time Well Spent. This was just a 833 00:44:58,600 --> 00:45:02,640 Speaker 1: separately compiled page I found uh In this separately compiled 834 00:45:02,840 --> 00:45:05,200 Speaker 1: set of results, the top apps that people were happy 835 00:45:05,200 --> 00:45:07,799 Speaker 1: to have spent time with were in similar categories. There 836 00:45:07,800 --> 00:45:12,080 Speaker 1: are things like, uh, you know, books and meditation and 837 00:45:12,120 --> 00:45:15,799 Speaker 1: wellness and learning and music. And the average amount of 838 00:45:15,800 --> 00:45:18,360 Speaker 1: time people spent per day on these top happiness apps 839 00:45:18,400 --> 00:45:21,440 Speaker 1: was around seven minutes. Another way of measuring the usage 840 00:45:21,440 --> 00:45:23,719 Speaker 1: is that they noted how many times a day did 841 00:45:23,719 --> 00:45:26,200 Speaker 1: you pick up the app for these apps? The answer 842 00:45:26,280 --> 00:45:28,640 Speaker 1: was generally about one to three times a day. The 843 00:45:28,640 --> 00:45:31,520 Speaker 1: top apps that people were unhappy to have spent time 844 00:45:31,560 --> 00:45:36,479 Speaker 1: with were generally social media apps and games, and these 845 00:45:36,480 --> 00:45:39,319 Speaker 1: apps people spent closer to around forty five minutes to 846 00:45:39,360 --> 00:45:42,240 Speaker 1: an hour a day on less time on the dating 847 00:45:42,280 --> 00:45:45,759 Speaker 1: apps than the other social media and games. So for 848 00:45:45,800 --> 00:45:48,080 Speaker 1: the purpose of this conversation, I think we can assume 849 00:45:48,160 --> 00:45:51,000 Speaker 1: that these results are more or less correct. They certainly 850 00:45:51,200 --> 00:45:53,160 Speaker 1: ring true to me, though I haven't found an in 851 00:45:53,239 --> 00:45:56,080 Speaker 1: depth description of the methodology by which they were arrived at, 852 00:45:56,560 --> 00:45:59,160 Speaker 1: but these organizations seem trustworthy to me. I think these 853 00:45:59,160 --> 00:46:03,880 Speaker 1: are probably basically correct findings. But assuming this is roughly correct, 854 00:46:04,520 --> 00:46:07,680 Speaker 1: why are we living the technological part of our lives 855 00:46:07,719 --> 00:46:11,359 Speaker 1: this way? Why do we spend the most time using 856 00:46:11,400 --> 00:46:14,359 Speaker 1: our technology in ways that end up making us the 857 00:46:14,480 --> 00:46:17,759 Speaker 1: least happy. That's kind of the the ultimate uh, like 858 00:46:17,880 --> 00:46:19,839 Speaker 1: kicking the pants at the end of a day, right 859 00:46:20,040 --> 00:46:22,120 Speaker 1: where you if you look back and you think, oh, 860 00:46:21,960 --> 00:46:24,640 Speaker 1: I spent I spend a fair amount of time doing 861 00:46:24,719 --> 00:46:28,239 Speaker 1: things that brought me no joy? Why didn't I use 862 00:46:28,320 --> 00:46:30,480 Speaker 1: that time? Why didn't I scroll that time away for 863 00:46:30,600 --> 00:46:33,520 Speaker 1: the things that I need to do and or really 864 00:46:33,600 --> 00:46:37,560 Speaker 1: want to do? Exactly? Nobody's forcing you to scroll your 865 00:46:37,600 --> 00:46:41,680 Speaker 1: Facebook timeline over and over, your your news feed or 866 00:46:41,920 --> 00:46:45,200 Speaker 1: Twitter or Reddit or any of these other things. Nobody's 867 00:46:45,280 --> 00:46:47,880 Speaker 1: forcing you to just spend your time running through this 868 00:46:47,960 --> 00:46:51,000 Speaker 1: infinite stream of content. And yet you do it. You 869 00:46:51,080 --> 00:46:54,000 Speaker 1: keep doing it, you keep going back to it sometimes 870 00:46:54,120 --> 00:46:56,360 Speaker 1: during the things that we want to do. Have you 871 00:46:56,360 --> 00:46:59,320 Speaker 1: ever had this experience where you're you're watching a movie yeah, 872 00:46:59,440 --> 00:47:01,520 Speaker 1: or a movie you're interested in yeah, and then for 873 00:47:01,560 --> 00:47:03,759 Speaker 1: some reason, like I mean, on one hand, I kind 874 00:47:03,760 --> 00:47:06,040 Speaker 1: of get into it this way, like I'll I convinced 875 00:47:06,080 --> 00:47:10,239 Speaker 1: myself it's certainly okay, to go to IMDb and look 876 00:47:10,320 --> 00:47:12,520 Speaker 1: up people in the movie. So who's who's that? Who's 877 00:47:12,560 --> 00:47:14,879 Speaker 1: that actor? All right, I'll look him up, and I mean, 878 00:47:14,920 --> 00:47:17,040 Speaker 1: that's all right. But then it goes beyond that, because 879 00:47:17,040 --> 00:47:20,120 Speaker 1: then I'm just on a reflex. I'm pulling open my 880 00:47:20,239 --> 00:47:23,239 Speaker 1: my work email account or my personal email account, or 881 00:47:23,280 --> 00:47:26,359 Speaker 1: I'm even worse, I'm clicking on social media or or 882 00:47:26,719 --> 00:47:30,480 Speaker 1: or Reuter's or something instead of just focusing on the 883 00:47:30,520 --> 00:47:34,799 Speaker 1: one distraction that I'm supposed to be into right now. Yeah, 884 00:47:34,840 --> 00:47:37,919 Speaker 1: So we're detecting that there is this inherent tension. It's 885 00:47:37,960 --> 00:47:39,799 Speaker 1: the same thing we talked about in the story we 886 00:47:39,920 --> 00:47:42,640 Speaker 1: used to illustrate the problem at the beginning. There is 887 00:47:42,640 --> 00:47:45,600 Speaker 1: a tension between what we want to do with our lives, 888 00:47:45,640 --> 00:47:48,480 Speaker 1: what we want to want, and the things we do 889 00:47:48,560 --> 00:47:53,280 Speaker 1: with our technology of our own free will. Seemingly moment 890 00:47:53,360 --> 00:47:57,840 Speaker 1: to moment when we have these impulsive, tempting little bits 891 00:47:57,880 --> 00:48:01,120 Speaker 1: of digital candy to pull us a side and distract 892 00:48:01,200 --> 00:48:03,600 Speaker 1: us and lure us in and then just keep us 893 00:48:03,640 --> 00:48:07,319 Speaker 1: gobbling for you know, minutes, hours, for as long as 894 00:48:07,360 --> 00:48:09,319 Speaker 1: it can keep our eyes glued to the screen. And 895 00:48:09,360 --> 00:48:12,400 Speaker 1: I think in the next episode we should focus a 896 00:48:12,400 --> 00:48:15,200 Speaker 1: little bit more on how technology is doing this to us, 897 00:48:15,239 --> 00:48:18,759 Speaker 1: how it's changing us, and how it works, what tools 898 00:48:18,840 --> 00:48:22,879 Speaker 1: it uses in order to exploit our psychological vulnerabilities. Yeah, 899 00:48:23,000 --> 00:48:25,960 Speaker 1: next next episode is going to be more of a 900 00:48:26,239 --> 00:48:29,520 Speaker 1: fight the power. I'm hoping it's gonna be. We wanted 901 00:48:29,560 --> 00:48:32,200 Speaker 1: to explain the problem in this one. Yeah, the next one, 902 00:48:32,239 --> 00:48:34,640 Speaker 1: you're this This episode was about finding out what the 903 00:48:34,719 --> 00:48:37,520 Speaker 1: terminators are. In the next episode you'll find out about 904 00:48:37,560 --> 00:48:39,239 Speaker 1: how to blow them up and you know, make your 905 00:48:39,239 --> 00:48:41,680 Speaker 1: own kitchen pipe bombs to do it. Right. So, when 906 00:48:41,719 --> 00:48:44,600 Speaker 1: that social media app says I need your clothes, your boots, 907 00:48:44,600 --> 00:48:48,880 Speaker 1: and your motorcycle, you say no. You say no, sir, no, 908 00:48:49,080 --> 00:48:52,000 Speaker 1: thank you? All right? Well, hey, in the meantime, head 909 00:48:52,040 --> 00:48:53,879 Speaker 1: on over to stuff to Blow your Mind dot com. 910 00:48:53,920 --> 00:48:56,880 Speaker 1: That's where you'll find all the podcast episodes. You'll find 911 00:48:57,160 --> 00:49:00,120 Speaker 1: various blog posts as well as links out to our 912 00:49:00,239 --> 00:49:05,400 Speaker 1: social media accounts such as Facebook, Twitter, Tumbler, Instagram. We 913 00:49:05,480 --> 00:49:08,400 Speaker 1: have so many of them. Uh, they drain us. Allow 914 00:49:08,440 --> 00:49:10,239 Speaker 1: them to drain you as well. Well. I'd say if 915 00:49:10,280 --> 00:49:12,759 Speaker 1: you go use them, try to be deliberate about that time. 916 00:49:12,840 --> 00:49:16,240 Speaker 1: Don't get sucked into whatever. Look for what you're looking for. 917 00:49:16,600 --> 00:49:18,879 Speaker 1: Get that thing you're looking for, don't let it pull 918 00:49:18,960 --> 00:49:21,920 Speaker 1: you into passive consumption. But if you want to be 919 00:49:21,960 --> 00:49:24,880 Speaker 1: really deliberate about getting in contact with us, of course 920 00:49:24,920 --> 00:49:27,160 Speaker 1: you can do that through email the old fashioned way. 921 00:49:27,200 --> 00:49:28,520 Speaker 1: Oh and of course I want to give a big 922 00:49:28,520 --> 00:49:31,920 Speaker 1: shout out, as always to our excellent audio producers Alex 923 00:49:31,960 --> 00:49:34,680 Speaker 1: Williams and Tarry Harrison. But that email address if you 924 00:49:34,719 --> 00:49:36,319 Speaker 1: want to write us a note let us know what 925 00:49:36,360 --> 00:49:38,880 Speaker 1: you think about this episode or any other, to request 926 00:49:38,880 --> 00:49:41,600 Speaker 1: a topic for the future. That address is blow the 927 00:49:41,680 --> 00:49:54,040 Speaker 1: mind at how stuff works dot com. For more on 928 00:49:54,120 --> 00:49:56,839 Speaker 1: this and thousands of other topics. It how stuff works 929 00:49:56,840 --> 00:50:07,960 Speaker 1: dot Com about a lot of people, the people the 930 00:50:08,680 --> 00:50:13,200 Speaker 1: biggest fourth start