1 00:00:04,240 --> 00:00:07,240 Speaker 1: Welcome to tech Stuff, a production of I Heart Radios. 2 00:00:07,320 --> 00:00:14,160 Speaker 1: How Stuff Works. He there, and welcome to tech Stuff. 3 00:00:14,200 --> 00:00:17,159 Speaker 1: I'm your host, Jonathan Strickland. I'm an executive producer with 4 00:00:17,200 --> 00:00:19,960 Speaker 1: I Heart Radio and I love all things tech and 5 00:00:20,120 --> 00:00:24,320 Speaker 1: today is Friday. That means it's time for a classic 6 00:00:24,600 --> 00:00:29,240 Speaker 1: episode of tech Stuff. This episode originally published on January 7 00:00:29,280 --> 00:00:33,640 Speaker 1: twenty nine, two thousand thirteen. It is titled Social Media 8 00:00:33,840 --> 00:00:37,920 Speaker 1: and You, and it features Lauren Vogelbaum as my co host. 9 00:00:38,000 --> 00:00:41,560 Speaker 1: Lauren had done a couple of episodes before this one 10 00:00:42,400 --> 00:00:44,880 Speaker 1: that you're about to hear. I did not include those 11 00:00:44,920 --> 00:00:48,599 Speaker 1: because they were particularly early ones. You can go back 12 00:00:48,640 --> 00:00:51,320 Speaker 1: into the archive at tech stuff podcast dot com and 13 00:00:51,320 --> 00:00:53,760 Speaker 1: find them. But I found that this one I think 14 00:00:53,840 --> 00:00:57,400 Speaker 1: was a good starting point for the Lauren era of 15 00:00:57,480 --> 00:01:01,040 Speaker 1: tech Stuff and hope you guys. And really there's a 16 00:01:01,080 --> 00:01:02,279 Speaker 1: lot of fear out there, I think in a lot 17 00:01:02,320 --> 00:01:03,960 Speaker 1: of a lot of news stories that are kind of 18 00:01:03,960 --> 00:01:05,960 Speaker 1: amping up that fear. And there's been several studies that 19 00:01:05,959 --> 00:01:08,640 Speaker 1: that talk with this doom and gloom about like about like, oh, no, 20 00:01:08,880 --> 00:01:10,840 Speaker 1: social media is so big that we're not going to 21 00:01:10,880 --> 00:01:15,039 Speaker 1: talk to each other in person anymore. Um, total communication breakdown, 22 00:01:15,520 --> 00:01:18,160 Speaker 1: Captain Dogg's looking together that that kind of thing. Um, 23 00:01:18,480 --> 00:01:21,360 Speaker 1: and uh, it's this is not pop stuff, but I 24 00:01:21,400 --> 00:01:24,280 Speaker 1: like to just yes anyway, Yeah, no, no no, no, we 25 00:01:24,640 --> 00:01:27,840 Speaker 1: quote documentaries all the time here on tech stuff. It's 26 00:01:27,840 --> 00:01:31,960 Speaker 1: perfectly fine. And I agree entirely, Lauren. I mean, from 27 00:01:32,000 --> 00:01:36,560 Speaker 1: an armchair psychology perspective, if I were to just look 28 00:01:36,680 --> 00:01:40,920 Speaker 1: at the whole idea of social media and human interaction 29 00:01:40,959 --> 00:01:44,520 Speaker 1: in general, uh, part of me would think, hey, social 30 00:01:44,520 --> 00:01:48,520 Speaker 1: media is replacing that face to face human interaction that 31 00:01:48,840 --> 00:01:51,680 Speaker 1: we tend to think of as being really important as 32 00:01:51,680 --> 00:01:54,280 Speaker 1: part of our development as a person, right, or that 33 00:01:54,320 --> 00:01:56,760 Speaker 1: seems to be really important for a very long time. Yeah, 34 00:01:56,800 --> 00:01:59,760 Speaker 1: I mean it's it's part of socialization. And the worry 35 00:01:59,880 --> 00:02:03,200 Speaker 1: is that without that face to face interaction with something 36 00:02:03,200 --> 00:02:07,000 Speaker 1: else replacing it, we would be less capable of dealing 37 00:02:07,000 --> 00:02:08,959 Speaker 1: with those interactions when they come up. And a lot 38 00:02:09,000 --> 00:02:13,560 Speaker 1: of the information about this tends to be anecdotal, which 39 00:02:14,360 --> 00:02:17,919 Speaker 1: anyone who has done any science knows is not reliable 40 00:02:18,000 --> 00:02:22,639 Speaker 1: when it comes to actually measuring it's not actually science. Um. 41 00:02:22,840 --> 00:02:24,760 Speaker 1: There have been a few studies. There's a one one 42 00:02:24,800 --> 00:02:26,840 Speaker 1: that you will here quoted all the time was from 43 00:02:27,000 --> 00:02:31,200 Speaker 1: MD Stanford Institute for the Quantitative Study of Society. UM 44 00:02:31,240 --> 00:02:34,720 Speaker 1: and uh, this one said, this one is was in 45 00:02:34,800 --> 00:02:36,600 Speaker 1: two thousand five to be fair, so this was a 46 00:02:36,639 --> 00:02:39,960 Speaker 1: few years ago and before Twitter and Facebook were really huge. 47 00:02:40,360 --> 00:02:42,760 Speaker 1: But um, but it found that um, compared to people 48 00:02:42,760 --> 00:02:46,079 Speaker 1: who do not use the internet frequently, those who do, 49 00:02:46,320 --> 00:02:50,600 Speaker 1: um uh, spend seventy minutes less per day interacting with 50 00:02:50,639 --> 00:02:54,040 Speaker 1: their family, twenty five minutes less per day sleeping, and 51 00:02:54,320 --> 00:02:56,600 Speaker 1: thirty is less watching television. Although I'm not sure why 52 00:02:56,639 --> 00:02:58,959 Speaker 1: that's necessarily a bad thing. Yeah, they could go the 53 00:02:59,000 --> 00:03:01,160 Speaker 1: other way, but but that but that family thing, that's 54 00:03:01,160 --> 00:03:03,320 Speaker 1: seventy minutes less a day talking to your family, that 55 00:03:03,400 --> 00:03:07,240 Speaker 1: sounds that sounds awful, right until of course you're interacting 56 00:03:07,240 --> 00:03:10,080 Speaker 1: with your family online exactly, and that's and that's what 57 00:03:10,120 --> 00:03:12,360 Speaker 1: we're I think it turns out that that is what 58 00:03:12,400 --> 00:03:15,079 Speaker 1: we are doing, right right. Yeah. Again, it's one of 59 00:03:15,080 --> 00:03:17,720 Speaker 1: those things where I think the stereotype, the thing that 60 00:03:17,760 --> 00:03:21,519 Speaker 1: we all imagine is the person sequestered in his or 61 00:03:21,560 --> 00:03:27,000 Speaker 1: her room. Uh, you know, it's dark, Yeah, it's depressing 62 00:03:27,320 --> 00:03:31,880 Speaker 1: and probably yeah, the only interaction they get is whomever 63 00:03:31,960 --> 00:03:34,080 Speaker 1: is online is through a keyboard and never a microphone, right, 64 00:03:34,120 --> 00:03:36,960 Speaker 1: and that's it. And it can be a synchronous communication, 65 00:03:37,040 --> 00:03:39,640 Speaker 1: meaning that you know, you leave a message, then later 66 00:03:39,720 --> 00:03:41,520 Speaker 1: on the person reads it and then they leave a message. 67 00:03:41,560 --> 00:03:44,240 Speaker 1: This is essentially the way email works. It's a synchronous 68 00:03:44,280 --> 00:03:46,920 Speaker 1: as opposed to a face to face conversation, which is 69 00:03:47,000 --> 00:03:49,680 Speaker 1: generally synchronous unless you're talking to me, in which case 70 00:03:49,680 --> 00:03:53,000 Speaker 1: it is just a monologue. But uh, that's the way 71 00:03:53,040 --> 00:03:56,960 Speaker 1: I work. But anyway, that's generally speaking, that tends to 72 00:03:57,000 --> 00:04:00,160 Speaker 1: be The view is that it's a person who is 73 00:04:00,200 --> 00:04:02,840 Speaker 1: withdrawing more and more. And it's this this idea that 74 00:04:03,200 --> 00:04:07,400 Speaker 1: social media could ultimately be dehumanizing us, at least in 75 00:04:07,440 --> 00:04:10,680 Speaker 1: the sense of how we define what a human is 76 00:04:11,240 --> 00:04:13,720 Speaker 1: right now, right, And to be fair, isolation is a 77 00:04:13,840 --> 00:04:16,560 Speaker 1: very scary thing. Um. There's all kinds of studies about 78 00:04:16,560 --> 00:04:18,880 Speaker 1: about how it can be as bad for you as 79 00:04:18,920 --> 00:04:24,400 Speaker 1: smoking and obesity, how it increases cancerus tumors in mice. Um. 80 00:04:24,400 --> 00:04:27,400 Speaker 1: I mean, it's developmentally, it's a very big problem. I mean, 81 00:04:27,440 --> 00:04:32,159 Speaker 1: there's there are actual cases of tragic cases of children 82 00:04:32,160 --> 00:04:36,279 Speaker 1: who are deprived the ability to interact with other people 83 00:04:36,600 --> 00:04:40,720 Speaker 1: and how that has impacted their ability to develop as 84 00:04:40,720 --> 00:04:44,000 Speaker 1: a human, like, like, you know, they never really develop 85 00:04:44,080 --> 00:04:47,240 Speaker 1: beyond a certain what would be equivalent to a certain age, 86 00:04:47,240 --> 00:04:49,599 Speaker 1: like a younger age. That there are stories about kids 87 00:04:49,600 --> 00:04:53,280 Speaker 1: who were in terrible conditions and you know, grow up 88 00:04:53,320 --> 00:04:56,280 Speaker 1: and never really develop beyond say a seven or eight 89 00:04:56,360 --> 00:04:59,520 Speaker 1: year old mensial level. Lonely subjects have been found to 90 00:04:59,560 --> 00:05:03,520 Speaker 1: have less brain activity than than people with a a 91 00:05:03,640 --> 00:05:06,560 Speaker 1: healthy and diverse social network. Yes, whenever I get lonely, 92 00:05:06,839 --> 00:05:11,280 Speaker 1: I'm not thinking of anything. It's just me and Supernatural, 93 00:05:11,400 --> 00:05:14,440 Speaker 1: Natural Marathon. You had it, man, before I could even 94 00:05:14,480 --> 00:05:17,200 Speaker 1: say it. Yes, it's just me and the next episode 95 00:05:17,200 --> 00:05:21,320 Speaker 1: of Supernatural. Not that there's anything wrong with that. Brothers 96 00:05:21,360 --> 00:05:25,080 Speaker 1: are dreaming. I wouldn't know. I don't want to they 97 00:05:25,120 --> 00:05:30,240 Speaker 1: are dreaming. I'm like, I believe you know. Okay, cool, 98 00:05:30,560 --> 00:05:35,679 Speaker 1: So anyway, Yes, these are these are all perceptions. Now, 99 00:05:36,279 --> 00:05:38,280 Speaker 1: when it comes down to science, there are a lot 100 00:05:38,360 --> 00:05:40,320 Speaker 1: of different stays that look at this, and there's some 101 00:05:40,360 --> 00:05:43,000 Speaker 1: conflicting results. And part of that is because the nature 102 00:05:43,000 --> 00:05:44,680 Speaker 1: of the studies. Part of it is that, you know, 103 00:05:44,720 --> 00:05:49,039 Speaker 1: the studies are not necessarily looking at the exact same criteria, right, 104 00:05:49,160 --> 00:05:52,160 Speaker 1: So it could be that one seems to contradict another, 105 00:05:52,200 --> 00:05:54,560 Speaker 1: but it maybe that in a broader perspective, they're not 106 00:05:54,640 --> 00:05:57,000 Speaker 1: really contradictory. It's just they're looking at they're looking at 107 00:05:57,000 --> 00:06:00,159 Speaker 1: different aspects. Because when people are talking about social isolation, um, 108 00:06:00,200 --> 00:06:05,240 Speaker 1: they're they're talking about the size, the intimacy, the diversity, 109 00:06:05,279 --> 00:06:08,720 Speaker 1: and the location of your social network. Right. Yes, there's 110 00:06:08,760 --> 00:06:12,000 Speaker 1: lots of different terms for this as well, about whether 111 00:06:12,080 --> 00:06:14,760 Speaker 1: or not you have a certain number of confidence like 112 00:06:14,839 --> 00:06:18,279 Speaker 1: people that you really connect with. These are the people 113 00:06:18,520 --> 00:06:23,080 Speaker 1: with whom you share those deep personal things that are 114 00:06:23,200 --> 00:06:26,560 Speaker 1: not something you would talk about to just your your neighbor, 115 00:06:26,760 --> 00:06:29,400 Speaker 1: unless they're your confidence or or someone on the street 116 00:06:29,480 --> 00:06:31,599 Speaker 1: or your coworker. Necessarily it's it's it's who you go 117 00:06:31,680 --> 00:06:33,520 Speaker 1: to when you're upset or when you're happy. It's the 118 00:06:33,560 --> 00:06:36,560 Speaker 1: first person that, if you're me, you text message when 119 00:06:37,000 --> 00:06:41,920 Speaker 1: when something terrific happens to you, any text messages. Okay, 120 00:06:41,920 --> 00:06:47,600 Speaker 1: so awkward, Uh, moving on, I'm dealing with a little 121 00:06:47,600 --> 00:06:51,000 Speaker 1: heartbreak here, but it's okay. I I don't don't have 122 00:06:51,040 --> 00:06:55,479 Speaker 1: to be everyone's best friend. Well, the thing is that 123 00:06:55,520 --> 00:06:58,480 Speaker 1: people that people really um the average person has has 124 00:06:58,560 --> 00:07:01,719 Speaker 1: one point eight ish and and that's that's you know, 125 00:07:01,839 --> 00:07:04,880 Speaker 1: one one point eight close contexts like that, depending upon 126 00:07:05,520 --> 00:07:08,360 Speaker 1: depending on which survey looking at, because some of these 127 00:07:09,200 --> 00:07:11,240 Speaker 1: one of the one of the surveys I really looked at, 128 00:07:11,320 --> 00:07:13,840 Speaker 1: which is one that you you sent me the information to, 129 00:07:13,920 --> 00:07:15,560 Speaker 1: So I thank you, Lauren, because without it I would 130 00:07:15,600 --> 00:07:17,960 Speaker 1: have nothing to talk about. But it was a Pew Internet, 131 00:07:18,040 --> 00:07:22,040 Speaker 1: Personal Networks and Community survey. Now, before I even get 132 00:07:22,120 --> 00:07:26,160 Speaker 1: into the data here, I should stress they surveyed two thousand, 133 00:07:26,280 --> 00:07:30,520 Speaker 1: five twelve adults. Now that's a pretty small sample, and 134 00:07:30,600 --> 00:07:34,440 Speaker 1: that's not so any results we get you have to 135 00:07:34,520 --> 00:07:37,320 Speaker 1: keep in mind this is this is a very small sample. 136 00:07:37,880 --> 00:07:41,320 Speaker 1: But within that sample they drew some pretty big conclusions. 137 00:07:41,360 --> 00:07:45,920 Speaker 1: One of those was that of American adults said they 138 00:07:46,040 --> 00:07:52,600 Speaker 1: used the Internet. So wow, didn't but anyway said they 139 00:07:52,680 --> 00:07:57,200 Speaker 1: used the Internet. And nearly half of adults, or of 140 00:07:57,200 --> 00:07:59,400 Speaker 1: those who said they use the Internet, say they use 141 00:07:59,480 --> 00:08:03,200 Speaker 1: at least on social networking service. So this is stuff 142 00:08:03,200 --> 00:08:06,800 Speaker 1: like Facebook, Twitter, LinkedIn, that kind of thing, and this 143 00:08:06,840 --> 00:08:09,560 Speaker 1: is sort of tangential to our community. Are our our 144 00:08:09,640 --> 00:08:12,480 Speaker 1: conversation here. But I found it really interesting out of 145 00:08:12,520 --> 00:08:17,560 Speaker 1: the ones they surveyed of those who used social networking 146 00:08:17,600 --> 00:08:22,080 Speaker 1: services said they used Facebook. Them on Facebook is one 147 00:08:22,120 --> 00:08:27,120 Speaker 1: of the ones I use, used my Space, LinkedIn, and 148 00:08:28,080 --> 00:08:31,160 Speaker 1: used Twitter. So that means more people are using my 149 00:08:31,280 --> 00:08:34,360 Speaker 1: Space than Twitter, which to me says the twenty of 150 00:08:34,360 --> 00:08:38,880 Speaker 1: people who said that are all in the band. I mean, really, 151 00:08:39,000 --> 00:08:41,000 Speaker 1: when was the last time you use my Space? I 152 00:08:41,600 --> 00:08:45,600 Speaker 1: deleted my my Space account about a year ago. It's 153 00:08:45,920 --> 00:08:50,560 Speaker 1: it's coming back because timber Lakes brought sexy and my Spacebook. 154 00:08:51,320 --> 00:08:53,000 Speaker 1: Those are the two things he brought back. I don't 155 00:08:53,000 --> 00:08:54,680 Speaker 1: know where he went to find them, but he brought 156 00:08:54,720 --> 00:08:58,920 Speaker 1: them back. Um. But yeah, I don't use my Space either. 157 00:08:59,000 --> 00:09:01,120 Speaker 1: It just blows my mom that more people are using 158 00:09:01,120 --> 00:09:04,040 Speaker 1: my Space that in the survey anyway, than LinkedIn and Twitter. 159 00:09:04,600 --> 00:09:06,800 Speaker 1: But um but but but that's I mean, that is 160 00:09:07,000 --> 00:09:09,600 Speaker 1: indicative of the of the overall usage. I mean, Facebook 161 00:09:09,840 --> 00:09:15,160 Speaker 1: as of October had um one billion monthly active users. Yeah, 162 00:09:15,160 --> 00:09:18,240 Speaker 1: that's a huge, huge number. And h and five four 163 00:09:18,280 --> 00:09:21,960 Speaker 1: million daily active users. That's a whole bunch of people. 164 00:09:22,000 --> 00:09:24,800 Speaker 1: That's that's that's more people than there are even user 165 00:09:24,840 --> 00:09:28,800 Speaker 1: accounts on Twitter. Um. As as of December, Twitter had 166 00:09:28,920 --> 00:09:34,760 Speaker 1: had more than two million December twelve people with the 167 00:09:34,800 --> 00:09:38,680 Speaker 1: Far Future are listening to Back in Future? Where's my 168 00:09:38,760 --> 00:09:42,320 Speaker 1: jet pack? Is it on the way the time machine? 169 00:09:42,360 --> 00:09:44,160 Speaker 1: Just bring it back to me right? There's no way 170 00:09:44,160 --> 00:09:46,480 Speaker 1: they can tell me. This is the sad thing, Like 171 00:09:46,559 --> 00:09:50,240 Speaker 1: you've been using it for three years. Time travel stinks 172 00:09:51,360 --> 00:09:53,480 Speaker 1: because I only get to go one direction and it's 173 00:09:53,520 --> 00:09:56,000 Speaker 1: really slow, second by second. As it turns out, hey 174 00:09:56,040 --> 00:09:59,679 Speaker 1: there it's job from twenty nineteen here to say. We're 175 00:09:59,679 --> 00:10:10,040 Speaker 1: going to a quick break to thank our sponsor. The 176 00:10:10,200 --> 00:10:12,920 Speaker 1: interesting thing about this this survey, I mean we've been 177 00:10:13,040 --> 00:10:15,319 Speaker 1: kind of talking about the things they found, but one 178 00:10:15,320 --> 00:10:17,960 Speaker 1: of the things they were specifically looking for was this 179 00:10:18,240 --> 00:10:22,920 Speaker 1: idea of social isolation and does the use of social 180 00:10:22,960 --> 00:10:27,000 Speaker 1: media contribute to social isolation? Is it true that we 181 00:10:27,040 --> 00:10:31,080 Speaker 1: are actually withdrawing from society in favor of the interactions 182 00:10:31,080 --> 00:10:33,920 Speaker 1: that we have on social networking services? And according to 183 00:10:33,960 --> 00:10:37,720 Speaker 1: the service spoiler alert, no, no it's not having Yeah, 184 00:10:37,880 --> 00:10:43,080 Speaker 1: they said that according to the surveying that social isolation 185 00:10:43,120 --> 00:10:49,000 Speaker 1: has not really changed since which this is just for 186 00:10:49,080 --> 00:10:53,120 Speaker 1: people who aren't paying really close attention. Not really a 187 00:10:53,120 --> 00:10:57,080 Speaker 1: big year for the internet. Um, if you were using 188 00:10:57,080 --> 00:11:00,720 Speaker 1: the Internet, you were in a research facility or a university. Uh, 189 00:11:00,800 --> 00:11:03,480 Speaker 1: you are not the average person. Because, of course, the 190 00:11:03,480 --> 00:11:06,960 Speaker 1: Worldwide Web the main way we tend to think about 191 00:11:07,000 --> 00:11:09,680 Speaker 1: interacting with the Internet, apart from apps and stuff. That's 192 00:11:09,720 --> 00:11:12,760 Speaker 1: starting to really take control. But wed Web it wasn't 193 00:11:12,760 --> 00:11:17,200 Speaker 1: a thing until ninety two, so obviously is before social 194 00:11:17,240 --> 00:11:21,079 Speaker 1: networking services are a thing. Like three people on Prodigy 195 00:11:21,160 --> 00:11:24,400 Speaker 1: as of eight and that's not it. So if you're 196 00:11:24,400 --> 00:11:27,920 Speaker 1: on a bulletin board system, maybe, but it's before social 197 00:11:27,960 --> 00:11:33,080 Speaker 1: networking services obviously. Uh. But these the extent of social 198 00:11:33,360 --> 00:11:35,920 Speaker 1: isolation hasn't really changed the sense. So that tells us 199 00:11:36,320 --> 00:11:39,800 Speaker 1: that it's very possible social networking services don't have a 200 00:11:39,800 --> 00:11:43,199 Speaker 1: big impact. Now we can't say that for sure because 201 00:11:43,200 --> 00:11:46,240 Speaker 1: there are a lot of different facts. Yeah, it could 202 00:11:46,280 --> 00:11:51,880 Speaker 1: be that we are reducing social isolation at an exponential rate, 203 00:11:52,040 --> 00:11:55,719 Speaker 1: but the social networking services are pulling that back. So 204 00:11:55,760 --> 00:11:58,840 Speaker 1: it could be that there is an impact on social isolation. 205 00:11:58,840 --> 00:12:01,240 Speaker 1: It's just that other factors are pushing it forward, so 206 00:12:01,320 --> 00:12:04,600 Speaker 1: it ends up balanced. Yeah. This is the complex thing 207 00:12:04,600 --> 00:12:08,320 Speaker 1: about science. This is why drawing conclusions is difficult. You 208 00:12:08,360 --> 00:12:10,439 Speaker 1: have to do a lot of studies and really look 209 00:12:10,480 --> 00:12:12,680 Speaker 1: at all the different factors and try and control for 210 00:12:12,720 --> 00:12:16,720 Speaker 1: as many variables as possible, because otherwise whatever you say 211 00:12:16,920 --> 00:12:20,240 Speaker 1: could turn out to be not so true in the 212 00:12:20,520 --> 00:12:25,120 Speaker 1: grand scheme of things. But based upon what this survey found, 213 00:12:25,160 --> 00:12:28,840 Speaker 1: it looks like social networking services are not turning us 214 00:12:28,840 --> 00:12:31,840 Speaker 1: all into hermits. This isn't from the same Pew Internet study. 215 00:12:31,920 --> 00:12:33,319 Speaker 1: I think it's from a different one. I didn't write 216 00:12:33,320 --> 00:12:35,000 Speaker 1: down which study it's from in my notes, but um, 217 00:12:35,400 --> 00:12:37,880 Speaker 1: it's been found that mobile phone use has actually made 218 00:12:37,920 --> 00:12:41,720 Speaker 1: our contact lists smaller but more intense. That makes sense. 219 00:12:42,000 --> 00:12:44,600 Speaker 1: So for instance, you know, it's not that it's not 220 00:12:44,679 --> 00:12:47,920 Speaker 1: that the relationships are less meaningful. It just means that 221 00:12:48,040 --> 00:12:51,040 Speaker 1: the ones that we contact were really depending upon them, 222 00:12:51,320 --> 00:12:54,080 Speaker 1: and that we might be using a phone in order 223 00:12:54,120 --> 00:12:56,760 Speaker 1: to make that connection as opposed to necessary you know, 224 00:12:56,880 --> 00:12:59,880 Speaker 1: walking across the street or or driving to a friend's hell, 225 00:13:00,040 --> 00:13:02,640 Speaker 1: are meeting up at a coffee shop or whatever. To me, 226 00:13:02,880 --> 00:13:05,600 Speaker 1: you know, that just means that we're transferring that same 227 00:13:05,720 --> 00:13:08,959 Speaker 1: need for interaction to a different medium. It doesn't mean 228 00:13:08,960 --> 00:13:11,480 Speaker 1: that we're losing that interaction. It just means that it's 229 00:13:11,520 --> 00:13:14,400 Speaker 1: a you know, we're taking advantage of technology in ways 230 00:13:14,400 --> 00:13:17,280 Speaker 1: that we couldn't before, which is awesome because I mean, 231 00:13:18,080 --> 00:13:21,000 Speaker 1: I'm sure you've known people who have moved away. You 232 00:13:21,000 --> 00:13:23,640 Speaker 1: you yourself have moved a few times. Absolutely. I I 233 00:13:23,920 --> 00:13:26,000 Speaker 1: play Halo every Wednesday night so that I get to 234 00:13:26,000 --> 00:13:28,080 Speaker 1: hang out with my friends who don't live here anymore. Right, 235 00:13:28,160 --> 00:13:31,800 Speaker 1: So this is technology giving us those social interactions that 236 00:13:31,840 --> 00:13:36,079 Speaker 1: otherwise we might lose if if we were to relocate 237 00:13:36,160 --> 00:13:39,680 Speaker 1: or our friends relocate. And in my view, that's a 238 00:13:39,760 --> 00:13:43,599 Speaker 1: huge positive. It means that those those those relationships that 239 00:13:43,640 --> 00:13:48,520 Speaker 1: have been formed over years sometimes decades of uh knowing 240 00:13:48,559 --> 00:13:52,920 Speaker 1: one another don't fade away. They still remain relevant because 241 00:13:52,960 --> 00:13:56,920 Speaker 1: technology allows us to continue to build those relationships. Now, 242 00:13:56,960 --> 00:14:00,680 Speaker 1: those interactions, the nature of them might change it, but 243 00:14:01,360 --> 00:14:04,840 Speaker 1: it's still a very important part of who we are 244 00:14:05,040 --> 00:14:08,400 Speaker 1: and how we interact. And for some people, UM, you 245 00:14:08,400 --> 00:14:12,680 Speaker 1: know there there are mentally and physically disabled people, the elderly, um, 246 00:14:12,840 --> 00:14:15,280 Speaker 1: new new mothers who can't get out of the house, um, 247 00:14:15,320 --> 00:14:18,120 Speaker 1: all kinds of people who they've been doing medical studies 248 00:14:18,160 --> 00:14:21,920 Speaker 1: with to see if use of the internet can actually 249 00:14:22,360 --> 00:14:26,480 Speaker 1: give them better social interaction. Sure it's it's it's been 250 00:14:26,480 --> 00:14:30,160 Speaker 1: found to reduce depression in lots of those groups, UM 251 00:14:30,360 --> 00:14:33,480 Speaker 1: and and has really helped people out. So yeah, I've 252 00:14:33,520 --> 00:14:36,320 Speaker 1: even seen that there have been studies done with people 253 00:14:36,320 --> 00:14:40,520 Speaker 1: who have various mental health conditions who otherwise would find 254 00:14:40,520 --> 00:14:43,960 Speaker 1: it very difficult to socialize. They they either not just 255 00:14:44,040 --> 00:14:47,440 Speaker 1: awkward like me and Jonathan. Right, No, no, no, the 256 00:14:47,520 --> 00:14:51,720 Speaker 1: people who really find it difficult to form any kind 257 00:14:51,760 --> 00:14:53,960 Speaker 1: of social contact. It's just one of those things that 258 00:14:54,320 --> 00:14:57,800 Speaker 1: it's a block for them and it can be very frustrating, 259 00:14:57,920 --> 00:15:00,840 Speaker 1: especially if they observe that other people are capable of 260 00:15:00,880 --> 00:15:04,480 Speaker 1: doing this. And there are so many factors involved here. 261 00:15:04,880 --> 00:15:08,280 Speaker 1: There's a there's an entire taboo about mental health in 262 00:15:08,400 --> 00:15:12,080 Speaker 1: our culture that is that's problematic and so awful. Yeah, 263 00:15:12,160 --> 00:15:14,400 Speaker 1: and it feeds on itself, right, it defends this thing 264 00:15:14,440 --> 00:15:17,320 Speaker 1: where it just mounts on the person who's experiencing this. 265 00:15:17,760 --> 00:15:21,320 Speaker 1: And some of the studies had people using social networks 266 00:15:21,640 --> 00:15:24,720 Speaker 1: and and they were finding it much easier to interact 267 00:15:24,720 --> 00:15:27,680 Speaker 1: with people and making suggestions on how to create a 268 00:15:27,720 --> 00:15:31,000 Speaker 1: social network with tools that would allow them to have 269 00:15:31,120 --> 00:15:34,600 Speaker 1: a greater interaction. Which, yeah, you think about that, like, 270 00:15:34,800 --> 00:15:38,200 Speaker 1: this is this is what technology should do. Technology should 271 00:15:38,200 --> 00:15:42,720 Speaker 1: help people so that they can interact any way they want. 272 00:15:42,800 --> 00:15:44,960 Speaker 1: In the you know in a way that brings them 273 00:15:45,120 --> 00:15:49,440 Speaker 1: the satisfaction that people can have, you know, people who 274 00:15:49,440 --> 00:15:53,720 Speaker 1: don't have these conditions tend to find naturally. Um, I 275 00:15:53,800 --> 00:15:56,480 Speaker 1: like this to me, it's to me, it's a great thing. 276 00:15:56,920 --> 00:16:00,040 Speaker 1: And anything that decreases that sense of isolation and that 277 00:16:00,200 --> 00:16:03,960 Speaker 1: since that taboo is a good thing to and it's 278 00:16:04,080 --> 00:16:05,880 Speaker 1: it's an added dimension. It doesn't have to be it 279 00:16:05,920 --> 00:16:08,720 Speaker 1: doesn't have to be replacement. It can be and things change, 280 00:16:08,760 --> 00:16:12,760 Speaker 1: I mean, I mean Socrates was terrified that writing was 281 00:16:12,800 --> 00:16:16,520 Speaker 1: going to ruin people's brains. Oh, I want to talk 282 00:16:16,560 --> 00:16:20,200 Speaker 1: about this, but we're gonna get into that all right. Now, 283 00:16:20,240 --> 00:16:23,080 Speaker 1: hang on one second. Now, let's take a quick break 284 00:16:23,120 --> 00:16:33,400 Speaker 1: to thank our sponsor, and now back to the show. 285 00:16:34,040 --> 00:16:36,880 Speaker 1: So you were talking about Socrates and writing, right, the 286 00:16:36,960 --> 00:16:40,160 Speaker 1: idea that writing things down means that you're taking stuff 287 00:16:40,280 --> 00:16:42,360 Speaker 1: out of your brain and putting it on paper or 288 00:16:43,560 --> 00:16:47,440 Speaker 1: stone or clay or whatever. Right a tree, your buddy Bill, 289 00:16:47,480 --> 00:16:50,440 Speaker 1: who stands still for really long periods of time, whatever, 290 00:16:50,720 --> 00:16:52,400 Speaker 1: you're taking it out of your brain, you're putting it 291 00:16:52,440 --> 00:16:55,720 Speaker 1: on something else, and therefore you are no longer relying 292 00:16:55,840 --> 00:16:59,000 Speaker 1: on your brain to process that information because you've offloaded 293 00:16:59,000 --> 00:17:02,320 Speaker 1: it soccers. You thought this was a terrible idea. The 294 00:17:02,360 --> 00:17:06,280 Speaker 1: interesting thing is that attitude has continued up to present day, 295 00:17:06,280 --> 00:17:08,520 Speaker 1: which is, you know, I assume why you brought it up, 296 00:17:08,560 --> 00:17:15,320 Speaker 1: because today we have computers and and uh smartphones and calculators. Calculators. Yeah, 297 00:17:15,480 --> 00:17:18,960 Speaker 1: I could not do complex equations anymore without the use 298 00:17:19,000 --> 00:17:21,440 Speaker 1: of a calculator because frankly, I just don't use those 299 00:17:21,440 --> 00:17:23,760 Speaker 1: skills as frequently as I as I used to. Well, 300 00:17:23,760 --> 00:17:25,879 Speaker 1: you're a writer, you're not a mathematician. I'm not a 301 00:17:25,880 --> 00:17:29,479 Speaker 1: good one either, um, and then mathematician that is, I'm 302 00:17:29,480 --> 00:17:31,919 Speaker 1: an excellent writer. You should read some of my articles 303 00:17:31,920 --> 00:17:34,560 Speaker 1: that how stufforce dot com. You know what I'm talking about. 304 00:17:35,160 --> 00:17:38,840 Speaker 1: So anyway, yeah, yeah, I know. That's the exactly I 305 00:17:38,840 --> 00:17:42,240 Speaker 1: mean the Nicholas Carr wrote the famous article for The 306 00:17:42,280 --> 00:17:46,080 Speaker 1: Atlantic is Google making us stupid? This whole idea that 307 00:17:46,359 --> 00:17:48,960 Speaker 1: because we've got so much information on the internet and 308 00:17:49,040 --> 00:17:52,280 Speaker 1: we have access to information, and therefore we're less intelligent, 309 00:17:52,480 --> 00:17:54,960 Speaker 1: right right, we don't. We don't have to remember stuff, 310 00:17:54,960 --> 00:17:57,160 Speaker 1: we don't have to know stuff. We just because I'm 311 00:17:57,359 --> 00:17:59,840 Speaker 1: d knows it, right, Yeah, yeah, why do we Why 312 00:17:59,880 --> 00:18:01,800 Speaker 1: do I need to even process this information. All I 313 00:18:01,840 --> 00:18:04,280 Speaker 1: have to do is type in search query and Google 314 00:18:04,320 --> 00:18:07,600 Speaker 1: pull up the first answer, and then and then repeat 315 00:18:07,640 --> 00:18:10,240 Speaker 1: it back. It doesn't even mean that I even process 316 00:18:10,320 --> 00:18:12,760 Speaker 1: it on a level where I understand it social media 317 00:18:13,560 --> 00:18:17,479 Speaker 1: or social networking services. Some people were worried a similar 318 00:18:17,520 --> 00:18:20,520 Speaker 1: thing is going on with the social aspect. As we're 319 00:18:20,520 --> 00:18:24,600 Speaker 1: talking with the memory and processing of information on the Internet. 320 00:18:24,600 --> 00:18:28,600 Speaker 1: This idea that social networking services are creating a less 321 00:18:28,640 --> 00:18:31,600 Speaker 1: meaningful way of connecting with people. But things like this 322 00:18:31,680 --> 00:18:35,639 Speaker 1: Pew research survey suggest otherwise. It's suggests that we're getting 323 00:18:35,720 --> 00:18:41,160 Speaker 1: just as much meaningful interaction online and through technology as 324 00:18:41,200 --> 00:18:43,600 Speaker 1: we would face to face. Now, the the actual nature 325 00:18:43,600 --> 00:18:46,560 Speaker 1: of that interaction may change somewhat somewhat, but it's still 326 00:18:46,600 --> 00:18:49,600 Speaker 1: important and it's still helpful. Yeah, and and that's I mean, 327 00:18:49,600 --> 00:18:51,639 Speaker 1: you know that kind of fear of technology. To be fair, 328 00:18:51,880 --> 00:18:55,280 Speaker 1: we hear tech stuff are probably biased towards technology and 329 00:18:55,320 --> 00:18:57,639 Speaker 1: just maybe maybe a little bit. Please listen to all 330 00:18:57,680 --> 00:19:04,960 Speaker 1: our episodes on your computers contacts at Discovery dot com. 331 00:19:05,000 --> 00:19:07,520 Speaker 1: But um, but but no, I mean, I mean, fear 332 00:19:07,520 --> 00:19:09,800 Speaker 1: of fear of technology isn't going to change the fact 333 00:19:09,840 --> 00:19:11,800 Speaker 1: that technology is out there, and it's not going to 334 00:19:12,440 --> 00:19:16,119 Speaker 1: change the fact that things are changing. Change happens, society changes, 335 00:19:16,160 --> 00:19:20,520 Speaker 1: everything changes, you know, and and fe people were also worried, um, 336 00:19:20,560 --> 00:19:26,240 Speaker 1: in the Industrial Revolution that because um, because these these 337 00:19:26,280 --> 00:19:30,919 Speaker 1: machines were starting to automate human processes, that that the 338 00:19:30,920 --> 00:19:34,960 Speaker 1: the equality of workspace interaction was going to go down, 339 00:19:35,560 --> 00:19:38,320 Speaker 1: and and all kinds of other ripple effects out from that. 340 00:19:38,359 --> 00:19:40,719 Speaker 1: And sure, that's where we get the whole sabotage thing, 341 00:19:41,119 --> 00:19:44,439 Speaker 1: because people, you know, the whole idea throwing the wooden 342 00:19:44,440 --> 00:19:48,800 Speaker 1: shoe into the automated loom to destroy it. So are 343 00:19:48,920 --> 00:19:52,000 Speaker 1: old old listeners to not not old listeners, but people 344 00:19:52,119 --> 00:19:53,800 Speaker 1: listen to old episodes of tech stuff. I know that 345 00:19:53,840 --> 00:19:57,000 Speaker 1: We've talked about that many many times, and and it's 346 00:19:57,040 --> 00:20:00,479 Speaker 1: not it's you know, the answer is, yes, things are changing, 347 00:20:00,520 --> 00:20:02,560 Speaker 1: but no, that's not necessarily a bad thing. Yeah, in 348 00:20:02,600 --> 00:20:05,399 Speaker 1: fact that that Again, going back to that Pew survey, 349 00:20:06,600 --> 00:20:08,280 Speaker 1: one of the other things I thought was really interesting 350 00:20:08,359 --> 00:20:10,480 Speaker 1: was that they found that people who use the web 351 00:20:10,520 --> 00:20:14,160 Speaker 1: a lot. Again, another one of those those uh perceptions 352 00:20:14,200 --> 00:20:17,400 Speaker 1: is that these are people who don't get out as much. 353 00:20:17,680 --> 00:20:20,600 Speaker 1: They don't you know, they don't interact in other ways 354 00:20:20,640 --> 00:20:22,639 Speaker 1: other than online, but that doesn't seem to be the 355 00:20:22,680 --> 00:20:25,760 Speaker 1: case according to the people that they surveyed. According to 356 00:20:25,760 --> 00:20:29,400 Speaker 1: the survey, the folks who used the web were actually 357 00:20:29,440 --> 00:20:35,120 Speaker 1: more likely to interact with people in their immediate physical environment. 358 00:20:35,160 --> 00:20:37,639 Speaker 1: They were more likely to do things like speak to 359 00:20:37,680 --> 00:20:41,200 Speaker 1: a neighbor on a on a regular basis or uh yes, 360 00:20:41,280 --> 00:20:43,840 Speaker 1: it's six of respondent said that they talked to a 361 00:20:43,840 --> 00:20:46,000 Speaker 1: neighbor at least once per month. And they found that 362 00:20:46,080 --> 00:20:49,800 Speaker 1: bloggers are seventy two percent more likely to belong to 363 00:20:49,840 --> 00:20:53,320 Speaker 1: a local voluntary association than those who do not blog. 364 00:20:53,920 --> 00:20:56,640 Speaker 1: So there you've got people who are very much invested 365 00:20:56,640 --> 00:20:59,879 Speaker 1: in the online world, but not at the expense of 366 00:21:00,000 --> 00:21:05,080 Speaker 1: a physical one, so directly contradicting that armchair psychology approach, 367 00:21:05,119 --> 00:21:07,760 Speaker 1: which is why we always say, like, yeah, there's that 368 00:21:07,800 --> 00:21:12,359 Speaker 1: whole idea of common sense dictates, which often means I 369 00:21:12,440 --> 00:21:15,840 Speaker 1: am wrong, but it seems like I'm right. I do 370 00:21:15,920 --> 00:21:20,000 Speaker 1: this all the time. I'm like, well, common sense talking 371 00:21:20,040 --> 00:21:22,680 Speaker 1: out of my butt because I don't have any data 372 00:21:22,720 --> 00:21:25,360 Speaker 1: in front of me. But yeah, they also found out 373 00:21:25,359 --> 00:21:28,880 Speaker 1: they said web users are more likely to visit a cafe. 374 00:21:29,600 --> 00:21:31,760 Speaker 1: I did not say if it was an internet cafe, 375 00:21:32,240 --> 00:21:36,040 Speaker 1: which obviously would increase the odds, but fifty two percent 376 00:21:36,080 --> 00:21:38,600 Speaker 1: more likely to visit a library. Didn't say if that 377 00:21:38,680 --> 00:21:41,400 Speaker 1: was because that's how they access the internet, But hey, 378 00:21:41,440 --> 00:21:43,800 Speaker 1: thirty four percent more likely to visit a fast food 379 00:21:43,840 --> 00:21:46,760 Speaker 1: restaurant that I don't necessarily think it's a good thing. 380 00:21:47,600 --> 00:21:49,520 Speaker 1: You know what, I haven't visit a fast food restaurant 381 00:21:49,520 --> 00:21:52,880 Speaker 1: in a very long time. I'm not a fast food fan. 382 00:21:54,200 --> 00:21:57,600 Speaker 1: More likely to visit other restaurants now I'm in that category. 383 00:21:57,960 --> 00:22:00,520 Speaker 1: We might like food and oh gosh, I love food, 384 00:22:00,840 --> 00:22:04,000 Speaker 1: And then more likely to visit a public park, which 385 00:22:04,080 --> 00:22:07,320 Speaker 1: is awesome. That is awesome. So we're talking about people 386 00:22:07,400 --> 00:22:12,000 Speaker 1: who have a real investment in their community and an 387 00:22:12,000 --> 00:22:14,640 Speaker 1: interest in the world outside the realm of the computer. 388 00:22:14,800 --> 00:22:17,600 Speaker 1: So that does contradict are they aren't going outside, they're 389 00:22:17,640 --> 00:22:21,960 Speaker 1: just taking pictures to upload when they get home. Right, Oh, 390 00:22:22,160 --> 00:22:23,960 Speaker 1: that reminds me of something I wanted to talk about 391 00:22:24,000 --> 00:22:26,000 Speaker 1: that wasn't on our on the notes, but it was. 392 00:22:26,720 --> 00:22:30,199 Speaker 1: It's this idea. It's something that's interesting. So one of 393 00:22:30,240 --> 00:22:33,200 Speaker 1: the one of the conflicts that people have with whole 394 00:22:33,240 --> 00:22:36,840 Speaker 1: social networking services and everything is that it It is 395 00:22:37,600 --> 00:22:41,280 Speaker 1: constantly interrupting our daily lives if we are connected in 396 00:22:41,560 --> 00:22:46,000 Speaker 1: various ways, so that uh verver ties where it's inappropriate 397 00:22:46,040 --> 00:22:48,480 Speaker 1: for you to stop what you're doing and respond to 398 00:22:48,520 --> 00:22:51,359 Speaker 1: someone on Facebook or Twitter, like like you're at work, 399 00:22:51,600 --> 00:22:54,800 Speaker 1: or you are in a conversation with someone, or you're 400 00:22:54,960 --> 00:22:57,400 Speaker 1: up on stage developing a keynote address to a bunch 401 00:22:57,400 --> 00:23:02,080 Speaker 1: of people. I'm sorry, but but I was gonna say 402 00:23:02,080 --> 00:23:08,119 Speaker 1: like driving, but those are so uh anyway, have you 403 00:23:08,160 --> 00:23:11,440 Speaker 1: ever been to uh dinner with someone or a meal 404 00:23:11,480 --> 00:23:14,399 Speaker 1: with someone when the very first thing that happens is 405 00:23:14,480 --> 00:23:18,600 Speaker 1: smartphone hits the table? Yes, that drives me so crazy. 406 00:23:18,600 --> 00:23:20,520 Speaker 1: Are you ever that person? Or do you leave your 407 00:23:20,560 --> 00:23:23,560 Speaker 1: smartphone away? Okay, to be fair, I'm very occasionally that person, 408 00:23:23,600 --> 00:23:27,000 Speaker 1: but usually it's revenge smartphone use. Usually it's it's after 409 00:23:27,119 --> 00:23:30,119 Speaker 1: someone else has brought out their smartphone. I I I 410 00:23:30,200 --> 00:23:32,640 Speaker 1: have on occasion gotten a little b huffy and I'll fine, 411 00:23:32,680 --> 00:23:35,159 Speaker 1: I'm gonna I'm gonna check Facebook too. Well, Lauren, I 412 00:23:35,200 --> 00:23:37,119 Speaker 1: hate to break this to you, but you're gonna get 413 00:23:37,160 --> 00:23:39,399 Speaker 1: way worse because now you're a tech stuff co host 414 00:23:40,040 --> 00:23:42,760 Speaker 1: and uh, let me put it this way, when I 415 00:23:42,840 --> 00:23:46,480 Speaker 1: hang out with other technology podcasts hosts and we all 416 00:23:46,560 --> 00:23:50,480 Speaker 1: go out to eat, the table creaks under the weight 417 00:23:50,560 --> 00:23:53,679 Speaker 1: of the electronics that hit it. First thing, I am 418 00:23:53,680 --> 00:23:56,920 Speaker 1: not kidding. I remember meals with people like and I'm 419 00:23:56,920 --> 00:23:59,960 Speaker 1: gonna be dropping some names here, folks, because my back hurt, 420 00:24:00,119 --> 00:24:02,680 Speaker 1: get ready, so I gotta drop some weight. Get Ready. 421 00:24:02,720 --> 00:24:06,439 Speaker 1: But people like Sarah Lane, i Azactar and tom Merritt 422 00:24:06,760 --> 00:24:09,680 Speaker 1: uh this Week in Tech or or Molly Wood of 423 00:24:09,800 --> 00:24:14,439 Speaker 1: c net or um oh, justin Robert Young and Brian 424 00:24:14,440 --> 00:24:18,440 Speaker 1: Brushwood and and and Veronica Belmont. We've all I've been 425 00:24:18,480 --> 00:24:22,760 Speaker 1: to dinner with these folks in various settings, and invariably, 426 00:24:22,800 --> 00:24:25,560 Speaker 1: the first thing that happens is everyone's smartphone hits the table, 427 00:24:25,880 --> 00:24:28,560 Speaker 1: and anytime you're not looking at a menu or talking, 428 00:24:28,680 --> 00:24:32,600 Speaker 1: you're way And I include myself, I am not immune 429 00:24:32,640 --> 00:24:35,399 Speaker 1: to this. This is something I've done too, and some 430 00:24:35,440 --> 00:24:37,679 Speaker 1: people are better about it than others. But it is 431 00:24:37,680 --> 00:24:39,760 Speaker 1: one of those things that can be kind of distracting. 432 00:24:39,840 --> 00:24:42,880 Speaker 1: And I will admit this is one of those behaviors 433 00:24:42,880 --> 00:24:47,240 Speaker 1: that is socially It's becoming more and more socially acceptable 434 00:24:47,240 --> 00:24:49,200 Speaker 1: in the sense that everyone's doing it but it's I 435 00:24:49,520 --> 00:24:51,200 Speaker 1: still am a little bit eaked out by it. Yeah, 436 00:24:51,440 --> 00:24:53,480 Speaker 1: it can be. It can be insulting if you're talking 437 00:24:53,520 --> 00:24:57,560 Speaker 1: to someone and you just see them looking because you know. 438 00:24:57,760 --> 00:25:01,600 Speaker 1: But the reason why I brought it up, Oh, there's 439 00:25:01,640 --> 00:25:06,760 Speaker 1: a restaurant in Los Angeles called EVA Restaurant E v A. 440 00:25:07,640 --> 00:25:10,000 Speaker 1: And I'm sure it's not the only restaurant that does this, 441 00:25:10,119 --> 00:25:14,040 Speaker 1: but EVA Restaurant has a policy which is if you 442 00:25:14,160 --> 00:25:18,320 Speaker 1: come inside the restaurant and you surrender your cell phone 443 00:25:18,400 --> 00:25:20,720 Speaker 1: to the waiter when you come in, you get a 444 00:25:20,720 --> 00:25:24,800 Speaker 1: five percent discount on your bill. That's a delightful. So 445 00:25:24,920 --> 00:25:27,320 Speaker 1: you come in. The waiter explains the policy, especially in 446 00:25:27,480 --> 00:25:30,160 Speaker 1: l A. Yeah, you have the choice of yeah, especially yeah, 447 00:25:30,320 --> 00:25:32,680 Speaker 1: no kidding all those actors waiting for the breakout call 448 00:25:34,359 --> 00:25:36,520 Speaker 1: or those agents waiting for their actors to call them. 449 00:25:36,720 --> 00:25:38,960 Speaker 1: But yeah, it means that you you just you hand 450 00:25:39,040 --> 00:25:42,320 Speaker 1: it over and then you concentrate on the food and 451 00:25:42,359 --> 00:25:45,400 Speaker 1: the experience. And you know, the restaurant's policy is that 452 00:25:45,880 --> 00:25:48,640 Speaker 1: this way, you're really focusing on the meal and you're 453 00:25:48,720 --> 00:25:51,040 Speaker 1: enjoying it for what it is, as opposed to distracting 454 00:25:51,040 --> 00:25:52,919 Speaker 1: yourself and the meal is just something you're doing in 455 00:25:52,960 --> 00:25:57,560 Speaker 1: between tweets. It also means Instagram hates it because there's 456 00:25:57,640 --> 00:26:01,800 Speaker 1: so few, so many fewer pick there's a food flooding 457 00:26:01,800 --> 00:26:05,239 Speaker 1: the Internet, which which we are in dire lacking of 458 00:26:05,480 --> 00:26:08,320 Speaker 1: so food and cats. People need more food and cats 459 00:26:08,359 --> 00:26:10,960 Speaker 1: on the internet. Staff. I'm I'm I'm starting my my 460 00:26:11,080 --> 00:26:13,560 Speaker 1: me and my tumbler followers are starting a drive for 461 00:26:13,600 --> 00:26:15,879 Speaker 1: more acute hedgehogs on the Internet. I think that this 462 00:26:15,920 --> 00:26:18,680 Speaker 1: is a thing that needs to happen. Had a good run, 463 00:26:18,920 --> 00:26:21,240 Speaker 1: it did well. That bucket, that bucket, it was a 464 00:26:21,240 --> 00:26:25,400 Speaker 1: bucket list. Oh no, oh dear, that one. That one 465 00:26:25,440 --> 00:26:27,480 Speaker 1: that was such a stretch. That wasn't that joke? Cut 466 00:26:27,640 --> 00:26:29,959 Speaker 1: wasn't that that was just not sure? It was just words, 467 00:26:30,840 --> 00:26:33,399 Speaker 1: That's what that was. But but but but no, but 468 00:26:33,440 --> 00:26:35,800 Speaker 1: I mean it is, it is. It can divide our attention, 469 00:26:35,840 --> 00:26:38,399 Speaker 1: and I do. I've seen a couple of news reports 470 00:26:38,520 --> 00:26:42,680 Speaker 1: lately that that had this announcer being shocked shocked. Did 471 00:26:42,720 --> 00:26:45,840 Speaker 1: you know that when young people wake up they check 472 00:26:45,920 --> 00:26:48,679 Speaker 1: their cell phone before getting out of bed. Then I'm like, 473 00:26:48,760 --> 00:26:51,560 Speaker 1: I'm like, people don't do that. First of all, my 474 00:26:51,600 --> 00:26:54,200 Speaker 1: cell phone is my alarm. Of course, my mind too. Yeah, 475 00:26:55,080 --> 00:26:57,280 Speaker 1: because of that by turning off my alarm, I am 476 00:26:57,359 --> 00:26:59,520 Speaker 1: checking my cell phone right, And I don't get out 477 00:26:59,520 --> 00:27:02,719 Speaker 1: of bed to throw off my alarm because that's just ridiculous. 478 00:27:02,960 --> 00:27:05,040 Speaker 1: That would be that would help me wake up. Probably 479 00:27:05,200 --> 00:27:07,000 Speaker 1: I should do that, but no, I do not. I'm 480 00:27:07,000 --> 00:27:08,840 Speaker 1: a morning person anyway, So as soon as as soon 481 00:27:08,840 --> 00:27:10,359 Speaker 1: as those eyes pop up and I'm ready to go, 482 00:27:10,720 --> 00:27:13,359 Speaker 1: I'm not happy about it. I'm not sure full morning person. 483 00:27:13,480 --> 00:27:17,320 Speaker 1: I'll still be grouchy at you. I just I'm just alert, 484 00:27:17,840 --> 00:27:19,960 Speaker 1: that's all. But but yeah, but but but that that 485 00:27:20,040 --> 00:27:21,760 Speaker 1: in the kind of behavior where you see more people 486 00:27:21,800 --> 00:27:24,200 Speaker 1: at a concert, for example, um, taking photos at the 487 00:27:24,240 --> 00:27:28,760 Speaker 1: concert than you do watching the concert. Yeah. Yeah, And 488 00:27:28,880 --> 00:27:33,040 Speaker 1: that's the thing, And that's you know, Yeah, I get 489 00:27:33,040 --> 00:27:36,680 Speaker 1: grouchy at concerts too. That's why every concert needs to 490 00:27:36,680 --> 00:27:38,560 Speaker 1: be a private show of just me and the and 491 00:27:38,600 --> 00:27:40,720 Speaker 1: the musician. When one of those get off my lawn 492 00:27:40,760 --> 00:27:43,360 Speaker 1: kind of moments. Yeah, there's only a couple of musicians 493 00:27:43,400 --> 00:27:46,720 Speaker 1: I know who would be willing to do that, oh, 494 00:27:46,800 --> 00:27:49,879 Speaker 1: to to actually do like, okay, it's just me and you, 495 00:27:49,920 --> 00:27:53,280 Speaker 1: I'll play my songs for you, someone that we know together. 496 00:27:55,080 --> 00:27:58,119 Speaker 1: Because she's nice. Yes, and it would be you know, 497 00:27:58,560 --> 00:28:01,200 Speaker 1: she she'd essentially be doing a favor for me. Yes, 498 00:28:01,320 --> 00:28:04,199 Speaker 1: So I like hear anything. But yeah, no, I think 499 00:28:04,240 --> 00:28:07,720 Speaker 1: it's a good discussion. I mean it's I think ultimately 500 00:28:07,760 --> 00:28:09,760 Speaker 1: the takeaway we have to have is that we do 501 00:28:09,840 --> 00:28:16,240 Speaker 1: not have a full spectrum of data to really support 502 00:28:16,280 --> 00:28:18,680 Speaker 1: this one way or the other. But it looks like 503 00:28:19,080 --> 00:28:23,000 Speaker 1: it's not as damaging as we would first think perhaps, 504 00:28:23,800 --> 00:28:27,200 Speaker 1: and it may in fact be helpful. Uh, since it's 505 00:28:27,280 --> 00:28:29,320 Speaker 1: kind of a social science thing, it's what we call 506 00:28:29,359 --> 00:28:32,840 Speaker 1: one of the soft sciences, and whenever you get people involved, 507 00:28:32,920 --> 00:28:36,600 Speaker 1: it really messes with the variables. So because we're not 508 00:28:36,640 --> 00:28:39,520 Speaker 1: all the same as it turns out crazy. Yeah, funky 509 00:28:39,600 --> 00:28:43,840 Speaker 1: about that, right. So ultimately it may we may not 510 00:28:43,880 --> 00:28:47,480 Speaker 1: be able to come down and say definitively whether it's 511 00:28:47,520 --> 00:28:51,840 Speaker 1: good or bad. It just it looks like it's not. Yeah, 512 00:28:52,680 --> 00:28:55,200 Speaker 1: And that wraps up another classic episode of tech Stuff. 513 00:28:55,480 --> 00:28:58,000 Speaker 1: Thank you so much for listening. Hope you enjoyed this 514 00:28:58,120 --> 00:29:02,600 Speaker 1: and the debut and the classics of Lauren Vogelbaum, who 515 00:29:02,840 --> 00:29:05,120 Speaker 1: was my co host for for a little while, quite 516 00:29:05,120 --> 00:29:07,280 Speaker 1: some time actually, And so we're gonna be hearing a 517 00:29:07,320 --> 00:29:10,200 Speaker 1: lot more from her over the next few classic episodes. 518 00:29:10,600 --> 00:29:13,800 Speaker 1: If you guys have suggestions for future episodes of tech Stuff, 519 00:29:14,000 --> 00:29:15,960 Speaker 1: feel free to get in touch with me. You can 520 00:29:16,040 --> 00:29:18,920 Speaker 1: email tech Stuff at how stuff works dot com or 521 00:29:19,000 --> 00:29:21,800 Speaker 1: drop a line on Facebook or Twitter. The handle of 522 00:29:21,920 --> 00:29:25,440 Speaker 1: both of those is text Stuff h SW. You can 523 00:29:25,480 --> 00:29:28,680 Speaker 1: also pop on over to our website, tech stuff podcast 524 00:29:28,760 --> 00:29:31,520 Speaker 1: dot com. As I said earlier, you can go to 525 00:29:31,760 --> 00:29:35,280 Speaker 1: the archive of every episode we've ever published. It's searchable, 526 00:29:35,320 --> 00:29:37,120 Speaker 1: so if you want to look for a specific topic 527 00:29:37,160 --> 00:29:39,640 Speaker 1: you can. You can also find a link to our 528 00:29:39,680 --> 00:29:42,320 Speaker 1: online store, where every purchase you make goes to out 529 00:29:42,360 --> 00:29:44,840 Speaker 1: the show and we greatly appreciate it, and I will 530 00:29:44,840 --> 00:29:52,120 Speaker 1: talk to you again really soon. Text Stuff is a 531 00:29:52,120 --> 00:29:54,840 Speaker 1: production of I Heart Radio's How Stuff Works. For more 532 00:29:54,920 --> 00:29:58,320 Speaker 1: podcasts from my heart Radio, visit the i heart Radio app, 533 00:29:58,440 --> 00:30:01,280 Speaker 1: Apple Podcasts, or where are you listen to your favorite 534 00:30:01,280 --> 00:30:01,600 Speaker 1: shows