1 00:00:00,600 --> 00:00:04,400 Speaker 1: Hey, San Francisco, we're coming back to see you. Yes, 2 00:00:04,920 --> 00:00:07,360 Speaker 1: our second year in a row. We're gonna be going 3 00:00:07,400 --> 00:00:10,720 Speaker 1: to s F Sketch Fest. I'd like to think it's 4 00:00:10,760 --> 00:00:14,080 Speaker 1: the premier comedy festival in the United States. Probably well 5 00:00:14,160 --> 00:00:17,760 Speaker 1: in the world, you think, so, yeah, what about Beijing? Nope, 6 00:00:18,640 --> 00:00:21,239 Speaker 1: it's a it's a close second. For a second. Well, 7 00:00:21,280 --> 00:00:24,320 Speaker 1: we love San Francisco, we love performing there. Everyone is 8 00:00:24,360 --> 00:00:27,040 Speaker 1: always so kind to us. And by San Francisco we 9 00:00:27,080 --> 00:00:29,880 Speaker 1: mean the entire Bay area of course. Yeah, so we 10 00:00:29,960 --> 00:00:32,720 Speaker 1: will be there doing our thing for a one time 11 00:00:32,800 --> 00:00:38,199 Speaker 1: only show on Sunday, January one pm. Yeah, it's the 12 00:00:38,320 --> 00:00:44,720 Speaker 1: rare Sunday afternoon. We're like the NFL of podcasters. Yeah right, 13 00:00:45,280 --> 00:00:47,640 Speaker 1: that's that's what I've always thought. So all you have 14 00:00:47,680 --> 00:00:50,080 Speaker 1: to do is go to the SF Sketch Fest site, 15 00:00:50,520 --> 00:00:52,680 Speaker 1: look at the old calendar, and there are tons of 16 00:00:52,720 --> 00:00:55,720 Speaker 1: great people performing. Oh yeah, So I suggest like just 17 00:00:56,120 --> 00:00:57,960 Speaker 1: doubling down and getting tickets to all kinds of good 18 00:00:57,960 --> 00:01:00,200 Speaker 1: shows for sure, and hurry up and get tickets to ours, 19 00:01:00,280 --> 00:01:02,480 Speaker 1: because they've only been on sale for a week or 20 00:01:02,480 --> 00:01:04,760 Speaker 1: so and they're already half sold out. That's right, So 21 00:01:04,840 --> 00:01:09,880 Speaker 1: please hurry San Francisco. Please hurry. Welcome to Stuff you 22 00:01:09,880 --> 00:01:18,840 Speaker 1: should know from How Stuff Works dot Com. Hey, and 23 00:01:18,880 --> 00:01:21,800 Speaker 1: welcome to the podcast. I'm Josh Clark, and there's Charles W. 24 00:01:21,959 --> 00:01:25,640 Speaker 1: Chuck Bryant, and there's Jerry over there. So this is 25 00:01:25,680 --> 00:01:34,400 Speaker 1: stuff you should know the Computer addiction episode. Yeah. So 26 00:01:34,600 --> 00:01:44,479 Speaker 1: computer addiction, as it turns out, is many things. Mhm uh. 27 00:01:44,760 --> 00:01:49,080 Speaker 1: And I'm gonna include smartphone addiction in this as computer addiction. 28 00:01:49,200 --> 00:01:51,240 Speaker 1: Yeah for sure, because this was written before there were 29 00:01:51,280 --> 00:01:56,760 Speaker 1: computers in people's pockets. But it is porn addiction. It 30 00:01:56,880 --> 00:02:02,560 Speaker 1: is online porn. Yeah, um, yeah, exactly. Um. Is there 31 00:02:02,560 --> 00:02:07,400 Speaker 1: any other way there's gent Well, that's like when when 32 00:02:07,400 --> 00:02:11,919 Speaker 1: poor Fred Willard went to uh like a a porno 33 00:02:12,000 --> 00:02:17,600 Speaker 1: theater and was caught or something, and like everybody felt 34 00:02:17,639 --> 00:02:20,160 Speaker 1: so bad because everyone loves for Fred Willard and it 35 00:02:20,200 --> 00:02:22,840 Speaker 1: wasn't like some big scandal. Everyone's just like, oh, poor Fred. 36 00:02:22,880 --> 00:02:25,440 Speaker 1: Somebody needs to like explain to him how this has 37 00:02:25,440 --> 00:02:28,360 Speaker 1: done these days, Like you don't need to leave your house. 38 00:02:28,639 --> 00:02:31,440 Speaker 1: How are there even porno theaters? Least? I don't know. 39 00:02:31,880 --> 00:02:36,320 Speaker 1: Is it like kitchy retro vintage kind of thing. Maybe 40 00:02:36,560 --> 00:02:40,400 Speaker 1: I would go to one of those okay, just as 41 00:02:40,480 --> 00:02:43,680 Speaker 1: you know, just to go look around and be like, hey, 42 00:02:43,919 --> 00:02:45,400 Speaker 1: nice to meet you. I was just wanting to shake 43 00:02:45,440 --> 00:02:49,280 Speaker 1: your hand. Won't be shaking hands, but I think it 44 00:02:49,320 --> 00:02:51,280 Speaker 1: would be interesting. Like, you know, this is what it 45 00:02:51,280 --> 00:02:54,920 Speaker 1: was like in the nineteen seventies, you know on Times Square. 46 00:02:56,040 --> 00:02:57,919 Speaker 1: All right, I'm gonna find me a porno theater. I'm 47 00:02:57,960 --> 00:03:00,320 Speaker 1: totally going. You know, Georgia Theater used to be one. 48 00:03:00,320 --> 00:03:04,720 Speaker 1: That's how it started out in Athens. I don't think 49 00:03:04,760 --> 00:03:06,520 Speaker 1: I knew that. And then God burned it down a 50 00:03:06,520 --> 00:03:09,440 Speaker 1: few years back because of that, and Dave Matthews built 51 00:03:09,440 --> 00:03:12,359 Speaker 1: it back up. I remember when he used to play there, 52 00:03:12,400 --> 00:03:16,200 Speaker 1: like it seems like every three months. When like I 53 00:03:16,240 --> 00:03:18,440 Speaker 1: was in school there before he was a big deal. 54 00:03:19,000 --> 00:03:21,400 Speaker 1: I was like, who are these guys? Why is her 55 00:03:21,480 --> 00:03:24,360 Speaker 1: name always up on that marquee? They said, you'll understand Sunday. 56 00:03:25,160 --> 00:03:28,720 Speaker 1: I still don't. When is that day coming? Uh? Well, 57 00:03:28,840 --> 00:03:31,919 Speaker 1: now it's not going to if it hasn't yet. So 58 00:03:32,080 --> 00:03:35,640 Speaker 1: it is porn addiction. It is uh maybe gambling addiction, 59 00:03:36,720 --> 00:03:40,960 Speaker 1: video game addiction, gaming addiction for sure, social media addiction, 60 00:03:41,720 --> 00:03:44,880 Speaker 1: YouTube addiction. Yeah, and all a lot of things wrapped 61 00:03:44,960 --> 00:03:47,440 Speaker 1: up under this big banner of computer addiction, right, and 62 00:03:47,480 --> 00:03:52,520 Speaker 1: everything you just UM mentioned is what's called collectively computer 63 00:03:52,680 --> 00:03:59,640 Speaker 1: mediated communication, right. Or there's another subgroup called Internet Addiction disorder. 64 00:04:00,520 --> 00:04:03,120 Speaker 1: And we should point out right here that none of this, 65 00:04:03,960 --> 00:04:06,880 Speaker 1: there's some controversy with some of this, and none of 66 00:04:06,880 --> 00:04:10,840 Speaker 1: it is officially listed yet. Still, as far as I 67 00:04:10,840 --> 00:04:15,120 Speaker 1: can tell, no, there was a push to get UM 68 00:04:15,760 --> 00:04:18,880 Speaker 1: computer addiction of some form or another, at least Internet 69 00:04:19,120 --> 00:04:23,400 Speaker 1: Addiction Disorder UM put into the d s M five, 70 00:04:23,560 --> 00:04:26,320 Speaker 1: which they put together a couple of years back, the 71 00:04:26,320 --> 00:04:32,920 Speaker 1: American Psychological Association UM. But they said, no, we should 72 00:04:32,920 --> 00:04:35,880 Speaker 1: do a show just on that because yeah, well no 73 00:04:35,960 --> 00:04:39,320 Speaker 1: the DSM period because I was looking at the year 74 00:04:39,400 --> 00:04:41,280 Speaker 1: because I was curious when the next one was due, 75 00:04:41,600 --> 00:04:43,200 Speaker 1: and then I looked at him throughout time, and I 76 00:04:43,240 --> 00:04:45,640 Speaker 1: don't think there is unless I'm wrong. It didn't look 77 00:04:45,640 --> 00:04:49,160 Speaker 1: like there was any like set like every ten years 78 00:04:49,160 --> 00:04:51,200 Speaker 1: we're gonna put one out. No, I don't know what 79 00:04:51,440 --> 00:04:54,160 Speaker 1: schedule it's it's on. Maybe when enough of the stuff 80 00:04:54,320 --> 00:04:57,600 Speaker 1: turns out to be bunk. Yeah, They're like, I was 81 00:04:57,760 --> 00:05:01,320 Speaker 1: probably write this thing. Yeah maybe so like, yeah, homosexuality 82 00:05:01,400 --> 00:05:05,720 Speaker 1: isn't comparable to insanity any longer. We should probably rewrite 83 00:05:05,720 --> 00:05:08,480 Speaker 1: the handbook, you know. And I wonder if they make 84 00:05:08,520 --> 00:05:11,800 Speaker 1: addendums or if it's just like, Nope, it's locked the 85 00:05:11,839 --> 00:05:14,760 Speaker 1: next until the next one comes out you're insane because 86 00:05:14,760 --> 00:05:17,320 Speaker 1: you're gay, But then the next day it comes out 87 00:05:17,400 --> 00:05:21,880 Speaker 1: it's like, oh, you're cud anyway. So they tried to 88 00:05:21,920 --> 00:05:24,159 Speaker 1: get something like that in the DSM five and and 89 00:05:24,240 --> 00:05:27,760 Speaker 1: it did not correct. Apparently. What I saw is that 90 00:05:27,800 --> 00:05:30,720 Speaker 1: there's a basically a note in there that says, like, 91 00:05:31,000 --> 00:05:34,520 Speaker 1: we understand that this is a thing that people are researching, 92 00:05:35,200 --> 00:05:37,200 Speaker 1: so we're going to keep an eye on it, and 93 00:05:37,240 --> 00:05:39,440 Speaker 1: it just needs more what we need to do more 94 00:05:39,480 --> 00:05:43,360 Speaker 1: research on it. It's just too misunderstood or not well 95 00:05:43,680 --> 00:05:47,280 Speaker 1: understood enough to morerant being included as which is how 96 00:05:47,279 --> 00:05:49,839 Speaker 1: it should work. Like, because all the different studies I 97 00:05:49,839 --> 00:05:52,400 Speaker 1: looked at was they're all pretty lame, to be honest. 98 00:05:52,640 --> 00:05:55,600 Speaker 1: There's a ton of them though, right, Yeah, but everyone 99 00:05:55,640 --> 00:05:57,320 Speaker 1: I saw was and we'll go over and was like 100 00:05:57,440 --> 00:06:01,680 Speaker 1: we asked twenty people question I saw. I saw others 101 00:06:01,720 --> 00:06:05,800 Speaker 1: that were legitimate. Um, And there are also apparently a 102 00:06:05,880 --> 00:06:09,120 Speaker 1: lot of neurological studies as well, because there's a big 103 00:06:09,160 --> 00:06:13,880 Speaker 1: controversy not just in whether, um, you could actually be 104 00:06:14,040 --> 00:06:18,120 Speaker 1: addicted to computers or if it's just an impulsive of 105 00:06:18,200 --> 00:06:21,720 Speaker 1: failure to control your impulses. Um. So people say, well, 106 00:06:21,760 --> 00:06:23,640 Speaker 1: they're just lazy, they just want to sit around in 107 00:06:23,640 --> 00:06:25,599 Speaker 1: front of a computer all day. That doesn't make them addicted. 108 00:06:25,760 --> 00:06:28,000 Speaker 1: Is that what all addiction is? Though? Is it failure 109 00:06:28,120 --> 00:06:32,760 Speaker 1: to control your impulse? No, I should go back and 110 00:06:32,760 --> 00:06:35,320 Speaker 1: re listen. You could. And I think some people make 111 00:06:35,400 --> 00:06:40,400 Speaker 1: the case that behavioral addictions, which are non substance addictions, 112 00:06:40,440 --> 00:06:44,960 Speaker 1: are failures to control your impulse. Right. Other groups say, no, 113 00:06:45,200 --> 00:06:49,400 Speaker 1: it's way off. These behaviors are still releasing dopamine in 114 00:06:49,440 --> 00:06:53,719 Speaker 1: your brain. Um, in which case that it's following the 115 00:06:53,760 --> 00:06:57,600 Speaker 1: same mechanism of addiction that heroin or cocaine creates. It's 116 00:06:57,640 --> 00:07:00,279 Speaker 1: just a behavior. So there's still kind of like got 117 00:07:00,320 --> 00:07:03,120 Speaker 1: a little bit of budding heads over that even which, 118 00:07:03,279 --> 00:07:06,279 Speaker 1: of course, then that means that something as a morphous 119 00:07:06,360 --> 00:07:09,560 Speaker 1: is Internet addiction disorder couldn't possibly be agreed upon at 120 00:07:09,560 --> 00:07:12,800 Speaker 1: this point. Uh. And you can tell this article is 121 00:07:12,840 --> 00:07:16,360 Speaker 1: dated too because it talks about instant messaging. So every 122 00:07:16,440 --> 00:07:18,240 Speaker 1: time it said that, I just crossed it out and 123 00:07:18,240 --> 00:07:21,320 Speaker 1: put texting nice. It's kind of the new I am. So. 124 00:07:21,400 --> 00:07:23,160 Speaker 1: One other thing that got me too is I went 125 00:07:23,200 --> 00:07:25,600 Speaker 1: and looked at the citation. It was written in two 126 00:07:25,640 --> 00:07:28,920 Speaker 1: thousand seven. And what's scary though, is a lot of 127 00:07:28,960 --> 00:07:32,480 Speaker 1: the stuff he's describing is taken as like totally normal 128 00:07:33,160 --> 00:07:36,520 Speaker 1: and among the general population in the West. Now, yeah, 129 00:07:36,720 --> 00:07:42,240 Speaker 1: you know it is. It is pretty interesting. Um, all right, Well, 130 00:07:42,280 --> 00:07:46,440 Speaker 1: obviously if we're talking about the smartphones and computers, laptops, desktops, 131 00:07:46,480 --> 00:07:52,000 Speaker 1: whatever your device. Um, we are talking about basically if 132 00:07:52,000 --> 00:07:55,320 Speaker 1: you look down the list of of what happens if 133 00:07:55,400 --> 00:08:00,360 Speaker 1: you are quote computer addicted in quote, it's basically the 134 00:08:00,400 --> 00:08:04,080 Speaker 1: same as any alcohol or drug addiction. Um, do you 135 00:08:04,120 --> 00:08:06,840 Speaker 1: stay on the computer front? It's longer than intended or 136 00:08:06,880 --> 00:08:10,000 Speaker 1: not noticed? Notice the passage of time. You could say 137 00:08:10,040 --> 00:08:14,000 Speaker 1: that with drugs. Uh. Do you make conscious efforts to 138 00:08:14,040 --> 00:08:17,120 Speaker 1: cut back on computer time and repeatedly fail? That's a 139 00:08:17,120 --> 00:08:19,640 Speaker 1: big one, big one. Uh. Do you think about your 140 00:08:19,680 --> 00:08:22,360 Speaker 1: computer a lot when you're not using it or constantly 141 00:08:22,360 --> 00:08:25,240 Speaker 1: look forward to the next opportunity use it? Drugs and alcohol? 142 00:08:25,440 --> 00:08:29,920 Speaker 1: That is why I don't play video games in Yeah, 143 00:08:30,200 --> 00:08:36,640 Speaker 1: in like seven whenever the Clone Wars with that episode 144 00:08:36,760 --> 00:08:40,160 Speaker 1: one who knows Okay, well whatever episode, the one with 145 00:08:40,240 --> 00:08:45,320 Speaker 1: jar Jar binks, probably the greatest Star Wars character ever created. Um, 146 00:08:46,320 --> 00:08:49,960 Speaker 1: the that video game that came out in association with it, 147 00:08:50,040 --> 00:08:52,800 Speaker 1: I found myself like thinking about how to play it 148 00:08:52,880 --> 00:08:56,080 Speaker 1: better when I wasn't playing it, and I was like 149 00:08:56,360 --> 00:09:01,240 Speaker 1: was this on the PlayStation? And uh, I was like 150 00:09:01,280 --> 00:09:02,880 Speaker 1: this is this is no way to live. I'm not 151 00:09:02,920 --> 00:09:05,840 Speaker 1: doing this anymore. So I stopped playing games altogether. Well, 152 00:09:06,080 --> 00:09:08,480 Speaker 1: they definitely like I think anyone who ever played a 153 00:09:08,480 --> 00:09:11,600 Speaker 1: lot of Tetris had Tetras streams or would look at 154 00:09:11,720 --> 00:09:16,080 Speaker 1: the world in some ways as a Tetris grid. Um, 155 00:09:16,120 --> 00:09:18,320 Speaker 1: some games really have a knack for getting into your 156 00:09:18,360 --> 00:09:21,280 Speaker 1: crawl that way. Like you know, I've been playing a 157 00:09:21,280 --> 00:09:23,839 Speaker 1: lot of I can't even think of the name of 158 00:09:23,840 --> 00:09:31,319 Speaker 1: it now. Um, they're very immersive, you know, dude, Especially now. 159 00:09:31,320 --> 00:09:35,440 Speaker 1: I mean like that that Star Wars game was was cruddy. Yeah, 160 00:09:35,600 --> 00:09:37,439 Speaker 1: Like I that wasn't the first and only game I'd 161 00:09:37,440 --> 00:09:39,160 Speaker 1: ever played. I'd played plenty of other games, and I 162 00:09:39,160 --> 00:09:41,520 Speaker 1: could tell you that was not a good game, But 163 00:09:41,720 --> 00:09:44,520 Speaker 1: I still that I think that made it even more 164 00:09:45,200 --> 00:09:47,320 Speaker 1: Its strengthened my resolve even more that Like, if I 165 00:09:47,360 --> 00:09:49,280 Speaker 1: was sitting there thinking about how to play this cruddy 166 00:09:49,320 --> 00:09:53,320 Speaker 1: game better, Um, I should probably just stop playing games altogether. Yeah, 167 00:09:53,320 --> 00:09:56,120 Speaker 1: so you haven't played any games since? Yeah, I don't 168 00:09:56,120 --> 00:09:58,040 Speaker 1: play a lot. I think I've mentioned this before. I'll 169 00:09:58,120 --> 00:10:01,440 Speaker 1: usually get every couple of years, I'll get one game 170 00:10:01,520 --> 00:10:03,680 Speaker 1: that's the best reviewed game, and I will play that 171 00:10:03,760 --> 00:10:06,160 Speaker 1: obsessively for a little while then quit. So I'm sure 172 00:10:06,200 --> 00:10:09,240 Speaker 1: you have like track the progress of games these days, 173 00:10:09,240 --> 00:10:11,440 Speaker 1: and now that we're starting to move into virtual reality, 174 00:10:11,520 --> 00:10:16,520 Speaker 1: it's like you're we're gonna be totally lost as a species. Alright. 175 00:10:16,520 --> 00:10:20,280 Speaker 1: So continuing on how it mirrors drug addiction, UM, hiding 176 00:10:20,320 --> 00:10:23,280 Speaker 1: the extent of your computer used from family and friends. Yeah, 177 00:10:23,320 --> 00:10:26,760 Speaker 1: that's totally use a computer as an escape when feeling 178 00:10:26,800 --> 00:10:30,840 Speaker 1: depressed or stressed. That one, to me is kind of like, Okay, 179 00:10:31,160 --> 00:10:33,200 Speaker 1: I don't I don't see that as a sign of addiction. 180 00:10:33,240 --> 00:10:35,520 Speaker 1: But this is a grabstor article, so I'll take it 181 00:10:35,559 --> 00:10:38,360 Speaker 1: as gospel fact. I was a scrabstor too, all right. 182 00:10:39,080 --> 00:10:43,240 Speaker 1: Um missing events or um failing at non computer tasks 183 00:10:43,880 --> 00:10:47,480 Speaker 1: because you're on the computer, poor job performance, family activities. 184 00:10:47,520 --> 00:10:50,959 Speaker 1: You miss that family reunion because you're playing jar jar 185 00:10:51,040 --> 00:10:56,520 Speaker 1: banks charge r binks is candy crush, bonanza rodeo uh. 186 00:10:56,559 --> 00:10:59,559 Speaker 1: And then it could lead to things like marital problems, 187 00:10:59,720 --> 00:11:02,800 Speaker 1: um negative consequences, getting in trouble at work. The same 188 00:11:02,800 --> 00:11:04,679 Speaker 1: can be said of alcohol and drugs for almost all 189 00:11:04,720 --> 00:11:10,240 Speaker 1: these Bryant uh. And then sadly, there have been cases 190 00:11:10,320 --> 00:11:16,160 Speaker 1: where computer addicted or gaming addicted people have either died 191 00:11:16,480 --> 00:11:19,719 Speaker 1: or had their children neglected such that they either had 192 00:11:19,720 --> 00:11:23,439 Speaker 1: health problems or died. And if you look it up, 193 00:11:23,800 --> 00:11:28,240 Speaker 1: just look up gaming death or game binge death and 194 00:11:28,800 --> 00:11:31,400 Speaker 1: they are all kinds of stories. It seems like it 195 00:11:31,400 --> 00:11:34,120 Speaker 1: seems like Taiwan is one of the worst. Taiwan had 196 00:11:34,240 --> 00:11:38,120 Speaker 1: two two people die within two weeks of each other 197 00:11:38,480 --> 00:11:42,719 Speaker 1: from gaming binges in two thousand. Yeah. Well they had 198 00:11:42,720 --> 00:11:45,600 Speaker 1: three total, and I think all three were in internet 199 00:11:45,600 --> 00:11:50,319 Speaker 1: cafes even so, not even at home where they can't be, 200 00:11:50,400 --> 00:11:54,120 Speaker 1: you know, disturbed, this out in public. Yeah, and apparently 201 00:11:54,120 --> 00:11:56,520 Speaker 1: one guy was laying there for ten hours before they 202 00:11:56,559 --> 00:12:02,080 Speaker 1: realized and another I couldn't believe this. One other um 203 00:12:02,440 --> 00:12:04,960 Speaker 1: note that I saw in this article was that when 204 00:12:05,000 --> 00:12:08,240 Speaker 1: the police and the paramedics came in to retrieve this 205 00:12:08,360 --> 00:12:11,280 Speaker 1: dead body from the internet cafe. The other people playing 206 00:12:11,640 --> 00:12:15,319 Speaker 1: didn't even stop. I think they just were basically didn't 207 00:12:15,320 --> 00:12:17,360 Speaker 1: even pay attention to the fact that a dead body 208 00:12:17,440 --> 00:12:20,840 Speaker 1: was being removed from the internet cafe. So there's a 209 00:12:20,840 --> 00:12:25,080 Speaker 1: book called Death by Video Game and um, it's actually 210 00:12:25,080 --> 00:12:27,280 Speaker 1: not new. This is this happened in the nineteen eighties. 211 00:12:27,320 --> 00:12:29,840 Speaker 1: Even ever since, there have been games. People have played 212 00:12:29,840 --> 00:12:32,600 Speaker 1: them until they died. And Uh, I was just curious 213 00:12:32,600 --> 00:12:36,640 Speaker 1: about Taiwan. And the author basically says Taiwan in particular, 214 00:12:37,000 --> 00:12:39,760 Speaker 1: um at these internet cafes, which is a cheap way 215 00:12:39,800 --> 00:12:45,120 Speaker 1: to get online and stay online. Yeah, so they've got 216 00:12:45,160 --> 00:12:47,920 Speaker 1: these cafes. There's a lot of smoking going on in there, 217 00:12:48,320 --> 00:12:51,920 Speaker 1: a lot of caffeine drinking. Um, the humidity in the 218 00:12:51,920 --> 00:12:55,160 Speaker 1: country is really tough. And basically says all of this 219 00:12:55,240 --> 00:12:58,240 Speaker 1: adds up to UM and you know, of course you're 220 00:12:58,240 --> 00:13:01,600 Speaker 1: not eating well, you're not exercising at all, uh, and 221 00:13:01,640 --> 00:13:04,480 Speaker 1: all of this adds up to really increased likelihoods of 222 00:13:04,480 --> 00:13:08,760 Speaker 1: things like blood clots and UM. Because you think, like, 223 00:13:08,800 --> 00:13:11,559 Speaker 1: how do you literally die from like a nineteen hour 224 00:13:11,640 --> 00:13:14,120 Speaker 1: bench of a video game? It's it's all the other 225 00:13:14,160 --> 00:13:16,959 Speaker 1: things that go into how you've treated your body really 226 00:13:17,760 --> 00:13:21,199 Speaker 1: neglect essentially. Yeah, I mean, a blood clot makes sense 227 00:13:21,240 --> 00:13:24,080 Speaker 1: to me. I saw exhaustion. I'm like, is this the 228 00:13:24,120 --> 00:13:27,920 Speaker 1: eighteen nineties. You don't die of exhaustion from sitting around. 229 00:13:28,600 --> 00:13:32,080 Speaker 1: Maybe you die of the vapors, you know, blood clot? 230 00:13:32,160 --> 00:13:34,120 Speaker 1: I get that makes sense, And it does too because 231 00:13:34,160 --> 00:13:37,800 Speaker 1: your legs are sitting there immobilized. Um, so of course 232 00:13:37,840 --> 00:13:41,920 Speaker 1: you could get a blood clot. Pulmonary thrombosis, right yeah, 233 00:13:42,160 --> 00:13:45,800 Speaker 1: or pulmonary Yeah, pulmonary thrombosis not good, not good at 234 00:13:45,840 --> 00:13:47,679 Speaker 1: all because it travels to your heart or your brain, 235 00:13:48,160 --> 00:13:51,960 Speaker 1: and all of a sudden, your World Warcraft character is 236 00:13:52,000 --> 00:13:55,760 Speaker 1: just standing there not doing anything because you're debt. Uh. 237 00:13:55,800 --> 00:13:58,120 Speaker 1: I see what you're saying though about this article being 238 00:13:58,160 --> 00:14:00,760 Speaker 1: written a while ago and then hearing it to today, 239 00:14:00,760 --> 00:14:04,280 Speaker 1: because it says even when people do interact with friends, 240 00:14:04,760 --> 00:14:07,240 Speaker 1: they may become irritable because they're away from their computer. 241 00:14:08,000 --> 00:14:10,800 Speaker 1: And now people aren't away from their computer ever because 242 00:14:10,800 --> 00:14:13,760 Speaker 1: of the smartphone. And it's just morphed into this thing 243 00:14:13,800 --> 00:14:18,160 Speaker 1: where it's just accepted that it's okay to be having 244 00:14:18,200 --> 00:14:22,520 Speaker 1: a conversation with someone while they're looking at something else right, right, 245 00:14:22,800 --> 00:14:25,560 Speaker 1: And I mean the idea that you're sitting there physically 246 00:14:25,720 --> 00:14:32,760 Speaker 1: with somebody and they're hanging out online with other people. Um, 247 00:14:32,800 --> 00:14:34,960 Speaker 1: and that's who they're actually hanging out with, even though 248 00:14:35,000 --> 00:14:37,840 Speaker 1: they're physically with you in the room. Yeah, it's weird, 249 00:14:38,000 --> 00:14:41,960 Speaker 1: but that's become basically, I mean, that's accepted behavior now. Yeah, 250 00:14:41,960 --> 00:14:44,280 Speaker 1: I know, it's not hard to step back one degree 251 00:14:44,320 --> 00:14:46,760 Speaker 1: and say this is odd. Yeah, And I wonder what 252 00:14:46,840 --> 00:14:49,880 Speaker 1: the long term, Like we're right in the infancy of 253 00:14:49,920 --> 00:14:51,680 Speaker 1: this thing, Like what are things going to be like 254 00:14:51,720 --> 00:14:54,480 Speaker 1: in fifty years. I was in a bar the other 255 00:14:54,600 --> 00:14:58,120 Speaker 1: night getting some take out dinner for the family, take 256 00:14:58,120 --> 00:15:01,760 Speaker 1: out beer, I don't know, take out food, bar, restaurant, 257 00:15:02,200 --> 00:15:04,920 Speaker 1: and um, and I got this place all the time 258 00:15:04,960 --> 00:15:07,120 Speaker 1: and go sit out the bar, order drink, order the food, 259 00:15:07,320 --> 00:15:08,960 Speaker 1: have my one drink, and get the food to go. 260 00:15:09,440 --> 00:15:12,840 Speaker 1: So I'm there for like twenty five minutes, thirty minutes maybe. 261 00:15:13,360 --> 00:15:15,960 Speaker 1: And I I used to love going to sitting down 262 00:15:15,960 --> 00:15:19,160 Speaker 1: in a bar and talking to strangers next to me, 263 00:15:19,520 --> 00:15:22,960 Speaker 1: striking like the good bar conversation is, like it's the best. 264 00:15:23,680 --> 00:15:25,800 Speaker 1: And I sat down I was between these two dudes. 265 00:15:25,840 --> 00:15:27,720 Speaker 1: I looked at my left, I was staring on his phone. 266 00:15:27,760 --> 00:15:29,800 Speaker 1: Looked at my right, I was staring at his phone. 267 00:15:30,200 --> 00:15:32,400 Speaker 1: People beside them were staring at their phones. Nobody was 268 00:15:32,440 --> 00:15:34,360 Speaker 1: talking to each other. So I ended up having a 269 00:15:34,400 --> 00:15:38,240 Speaker 1: good conversation with the bartender, which was fine. How Yukowski, 270 00:15:38,320 --> 00:15:41,880 Speaker 1: yesk of you. Yeah, that's true, But um, I don't know, man, 271 00:15:41,920 --> 00:15:44,520 Speaker 1: it just depressed me. Yeah, no, I know what you mean. 272 00:15:44,560 --> 00:15:46,880 Speaker 1: When you step back and look around and stuff like that, 273 00:15:47,240 --> 00:15:50,480 Speaker 1: you can make the case that they're still connecting with 274 00:15:50,880 --> 00:15:53,760 Speaker 1: whoever they're talking to, who they'd rather be talking to. 275 00:15:53,800 --> 00:15:55,680 Speaker 1: I guess right, And that's actually I mean, that was 276 00:15:55,720 --> 00:15:57,760 Speaker 1: one of the things that that Ed touches on in 277 00:15:57,800 --> 00:16:00,600 Speaker 1: the negative effects of this is that you you start 278 00:16:00,680 --> 00:16:06,120 Speaker 1: to prefer your online friends. Well, I mean, there are 279 00:16:06,360 --> 00:16:10,200 Speaker 1: it's entirely possible that your online friends are better friends 280 00:16:10,280 --> 00:16:12,960 Speaker 1: from the people you're surrounded with in real life, you know. 281 00:16:13,400 --> 00:16:16,720 Speaker 1: So I don't know that that's necessarily a drawback, but 282 00:16:17,200 --> 00:16:19,440 Speaker 1: there is definitely a case to be made in plenty 283 00:16:19,520 --> 00:16:23,040 Speaker 1: of studies that suggest that we are growing ironically more 284 00:16:23,160 --> 00:16:26,800 Speaker 1: socially isolated the more connected we become. Yeah, but also 285 00:16:26,840 --> 00:16:29,640 Speaker 1: get the feeling that in that bar, if I would 286 00:16:29,640 --> 00:16:36,000 Speaker 1: have said, hey, man, like, let's get a conversation going, 287 00:16:36,000 --> 00:16:37,680 Speaker 1: of course I wouldn't do it that offer, because it 288 00:16:37,720 --> 00:16:41,320 Speaker 1: would have been that's a great conversation, sir, would you 289 00:16:41,320 --> 00:16:43,880 Speaker 1: like to speak with me? Hey man, let's get a conversation. 290 00:16:44,920 --> 00:16:49,040 Speaker 1: But if I got a conversation going and these people 291 00:16:49,040 --> 00:16:52,840 Speaker 1: put their phones away for a minute, they might be like, oh, 292 00:16:52,920 --> 00:16:55,479 Speaker 1: well this, you know, I'd rather be talking to this guy. 293 00:16:55,520 --> 00:16:57,520 Speaker 1: Because a lot of times, I mean, we're assuming people 294 00:16:57,560 --> 00:17:00,520 Speaker 1: are interacting with friends on social media. It might be 295 00:17:01,040 --> 00:17:06,000 Speaker 1: watching cat videos or reading uh news sites angry about 296 00:17:06,040 --> 00:17:08,520 Speaker 1: the election, just feeding into their anger over and over 297 00:17:08,520 --> 00:17:11,959 Speaker 1: and over. Um, they might be like, dude, thank you 298 00:17:12,080 --> 00:17:14,679 Speaker 1: for talking to me. And because that was so much 299 00:17:14,760 --> 00:17:17,520 Speaker 1: more pleasurable than like, it's sort of the de facto 300 00:17:17,640 --> 00:17:21,280 Speaker 1: go to of well, I have thirty seconds to stand 301 00:17:21,280 --> 00:17:23,080 Speaker 1: here and wait for the elevator, right, so let me 302 00:17:23,160 --> 00:17:26,920 Speaker 1: check my social meds. The thing is, I don't necessarily 303 00:17:26,960 --> 00:17:30,400 Speaker 1: agree with you. I think that the more we are 304 00:17:30,480 --> 00:17:33,919 Speaker 1: drawn into our devices to communicate with others, the harder 305 00:17:33,960 --> 00:17:38,560 Speaker 1: it becomes to talk to somebody who says, hey, let's 306 00:17:38,560 --> 00:17:41,119 Speaker 1: get a conversation. Go no, no no, I agree. In real life, 307 00:17:41,160 --> 00:17:43,160 Speaker 1: I agree with that, you know, I think they would 308 00:17:43,400 --> 00:17:46,800 Speaker 1: if people did it, though they might be surprised and delighted, like, wait, 309 00:17:47,000 --> 00:17:49,679 Speaker 1: I can still do this, you know, not me. I 310 00:17:49,720 --> 00:17:52,400 Speaker 1: find I'm failing at it more and more these days. 311 00:17:52,440 --> 00:17:55,520 Speaker 1: So it just makes me feel So we're getting way 312 00:17:55,600 --> 00:17:59,200 Speaker 1: into the opinion category and where irking a few people. 313 00:17:59,240 --> 00:18:25,760 Speaker 1: Let's take a break, all right, we are back with 314 00:18:25,800 --> 00:18:31,160 Speaker 1: facts and figures, okay. So um. One of the other 315 00:18:31,160 --> 00:18:34,240 Speaker 1: things that struck me to chuck was was that losing 316 00:18:34,720 --> 00:18:38,040 Speaker 1: losing touch with the people in your physical life in 317 00:18:38,160 --> 00:18:41,399 Speaker 1: favor of people online that you're friends with. You're also, 318 00:18:41,520 --> 00:18:44,440 Speaker 1: in a lot of cases doing way more spectacular things 319 00:18:44,440 --> 00:18:46,400 Speaker 1: with the people online than your in your real life, 320 00:18:46,440 --> 00:18:50,520 Speaker 1: like going into simulated combat. You know. That's that you 321 00:18:50,640 --> 00:18:54,240 Speaker 1: do interesting stuff with the people you're ine online with 322 00:18:54,760 --> 00:18:58,520 Speaker 1: rather than you know, um well, especially if you lead 323 00:18:58,560 --> 00:19:02,240 Speaker 1: kind of a boring life, you know, and that's all subjective, 324 00:19:02,280 --> 00:19:05,240 Speaker 1: of course, but um, I don't know if your life 325 00:19:05,280 --> 00:19:10,040 Speaker 1: is really boring. Everybody can um. I saw this ESPN 326 00:19:10,160 --> 00:19:15,280 Speaker 1: Outside the Lines episode on this wrestler at University of Michigan, 327 00:19:16,000 --> 00:19:19,119 Speaker 1: Go Wolverines. Name is Marshall Carpenter and I think he 328 00:19:19,160 --> 00:19:21,640 Speaker 1: was a twin if I'm not mistaken. But he would 329 00:19:21,640 --> 00:19:24,320 Speaker 1: spend eight to fourteen hours a day gaming on his 330 00:19:24,680 --> 00:19:29,440 Speaker 1: computer and was done. Like he washed out of wrestling, 331 00:19:30,280 --> 00:19:33,560 Speaker 1: quit Michigan and went to rehab and like had a 332 00:19:33,600 --> 00:19:36,080 Speaker 1: guy come into his house every day to rehab him 333 00:19:36,080 --> 00:19:38,360 Speaker 1: out of it. Uh. And there was a football player too, 334 00:19:38,440 --> 00:19:42,119 Speaker 1: named Quinn pitcock Um that quit the NFL. He played 335 00:19:42,119 --> 00:19:45,000 Speaker 1: one season for the Colts because he was playing Call 336 00:19:45,040 --> 00:19:47,720 Speaker 1: of Duty and only wanted to play Call of Duty. 337 00:19:49,160 --> 00:19:52,639 Speaker 1: It's sad, but it's it is like, how can that? 338 00:19:52,680 --> 00:19:56,040 Speaker 1: How can you not call that an addiction? You know right? No, 339 00:19:56,240 --> 00:19:58,840 Speaker 1: it's true. And examples like that are the ones that 340 00:19:58,920 --> 00:20:01,439 Speaker 1: the people point to say, yes, there is such a 341 00:20:01,520 --> 00:20:05,400 Speaker 1: thing as computer addiction, and it does have pronounced effects 342 00:20:05,400 --> 00:20:09,120 Speaker 1: not just on you know, your NFL career, but if 343 00:20:09,119 --> 00:20:12,800 Speaker 1: you're just an everyday schmo, it can have pronounced effects 344 00:20:12,800 --> 00:20:17,040 Speaker 1: on your relationships to like for example, um, yes, you 345 00:20:17,119 --> 00:20:20,119 Speaker 1: might prefer your online friends who you're playing Call of 346 00:20:20,200 --> 00:20:24,239 Speaker 1: Duty with to your wife, but you are married to 347 00:20:24,280 --> 00:20:27,280 Speaker 1: your wife and if you neglect or ignore her long 348 00:20:27,400 --> 00:20:31,679 Speaker 1: enough in favor of your friends for call of duty, Um, 349 00:20:31,760 --> 00:20:35,560 Speaker 1: she may divorce you. And there's actually there's never been 350 00:20:35,840 --> 00:20:38,919 Speaker 1: from what I've seen, a study that definitively showed it, 351 00:20:39,520 --> 00:20:47,000 Speaker 1: Um that yes, uh, online time equals higher divorce rates. 352 00:20:47,400 --> 00:20:49,320 Speaker 1: But I found one that came pretty close, as a 353 00:20:49,359 --> 00:20:53,439 Speaker 1: two thousand fourteen study that appeared in Computers in Human 354 00:20:53,480 --> 00:20:57,120 Speaker 1: Behavior a journal, and it found a two point one 355 00:20:57,160 --> 00:21:00,560 Speaker 1: eight percent to a four point three two percent. Those 356 00:21:00,640 --> 00:21:04,080 Speaker 1: those like decimal points that you know it's legitimate, um, 357 00:21:04,560 --> 00:21:09,720 Speaker 1: And that that level rise in divorce rates correlated with 358 00:21:09,920 --> 00:21:14,800 Speaker 1: a annual increase in Facebook use in a given area 359 00:21:14,920 --> 00:21:18,919 Speaker 1: in the US. Right, So every that Facebook increased up 360 00:21:18,960 --> 00:21:23,360 Speaker 1: to four percent and change, Um, there was a four 361 00:21:23,400 --> 00:21:25,679 Speaker 1: percent and change increase in the divorce rate for that 362 00:21:25,760 --> 00:21:28,919 Speaker 1: area too. It's entirely possible the two had nothing to 363 00:21:28,960 --> 00:21:32,320 Speaker 1: do with one another. It's also entirely possible that yeah, 364 00:21:32,520 --> 00:21:35,439 Speaker 1: it totally did. Yeah. But then there are people that 365 00:21:35,560 --> 00:21:39,720 Speaker 1: say that. Um. The people that say that, well, it's 366 00:21:39,760 --> 00:21:41,760 Speaker 1: no different than sitting down and watching TV every night 367 00:21:41,760 --> 00:21:43,920 Speaker 1: for four or five hours. Well, you can be addicted 368 00:21:43,920 --> 00:21:48,639 Speaker 1: to television too, I think, um, but I think the 369 00:21:48,680 --> 00:21:55,360 Speaker 1: internet is a bit more immersive than TV, especially especially 370 00:21:55,400 --> 00:21:57,520 Speaker 1: because you don't interact with the TV the way you 371 00:21:57,560 --> 00:22:01,440 Speaker 1: do online usually, right unless your Elvis and Bob Goolays 372 00:22:01,480 --> 00:22:04,879 Speaker 1: on TV and you with your hand going, was it 373 00:22:04,960 --> 00:22:10,400 Speaker 1: Bob Goulay? It used to drive him when oh Man 374 00:22:10,480 --> 00:22:15,200 Speaker 1: Will Ferrell, the Bob thing was so great, remember that. Yeah, 375 00:22:15,240 --> 00:22:17,080 Speaker 1: I didn't know who was on TV when Elvis shot it. 376 00:22:17,920 --> 00:22:20,320 Speaker 1: And sometimes he would just be like with the TV 377 00:22:20,440 --> 00:22:23,200 Speaker 1: wasn't enough. He'd shoot his toaster or the dishwasher. Yeah, 378 00:22:23,320 --> 00:22:27,240 Speaker 1: I'm Robert grew because he would see Gooley everywhere. Wow. Yeah, 379 00:22:27,520 --> 00:22:30,280 Speaker 1: els is on a lot of drugs. Yeah, but they 380 00:22:30,280 --> 00:22:34,000 Speaker 1: were legal. God bless my grandmother, God rest her soul. 381 00:22:34,080 --> 00:22:36,119 Speaker 1: She you know. They were from Memphis, and it was 382 00:22:36,119 --> 00:22:39,840 Speaker 1: always like, oh, Elvis, you know, he was still Memphis's son. 383 00:22:40,000 --> 00:22:43,159 Speaker 1: Like all his doctors, they had him going every which way. 384 00:22:43,680 --> 00:22:47,199 Speaker 1: His doctors killed him. He was a big, big fat junkie. 385 00:22:47,280 --> 00:22:50,280 Speaker 1: He loved his drugs. I don't want to like be 386 00:22:50,320 --> 00:22:51,760 Speaker 1: the one to break it to you, so I never did. 387 00:22:51,800 --> 00:22:53,919 Speaker 1: I just let her think that. What's funny is that 388 00:22:54,160 --> 00:22:58,479 Speaker 1: horribly ironic um but also hysterically ironic thing that he 389 00:22:58,800 --> 00:23:02,280 Speaker 1: hated drug dealers. Like he would get wasted on prescription 390 00:23:02,320 --> 00:23:05,240 Speaker 1: drugs and get so worked up thinking about drug dealers 391 00:23:05,240 --> 00:23:07,760 Speaker 1: living in his town and you want to go shoot him, 392 00:23:07,760 --> 00:23:09,760 Speaker 1: and his boys would have to like restrain him and 393 00:23:10,160 --> 00:23:14,560 Speaker 1: calm him down. So he didn't drug dealers sit down 394 00:23:14,560 --> 00:23:17,640 Speaker 1: to put this under your tongue on biggie sit back then, 395 00:23:17,920 --> 00:23:20,359 Speaker 1: Well didn't he wasn't he like an honorary d A 396 00:23:20,440 --> 00:23:23,320 Speaker 1: guy from Nixon. He tried to make it. I think 397 00:23:23,359 --> 00:23:25,920 Speaker 1: they know there's that famous picture of him shaking hands 398 00:23:25,920 --> 00:23:30,560 Speaker 1: with Nixon. He got that that meeting arranged. Nixon didn't 399 00:23:30,560 --> 00:23:32,679 Speaker 1: wanted to happen because he's like, this is a preposterous 400 00:23:32,680 --> 00:23:34,879 Speaker 1: I'm not gonna make this guy d E agent. But 401 00:23:35,440 --> 00:23:40,119 Speaker 1: Elvis was offering himself as a d E, a undercover agent, 402 00:23:40,160 --> 00:23:43,640 Speaker 1: because he could get closer to the hippies and all that. Um, 403 00:23:43,680 --> 00:23:47,320 Speaker 1: And she was like, here's your junior badge. Anything I 404 00:23:47,320 --> 00:23:50,440 Speaker 1: can do, just let me know. Man. Uh, there's a 405 00:23:50,520 --> 00:23:52,080 Speaker 1: movie out about that now that I want to see 406 00:23:52,080 --> 00:23:56,679 Speaker 1: that Elvison Nixon meeting. Yeah, yeah, I haven't seen it yet. Um. Alright, 407 00:23:56,680 --> 00:24:00,639 Speaker 1: So back to I was not expecting to Nixon to 408 00:24:00,680 --> 00:24:03,359 Speaker 1: make an appearance in this that wasn't either. I did 409 00:24:03,440 --> 00:24:06,040 Speaker 1: find one study. This is Dr Susan Mueller at the 410 00:24:06,119 --> 00:24:10,080 Speaker 1: University of Maryland, Go Turps. She asked two hundred students 411 00:24:10,119 --> 00:24:13,160 Speaker 1: to abstain from all media for twenty four hours, called 412 00:24:13,160 --> 00:24:16,200 Speaker 1: it twenty four hours unplugged. Had a colon in there. 413 00:24:16,240 --> 00:24:18,520 Speaker 1: Even there's like a lot of that too. There's like 414 00:24:18,640 --> 00:24:21,320 Speaker 1: camps in Japan, they have fasting camps, they call him 415 00:24:21,320 --> 00:24:25,680 Speaker 1: where like you're just away from anything technological or connected. 416 00:24:26,040 --> 00:24:29,239 Speaker 1: See that's great. Yeah, I don't think it's great if 417 00:24:29,240 --> 00:24:31,920 Speaker 1: you're one of the poor teenagers whose parents put you there. 418 00:24:33,119 --> 00:24:34,600 Speaker 1: I bet at the end of the week, though, you 419 00:24:34,600 --> 00:24:36,840 Speaker 1: get these great stories about how like I hated it 420 00:24:36,880 --> 00:24:39,120 Speaker 1: going in, but you know, I made these new friends 421 00:24:39,200 --> 00:24:42,000 Speaker 1: and and then they're probably something, are like this was 422 00:24:42,040 --> 00:24:46,800 Speaker 1: the worst experience, right. Uh. But she basically just had 423 00:24:46,840 --> 00:24:50,240 Speaker 1: these kids, these students describe things and what it was 424 00:24:50,320 --> 00:24:54,359 Speaker 1: like wasn't super scientific, but um all of them said 425 00:24:54,359 --> 00:24:57,399 Speaker 1: that they were I could not function without it. And 426 00:24:57,440 --> 00:25:00,880 Speaker 1: they across the board, all two hundred, Like there wasn't 427 00:25:00,880 --> 00:25:03,280 Speaker 1: a single person that said like, oh that was nice. 428 00:25:03,560 --> 00:25:05,000 Speaker 1: All of them are like this was one of the 429 00:25:05,040 --> 00:25:08,920 Speaker 1: worst twenty for our periods of my life lately. Uh. 430 00:25:08,960 --> 00:25:12,199 Speaker 1: But then you also make the argument like this is 431 00:25:12,200 --> 00:25:16,000 Speaker 1: how we communicate these days, is how we get our news. 432 00:25:16,520 --> 00:25:19,239 Speaker 1: Is how we can communicate through work? Uh, Like you 433 00:25:19,280 --> 00:25:22,959 Speaker 1: can't just take away everything like that and say all 434 00:25:23,040 --> 00:25:26,720 Speaker 1: right now, relearn everything you know and over this twenty 435 00:25:26,720 --> 00:25:30,120 Speaker 1: four hour period or keep up in modern society. Yeah, 436 00:25:30,160 --> 00:25:33,080 Speaker 1: so it wasn't the most fair thing to do. It 437 00:25:33,520 --> 00:25:36,920 Speaker 1: seems like drawing it out over time might have been 438 00:25:36,960 --> 00:25:39,240 Speaker 1: more useful than twenty four hours or something. But and 439 00:25:39,320 --> 00:25:42,080 Speaker 1: it also raises another point of contention as far as 440 00:25:42,200 --> 00:25:47,560 Speaker 1: determining what constitutes computer addiction. Um, the computer is not 441 00:25:48,400 --> 00:25:53,240 Speaker 1: and inherently um useless or evil thing, right, Like Egg 442 00:25:53,240 --> 00:25:58,080 Speaker 1: compares it to Heroin, Like you're you could legitimately sit 443 00:25:58,119 --> 00:26:00,480 Speaker 1: around and use a computer for ten hours in a 444 00:26:00,600 --> 00:26:03,879 Speaker 1: day in a very useful manner. Yeah, you got a 445 00:26:03,880 --> 00:26:07,159 Speaker 1: deadline for work or something. Um, you sit around and 446 00:26:07,200 --> 00:26:10,040 Speaker 1: shoot Heroin for ten hours. You're not accomplishing anything. You 447 00:26:10,080 --> 00:26:12,240 Speaker 1: can point to that and be like, no, that's not 448 00:26:12,440 --> 00:26:16,240 Speaker 1: that's not objectively good in any way, shape or form. Right. Um, 449 00:26:16,480 --> 00:26:19,120 Speaker 1: with a computer, you can be like, yeah, you could 450 00:26:19,160 --> 00:26:21,760 Speaker 1: be sitting there playing candy crushed for ten hours straight, 451 00:26:22,320 --> 00:26:26,879 Speaker 1: or you could be um researching new things or learning 452 00:26:26,880 --> 00:26:29,800 Speaker 1: a new language, or getting work done or whatever. So 453 00:26:29,960 --> 00:26:32,399 Speaker 1: it's not like you can point to yes, if someone 454 00:26:32,400 --> 00:26:35,160 Speaker 1: sits down at a computer for ten straight hours, their 455 00:26:35,280 --> 00:26:38,320 Speaker 1: computer addicts. But the whole it muddies the whole thing, 456 00:26:38,359 --> 00:26:42,240 Speaker 1: the usefulness of the computer and the ubiquity and necessity 457 00:26:42,280 --> 00:26:45,440 Speaker 1: of using a computer for long stretches muddies the whole 458 00:26:45,480 --> 00:26:49,320 Speaker 1: definition of what constitutes computer addiction. Well. Yeah, and while 459 00:26:49,320 --> 00:26:54,480 Speaker 1: they have determined and actually shown in the brain scans 460 00:26:55,040 --> 00:26:57,920 Speaker 1: that it actually lights up areas similar to drug addiction 461 00:26:58,560 --> 00:27:03,400 Speaker 1: and reward centers, it doesn't render those uh, brain centers useless. 462 00:27:03,520 --> 00:27:06,480 Speaker 1: Like when you do the heroine the heroine, which is 463 00:27:06,520 --> 00:27:08,040 Speaker 1: kind of, you know, another way of saying what you 464 00:27:08,080 --> 00:27:10,719 Speaker 1: were saying, when you ride the horse. That's that what 465 00:27:10,760 --> 00:27:12,320 Speaker 1: they call it. I think that's what they call it. 466 00:27:12,359 --> 00:27:14,720 Speaker 1: In the seventies, I was chasing the dragon or is 467 00:27:14,720 --> 00:27:19,600 Speaker 1: that something? Okay, there's this great MST three K episode 468 00:27:19,600 --> 00:27:22,399 Speaker 1: and early one with Joel and they were like injecting 469 00:27:22,400 --> 00:27:25,080 Speaker 1: a monkey with something and one of them, I don't 470 00:27:25,119 --> 00:27:27,600 Speaker 1: remember who said it. They go, yes, they're a little 471 00:27:27,640 --> 00:27:34,119 Speaker 1: horse for a little monkey. Alright, U should we take 472 00:27:34,160 --> 00:27:36,240 Speaker 1: another break? All right, we'll come back and we'll talk 473 00:27:36,280 --> 00:28:04,560 Speaker 1: specifically about um, social media addiction right after this. Al right, 474 00:28:04,680 --> 00:28:09,040 Speaker 1: social meds, as Hodgeman calls it, well, he ended up 475 00:28:09,040 --> 00:28:14,760 Speaker 1: calling it so means by the it is a new, 476 00:28:15,040 --> 00:28:19,080 Speaker 1: brand new world with social media. And I think a 477 00:28:19,200 --> 00:28:23,160 Speaker 1: lot of the online addiction now is centered around studying 478 00:28:23,200 --> 00:28:29,080 Speaker 1: things like Facebook, Twitter, Instagram, snapchat. A prediction. I guess 479 00:28:29,080 --> 00:28:31,280 Speaker 1: it's probably a better way to put it. Yeah, it's 480 00:28:31,280 --> 00:28:34,200 Speaker 1: a good point. So they've been studying for a little while. 481 00:28:34,280 --> 00:28:38,280 Speaker 1: In two thousand twelve, Uh, some researchers from University of 482 00:28:38,320 --> 00:28:42,520 Speaker 1: Bergen did a study where they looked at Facebook dependency 483 00:28:43,280 --> 00:28:45,760 Speaker 1: and they said that the very nature of the site 484 00:28:45,880 --> 00:28:50,120 Speaker 1: is problematic and that, uh they found that the brain, 485 00:28:50,480 --> 00:28:53,280 Speaker 1: the parts of the brain associated with preservation of the 486 00:28:53,320 --> 00:29:00,680 Speaker 1: social reputation are what's at play there. And um, basically 487 00:29:00,680 --> 00:29:02,160 Speaker 1: and this stuff that you sent me and I found 488 00:29:02,200 --> 00:29:05,200 Speaker 1: other stuff too, the very way that those sites are 489 00:29:05,240 --> 00:29:09,120 Speaker 1: structured are to get you addicted to them. Yeah. Yeah, 490 00:29:09,120 --> 00:29:12,960 Speaker 1: so there's this whole thing. Um, there's basically it's called 491 00:29:13,040 --> 00:29:15,840 Speaker 1: behavioral design. There's a guy named b. J. Fogg who's 492 00:29:15,880 --> 00:29:21,680 Speaker 1: a experimental psychologist slash computer scientist and he runs a 493 00:29:22,000 --> 00:29:26,280 Speaker 1: uh what's called the Persuasive Technology Lab at Stanford. And 494 00:29:26,320 --> 00:29:29,560 Speaker 1: this guy is like a guru god out there, UM 495 00:29:29,640 --> 00:29:33,680 Speaker 1: who basically has taken this concept that yes, you can 496 00:29:33,720 --> 00:29:35,960 Speaker 1: have a great idea, Yes you can have a killer app. 497 00:29:36,080 --> 00:29:39,720 Speaker 1: Yes you can, UM have wonderful technology, but it doesn't 498 00:29:39,720 --> 00:29:42,440 Speaker 1: amount to anything unless you get somebody to use it 499 00:29:42,480 --> 00:29:45,000 Speaker 1: and to use it a lot, to make a habit 500 00:29:45,040 --> 00:29:47,960 Speaker 1: out of using it. And there's this Basically there's a 501 00:29:48,000 --> 00:29:54,880 Speaker 1: push right now to make technology purposefully as addictive as possible, 502 00:29:54,960 --> 00:29:58,240 Speaker 1: literally addictive. And we're at a point right now with 503 00:29:58,480 --> 00:30:01,880 Speaker 1: the way the apps are designed where cigarettes were back 504 00:30:01,920 --> 00:30:04,280 Speaker 1: in like the seventies when they started adding things like 505 00:30:04,320 --> 00:30:09,120 Speaker 1: ammonia and sugar to increase the amount of effect that 506 00:30:09,200 --> 00:30:12,160 Speaker 1: nicotine had on the brain to make them more addictive. 507 00:30:12,480 --> 00:30:14,280 Speaker 1: That's the point we're at with the apps that are 508 00:30:14,320 --> 00:30:18,120 Speaker 1: being created, and it's all based on how they're designed. Yeah. 509 00:30:18,160 --> 00:30:21,960 Speaker 1: This UM. This one researcher called n. I. R. Near 510 00:30:22,200 --> 00:30:26,400 Speaker 1: ell e Y a L great name. UM wrote a 511 00:30:26,400 --> 00:30:29,800 Speaker 1: book called Hooked Colon How to Build Habit Forming Products. 512 00:30:30,520 --> 00:30:35,720 Speaker 1: UM basically said it starts with this trigger, and the hook, 513 00:30:35,880 --> 00:30:38,520 Speaker 1: which is in the case of social media and Facebook, 514 00:30:39,040 --> 00:30:43,320 Speaker 1: is lonely, loneliness, boredom or stress. Okay, so that's the 515 00:30:43,360 --> 00:30:45,480 Speaker 1: hook that they get you with a board the standing 516 00:30:45,520 --> 00:30:49,480 Speaker 1: there at the elevator for a minute. Hey, it's so sad, 517 00:30:49,520 --> 00:30:51,640 Speaker 1: but it's actually your Facebook feed. Are you standing the 518 00:30:51,680 --> 00:30:53,959 Speaker 1: line in the grocery store. Don't talk to the nice 519 00:30:54,040 --> 00:30:57,160 Speaker 1: lady next to you. Ignore that little kid making cute faces. 520 00:30:57,280 --> 00:30:59,600 Speaker 1: She didn't want to talk to you anyway. So you're 521 00:30:59,640 --> 00:31:01,920 Speaker 1: bored and that's how they get you going. That's how 522 00:31:01,960 --> 00:31:05,440 Speaker 1: they get you cooked, to begin with that initial little trigger, 523 00:31:06,120 --> 00:31:08,520 Speaker 1: but then it goes from there. So so I think 524 00:31:08,560 --> 00:31:12,520 Speaker 1: that I think boredom would constitute a motivation that will 525 00:31:12,600 --> 00:31:16,400 Speaker 1: motivate you. Right. The trigger is something like, um, if 526 00:31:16,440 --> 00:31:20,719 Speaker 1: you open up Facebook and there's the news feed and 527 00:31:20,760 --> 00:31:24,120 Speaker 1: there's like all these different stories or your friends like something, 528 00:31:24,640 --> 00:31:29,840 Speaker 1: and so you are activated too. You're motivated by boredom 529 00:31:29,880 --> 00:31:32,040 Speaker 1: to go seek out the news feed. The news feed 530 00:31:32,040 --> 00:31:35,320 Speaker 1: itself are are triggers that you click and then all 531 00:31:35,360 --> 00:31:38,920 Speaker 1: of a sudden you are immersed in your Facebook app. Yes, 532 00:31:39,200 --> 00:31:41,960 Speaker 1: and fogs actually come up with this kind of shorthand 533 00:31:42,040 --> 00:31:46,240 Speaker 1: formula for it. It's um B equals M A T 534 00:31:46,560 --> 00:31:49,680 Speaker 1: so behavior the behavior that you're after, which is interacting 535 00:31:49,720 --> 00:31:52,880 Speaker 1: with Facebook, is what Facebook wants you to do. UM 536 00:31:52,880 --> 00:31:56,960 Speaker 1: it's where motivation, So you're motivated by boredom. Ability, it's 537 00:31:57,040 --> 00:31:59,200 Speaker 1: very easy to open up the app on your phone. 538 00:31:59,280 --> 00:32:03,120 Speaker 1: You're able to low hanging fruit is what we call it. 539 00:32:03,960 --> 00:32:07,800 Speaker 1: UM and a trigger all come together. So the triggers, 540 00:32:07,800 --> 00:32:11,760 Speaker 1: say is the news feed. The ability is just opening 541 00:32:11,840 --> 00:32:15,160 Speaker 1: up the Facebook app, and then the motivation is boredom. 542 00:32:15,200 --> 00:32:17,400 Speaker 1: But there are plenty of other motivations. There's plenty of 543 00:32:17,440 --> 00:32:20,040 Speaker 1: other abilities, there's plenty of other triggers. And what they 544 00:32:20,080 --> 00:32:23,120 Speaker 1: found out is that the key seems to be ability, 545 00:32:23,440 --> 00:32:26,280 Speaker 1: where if you make it as easy as possible for 546 00:32:26,320 --> 00:32:29,000 Speaker 1: a person to do something, they're likely to do it, 547 00:32:29,360 --> 00:32:31,400 Speaker 1: and once they do it, you can start getting them 548 00:32:31,400 --> 00:32:33,640 Speaker 1: to do it over and over, so a behavior becomes 549 00:32:33,640 --> 00:32:36,760 Speaker 1: a habit. Yeah, that's the key, that's the point. That's 550 00:32:36,760 --> 00:32:39,120 Speaker 1: what they're after, and that's how they're designing apps these 551 00:32:39,200 --> 00:32:43,280 Speaker 1: days to make habitual. Yeah. Well, the Facebook in particular, 552 00:32:43,360 --> 00:32:48,080 Speaker 1: like it's no accident that the UM what's it called 553 00:32:48,120 --> 00:32:53,360 Speaker 1: not the alert but the notification is in red. Um. 554 00:32:53,400 --> 00:32:55,600 Speaker 1: That's a color that they found draws like a more 555 00:32:55,600 --> 00:32:59,240 Speaker 1: immediate reaction in response. That's what stop sign is red um. 556 00:32:59,240 --> 00:33:01,320 Speaker 1: That's why I read like is read. That's why the 557 00:33:01,320 --> 00:33:06,120 Speaker 1: Facebook alerts are read, so I'm read when I'm mad. Uh. 558 00:33:06,160 --> 00:33:10,640 Speaker 1: This other guy, UM, what's his first name something Harris. Uh. 559 00:33:10,720 --> 00:33:12,400 Speaker 1: This is sort of the depressing part. He says. You 560 00:33:12,480 --> 00:33:16,200 Speaker 1: might say to yourself, it's my responsibility to exert self 561 00:33:16,200 --> 00:33:20,160 Speaker 1: control when it comes to things like getting on Facebook. Uh, 562 00:33:20,160 --> 00:33:22,040 Speaker 1: he said, But that's not acknowledging the fact that there 563 00:33:22,040 --> 00:33:24,440 Speaker 1: are a thousand people on the other side of that 564 00:33:24,480 --> 00:33:28,160 Speaker 1: screen whose job it is to break down whatever responsibility 565 00:33:28,200 --> 00:33:31,200 Speaker 1: you can maintain. Yeah, which is I mean, that's just 566 00:33:31,320 --> 00:33:33,920 Speaker 1: dead on. And this guy knows his name is Tristan Harris. 567 00:33:34,400 --> 00:33:37,600 Speaker 1: He actually spent time under studying under B. J. Fogg, 568 00:33:38,000 --> 00:33:42,320 Speaker 1: and he's kind of gone the opposite way. Um, he's saying, Hey, 569 00:33:42,400 --> 00:33:46,320 Speaker 1: we actually, um are doing something kind of nefarious here. 570 00:33:46,360 --> 00:33:48,880 Speaker 1: We should really kind of cool it with the behavioral stuff. 571 00:33:48,960 --> 00:33:53,640 Speaker 1: So he's kind of launched a push for people to 572 00:33:53,680 --> 00:33:57,280 Speaker 1: rely on technology less or to resist interesting the use 573 00:33:57,320 --> 00:34:01,280 Speaker 1: of technology in their lives. So basically there's I mean, 574 00:34:01,560 --> 00:34:03,680 Speaker 1: some of it comes up in this article by ed 575 00:34:03,800 --> 00:34:08,160 Speaker 1: to UM, like doing things like setting alarms and when 576 00:34:08,160 --> 00:34:10,680 Speaker 1: the alarm goes off, your computers off, you just turn 577 00:34:10,760 --> 00:34:14,520 Speaker 1: it off UM, or just allotting certain parts of the 578 00:34:14,640 --> 00:34:18,040 Speaker 1: day to UM using your computer, your phone. But I 579 00:34:18,080 --> 00:34:20,680 Speaker 1: mean it may have worked in two thousand seven, it's 580 00:34:20,680 --> 00:34:23,360 Speaker 1: just getting harder and harder today. Like we were saying before, 581 00:34:24,040 --> 00:34:26,960 Speaker 1: have you ever been on LinkedIn? Yeah, I've got like 582 00:34:27,239 --> 00:34:31,640 Speaker 1: a time. It's totally neglected, like all LinkedIn accounts. I 583 00:34:31,719 --> 00:34:33,920 Speaker 1: know some people are super into it. I think I 584 00:34:33,960 --> 00:34:36,640 Speaker 1: get by the way. People I'm not on LinkedIn, never 585 00:34:36,680 --> 00:34:39,440 Speaker 1: have been nothing against it, don't even fully understand what 586 00:34:39,480 --> 00:34:42,760 Speaker 1: it is. But I don't need any more LinkedIn invites 587 00:34:43,280 --> 00:34:45,040 Speaker 1: because I'm not on it. I get them all the time. 588 00:34:45,760 --> 00:34:49,320 Speaker 1: But when LinkedIn launched UM, apparently they had a hub 589 00:34:49,320 --> 00:34:53,160 Speaker 1: and spoke icon to represent visually what your network was, 590 00:34:53,239 --> 00:34:55,920 Speaker 1: how big it was, and they said that that was 591 00:34:55,960 --> 00:34:57,920 Speaker 1: a very intentional thing. When you look at it, that 592 00:34:57,960 --> 00:35:02,319 Speaker 1: trigger just say like, well look at my wheel. It's lame. 593 00:35:02,920 --> 00:35:05,120 Speaker 1: I can't have people see him a wheel like that. Yeah, 594 00:35:05,160 --> 00:35:08,640 Speaker 1: I connect, connect connect to people. Yeah, like fog says. 595 00:35:08,680 --> 00:35:12,080 Speaker 1: He's like, yeah, people couldn't think have people thinking they 596 00:35:12,080 --> 00:35:15,319 Speaker 1: were losers. Yeah, so yeah, they started using the site 597 00:35:15,360 --> 00:35:18,680 Speaker 1: like crazy. Well I'm surprised, I mean, Facebook says how 598 00:35:18,680 --> 00:35:21,319 Speaker 1: many friends you have. I'm kind of surprised that's not 599 00:35:21,440 --> 00:35:27,080 Speaker 1: featured a little more prominently, like you know, and or 600 00:35:27,160 --> 00:35:30,960 Speaker 1: even represented in terms of popularity. I'm quite sure that 601 00:35:31,000 --> 00:35:34,800 Speaker 1: they studied that extensively and found that it actually like 602 00:35:34,960 --> 00:35:39,480 Speaker 1: a reduction in Facebook. So they they guarantee that that 603 00:35:40,320 --> 00:35:43,000 Speaker 1: wasn't Yeah, that wasn't something they overlooked. Yeah, because it 604 00:35:43,040 --> 00:35:44,919 Speaker 1: seems obvious that they would be like you would click 605 00:35:44,920 --> 00:35:47,560 Speaker 1: on someone's profile and it would be like this is 606 00:35:47,600 --> 00:35:50,279 Speaker 1: so and so they're super popular and he says he's 607 00:35:50,280 --> 00:35:55,359 Speaker 1: a hero, You're a zero. Uh and then sadly, I'm 608 00:35:55,360 --> 00:35:59,799 Speaker 1: not on Snapchat at all. But m dude, the US 609 00:36:00,120 --> 00:36:02,439 Speaker 1: Snapchat is one of the most difficult things you could 610 00:36:02,480 --> 00:36:04,440 Speaker 1: ever try to do. Well, they said in here. They 611 00:36:04,440 --> 00:36:09,200 Speaker 1: said that Facebook's um behavioral design makes like it's cute 612 00:36:09,200 --> 00:36:12,000 Speaker 1: compared to what Snapchat is doing. Yeah, like Facebook, if 613 00:36:12,080 --> 00:36:17,319 Speaker 1: you send someone um a note, right, you get some 614 00:36:17,360 --> 00:36:20,560 Speaker 1: sort of alert when they read your note. Yeah, well, 615 00:36:20,560 --> 00:36:22,480 Speaker 1: not an alert, but you can see like a little 616 00:36:22,560 --> 00:36:24,799 Speaker 1: check mark like so and so rick as at this time. 617 00:36:24,920 --> 00:36:27,319 Speaker 1: So that sets it up for a social obligation for 618 00:36:27,360 --> 00:36:30,480 Speaker 1: the person who received the note to respond because they 619 00:36:30,520 --> 00:36:32,680 Speaker 1: know that you know that they've read it now, yeah, 620 00:36:32,760 --> 00:36:34,640 Speaker 1: or you see it as they've you know, like they 621 00:36:34,680 --> 00:36:38,120 Speaker 1: saw this thing three days ago. That's responded. That motivates 622 00:36:38,120 --> 00:36:41,839 Speaker 1: the behavior. Yes, that's that's built in design. Snapchat has 623 00:36:41,840 --> 00:36:44,000 Speaker 1: a feature that displays how many days in a row 624 00:36:44,560 --> 00:36:47,480 Speaker 1: two people have snapped each other and then actually does 625 00:36:47,600 --> 00:36:52,319 Speaker 1: reward you with like emojis and things. Right, people love gamification. Yeah, 626 00:36:52,400 --> 00:36:55,719 Speaker 1: that's basically what it is. Right. Yeah. So they said 627 00:36:55,760 --> 00:36:58,880 Speaker 1: what Facebook doing is just like kids play compared to 628 00:36:58,880 --> 00:37:00,759 Speaker 1: what they're trying to do. It's app chat and other 629 00:37:00,800 --> 00:37:03,560 Speaker 1: apps in the future. Right this uh, this, it's I 630 00:37:03,600 --> 00:37:07,520 Speaker 1: think an Atlantic article that that that pulls from. Um. 631 00:37:07,640 --> 00:37:10,319 Speaker 1: They were saying that there's reports of people who are 632 00:37:10,360 --> 00:37:15,239 Speaker 1: on the snapchat streaks of like X number of consecutive days. 633 00:37:15,239 --> 00:37:17,319 Speaker 1: They don't want to break their streak, so if they 634 00:37:17,320 --> 00:37:19,720 Speaker 1: know they're going to be away from their device or whatever, 635 00:37:19,719 --> 00:37:22,520 Speaker 1: they'll give their password and log into a friend to 636 00:37:22,600 --> 00:37:26,000 Speaker 1: have them snapchat the other friends, so that the streak 637 00:37:26,040 --> 00:37:29,480 Speaker 1: will be unbroken, which I mean if you step back, 638 00:37:29,520 --> 00:37:32,280 Speaker 1: and there's plenty of people who are like, so, who cares, 639 00:37:32,320 --> 00:37:34,200 Speaker 1: It's fine, this is the way the world is now. 640 00:37:34,680 --> 00:37:38,640 Speaker 1: Some some teenagers are snapchatting each other because so that 641 00:37:38,680 --> 00:37:41,960 Speaker 1: they can get emojis? Is that really that wrong? And 642 00:37:42,000 --> 00:37:46,160 Speaker 1: that's the legitimate response, and that is I mean, and 643 00:37:46,360 --> 00:37:49,760 Speaker 1: that is At the same time, though, I really feel 644 00:37:49,800 --> 00:37:55,320 Speaker 1: like there's um, there's a lot of shirking of responsibility 645 00:37:55,440 --> 00:37:58,680 Speaker 1: for taking the human species in a certain direction without 646 00:37:58,920 --> 00:38:02,120 Speaker 1: the human species being largely aware of it. We'll see. 647 00:38:02,160 --> 00:38:05,480 Speaker 1: That's where that's exactly crystallizes my problem with it. It's 648 00:38:05,520 --> 00:38:07,400 Speaker 1: not that sure, that is sort of the world now 649 00:38:07,440 --> 00:38:09,520 Speaker 1: and that's what people do, but it's the fact that 650 00:38:09,520 --> 00:38:13,960 Speaker 1: we're being manipulated into doing so behind the scenes when 651 00:38:13,960 --> 00:38:16,560 Speaker 1: they have those meetings and they're like, hey, what if 652 00:38:16,560 --> 00:38:19,040 Speaker 1: the Facebook feed? What if they auto played these videos? 653 00:38:19,719 --> 00:38:22,560 Speaker 1: So before you know it, you're watching a video that 654 00:38:22,640 --> 00:38:24,960 Speaker 1: you didn't even want to watch, and then you're watching 655 00:38:24,960 --> 00:38:28,279 Speaker 1: another one, like so let's put in the AutoPlay feature. Uh, 656 00:38:28,320 --> 00:38:31,000 Speaker 1: And they call it hair is called it the bottomless bowl, 657 00:38:31,560 --> 00:38:36,120 Speaker 1: that infinite stream that you get sucked into because they 658 00:38:36,160 --> 00:38:39,920 Speaker 1: found there was a study where people eat sev more 659 00:38:39,960 --> 00:38:44,319 Speaker 1: soup out of a self refilling bowl, a regular bowl 660 00:38:44,400 --> 00:38:47,560 Speaker 1: without even realizing they've eaten more. I want to see 661 00:38:47,560 --> 00:38:50,759 Speaker 1: that bowl. They just can't too. You just keep eating 662 00:38:50,760 --> 00:38:53,279 Speaker 1: the soup. And that's essentially what they're doing on Facebook 663 00:38:53,280 --> 00:38:59,200 Speaker 1: and your other social media. Social media feeds is um 664 00:38:59,480 --> 00:39:01,880 Speaker 1: is you get sucked in before you know it, and 665 00:39:01,920 --> 00:39:05,200 Speaker 1: then a half hour has gone by, like you rationalize yourself, like, 666 00:39:05,800 --> 00:39:07,680 Speaker 1: you know, I can just go check um I sent 667 00:39:07,760 --> 00:39:10,080 Speaker 1: her for in request or a message. Just it'll take 668 00:39:10,120 --> 00:39:12,240 Speaker 1: two seconds. Let me just check and see if they responded. 669 00:39:12,719 --> 00:39:16,000 Speaker 1: Twenty five minutes later, they found is twenty five minutes 670 00:39:16,040 --> 00:39:17,800 Speaker 1: is the average time that you it takes you to 671 00:39:17,840 --> 00:39:19,919 Speaker 1: get back to what you were doing because you get 672 00:39:19,920 --> 00:39:22,600 Speaker 1: distracted because of that feed. You know, I've never been 673 00:39:22,600 --> 00:39:26,479 Speaker 1: more aware of how often I checked Twitter than I 674 00:39:26,560 --> 00:39:29,280 Speaker 1: was when I was checking Twitter while I was researching 675 00:39:29,280 --> 00:39:32,920 Speaker 1: this article. Just nothing to do with thing, anything. Randomly, 676 00:39:32,960 --> 00:39:37,000 Speaker 1: I'd just go open up Twitter and look nothing, no change, 677 00:39:37,040 --> 00:39:41,640 Speaker 1: nothing worth seeing. It's it is bizarre the habits you 678 00:39:41,719 --> 00:39:45,000 Speaker 1: can form from it. So what do you do? Chuck 679 00:39:45,040 --> 00:39:49,000 Speaker 1: if you want out besides having to completely faster unplug 680 00:39:49,120 --> 00:39:52,279 Speaker 1: or whatever. Well, like you were talking about the if 681 00:39:52,280 --> 00:39:53,759 Speaker 1: you are if you I should say, if you are 682 00:39:53,800 --> 00:39:57,279 Speaker 1: a bona fide computer addict. Oh, I mean you can 683 00:39:57,320 --> 00:40:00,120 Speaker 1: go through a legit twelve step program like you go 684 00:40:00,200 --> 00:40:03,400 Speaker 1: through rehab. There are people out there doing that. So 685 00:40:03,440 --> 00:40:06,839 Speaker 1: if you feel like you need that, or you're someone 686 00:40:06,880 --> 00:40:10,120 Speaker 1: in your family needs that, like have an intervention like 687 00:40:10,160 --> 00:40:12,200 Speaker 1: these cases are I was talking about. Like this wrestler, 688 00:40:12,719 --> 00:40:16,160 Speaker 1: you're gaming fourteen hours a day. Your life is suffering 689 00:40:17,160 --> 00:40:19,759 Speaker 1: in some ways, in many ways, like there's just no 690 00:40:19,800 --> 00:40:22,000 Speaker 1: way getting around it. No, there's not. Because again, like 691 00:40:22,040 --> 00:40:24,800 Speaker 1: you're not getting exercise, you're not eating right, you're creating 692 00:40:25,000 --> 00:40:31,839 Speaker 1: um uh, blood clots in your legs, you're you're not 693 00:40:31,960 --> 00:40:34,560 Speaker 1: hanging out with the people who are you're physically around with. 694 00:40:34,640 --> 00:40:40,480 Speaker 1: Of course there's problems. Um you could also Uh, this 695 00:40:40,520 --> 00:40:42,680 Speaker 1: one was good. I thought you can put the computer 696 00:40:42,760 --> 00:40:45,239 Speaker 1: in the high traffic area of the house, That is 697 00:40:45,239 --> 00:40:47,400 Speaker 1: a good one. Instead of being up there in your 698 00:40:47,440 --> 00:40:53,799 Speaker 1: bedroom in the closet looking like a guy from reefer madness. Yeah, 699 00:40:53,840 --> 00:40:56,000 Speaker 1: sit out in the open where someone might distract you 700 00:40:56,600 --> 00:41:00,480 Speaker 1: into a human interaction, right, or being will to keep 701 00:41:00,520 --> 00:41:04,040 Speaker 1: tabs on, Like, Uh, you've been at the computer for 702 00:41:04,080 --> 00:41:07,120 Speaker 1: six hours, now, what's your problem. Yeah, I'm working, Okay, 703 00:41:07,160 --> 00:41:10,000 Speaker 1: we'll keep going. Yeah, And I mean I find that 704 00:41:10,600 --> 00:41:12,839 Speaker 1: our lives are and I'm sure you're the same way. 705 00:41:12,880 --> 00:41:15,080 Speaker 1: They're busy enough to wear I mean I don't have 706 00:41:15,120 --> 00:41:18,480 Speaker 1: time to do that. What six hours at the at 707 00:41:18,480 --> 00:41:22,319 Speaker 1: the six and eight hours of of funning. Oh yeah, 708 00:41:22,320 --> 00:41:24,759 Speaker 1: when we do our research and our work online. But 709 00:41:25,440 --> 00:41:28,400 Speaker 1: like I can't play Fallout for eight hours. You know, 710 00:41:29,400 --> 00:41:35,480 Speaker 1: I have responsibilities. But the fewer responsibilities you have, I guess, 711 00:41:35,520 --> 00:41:37,600 Speaker 1: the more prone you are. I think there was Yeah, 712 00:41:37,640 --> 00:41:39,879 Speaker 1: if you're born with a silver spoon in your mouth, 713 00:41:39,960 --> 00:41:45,440 Speaker 1: your toast when it comes to gaming. Yeah, that's true. Um. 714 00:41:45,480 --> 00:41:49,320 Speaker 1: It says here that study people who are more anxious 715 00:41:49,360 --> 00:41:53,560 Speaker 1: and socially insecure appreciate the easy ways to communicate via 716 00:41:53,600 --> 00:41:56,600 Speaker 1: the social meds. But on the other hand, people who 717 00:41:56,600 --> 00:41:59,680 Speaker 1: are more organized and ambitious were at a decreased risk 718 00:42:00,239 --> 00:42:04,080 Speaker 1: of tech related addiction. Uh. And we just use it, 719 00:42:04,120 --> 00:42:07,520 Speaker 1: you know, use it for the things they need it for. 720 00:42:08,680 --> 00:42:13,400 Speaker 1: I'd say that characterizes me. Yeah, it's the tool aside 721 00:42:13,480 --> 00:42:18,279 Speaker 1: from checking Twitter. What else? Man you got anything else? 722 00:42:18,400 --> 00:42:20,160 Speaker 1: I got one more thing. I just saw this good 723 00:42:20,280 --> 00:42:23,720 Speaker 1: um artic. Well, it was an article as a research 724 00:42:23,800 --> 00:42:27,120 Speaker 1: paper Internet Addiction colon, A brief summary of research and 725 00:42:27,120 --> 00:42:34,320 Speaker 1: practice from Hillary Cash Lazette, Ray and Stealing Alexander Winkler. Um. 726 00:42:34,360 --> 00:42:36,759 Speaker 1: And I just read the summation. But it's interesting they said, 727 00:42:37,440 --> 00:42:40,200 Speaker 1: from our practical perspective, the different types of I a D. 728 00:42:41,080 --> 00:42:44,680 Speaker 1: That's the Internet addiction disorder. Uh, they fit into one 729 00:42:44,680 --> 00:42:48,960 Speaker 1: category due to various uh, internet specific commonalities. So you 730 00:42:49,000 --> 00:42:53,120 Speaker 1: talk about porn addiction or gaming and addiction, or any 731 00:42:53,160 --> 00:42:56,080 Speaker 1: of these various addictions except probably social media in some 732 00:42:56,120 --> 00:43:01,359 Speaker 1: ways because anonymity and riskless interaction or two of them. Uh. 733 00:43:01,360 --> 00:43:04,080 Speaker 1: And then commonalities and the underlying behavior which is avoidant 734 00:43:04,080 --> 00:43:08,320 Speaker 1: sphere pleasure and entertainment. And then the overlapping systems I'm sorry, 735 00:43:08,320 --> 00:43:12,239 Speaker 1: symptoms increased amount of time spent online and preoccupation and 736 00:43:12,360 --> 00:43:15,200 Speaker 1: other signs of addiction. But in the end they say, 737 00:43:15,440 --> 00:43:18,200 Speaker 1: you know, more research, more research, more research, that's what 738 00:43:18,239 --> 00:43:20,520 Speaker 1: we need, which I mean, of course, it's coming like 739 00:43:21,360 --> 00:43:25,600 Speaker 1: this is probably the premier addiction of the twenty one century. 740 00:43:26,719 --> 00:43:28,960 Speaker 1: The thing is, we seem to be looking at it 741 00:43:29,040 --> 00:43:30,920 Speaker 1: is less and less of an addiction and more and 742 00:43:30,960 --> 00:43:34,040 Speaker 1: more of normal life. So I don't know, maybe there 743 00:43:34,080 --> 00:43:36,560 Speaker 1: will be less study of it. I'm just gonna encourage 744 00:43:36,560 --> 00:43:39,759 Speaker 1: people to you don't have to go out and and 745 00:43:39,800 --> 00:43:42,319 Speaker 1: give up everything, but just try to spend a little 746 00:43:42,320 --> 00:43:44,640 Speaker 1: more time talking to people. Yeah, it's just go to 747 00:43:44,680 --> 00:43:47,960 Speaker 1: somebody and say, hey, make conversation with me just a 748 00:43:47,960 --> 00:43:49,960 Speaker 1: little bit here and there, just and that always. Or 749 00:43:50,040 --> 00:43:52,839 Speaker 1: let's get a conversation going pet pepper and into your 750 00:43:52,880 --> 00:43:55,080 Speaker 1: life here and there and see if it does not 751 00:43:55,160 --> 00:43:59,000 Speaker 1: provide reward. Yeah. Another good one that I've found at 752 00:43:59,040 --> 00:44:01,120 Speaker 1: least makes you COGNI is in of it is when 753 00:44:01,160 --> 00:44:04,360 Speaker 1: you are standing there waiting for that elevator or whatever 754 00:44:04,960 --> 00:44:08,279 Speaker 1: and you go to grab your phone, just think and 755 00:44:08,360 --> 00:44:11,600 Speaker 1: stop yourself at least do it to just poke yourself 756 00:44:11,600 --> 00:44:14,640 Speaker 1: for fun. Think of it as a gun. Yeah, and 757 00:44:14,680 --> 00:44:18,240 Speaker 1: you're gonna get tackled for waving it around in public. 758 00:44:19,640 --> 00:44:21,359 Speaker 1: I've tried to do some of this lately to where 759 00:44:21,360 --> 00:44:23,640 Speaker 1: I do just start talking to people and it freaks 760 00:44:23,680 --> 00:44:26,239 Speaker 1: people out a little bit these days, Yeah, whereas it 761 00:44:26,400 --> 00:44:28,879 Speaker 1: definitely I don't remember it freaking people out like when 762 00:44:28,920 --> 00:44:33,279 Speaker 1: I was in college. Yeah, I think you're right. Yeah, 763 00:44:33,360 --> 00:44:35,359 Speaker 1: it's definitely changed. And it's like, white, what do you want? 764 00:44:35,360 --> 00:44:36,680 Speaker 1: What are you talking to me? If you want to 765 00:44:36,719 --> 00:44:38,520 Speaker 1: talk to somebody who wasn't around, you had to go 766 00:44:38,600 --> 00:44:42,439 Speaker 1: to a pay phone back in your day. They still 767 00:44:42,440 --> 00:44:44,440 Speaker 1: have those. I see him from time to time. Their 768 00:44:44,480 --> 00:44:49,479 Speaker 1: neat It's like being in a living museum. Uh well, 769 00:44:49,520 --> 00:44:51,880 Speaker 1: I don't think either one of us have anything else. Instead, 770 00:44:52,000 --> 00:44:53,799 Speaker 1: we're going to suggest that you go on the house 771 00:44:53,800 --> 00:44:56,600 Speaker 1: stuff works dot com. Type in a search bar computer 772 00:44:56,640 --> 00:44:59,040 Speaker 1: addiction if you want to know more about this. Um, 773 00:44:59,080 --> 00:45:01,879 Speaker 1: there's plenty of others if you can look up to. Uh. 774 00:45:01,920 --> 00:45:04,160 Speaker 1: And since I said search bar in there somewhere, it's 775 00:45:04,200 --> 00:45:09,320 Speaker 1: time for a listener mail. Uh yeah, And hey, sorry 776 00:45:09,320 --> 00:45:11,040 Speaker 1: if it was a little soap boxy on that one. 777 00:45:11,080 --> 00:45:13,800 Speaker 1: I didn't want to get to soap boxy. But I 778 00:45:13,880 --> 00:45:17,120 Speaker 1: kind of miss folks talking to folks. You know, we'll 779 00:45:17,160 --> 00:45:19,560 Speaker 1: talk to you. He just tapped me on the shoulder. 780 00:45:20,200 --> 00:45:24,400 Speaker 1: All right, I'm gonna call this uh syntax um beef. 781 00:45:25,239 --> 00:45:27,840 Speaker 1: Hey guys, small issue I had with the syntax episode 782 00:45:28,080 --> 00:45:31,120 Speaker 1: and discussing the colonial American reaction to levies like the 783 00:45:31,120 --> 00:45:34,120 Speaker 1: sugar tax, you dismissed the purpose of the taxes making 784 00:45:34,120 --> 00:45:36,840 Speaker 1: the king richer. It is a common misconception that the 785 00:45:36,880 --> 00:45:39,800 Speaker 1: taxes were imposed on the colonies arbitrarily, and this was 786 00:45:39,840 --> 00:45:44,719 Speaker 1: certainly the patriot narrative used to support independence go Pats, 787 00:45:44,760 --> 00:45:47,400 Speaker 1: but in fact, the taxes were levied to cover the 788 00:45:47,440 --> 00:45:50,480 Speaker 1: cost of the devastating French and Indian War, which the 789 00:45:50,480 --> 00:45:55,640 Speaker 1: colony survived only due to the British army's resistance. Revisionist 790 00:45:55,719 --> 00:45:59,080 Speaker 1: history nowadays tends to focus on the without representation part 791 00:45:59,120 --> 00:46:01,560 Speaker 1: of the no tax should request, as well as the 792 00:46:01,560 --> 00:46:04,120 Speaker 1: effects of other laws such as forbidding settlement in the 793 00:46:04,160 --> 00:46:07,080 Speaker 1: Appalachian regions and restriction of trade, rather than taxes alone. 794 00:46:07,280 --> 00:46:09,000 Speaker 1: But I want to clear it up because portraying King 795 00:46:09,360 --> 00:46:13,719 Speaker 1: George five is greedy ignores legitimate political motives on the 796 00:46:13,719 --> 00:46:17,040 Speaker 1: part of the British Empire, dropping ignored in the revolutionary narrative. 797 00:46:17,560 --> 00:46:22,880 Speaker 1: Did this be email come from the UK? No? Oklahoma? Okay, okay, 798 00:46:23,800 --> 00:46:26,719 Speaker 1: right so close. I love the show. Keep up a 799 00:46:26,719 --> 00:46:30,440 Speaker 1: good work, guy, Sincerely, Thomas from Oklahoma. Thanks a lot, Thomas, 800 00:46:30,440 --> 00:46:33,160 Speaker 1: thanks for setting us straight. Um. Yeah, I think we 801 00:46:33,239 --> 00:46:36,799 Speaker 1: kind of just did the Nickel sketch of the King. 802 00:46:37,160 --> 00:46:39,520 Speaker 1: I think it's pretty easy to fall into that trap. Sure, 803 00:46:39,840 --> 00:46:42,239 Speaker 1: that's what they teach us in school exactly. You want 804 00:46:42,239 --> 00:46:44,880 Speaker 1: to get pushed drawn by the King England. No, I 805 00:46:44,960 --> 00:46:50,520 Speaker 1: saw Schoolhouse Rock. That was a jerk Schoolhouse Rock with 806 00:46:50,760 --> 00:46:54,359 Speaker 1: Jack Black. No, that was school Love Rock. Oh, that's right. 807 00:46:55,680 --> 00:46:57,719 Speaker 1: If you want to get in touch with me or Chuck, 808 00:46:57,800 --> 00:46:59,960 Speaker 1: you can hang out with me on Twitter at John 809 00:47:00,040 --> 00:47:02,000 Speaker 1: Show Them Clark. You can also look us up at 810 00:47:02,160 --> 00:47:03,840 Speaker 1: s Y s K podcast. You can hang out with 811 00:47:03,960 --> 00:47:07,200 Speaker 1: Chuck on Facebook at Charles W. Chuck Bryant, or Facebook 812 00:47:07,239 --> 00:47:09,399 Speaker 1: dot com slash Stuff you Should Know. You can send 813 00:47:09,440 --> 00:47:11,880 Speaker 1: us an email to stuff Podcasts at how stuff Works 814 00:47:11,920 --> 00:47:14,360 Speaker 1: dot com and has always hang out with us at 815 00:47:14,400 --> 00:47:16,359 Speaker 1: her home on the web. Stuff you Should Know dot 816 00:47:16,440 --> 00:47:25,520 Speaker 1: com for more on this and thousands of other topics. 817 00:47:25,800 --> 00:47:35,920 Speaker 1: Is it how stuff Works dot com