1 00:00:11,840 --> 00:00:14,640 Speaker 1: Good morning, peeps, and welcome to Okay f Daily with 2 00:00:14,720 --> 00:00:18,439 Speaker 1: me your Girl, Danielle Moody recording not so live but 3 00:00:18,600 --> 00:00:21,960 Speaker 1: live ish from my home in Brooklyn. See the plant babies. 4 00:00:21,960 --> 00:00:24,279 Speaker 1: Oh my god, Look how gorgeous they look. The sun 5 00:00:24,280 --> 00:00:27,440 Speaker 1: is setting them just right. Anyway, folks, I'm so excited 6 00:00:27,600 --> 00:00:30,880 Speaker 1: to be with you now five days a week on video. 7 00:00:31,000 --> 00:00:32,560 Speaker 1: I think that this is going to be a really 8 00:00:32,600 --> 00:00:35,919 Speaker 1: fun journey that we're going to take together. And I'm 9 00:00:35,960 --> 00:00:38,960 Speaker 1: excited that after writing with me on Patreon and on 10 00:00:39,000 --> 00:00:41,800 Speaker 1: audio for so long, that we can do something different. 11 00:00:42,600 --> 00:00:45,919 Speaker 1: So today, first, I want to start off with the 12 00:00:45,960 --> 00:00:49,199 Speaker 1: fact that I can't express to you how disappointed I 13 00:00:49,200 --> 00:00:53,640 Speaker 1: am in this administration. And I say that not because 14 00:00:53,880 --> 00:00:56,000 Speaker 1: I don't think that they have good intentions, but it's 15 00:00:56,040 --> 00:00:58,560 Speaker 1: because the road to hell is paved with good intentions. 16 00:00:58,880 --> 00:01:04,839 Speaker 1: It's because we recognis that, you know, having essentially Congress 17 00:01:06,000 --> 00:01:10,000 Speaker 1: and the White House isn't enough that you actually have 18 00:01:10,040 --> 00:01:12,360 Speaker 1: to have the will, You actually have to have the 19 00:01:12,400 --> 00:01:15,520 Speaker 1: fight within you. And I'm tired of us being fed 20 00:01:15,959 --> 00:01:18,880 Speaker 1: these lines like don't let the perfect get in the 21 00:01:18,920 --> 00:01:21,800 Speaker 1: way of the good, or something is better than nothing, 22 00:01:21,920 --> 00:01:24,319 Speaker 1: or this is historic even though it was gutted of 23 00:01:24,400 --> 00:01:26,920 Speaker 1: all of the things that people need right and I'm 24 00:01:26,959 --> 00:01:30,080 Speaker 1: talking about the Infrastructure Bill, and I'm talking about all 25 00:01:30,120 --> 00:01:33,160 Speaker 1: of the compromises, and by compromises, I actually mean the 26 00:01:33,200 --> 00:01:36,400 Speaker 1: way in which Joe Mansion and Kirsten's Cinema were able 27 00:01:36,400 --> 00:01:39,640 Speaker 1: to throw their weight around and essentially show us firsthand 28 00:01:39,640 --> 00:01:42,119 Speaker 1: that we no longer live in a democracy that two 29 00:01:42,200 --> 00:01:45,319 Speaker 1: people can halt progress for the rest of the country 30 00:01:45,360 --> 00:01:48,840 Speaker 1: just because they want to. They love the attention, they 31 00:01:48,880 --> 00:01:51,120 Speaker 1: love the money that is being thrown at them, they 32 00:01:51,160 --> 00:01:53,639 Speaker 1: love all of the headlines. But what they don't love, friends, 33 00:01:53,800 --> 00:01:56,840 Speaker 1: is America. What they don't love is progress and the 34 00:01:56,880 --> 00:01:59,760 Speaker 1: fact that we continue to refer to them as moderates 35 00:02:00,040 --> 00:02:04,240 Speaker 1: when they are not. They are They're not insurrectionists, So 36 00:02:04,480 --> 00:02:07,960 Speaker 1: I won't say that even though we have insurrectionists sitting 37 00:02:08,040 --> 00:02:10,640 Speaker 1: in Congress right now, none of which who have been 38 00:02:10,720 --> 00:02:14,200 Speaker 1: arrested or indicted or called before the one six Commission. 39 00:02:14,760 --> 00:02:17,200 Speaker 1: But they are obstructionists. And that's what I used to 40 00:02:17,240 --> 00:02:20,600 Speaker 1: refer to as Mitch mcconnellists as well as the Grim Reaper. 41 00:02:20,919 --> 00:02:24,440 Speaker 1: They are obstructionists. They don't want progress so long as 42 00:02:24,440 --> 00:02:27,000 Speaker 1: it interferes with their bag. And what do I mean 43 00:02:27,000 --> 00:02:30,399 Speaker 1: by their bag? Their money right, their pockets, like as 44 00:02:30,440 --> 00:02:34,480 Speaker 1: so long as they are fed, so long as their 45 00:02:34,520 --> 00:02:37,679 Speaker 1: families are bringing in millions, then they are good. The 46 00:02:37,840 --> 00:02:40,760 Speaker 1: disrespect that Kirsten cinema, and I'll keep saying it over 47 00:02:40,800 --> 00:02:43,799 Speaker 1: and over again, the disrespect that she showed the Senate 48 00:02:43,840 --> 00:02:46,799 Speaker 1: with her fucking denim jacket and the flipping of her 49 00:02:46,800 --> 00:02:50,440 Speaker 1: hair like she is some fucking high schooler was embarrassing. 50 00:02:50,760 --> 00:02:53,120 Speaker 1: And it was embarrassing for a couple of reasons because 51 00:02:53,120 --> 00:02:56,560 Speaker 1: it showed her absolute and total privilege right as a 52 00:02:56,560 --> 00:02:59,040 Speaker 1: white woman that gets to do and say whatever it 53 00:02:59,120 --> 00:03:01,639 Speaker 1: is that she wants to do. Black people aren't even 54 00:03:01,680 --> 00:03:04,799 Speaker 1: allowed to walk down the fucking street, right Like, everything 55 00:03:04,800 --> 00:03:07,000 Speaker 1: that we do is criticize, from the top of our 56 00:03:07,040 --> 00:03:09,240 Speaker 1: heads to the bottom of our toes, to the way 57 00:03:09,240 --> 00:03:12,000 Speaker 1: in which we straighten up our backs, or if we slouch, 58 00:03:12,240 --> 00:03:14,600 Speaker 1: or if we decide not to brush our hair that day, 59 00:03:14,800 --> 00:03:17,000 Speaker 1: or if we decide to do the most. It is 60 00:03:17,040 --> 00:03:20,800 Speaker 1: always criticized because everything that is on and comes from 61 00:03:20,840 --> 00:03:24,880 Speaker 1: a black body is politicized. So the very idea that 62 00:03:25,000 --> 00:03:29,160 Speaker 1: you can just shows just such, you know, casualness, just 63 00:03:29,280 --> 00:03:33,800 Speaker 1: such disregard at this critical time. And I know, folks, 64 00:03:33,880 --> 00:03:36,640 Speaker 1: you're like, oh, the clothes are not the problem, like 65 00:03:36,680 --> 00:03:39,560 Speaker 1: she's the problem, But it's the entire package. To me 66 00:03:39,800 --> 00:03:43,080 Speaker 1: that is the problem, and is the package of white privilege. 67 00:03:44,040 --> 00:03:45,600 Speaker 1: The other thing that I want to discuss is that 68 00:03:45,680 --> 00:03:48,360 Speaker 1: tomorrow is one of the most consequential elections that we 69 00:03:48,400 --> 00:03:50,920 Speaker 1: will see in Virginia. And look, I'm tired of saying 70 00:03:50,960 --> 00:03:53,960 Speaker 1: the word consequential too, you know why, because every fucking 71 00:03:53,960 --> 00:03:57,800 Speaker 1: election is consequential. And what we saw following twenty twenty 72 00:03:57,880 --> 00:04:02,720 Speaker 1: is that guess what elections matter, right, and how big 73 00:04:02,760 --> 00:04:06,960 Speaker 1: of a margin we have actually fucking matters. And so 74 00:04:07,280 --> 00:04:10,960 Speaker 1: here's the thing. You have Terry mcculiffe, and you have 75 00:04:11,280 --> 00:04:14,520 Speaker 1: junkin right, who are running in Virginia. You have a 76 00:04:14,520 --> 00:04:18,080 Speaker 1: man that is essentially Trump light. He is everything that 77 00:04:18,120 --> 00:04:21,480 Speaker 1: I said was going to happen. Right. Once you had 78 00:04:21,720 --> 00:04:25,560 Speaker 1: somebody who could speak a little smoother, maybe understood the 79 00:04:25,600 --> 00:04:28,160 Speaker 1: system a little better, was an easier pill to swallow, 80 00:04:28,360 --> 00:04:31,560 Speaker 1: but still had all of the same ideology and sentiments 81 00:04:31,560 --> 00:04:34,279 Speaker 1: of Donald Trump. Right. Put him up in a suit, 82 00:04:34,400 --> 00:04:37,039 Speaker 1: smack a smile on his face, have him not call 83 00:04:37,160 --> 00:04:40,320 Speaker 1: people outside of their names, and guess what you got yourself? 84 00:04:40,360 --> 00:04:44,400 Speaker 1: A candidate, This man wants to ban books in school, 85 00:04:44,520 --> 00:04:48,120 Speaker 1: Tony Morrison, to be exact right, he wants to ban 86 00:04:48,240 --> 00:04:51,000 Speaker 1: books in school. He doesn't want to have mask mandates, 87 00:04:51,160 --> 00:04:53,960 Speaker 1: he doesn't want abortion, he doesn't want all of the 88 00:04:53,960 --> 00:04:57,800 Speaker 1: things that Republicans are going after these days. If this 89 00:04:57,839 --> 00:05:01,960 Speaker 1: man wins, he is setting up his place right and 90 00:05:02,120 --> 00:05:05,760 Speaker 1: setting Trump is m in stone, because again, what do 91 00:05:05,960 --> 00:05:08,800 Speaker 1: people need to understand? So long as you are not 92 00:05:08,920 --> 00:05:12,240 Speaker 1: spinning hot shit or tweeting hot shit like Donald Trump 93 00:05:12,279 --> 00:05:16,320 Speaker 1: did for four years, then you'll be palpable, right. And 94 00:05:16,640 --> 00:05:18,760 Speaker 1: what we knew to be true is that Donald Trump 95 00:05:18,800 --> 00:05:22,520 Speaker 1: was palpable to seventy five million Americans, right, because that's 96 00:05:22,520 --> 00:05:26,080 Speaker 1: how many people voted for him. So just imagine a smoother, 97 00:05:26,600 --> 00:05:30,960 Speaker 1: more refined but with the same ideology person running for 98 00:05:31,040 --> 00:05:34,279 Speaker 1: office and them winning. The margins are closing in Virginia, 99 00:05:34,320 --> 00:05:37,599 Speaker 1: and frankly, I don't fucking understand why I have lived 100 00:05:37,600 --> 00:05:40,160 Speaker 1: in Virginia. I lived in Arlington for many years. I've 101 00:05:40,160 --> 00:05:43,839 Speaker 1: lived in Washington, DC for well over decades, and so 102 00:05:43,920 --> 00:05:46,719 Speaker 1: I don't understand what is going on in that state 103 00:05:46,800 --> 00:05:49,640 Speaker 1: right now. But I plead with you, if you have 104 00:05:49,839 --> 00:05:53,160 Speaker 1: friends that live in Virginia, if you live in Virginia, 105 00:05:53,400 --> 00:05:57,839 Speaker 1: get out and fucking vote. Get them to vote. Get 106 00:05:57,880 --> 00:06:01,200 Speaker 1: at least ten people with you to go to the polls. 107 00:06:01,360 --> 00:06:04,120 Speaker 1: Offer to shuttle people back and forth, do whatever it 108 00:06:04,240 --> 00:06:07,599 Speaker 1: is that you can right now, because we are running 109 00:06:07,600 --> 00:06:09,880 Speaker 1: out of time and this is going to set the 110 00:06:09,960 --> 00:06:12,720 Speaker 1: tone for what midterms will look like. And I already 111 00:06:12,720 --> 00:06:15,560 Speaker 1: can tell you Democrats don't have a whole lot to 112 00:06:15,600 --> 00:06:19,240 Speaker 1: fucking offer. They really don't. And that's the problem is 113 00:06:19,279 --> 00:06:22,760 Speaker 1: that when you look over the last you know, ten months, 114 00:06:22,760 --> 00:06:25,240 Speaker 1: we're getting ready to look at it a year since 115 00:06:25,279 --> 00:06:28,320 Speaker 1: the election. When you look at that time, right and 116 00:06:28,360 --> 00:06:30,960 Speaker 1: you realize, well, how is my life different? What have 117 00:06:31,080 --> 00:06:34,640 Speaker 1: they done? They've done really nothing. And yes they can 118 00:06:34,720 --> 00:06:38,000 Speaker 1: now say that we pass this infrastructure bill, but what 119 00:06:38,040 --> 00:06:40,800 Speaker 1: will it do. Is it going to provide the American 120 00:06:40,800 --> 00:06:42,920 Speaker 1: people with the things that they need or the things 121 00:06:42,920 --> 00:06:45,400 Speaker 1: that Joe Mansion said that you should have, or that 122 00:06:45,560 --> 00:06:48,760 Speaker 1: is just good enough? Right? Because that's where we are. 123 00:06:48,960 --> 00:06:51,440 Speaker 1: We're at a time when we need robust change in 124 00:06:51,560 --> 00:06:55,240 Speaker 1: big thinkers and change makers right and warriors that are 125 00:06:55,279 --> 00:06:57,520 Speaker 1: on the field and recognize that we are at war. 126 00:06:57,960 --> 00:07:00,280 Speaker 1: We are at war with a political party that is 127 00:07:00,279 --> 00:07:03,159 Speaker 1: a group of insurrectionists that want to overthrow our government. 128 00:07:03,320 --> 00:07:05,760 Speaker 1: And they are doing it right. They are doing it 129 00:07:05,839 --> 00:07:08,680 Speaker 1: each and every day. Each and every day that Merritt 130 00:07:08,720 --> 00:07:11,760 Speaker 1: Garland decides to sit on his fucking hands and do 131 00:07:11,880 --> 00:07:17,160 Speaker 1: absolutely nothing, he is telling them, next time, make it better, 132 00:07:17,440 --> 00:07:20,440 Speaker 1: make it bigger, and make it stick, because there is 133 00:07:20,560 --> 00:07:23,720 Speaker 1: no responsibility, there is no accountability for any of the 134 00:07:23,760 --> 00:07:28,160 Speaker 1: actions that have been taken. We know, we know that 135 00:07:28,200 --> 00:07:31,280 Speaker 1: their insurrection is sitting inside of Congress right now. We 136 00:07:31,400 --> 00:07:34,520 Speaker 1: know that there are people that gave gave these people 137 00:07:34,560 --> 00:07:37,280 Speaker 1: the blue prints to the Capitol. We know all of 138 00:07:37,320 --> 00:07:40,360 Speaker 1: those things. But none of those people are in jail. None. 139 00:07:41,160 --> 00:07:43,200 Speaker 1: And that should make every single one of us sick. 140 00:07:43,240 --> 00:07:46,520 Speaker 1: And I'm so fucking tired of being told, oh, well, 141 00:07:46,560 --> 00:07:49,440 Speaker 1: we got to follow the process. Republicans never follow the 142 00:07:49,480 --> 00:07:52,680 Speaker 1: fucking process. They barely ever follow the fucking law. And 143 00:07:52,720 --> 00:07:54,800 Speaker 1: we know that over the four years of Donald Trump, 144 00:07:54,800 --> 00:07:56,920 Speaker 1: they didn't follow the law at all. They thought the 145 00:07:56,960 --> 00:08:01,960 Speaker 1: Constitution was a fucking suggestion, right, And so but here 146 00:08:02,000 --> 00:08:05,880 Speaker 1: we are with Democrats with all their hand ringing right 147 00:08:06,160 --> 00:08:09,600 Speaker 1: with all of their Oh history will remember you. How 148 00:08:09,920 --> 00:08:11,960 Speaker 1: when you're gutting the history books and you're not doing 149 00:08:12,000 --> 00:08:15,400 Speaker 1: dick about that? So what history books is going to 150 00:08:15,440 --> 00:08:20,160 Speaker 1: remember you? You'll be written out right, or you'll be banned. 151 00:08:20,960 --> 00:08:24,400 Speaker 1: We already have schools across this country curriculums that are 152 00:08:24,640 --> 00:08:27,800 Speaker 1: erasing the insurrection. Oh that's too divisive, so we don't 153 00:08:27,800 --> 00:08:29,880 Speaker 1: want to talk about it. Oh it was just a 154 00:08:29,880 --> 00:08:32,440 Speaker 1: group of tourists that got out of control, So we're 155 00:08:32,480 --> 00:08:34,160 Speaker 1: not going to talk about it, so we won't write 156 00:08:34,200 --> 00:08:37,439 Speaker 1: it up. So what history exactly is going to remember 157 00:08:37,480 --> 00:08:43,360 Speaker 1: these people? Nothing? None. You know. I was in La 158 00:08:43,400 --> 00:08:45,720 Speaker 1: at the end of last week, and I have to 159 00:08:45,760 --> 00:08:47,880 Speaker 1: tell you that I was sitting around with folks because, 160 00:08:47,880 --> 00:08:50,800 Speaker 1: as I mentioned to you all, I'm a member, a 161 00:08:50,880 --> 00:08:53,559 Speaker 1: board member of the miss Foundation for Women that is 162 00:08:53,600 --> 00:08:59,120 Speaker 1: specifically working on investing in organizations that are focused on 163 00:08:59,200 --> 00:09:04,280 Speaker 1: lifting up black and brown girls right and women. So 164 00:09:04,520 --> 00:09:07,760 Speaker 1: organizations that support this work doing a myriad of things. 165 00:09:08,280 --> 00:09:10,120 Speaker 1: And I'm sitting there and I'm talking to folks and 166 00:09:10,160 --> 00:09:12,800 Speaker 1: They're just like, Yo, this country is going to hell 167 00:09:12,880 --> 00:09:16,720 Speaker 1: in a fucking handbasket. But I live in California. So 168 00:09:16,800 --> 00:09:20,040 Speaker 1: for right now, for right now, I feel safe. But 169 00:09:20,080 --> 00:09:22,280 Speaker 1: I got to tell you that as I look out 170 00:09:22,360 --> 00:09:24,720 Speaker 1: at the rest of the country, I realize that we 171 00:09:24,800 --> 00:09:28,880 Speaker 1: are in deep, deep trouble. And that's the thing is 172 00:09:28,920 --> 00:09:31,960 Speaker 1: that I can say, right now, Okay, I'm sitting in 173 00:09:32,000 --> 00:09:35,400 Speaker 1: New York and I can feel safe ish, but for 174 00:09:35,440 --> 00:09:40,160 Speaker 1: how long? Because the next time that a Donald Trump 175 00:09:40,280 --> 00:09:43,240 Speaker 1: or a Trump like figure becomes president of the United States, 176 00:09:43,400 --> 00:09:45,400 Speaker 1: they are going to use every single bit of power 177 00:09:45,480 --> 00:09:48,839 Speaker 1: that they have to drain blue states, right, which is 178 00:09:48,880 --> 00:09:52,480 Speaker 1: funny because we subsidize red states. Let's just be clear. 179 00:09:52,640 --> 00:09:55,400 Speaker 1: They don't make any fucking money or produce a goddamn thing, 180 00:09:55,760 --> 00:09:59,480 Speaker 1: right except obesity and under education. Let me not. Let 181 00:09:59,520 --> 00:10:03,720 Speaker 1: me not. That was rude, Let me not. But I'm 182 00:10:03,760 --> 00:10:07,800 Speaker 1: just so angry these days. I'm just so angry and 183 00:10:08,000 --> 00:10:12,280 Speaker 1: frustrated at the fact that all we did was press pause. 184 00:10:13,120 --> 00:10:15,120 Speaker 1: All we did was press pause, and why the rest 185 00:10:15,120 --> 00:10:17,959 Speaker 1: of us are looking around and realizing that we are 186 00:10:18,040 --> 00:10:25,360 Speaker 1: in mortal danger, right, mortal danger. Right, Just imagine what 187 00:10:25,440 --> 00:10:28,360 Speaker 1: the slate of presidential candidates are going to look like 188 00:10:28,400 --> 00:10:30,880 Speaker 1: in twenty twenty four if we make it there, right, 189 00:10:31,520 --> 00:10:39,120 Speaker 1: Ron de Santis, Trump junkin right, Abbott, like, these are 190 00:10:39,160 --> 00:10:41,320 Speaker 1: the people that are going to be running to be 191 00:10:41,360 --> 00:10:44,520 Speaker 1: president of the United States. And frankly, although Joe Biden 192 00:10:44,640 --> 00:10:49,760 Speaker 1: is nice and he's comforting, right, when the people look 193 00:10:49,800 --> 00:10:53,640 Speaker 1: around and if he hasn't offered them anything, and then, frankly, 194 00:10:53,679 --> 00:10:55,719 Speaker 1: when we lose the House and the Senate, and then 195 00:10:55,800 --> 00:10:58,640 Speaker 1: you can't offer us anything. You can't offer us any 196 00:10:58,679 --> 00:11:01,440 Speaker 1: fucking judges that are going to combat the over three 197 00:11:01,480 --> 00:11:04,520 Speaker 1: hundred judges that are now across the country. And now 198 00:11:04,760 --> 00:11:07,600 Speaker 1: when Republicans get hold of the White House again and 199 00:11:07,640 --> 00:11:10,000 Speaker 1: get hold of Congress again, what do you think is 200 00:11:10,000 --> 00:11:16,160 Speaker 1: going to happen? Every single law, every single law that 201 00:11:16,240 --> 00:11:21,360 Speaker 1: you think is laid down in cement, isn't That's the 202 00:11:21,400 --> 00:11:23,480 Speaker 1: thing that I wish that Democrats would get on their 203 00:11:23,480 --> 00:11:27,600 Speaker 1: fucking bullhorn and be screaming about. But they're not right. 204 00:11:27,840 --> 00:11:31,920 Speaker 1: We are in danger. Brown versus the Board of Education 205 00:11:32,080 --> 00:11:35,360 Speaker 1: is in danger. Plus a versus Ferguson is in danger. 206 00:11:35,800 --> 00:11:44,080 Speaker 1: Marriage equality is in danger. Why because we refuse to 207 00:11:44,080 --> 00:11:47,440 Speaker 1: take the action to end the filibuster, to strengthen our 208 00:11:47,520 --> 00:11:51,720 Speaker 1: voting rights, to strengthen our courts, to strengthen anything about 209 00:11:51,720 --> 00:11:55,439 Speaker 1: our democracy and just hope you know that shaming Republicans 210 00:11:55,520 --> 00:11:59,600 Speaker 1: is going to do something You cannot shame the devil. 211 00:12:00,240 --> 00:12:03,160 Speaker 1: That is the thing that folks need to understand. And 212 00:12:03,240 --> 00:12:06,520 Speaker 1: so we're going into midterm elections folks on a hope 213 00:12:06,559 --> 00:12:11,559 Speaker 1: and a fucking prayer, right hoping that people still have 214 00:12:11,640 --> 00:12:15,040 Speaker 1: Trump residue because you know what, again, Joe Biden hasn't 215 00:12:15,040 --> 00:12:19,640 Speaker 1: been doing. He hasn't been taking every single administration member 216 00:12:19,679 --> 00:12:22,640 Speaker 1: from the Trump administration and bringing them out on a 217 00:12:22,679 --> 00:12:25,560 Speaker 1: perp walk for all of the fucking illegal and criminal 218 00:12:25,640 --> 00:12:28,240 Speaker 1: ast shit that they have done. Right, he is not 219 00:12:28,360 --> 00:12:31,160 Speaker 1: reminding the American people each and every day about the 220 00:12:31,200 --> 00:12:34,800 Speaker 1: insurrectionists and that we are holding these people accountable because 221 00:12:34,840 --> 00:12:39,720 Speaker 1: our country right needs to reflect strength not only internally 222 00:12:39,840 --> 00:12:43,400 Speaker 1: and know that people are held responsible, but externally right 223 00:12:43,840 --> 00:12:46,440 Speaker 1: to our allies so that they can see that while 224 00:12:46,520 --> 00:12:51,600 Speaker 1: democracy may falter, it will not fall in America. But 225 00:12:51,720 --> 00:12:55,319 Speaker 1: those are just slogans because we are taking absolutely no 226 00:12:55,440 --> 00:13:00,920 Speaker 1: action to make sure that that is certain, folks. Today, 227 00:13:01,040 --> 00:13:04,280 Speaker 1: I'm very excited about the conversation that is coming up 228 00:13:04,400 --> 00:13:08,400 Speaker 1: next with Alex J. Packer, who is a psychologist and 229 00:13:08,440 --> 00:13:13,000 Speaker 1: wrote the book Slaying Digital Dragons about our kids obsession 230 00:13:13,040 --> 00:13:16,360 Speaker 1: and dare I say us as well, our obsession and 231 00:13:17,000 --> 00:13:22,920 Speaker 1: you know, addiction frankly to the digital world. Right as 232 00:13:23,480 --> 00:13:28,080 Speaker 1: Mark Zuckerberg is announcing metaverse, right and essentially turning our 233 00:13:28,120 --> 00:13:33,320 Speaker 1: real lives into sims. He has written a book that 234 00:13:33,440 --> 00:13:37,680 Speaker 1: is about how we need to decompress, that we need 235 00:13:37,720 --> 00:13:41,640 Speaker 1: to take breaks from social media. That it is not 236 00:13:41,760 --> 00:13:45,920 Speaker 1: only a danger to ourselves physically and emotionally, right with 237 00:13:46,000 --> 00:13:50,280 Speaker 1: the slouching over and the thumbs right, and the arthritis 238 00:13:50,280 --> 00:13:52,160 Speaker 1: and the eyes and all of the things that are 239 00:13:52,160 --> 00:13:56,920 Speaker 1: going wrong physically, but that our addiction to our devices 240 00:13:57,360 --> 00:14:02,199 Speaker 1: is also thwarting our democracy, right, our engagement, our ability 241 00:14:02,200 --> 00:14:05,400 Speaker 1: to have empathy for one another. I'll ask him a question, 242 00:14:05,440 --> 00:14:08,400 Speaker 1: and I will ask this to all of you. I 243 00:14:08,520 --> 00:14:11,520 Speaker 1: felt that with the advent of social media and the 244 00:14:11,559 --> 00:14:15,080 Speaker 1: ability to connect to people, that it would increase our 245 00:14:15,160 --> 00:14:18,280 Speaker 1: empathy because we could see, right, we would see these 246 00:14:18,400 --> 00:14:22,560 Speaker 1: videos we collectively watched George Floyd's murder, right, like, we 247 00:14:22,600 --> 00:14:26,160 Speaker 1: could see these things happening, and that that would invoke empathy. 248 00:14:26,360 --> 00:14:30,560 Speaker 1: Alex says, the absolute opposite is actually happening, And I 249 00:14:30,600 --> 00:14:33,280 Speaker 1: want to know what you think? What do you remember 250 00:14:33,320 --> 00:14:38,040 Speaker 1: your initial thoughts about social media and you know, connectivity 251 00:14:38,040 --> 00:14:41,040 Speaker 1: and building community in the digital space, what you thought 252 00:14:41,040 --> 00:14:43,920 Speaker 1: about it ten years ago and what you think about 253 00:14:43,960 --> 00:14:47,560 Speaker 1: it now. Do you force yourself to take breaks? Do 254 00:14:47,640 --> 00:14:52,160 Speaker 1: you have you know, device free days? Is that even possible? Right? 255 00:14:52,480 --> 00:14:55,200 Speaker 1: Is it possible to leave your home without your phone? 256 00:14:55,400 --> 00:14:57,800 Speaker 1: Or do you just shut off all of your apps 257 00:14:57,800 --> 00:15:00,040 Speaker 1: for the day. I want to know how you you 258 00:15:00,200 --> 00:15:03,720 Speaker 1: are putting yourself, if at all, on any type of 259 00:15:04,160 --> 00:15:07,120 Speaker 1: device diet, so that you can take a break, so 260 00:15:07,160 --> 00:15:10,480 Speaker 1: that you don't have a breakdown. So that conversation is 261 00:15:10,520 --> 00:15:12,520 Speaker 1: coming up next, and I'd love to hear from you, 262 00:15:12,840 --> 00:15:16,680 Speaker 1: folks again. I am so very excited about making this 263 00:15:16,800 --> 00:15:20,280 Speaker 1: journey with all of you into video and I know 264 00:15:20,360 --> 00:15:24,080 Speaker 1: that we will have much more exciting and engaging things 265 00:15:24,160 --> 00:15:26,920 Speaker 1: to produce for all of you patrons, and so I 266 00:15:26,960 --> 00:15:30,040 Speaker 1: just want it again. Want to say thank you, thank you, 267 00:15:30,120 --> 00:15:33,280 Speaker 1: thank you for being a part of this journey. Coming 268 00:15:33,360 --> 00:15:43,120 Speaker 1: up next is my conversation with Alex J. Packer. Folks, 269 00:15:43,240 --> 00:15:46,200 Speaker 1: I am very excited to welcome to Okay f for 270 00:15:46,400 --> 00:15:50,200 Speaker 1: the very first time psychologist Alex J. Packer, who is 271 00:15:50,240 --> 00:15:54,560 Speaker 1: the author of a new book, Slaying Digital Dragons, Tips 272 00:15:54,560 --> 00:15:58,600 Speaker 1: and Tools for protecting your body, brain, psyche and thumbs 273 00:15:58,600 --> 00:16:02,880 Speaker 1: from the Digital dark Side. Alex you talk about the 274 00:16:02,920 --> 00:16:07,760 Speaker 1: fact that a year in pandemic life or I guess 275 00:16:07,840 --> 00:16:10,760 Speaker 1: we are a year plus over in that time, that 276 00:16:10,880 --> 00:16:14,360 Speaker 1: the amount of time that kids and teens have spent 277 00:16:14,640 --> 00:16:20,640 Speaker 1: on their devices, whether it's iPads, iPhones, you know, computers 278 00:16:21,520 --> 00:16:24,880 Speaker 1: is has reached what you were saying as shocking levels. 279 00:16:24,920 --> 00:16:29,080 Speaker 1: I think it's shocking as well that eight ten and 280 00:16:29,200 --> 00:16:33,960 Speaker 1: even twelve hours or more a day is no longer 281 00:16:34,040 --> 00:16:38,480 Speaker 1: considered unusual. I mean that when we when I think 282 00:16:38,520 --> 00:16:40,800 Speaker 1: eight ten and twelve hours, I think about the amount 283 00:16:40,800 --> 00:16:44,520 Speaker 1: of sleep that young people should be getting. How is 284 00:16:44,560 --> 00:16:47,400 Speaker 1: it that they can manage to be on devices eight 285 00:16:47,840 --> 00:16:53,440 Speaker 1: ten and twelve hours a day. Many of them ten't manage. 286 00:16:53,960 --> 00:16:57,920 Speaker 1: And that's what motivated me to write the book. You know, 287 00:16:58,000 --> 00:17:01,320 Speaker 1: I want to say up front, their men, teens who 288 00:17:01,480 --> 00:17:06,679 Speaker 1: use these devices as a tool in their lives and 289 00:17:06,800 --> 00:17:12,639 Speaker 1: they exercise control over them. And the digital world is 290 00:17:12,680 --> 00:17:17,719 Speaker 1: a miracle, but it's also a monster. And I am 291 00:17:17,760 --> 00:17:24,200 Speaker 1: increasingly concerned about the impact on growing bodies and brains 292 00:17:24,240 --> 00:17:30,800 Speaker 1: and psyches and social beings of technology and particularly big tech, 293 00:17:31,320 --> 00:17:35,800 Speaker 1: and what it's doing to young people and the foundational 294 00:17:35,920 --> 00:17:40,119 Speaker 1: pillars of our society. So let's start out with the 295 00:17:40,160 --> 00:17:42,679 Speaker 1: fact that you know, I have many I don't have 296 00:17:42,800 --> 00:17:46,000 Speaker 1: kids myself, but I have many friends that do. And 297 00:17:46,280 --> 00:17:50,480 Speaker 1: I'm seeing children as young as one two years old 298 00:17:51,000 --> 00:17:55,920 Speaker 1: that are on screen time. Parents hand them over any 299 00:17:56,040 --> 00:18:00,359 Speaker 1: number of devices as a way to pacify them. Right, 300 00:18:00,440 --> 00:18:05,320 Speaker 1: it has become the new pacifier. What makes that at 301 00:18:05,359 --> 00:18:10,600 Speaker 1: those young very young brain is readily developing. What makes 302 00:18:10,720 --> 00:18:15,679 Speaker 1: that particularly dangerous for the younger set, well, for the 303 00:18:15,800 --> 00:18:21,360 Speaker 1: younger kids, They are experiencing life through a tiny screen 304 00:18:22,359 --> 00:18:26,480 Speaker 1: and it doesn't have the same tactile or a motive 305 00:18:27,280 --> 00:18:33,639 Speaker 1: bonding elements that face to face interactions with their caregivers 306 00:18:33,720 --> 00:18:40,000 Speaker 1: with siblings would have. It's a very narrow experience. And 307 00:18:40,440 --> 00:18:47,800 Speaker 1: they're not out in sandboxes, they're not playing with physical toys. 308 00:18:47,840 --> 00:18:51,800 Speaker 1: Their eyes are glued to a screen. And you know, 309 00:18:51,920 --> 00:18:55,119 Speaker 1: I think of this a bit like you know the 310 00:18:56,000 --> 00:19:00,040 Speaker 1: story about the frogs. If you put them in a 311 00:19:00,040 --> 00:19:04,480 Speaker 1: plot of wall Yeah, yeah, they won't jump out when 312 00:19:04,560 --> 00:19:09,040 Speaker 1: it's turned up to boiling. And for kids today, they 313 00:19:09,040 --> 00:19:15,960 Speaker 1: are simply immersed in digital devices practically from birth and 314 00:19:16,440 --> 00:19:20,840 Speaker 1: there's no conscious decision about it. And by the time 315 00:19:20,880 --> 00:19:28,200 Speaker 1: they're eight, ten, twelve, eighteen, these devices have taken over 316 00:19:28,320 --> 00:19:34,800 Speaker 1: their lives. Is it that not only are they seeing 317 00:19:34,920 --> 00:19:39,439 Speaker 1: life through this tiny screen, but as a psychologist, is 318 00:19:39,480 --> 00:19:45,280 Speaker 1: it also really harming their ability to make connections? How 319 00:19:46,200 --> 00:19:50,280 Speaker 1: is this kind of amped up amount of screen time? 320 00:19:50,800 --> 00:19:52,840 Speaker 1: Because you know, on one hand, I'll say this, on 321 00:19:52,840 --> 00:19:58,840 Speaker 1: one hand, I was so thankful during quarantine that my God, 322 00:19:58,840 --> 00:20:02,160 Speaker 1: that this was happening at this point in time and 323 00:20:02,280 --> 00:20:07,480 Speaker 1: not fifteen twenty years prior, right where we had FaceTime 324 00:20:07,920 --> 00:20:12,840 Speaker 1: and zoom and all of these other opportunities to connect 325 00:20:13,000 --> 00:20:16,199 Speaker 1: outside of physical touch. I thought that that in and 326 00:20:16,240 --> 00:20:21,879 Speaker 1: of itself may have actually saved people's lives who you know, depression, anxiety, 327 00:20:21,920 --> 00:20:25,000 Speaker 1: and that kind of isolation we know was causing and 328 00:20:25,080 --> 00:20:28,359 Speaker 1: triggering a lot of mental health crises. So in that, 329 00:20:28,560 --> 00:20:31,760 Speaker 1: in that vein, I think, my goodness, to your point, 330 00:20:31,760 --> 00:20:37,000 Speaker 1: what a miracle. Right on the other hand, for those 331 00:20:37,119 --> 00:20:42,240 Speaker 1: that are not that we're not aged in a time 332 00:20:42,359 --> 00:20:46,320 Speaker 1: where I had both both the sandbox right and then 333 00:20:46,680 --> 00:20:50,280 Speaker 1: the and then was a child in a teenager in 334 00:20:50,320 --> 00:20:52,960 Speaker 1: the development of the Internet, so it was able to 335 00:20:53,160 --> 00:20:56,320 Speaker 1: really manage both of those things. As a psychologist, what 336 00:20:56,440 --> 00:21:00,800 Speaker 1: are we seeing in terms of how difficult it is 337 00:21:00,920 --> 00:21:04,080 Speaker 1: or what the differences are in being able to connect 338 00:21:04,400 --> 00:21:09,480 Speaker 1: if a screen is your first mode of contact, Well, 339 00:21:09,560 --> 00:21:15,840 Speaker 1: the main difference is that these devices are hijacking your being. 340 00:21:16,840 --> 00:21:21,280 Speaker 1: You know, people often say to me, well, you know, teenagers, 341 00:21:21,920 --> 00:21:26,280 Speaker 1: it's a time of passion and compulsive behavior. And how 342 00:21:26,280 --> 00:21:29,920 Speaker 1: are these devices? How is social media any different from 343 00:21:30,640 --> 00:21:35,040 Speaker 1: comic books or rock and roll, or you know, spending 344 00:21:35,119 --> 00:21:41,879 Speaker 1: hours tying up the family lambline, and the differences. Those 345 00:21:42,240 --> 00:21:47,120 Speaker 1: were tools, Those were choices teams made. You choose to 346 00:21:47,119 --> 00:21:51,240 Speaker 1: pick up a comic book, you choose to spend three 347 00:21:51,280 --> 00:21:57,080 Speaker 1: hours a day playing drums or shooting hoops. These devices 348 00:21:57,240 --> 00:22:02,159 Speaker 1: now are controlling us. They're not our tools. We've become 349 00:22:02,240 --> 00:22:08,280 Speaker 1: their tools. And the developers of social media platforms and 350 00:22:08,400 --> 00:22:12,639 Speaker 1: many video games and shopping sites, they are using the 351 00:22:12,680 --> 00:22:19,800 Speaker 1: most sophisticated tools and knowledge of human nature and neurochemistry 352 00:22:19,840 --> 00:22:26,440 Speaker 1: and psychology to control us, manipulate us, influence our behavior, 353 00:22:26,640 --> 00:22:30,760 Speaker 1: and even predict our behavior. You know, I think that 354 00:22:30,840 --> 00:22:34,040 Speaker 1: one of the most ask you one of the most 355 00:22:34,080 --> 00:22:38,920 Speaker 1: alarming things that came out of the Facebook whistleblowers, right, 356 00:22:39,200 --> 00:22:43,679 Speaker 1: the multiple reports and data that has been dumped, is 357 00:22:43,920 --> 00:22:47,040 Speaker 1: how much they do know, how much they are aware 358 00:22:47,359 --> 00:22:52,679 Speaker 1: of how social media platforms are changing body brain chemistry, 359 00:22:53,080 --> 00:22:56,440 Speaker 1: on how much they are aware of what will trigger 360 00:22:56,800 --> 00:22:59,560 Speaker 1: you to want to continue to stay on the device 361 00:23:00,080 --> 00:23:02,399 Speaker 1: versus to put it down, and the desires to have 362 00:23:02,520 --> 00:23:05,800 Speaker 1: you on there for as long as possible, because, as 363 00:23:05,840 --> 00:23:09,119 Speaker 1: you know, was brilliantly said, when the platform is free, 364 00:23:09,560 --> 00:23:13,280 Speaker 1: you're the product. So if you're the product, then you 365 00:23:13,320 --> 00:23:15,920 Speaker 1: need to be consuming as much as possible. The more 366 00:23:15,960 --> 00:23:19,160 Speaker 1: time that you're on, the more I can advertise to you, right, 367 00:23:19,320 --> 00:23:23,520 Speaker 1: the more you will spend. What was the most shocking 368 00:23:23,600 --> 00:23:27,880 Speaker 1: thing that you learned or heard, you know, during as 369 00:23:28,280 --> 00:23:32,400 Speaker 1: we're unpacking the information out of Facebook and Instagram as 370 00:23:32,440 --> 00:23:40,080 Speaker 1: it pertains to young people, I was not shocked at all. Yeah, 371 00:23:40,400 --> 00:23:46,520 Speaker 1: as a psychologist, as someone who's worked with teams, I 372 00:23:46,560 --> 00:23:50,399 Speaker 1: believe that that was all happening. And you know the 373 00:23:50,560 --> 00:23:54,280 Speaker 1: power of what happened with the whistleblower is now we 374 00:23:54,400 --> 00:23:58,440 Speaker 1: have the documents and the proof, and it's much harder 375 00:23:58,480 --> 00:24:08,040 Speaker 1: to deny the destructive effect that these platforms and devices 376 00:24:08,160 --> 00:24:13,680 Speaker 1: can have on individual teenagers and on society. So it 377 00:24:13,760 --> 00:24:18,240 Speaker 1: wasn't a surprise to me. And that was so much 378 00:24:18,440 --> 00:24:23,280 Speaker 1: the motivation for this book, because I don't think teens 379 00:24:23,520 --> 00:24:28,920 Speaker 1: realize the extent to which they are being tracked and manipulated. 380 00:24:29,400 --> 00:24:31,800 Speaker 1: You know, we all have a sense of, well, you 381 00:24:31,840 --> 00:24:35,359 Speaker 1: don't have that much privacy, and they're kind of putting 382 00:24:35,400 --> 00:24:40,960 Speaker 1: cookies on your device. But no, it's so much deeper 383 00:24:41,000 --> 00:24:47,000 Speaker 1: than that. And I wanted to empower kids to fight back. 384 00:24:47,040 --> 00:24:51,960 Speaker 1: I mean, teams don't like to be manipulated, and I 385 00:24:52,000 --> 00:24:57,360 Speaker 1: think that teens are ripe for this type of information. 386 00:24:57,880 --> 00:25:01,800 Speaker 1: I mean, between forty and fifty percent of all teams 387 00:25:02,480 --> 00:25:07,919 Speaker 1: say they feel addicted to their phones. And and what 388 00:25:07,960 --> 00:25:10,359 Speaker 1: does that addiction mean when when they say, when you 389 00:25:10,359 --> 00:25:14,359 Speaker 1: say it, what do we mean when we're when we 390 00:25:14,480 --> 00:25:18,600 Speaker 1: use the word addiction in that context that they can't 391 00:25:18,720 --> 00:25:23,240 Speaker 1: control their use? Okay, you know that they are aware 392 00:25:23,320 --> 00:25:28,880 Speaker 1: of negative or harmful consequences, but they're not able to 393 00:25:28,920 --> 00:25:34,760 Speaker 1: turn that awareness into a behavioral change. And ninety percent 394 00:25:34,920 --> 00:25:39,919 Speaker 1: of all teams say that too much screen time is 395 00:25:39,960 --> 00:25:45,000 Speaker 1: a problem for their peers. And you know, again, four 396 00:25:45,000 --> 00:25:48,639 Speaker 1: out of ten. Teams want to be able to unplug 397 00:25:48,960 --> 00:25:52,240 Speaker 1: from time to time from their phones, So I think 398 00:25:52,280 --> 00:25:57,080 Speaker 1: there is a receptivity there and I want to awaken 399 00:25:57,760 --> 00:26:02,280 Speaker 1: their mindfulness so they under stand their screen scene. What 400 00:26:02,600 --> 00:26:06,639 Speaker 1: is great about it and helpful to their growth and 401 00:26:06,880 --> 00:26:11,280 Speaker 1: learning and connection with other people, and what is harmful 402 00:26:11,520 --> 00:26:16,359 Speaker 1: and could actually be interfering with the development of their brain, 403 00:26:16,720 --> 00:26:22,520 Speaker 1: their social skills, their autonomy, their street smarts. How does 404 00:26:22,520 --> 00:26:25,399 Speaker 1: it interfere? And I love the fact that you listed 405 00:26:25,440 --> 00:26:28,360 Speaker 1: all of those things. Let's go to the street smarts 406 00:26:29,160 --> 00:26:33,080 Speaker 1: piece of that. How does a screen time interfere with 407 00:26:33,160 --> 00:26:37,040 Speaker 1: that development of just understanding right and being socialized in 408 00:26:37,080 --> 00:26:43,480 Speaker 1: the world around you. Because the phones let teens connect 409 00:26:43,480 --> 00:26:49,000 Speaker 1: to their friends and their world from the bedroom, they 410 00:26:49,000 --> 00:26:53,480 Speaker 1: don't go out as much, and they they're not getting 411 00:26:53,560 --> 00:26:59,680 Speaker 1: drivers licenses, they're not going out in the city or 412 00:27:00,200 --> 00:27:03,879 Speaker 1: wherever they live on their own. They're not dating as much, 413 00:27:03,920 --> 00:27:09,120 Speaker 1: they're not volunteering as much. So there is a retraction 414 00:27:09,760 --> 00:27:14,439 Speaker 1: of their independence, and adolescence is the primary time for 415 00:27:14,560 --> 00:27:21,359 Speaker 1: becoming independent and developing those types of street smarts or 416 00:27:21,480 --> 00:27:26,200 Speaker 1: skills for going out and about in the world. For teams, 417 00:27:27,000 --> 00:27:31,160 Speaker 1: imagine if their GPS on the phone doesn't work, they'll 418 00:27:31,200 --> 00:27:34,240 Speaker 1: be absolutely lost. How do they get from point eight 419 00:27:34,280 --> 00:27:38,240 Speaker 1: to point B. So in that sense, it's making kids 420 00:27:38,760 --> 00:27:43,439 Speaker 1: more dependent on their parents and more sheltered. They're essentially 421 00:27:43,520 --> 00:27:50,280 Speaker 1: growing up more slowly. They're not developing empathy to the 422 00:27:50,359 --> 00:27:55,680 Speaker 1: same degree as previous generations. So you put all this together, 423 00:27:55,880 --> 00:27:59,800 Speaker 1: and I do see it having a major impact on 424 00:28:00,320 --> 00:28:04,679 Speaker 1: the way kids are going to develop and their self 425 00:28:04,720 --> 00:28:09,000 Speaker 1: confidence and their social skills. You know, it's so interesting 426 00:28:09,600 --> 00:28:12,520 Speaker 1: that you say, then they're not they're not dating as 427 00:28:12,600 --> 00:28:15,119 Speaker 1: much and not getting driver's license, you know, because we 428 00:28:15,240 --> 00:28:19,800 Speaker 1: have these conversations too about the millennial and the you know, 429 00:28:19,960 --> 00:28:26,080 Speaker 1: zillennial set, where one we're saying, oh, well, they're living 430 00:28:26,119 --> 00:28:28,639 Speaker 1: at home longer because of student loan dead and the 431 00:28:28,680 --> 00:28:32,119 Speaker 1: economy and all of these things. But I wonder too 432 00:28:32,560 --> 00:28:35,600 Speaker 1: if what you are saying with regard to how they 433 00:28:35,600 --> 00:28:39,240 Speaker 1: are being socialized or not is also playing a part 434 00:28:39,320 --> 00:28:42,680 Speaker 1: into the larger picture of things, right that you know, 435 00:28:42,760 --> 00:28:47,959 Speaker 1: you aren't having those same types of experiences and markers 436 00:28:48,000 --> 00:28:54,320 Speaker 1: that young people teenagers had twenty thirty forty years ago. Right, 437 00:28:55,000 --> 00:28:59,480 Speaker 1: It's an interesting way of looking at it that more 438 00:28:59,520 --> 00:29:04,480 Speaker 1: people maybe staying at home because of economic and societal factors, 439 00:29:04,560 --> 00:29:08,480 Speaker 1: but also they may be more fearful, like you're saying, 440 00:29:09,000 --> 00:29:13,320 Speaker 1: of going out on their own and being independent if 441 00:29:13,480 --> 00:29:17,480 Speaker 1: deep down they don't feel confident about their ability to 442 00:29:18,120 --> 00:29:21,280 Speaker 1: navigate the world. Yeah. I can tell you that I 443 00:29:21,320 --> 00:29:26,440 Speaker 1: have younger cousins, one that just turned twenty two and 444 00:29:26,560 --> 00:29:29,280 Speaker 1: another one that is also in their mid twenties. But 445 00:29:29,360 --> 00:29:33,200 Speaker 1: I will tell you that their childhood was so completely 446 00:29:33,280 --> 00:29:36,280 Speaker 1: radically different from mine. Um, you know, I used to 447 00:29:36,280 --> 00:29:38,560 Speaker 1: say to them, why you're not going to go out 448 00:29:38,640 --> 00:29:40,680 Speaker 1: and go see your friends, go ride a bike? And 449 00:29:40,720 --> 00:29:43,200 Speaker 1: this is when they were in middle school and high school. 450 00:29:43,240 --> 00:29:45,960 Speaker 1: You're not you know, you're not going to sleepovers, You're 451 00:29:45,960 --> 00:29:48,560 Speaker 1: not doing I don't you know, I don't understand. They 452 00:29:48,600 --> 00:29:52,400 Speaker 1: would go to the mall see somebody from across, you know, 453 00:29:52,560 --> 00:29:57,480 Speaker 1: from across the store that they would go to school with, 454 00:29:58,080 --> 00:30:01,760 Speaker 1: and they would text them instead of actually going up 455 00:30:01,840 --> 00:30:04,440 Speaker 1: to speak with them or saying, Hey, I'm going to 456 00:30:04,480 --> 00:30:06,360 Speaker 1: go and hang out with this person. Part it was 457 00:30:06,480 --> 00:30:10,240 Speaker 1: so it was bizarre for me to see for their 458 00:30:09,920 --> 00:30:13,000 Speaker 1: for their parents, it was just like, I don't understand, 459 00:30:13,080 --> 00:30:17,560 Speaker 1: why aren't you connecting, and that was the only way 460 00:30:17,560 --> 00:30:21,280 Speaker 1: they knew was to shoot that Hey, I see you right, 461 00:30:21,720 --> 00:30:26,200 Speaker 1: send an emoji. In a recent survey, and this did 462 00:30:26,320 --> 00:30:31,640 Speaker 1: shock me, the number one preferred method for communicating amongst 463 00:30:31,680 --> 00:30:37,280 Speaker 1: teens is texting Yeah. And the survey was given right 464 00:30:37,320 --> 00:30:41,440 Speaker 1: before the pandemic. So maybe the experience of the pandemic, 465 00:30:41,520 --> 00:30:46,400 Speaker 1: where kids you know, felt and experienced the absence of 466 00:30:46,440 --> 00:30:50,720 Speaker 1: face to face communication, maybe if the survey were given again, 467 00:30:51,520 --> 00:30:54,840 Speaker 1: face to face would take the number one spot. But 468 00:30:55,000 --> 00:31:00,400 Speaker 1: I was shocked and very concerned about that. So of 469 00:31:00,960 --> 00:31:07,160 Speaker 1: human communication is nonverbal, right, If it's happening through snaps 470 00:31:07,240 --> 00:31:14,120 Speaker 1: and texting and you know, quick quick videos, you're missing 471 00:31:14,120 --> 00:31:16,840 Speaker 1: out on so much. And I think that is one 472 00:31:16,920 --> 00:31:20,680 Speaker 1: reason that we've seen the drop in empathy. People aren't 473 00:31:20,680 --> 00:31:25,480 Speaker 1: getting the same clues in their social interactions. And so 474 00:31:25,880 --> 00:31:29,280 Speaker 1: it's so interesting because I would think that because you're 475 00:31:29,320 --> 00:31:33,120 Speaker 1: on a screen, and because essentially every device that we 476 00:31:33,200 --> 00:31:37,560 Speaker 1: have is a small computer, that you are seeing more, right, 477 00:31:37,560 --> 00:31:40,560 Speaker 1: You're seeing more not just in your own neighborhood, but 478 00:31:40,600 --> 00:31:43,800 Speaker 1: you're seeing more of the world, right, and the ills 479 00:31:43,840 --> 00:31:48,120 Speaker 1: and the pains and the societal you know, imbalances and injustices. 480 00:31:48,160 --> 00:31:56,800 Speaker 1: Why isn't that provoking empathy? I think because empathy develops 481 00:31:56,920 --> 00:32:02,720 Speaker 1: through human interaction primary early and a lot of what 482 00:32:02,880 --> 00:32:07,800 Speaker 1: you're seeing on your screen is what they, you know, 483 00:32:08,000 --> 00:32:13,959 Speaker 1: big tech wants you to see. Like seventy percent of 484 00:32:14,000 --> 00:32:18,680 Speaker 1: the time people spend on YouTube is spent looking at 485 00:32:18,800 --> 00:32:24,240 Speaker 1: YouTube's recommendations. You may have gone there to see a 486 00:32:24,320 --> 00:32:29,200 Speaker 1: particular video yourself, but once you're there and you stay 487 00:32:29,400 --> 00:32:33,480 Speaker 1: for hours, it's YouTube in the driver's seat. And that's why, 488 00:32:33,640 --> 00:32:39,920 Speaker 1: you know, YouTube is viewed as a principal radicalization force 489 00:32:40,000 --> 00:32:44,760 Speaker 1: in our culture. Wow, I never even thought about that, 490 00:32:44,760 --> 00:32:46,600 Speaker 1: that you're not even going there to see what you 491 00:32:46,640 --> 00:32:49,680 Speaker 1: want to see. You're going there and being shown what 492 00:32:50,480 --> 00:32:55,320 Speaker 1: they want you to see, right they're really Yeah, it's 493 00:32:55,480 --> 00:33:00,000 Speaker 1: it's all their choices. I mean, the menus you see 494 00:33:00,080 --> 00:33:03,400 Speaker 1: on your screen when you go to a website, that's 495 00:33:03,800 --> 00:33:10,280 Speaker 1: what the platform wants you to experience. Many times, you're 496 00:33:10,320 --> 00:33:13,320 Speaker 1: probably on a site and you're looking for a different 497 00:33:13,680 --> 00:33:17,240 Speaker 1: choice and it doesn't exist. They don't want to give 498 00:33:17,280 --> 00:33:21,080 Speaker 1: you that choice, or it may exist, but you have 499 00:33:21,120 --> 00:33:23,920 Speaker 1: to look in the tiny, tiny fine print at the 500 00:33:23,960 --> 00:33:29,320 Speaker 1: bottom of the screen to click on it and cancel 501 00:33:29,600 --> 00:33:32,520 Speaker 1: or do whatever it is you want to do. So, 502 00:33:32,560 --> 00:33:35,200 Speaker 1: what are some of the ways that we're able that 503 00:33:35,360 --> 00:33:38,920 Speaker 1: teens and all of us who are in fact addicted 504 00:33:39,000 --> 00:33:44,000 Speaker 1: to our devices on a number in a number of ways, Alex, 505 00:33:44,080 --> 00:33:47,400 Speaker 1: How do we take our power back? How do we 506 00:33:47,440 --> 00:33:54,240 Speaker 1: start making mindful, thoughtful choices about what we're consuming. That's 507 00:33:54,320 --> 00:33:59,080 Speaker 1: the key question, because I'm not hopeful that the tech 508 00:33:59,160 --> 00:34:03,720 Speaker 1: industry or a dithering Congress is going to pave the 509 00:34:03,800 --> 00:34:08,839 Speaker 1: way for solutions. So the first and most important thing 510 00:34:09,040 --> 00:34:14,160 Speaker 1: is to become aware, to become mindful. And the first 511 00:34:14,200 --> 00:34:21,640 Speaker 1: section in Slaying Digital Dragons presents readers with nine wacky 512 00:34:22,120 --> 00:34:28,640 Speaker 1: but science based challenges for understanding their screen scene. So 513 00:34:28,719 --> 00:34:32,759 Speaker 1: I present them with warning signs to see if they 514 00:34:32,880 --> 00:34:37,879 Speaker 1: might recognize in themselves any indications that their use may 515 00:34:37,920 --> 00:34:42,239 Speaker 1: be problematic. I asked them to think about how much 516 00:34:42,360 --> 00:34:48,840 Speaker 1: time they're spending, how that time is apportioned, what triggers 517 00:34:49,120 --> 00:34:54,879 Speaker 1: their use, what their screen habits are, and especially how 518 00:34:54,920 --> 00:34:59,439 Speaker 1: they feel after going to certain apps. You know, we're 519 00:34:59,480 --> 00:35:03,239 Speaker 1: just not thinking about these things. So that's like a 520 00:35:03,400 --> 00:35:11,000 Speaker 1: first step in taking charge, taking control back, and then 521 00:35:11,320 --> 00:35:15,400 Speaker 1: there's learning. So the second section of the book talks 522 00:35:15,480 --> 00:35:19,760 Speaker 1: all about the specific ways in which big tech may 523 00:35:19,800 --> 00:35:23,880 Speaker 1: harm your body and brain and privacy and psyche and 524 00:35:23,960 --> 00:35:28,840 Speaker 1: relationships and future and life balance. And then the final 525 00:35:28,960 --> 00:35:31,799 Speaker 1: section of the book is where you bring it all 526 00:35:31,840 --> 00:35:35,680 Speaker 1: together and you can reset your digital life. It's a 527 00:35:35,719 --> 00:35:39,880 Speaker 1: process I call giving yourself an app and deck. To me, 528 00:35:41,840 --> 00:35:48,040 Speaker 1: what you do is you cut out the unhealthy aspects 529 00:35:48,080 --> 00:35:52,560 Speaker 1: of your digital life and reset it so that it 530 00:35:52,600 --> 00:35:56,840 Speaker 1: works for you and you have a healthier online offline balance. 531 00:35:58,280 --> 00:36:03,279 Speaker 1: What is is? What is that healthy? Is there just 532 00:36:03,360 --> 00:36:07,360 Speaker 1: a general if you're on your device for an hour, 533 00:36:07,800 --> 00:36:10,000 Speaker 1: how long should you be off of it? Is there 534 00:36:10,200 --> 00:36:14,080 Speaker 1: is there any scientific makeup? Or is it really about 535 00:36:14,120 --> 00:36:16,799 Speaker 1: how we are feeling and that we should begin by 536 00:36:16,880 --> 00:36:22,560 Speaker 1: checking in with how we are feeling pre and post usage. Well, 537 00:36:22,600 --> 00:36:28,120 Speaker 1: those are good clues, and the time you spend isn't 538 00:36:28,239 --> 00:36:33,440 Speaker 1: necessarily the key factor. You know, many families argue about 539 00:36:33,520 --> 00:36:38,399 Speaker 1: how much time kids spend on their devices, and some 540 00:36:38,520 --> 00:36:42,480 Speaker 1: parents think it's too much time, and other parents think 541 00:36:42,520 --> 00:36:48,839 Speaker 1: it's way too much time. And the key though, is 542 00:36:49,200 --> 00:36:53,760 Speaker 1: how are you spending the time? You know, six hours 543 00:36:53,840 --> 00:36:57,200 Speaker 1: on a screen can be spent in so many different ways. 544 00:36:57,280 --> 00:37:04,280 Speaker 1: Are you creating or are you vegetating. Are you passively 545 00:37:05,000 --> 00:37:09,200 Speaker 1: you know, responding to material that's thrown at you on 546 00:37:09,239 --> 00:37:13,399 Speaker 1: the screen, or are you actively engaged in learning and 547 00:37:13,480 --> 00:37:17,600 Speaker 1: connecting and creating. Is there a healthy balance of the 548 00:37:17,680 --> 00:37:22,200 Speaker 1: activities you're doing. Are the things you're doing making you 549 00:37:22,280 --> 00:37:26,920 Speaker 1: feel depressed and lonely and insecure or are they making 550 00:37:26,960 --> 00:37:31,840 Speaker 1: you feel proud and connected and productive. So you really 551 00:37:31,880 --> 00:37:35,520 Speaker 1: have to look at the two factors together, the total 552 00:37:35,600 --> 00:37:41,160 Speaker 1: amount of screen time and how it's being spent. I mean, 553 00:37:41,200 --> 00:37:44,200 Speaker 1: that makes sense. It makes sense. You know. I can 554 00:37:44,280 --> 00:37:51,040 Speaker 1: say that I catch myself more now and how much 555 00:37:51,600 --> 00:37:55,279 Speaker 1: swiping I'm doing, how much looking and scrolling just mindlessly 556 00:37:55,320 --> 00:37:58,440 Speaker 1: scrolling with a television on in the background, but still 557 00:37:58,480 --> 00:38:01,759 Speaker 1: holding the hand the phone in your hand, and just 558 00:38:02,200 --> 00:38:04,560 Speaker 1: you know, mindlessly scrolling. And I'm saying, I don't even 559 00:38:04,560 --> 00:38:07,040 Speaker 1: know what I'm looking at, right, So then put it 560 00:38:07,080 --> 00:38:12,239 Speaker 1: down and making those trying to make those small changes. 561 00:38:13,160 --> 00:38:17,160 Speaker 1: But there is not a time that I take public transportation. Right, 562 00:38:17,200 --> 00:38:19,640 Speaker 1: I'm on the subway in New York and I'm looking around. 563 00:38:19,920 --> 00:38:22,239 Speaker 1: No one is looking up, not one part, you know, 564 00:38:22,360 --> 00:38:25,960 Speaker 1: not one person. Everyone is looking down like this, walking 565 00:38:25,960 --> 00:38:28,959 Speaker 1: along the street. Everyone is looking down like this. So 566 00:38:29,400 --> 00:38:34,959 Speaker 1: it is about really re establishing mindfulness at a time 567 00:38:34,960 --> 00:38:40,120 Speaker 1: when we've been able to be so mindless. Right. Sherry Turkle, 568 00:38:40,520 --> 00:38:46,480 Speaker 1: who's an icon in this field, coined the phrase alone together, 569 00:38:47,719 --> 00:38:50,240 Speaker 1: and I think it's so apt because, as you're saying, 570 00:38:50,320 --> 00:38:55,080 Speaker 1: you can't be on any public transportation or an airport 571 00:38:55,320 --> 00:38:59,799 Speaker 1: lounge or any public area and everyone is, you know, 572 00:39:00,120 --> 00:39:04,600 Speaker 1: buried in their phone. And I can't help but connect 573 00:39:04,760 --> 00:39:10,600 Speaker 1: that behavior that you know, disconnection from other people with 574 00:39:10,960 --> 00:39:16,759 Speaker 1: the polarization and the hatred and the silos we're all 575 00:39:16,840 --> 00:39:21,400 Speaker 1: living in today. I think you know these issues, and 576 00:39:21,480 --> 00:39:25,799 Speaker 1: this book is so timely because you really can't come 577 00:39:25,880 --> 00:39:30,359 Speaker 1: up with one aspect of being alive that isn't being 578 00:39:30,400 --> 00:39:36,280 Speaker 1: affected by digital devices and social media and the fake 579 00:39:36,440 --> 00:39:43,440 Speaker 1: news and disinformation that we're all exposed to online. I 580 00:39:44,239 --> 00:39:48,160 Speaker 1: can agree more. This is a fantastic book. Thank you 581 00:39:48,280 --> 00:39:52,440 Speaker 1: so much for coming on, folks. The title of the 582 00:39:52,440 --> 00:39:56,600 Speaker 1: book is Slaying Digital Dragons. Tips and Tools for protecting 583 00:39:56,600 --> 00:40:01,120 Speaker 1: your Mind, Brain, psyche and thumbs from the Digital dark Side. 584 00:40:01,560 --> 00:40:04,200 Speaker 1: Alex J. Packer, thank you so much for making the 585 00:40:04,239 --> 00:40:07,359 Speaker 1: time to join woke f and have this really important conversation. 586 00:40:07,840 --> 00:40:11,400 Speaker 1: We work on this show to bring people on that 587 00:40:11,440 --> 00:40:14,879 Speaker 1: are about connecting us to our mindfulness in so many 588 00:40:14,920 --> 00:40:17,640 Speaker 1: different ways, and this is definitely an important one. So 589 00:40:17,680 --> 00:40:20,879 Speaker 1: we appreciate you, thank you for inviting me. It's been 590 00:40:20,880 --> 00:40:29,280 Speaker 1: a pleasure. That is it for me today on Woke 591 00:40:29,520 --> 00:40:32,839 Speaker 1: f As always power to the people and to all 592 00:40:33,000 --> 00:40:36,360 Speaker 1: the people. Power, get woke and stay woke as fuck.