1 00:00:00,360 --> 00:00:03,840 Speaker 1: Hello the Internet, and welcome to Season two eighty one, 2 00:00:03,920 --> 00:00:07,160 Speaker 1: Episode two of The Daily Site Geist. This is still 3 00:00:07,160 --> 00:00:10,680 Speaker 1: a production the iHeartRadio. It's still a podcast where we 4 00:00:10,800 --> 00:00:15,480 Speaker 1: still take a deep dive into America's shared consciousness. It's Tuesday, 5 00:00:15,760 --> 00:00:18,840 Speaker 1: March twenty eight, twenty twenty three. I feel like Jack 6 00:00:19,000 --> 00:00:21,680 Speaker 1: wasn't really telling you what national day is because sometimes 7 00:00:21,720 --> 00:00:23,520 Speaker 1: it doesn't matter. But I'll just tell you what it is. 8 00:00:23,600 --> 00:00:28,080 Speaker 1: It's National Black forest Cake Day, National Weed Appreciation Day, 9 00:00:28,080 --> 00:00:30,800 Speaker 1: not cannabis, but just the weeds that grow in your yards. 10 00:00:30,800 --> 00:00:34,800 Speaker 1: American Diabetes Association Alert Day, and National Triglycerides Day Wow. 11 00:00:34,840 --> 00:00:36,640 Speaker 1: And something on a Stick Day for those of you 12 00:00:36,720 --> 00:00:40,040 Speaker 1: that indulge with things on sticks like marshmallows or carrots 13 00:00:40,120 --> 00:00:43,160 Speaker 1: or whatever, what have you. My name is Miles Gray 14 00:00:43,200 --> 00:00:47,840 Speaker 1: aka the Heartbroup Heartbroken Brewing as I watched UCLA crash 15 00:00:47,840 --> 00:00:51,080 Speaker 1: out of Gonzaga on Thursday. But I am back, and 16 00:00:51,120 --> 00:00:54,440 Speaker 1: I am still Hideo NoHo, and I am thrilled to 17 00:00:54,520 --> 00:00:57,960 Speaker 1: be joined by my college today. A hilarious comedian, someone 18 00:00:58,040 --> 00:01:02,400 Speaker 1: who's doing good whenever they possibly can, and a wonderful host. 19 00:01:02,800 --> 00:01:05,679 Speaker 1: Uh oh my god, forgive me. What's the stand up 20 00:01:05,680 --> 00:01:08,520 Speaker 1: show again. I've been on leave too long. I'll introduce you. First. 21 00:01:08,600 --> 00:01:14,360 Speaker 1: It's plain thank you. First of all, that just reminds 22 00:01:14,400 --> 00:01:16,319 Speaker 1: me that my tred list rides are high. And second, 23 00:01:16,319 --> 00:01:19,399 Speaker 1: all carrots on a stick, bro, have more fun. What 24 00:01:19,440 --> 00:01:26,080 Speaker 1: are you doing is a stick? Oh wow? So you 25 00:01:26,080 --> 00:01:29,520 Speaker 1: would okay, stick, if you're going to improve upon the form, 26 00:01:29,600 --> 00:01:32,520 Speaker 1: you would put something upon the carrot, upon the carrot. 27 00:01:32,680 --> 00:01:36,240 Speaker 1: That's what I'm saying. Okay, but engineer mine, you know 28 00:01:36,240 --> 00:01:39,600 Speaker 1: what I mean? Yes? Yes, what was a stand up 29 00:01:39,600 --> 00:01:42,679 Speaker 1: show again? So I have a city council. It's a podcast. 30 00:01:43,120 --> 00:01:46,040 Speaker 1: Oh yeah, yeah, yeah yeah, but I've been real lazy 31 00:01:46,040 --> 00:01:49,320 Speaker 1: about it. So nobody get mad at me. I get it, 32 00:01:49,360 --> 00:01:52,000 Speaker 1: I get it, I get it. Play we have we 33 00:01:52,120 --> 00:01:54,320 Speaker 1: got one of those guests that come on. We're like, oh, 34 00:01:54,400 --> 00:01:56,720 Speaker 1: we got someone smart, we got someone that's an expert, 35 00:01:56,920 --> 00:02:00,000 Speaker 1: someone who I can ask questions to, like white technology, 36 00:02:00,480 --> 00:02:03,200 Speaker 1: making me sad, because that's exactly what we're going to 37 00:02:03,240 --> 00:02:06,600 Speaker 1: be doing today. We have to introduce our guests. She 38 00:02:06,880 --> 00:02:10,040 Speaker 1: is the Technology, Privacy and Policy Professor of Law Code, 39 00:02:10,040 --> 00:02:13,160 Speaker 1: director of the Institute for Privacy Protection, and co director 40 00:02:13,200 --> 00:02:15,880 Speaker 1: of the Gibbons Institute of Law. Science and Technology at 41 00:02:15,880 --> 00:02:21,239 Speaker 1: Seton Hall University and the author of the book Unwired 42 00:02:21,280 --> 00:02:29,480 Speaker 1: Gaining Control over Addictive Technologies. Please welcome, Guya Burnsday guy, welcome, 43 00:02:30,080 --> 00:02:33,120 Speaker 1: Thank you, thank you for having me. Oh, thank you 44 00:02:33,160 --> 00:02:36,240 Speaker 1: for blessing us on this second rate podcast with your 45 00:02:36,600 --> 00:02:40,280 Speaker 1: first rate acumen. But yes, we really appreciate you stopping by. 46 00:02:40,480 --> 00:02:43,000 Speaker 1: I like that she comes with her own AKA. She 47 00:02:43,080 --> 00:02:45,440 Speaker 1: had so many titles, I know, yeah, yeah, yeah, and 48 00:02:45,480 --> 00:02:47,760 Speaker 1: like those are legit over here. I'm saying, like I'm 49 00:02:47,800 --> 00:02:50,840 Speaker 1: the lord of lancershim Boulevard. That's not a real thing, 50 00:02:50,919 --> 00:02:53,280 Speaker 1: but for the people in North Hollywood, California, they know 51 00:02:53,360 --> 00:02:55,920 Speaker 1: it is. But guya, welcome to the show. Where are 52 00:02:55,919 --> 00:02:58,400 Speaker 1: you coming to us from today? From New York City 53 00:02:58,560 --> 00:03:01,880 Speaker 1: all the way? Okay, okay um, And yeah, we're gonna 54 00:03:01,960 --> 00:03:03,600 Speaker 1: talk a little bit about your book and some other 55 00:03:03,880 --> 00:03:06,760 Speaker 1: technology things as we get into it. But how has 56 00:03:06,800 --> 00:03:10,720 Speaker 1: it been, you know, doing doing a few interviews, getting 57 00:03:10,760 --> 00:03:13,440 Speaker 1: out there talking with interesting people? Are we the most 58 00:03:13,440 --> 00:03:16,399 Speaker 1: interesting people you've spoken with so far? Without really being 59 00:03:16,440 --> 00:03:18,600 Speaker 1: ten minutes into a conversation, can we for sure? You're 60 00:03:18,600 --> 00:03:26,200 Speaker 1: the funniest nobody? Okay, y, that's all I ever wanted. Well, 61 00:03:26,200 --> 00:03:29,040 Speaker 1: you're the funniest for sure. Okay, well, gaya, we are 62 00:03:29,040 --> 00:03:31,000 Speaker 1: going to get to know you a little bit better. 63 00:03:31,040 --> 00:03:32,600 Speaker 1: But first we're gonna tell people some of the things 64 00:03:32,639 --> 00:03:37,440 Speaker 1: we're gonna talk about. First, the Pope's puffy jacket, No, 65 00:03:37,920 --> 00:03:41,000 Speaker 1: he is not. His style is not that lit. Although 66 00:03:41,040 --> 00:03:44,280 Speaker 1: I guess many people would love if the pont Effects 67 00:03:44,840 --> 00:03:49,040 Speaker 1: was doing a pontifflex upon us mortals on planet Earth. 68 00:03:49,200 --> 00:03:52,080 Speaker 1: But we'll talk about that AI generated image. I just 69 00:03:52,120 --> 00:03:54,760 Speaker 1: remember some people the evolution of people being like I 70 00:03:54,800 --> 00:03:56,320 Speaker 1: can't believe it to them like, oh, yeah, I knew 71 00:03:56,320 --> 00:03:57,960 Speaker 1: that was AI the whole time. Yeah, I knew. I 72 00:03:58,040 --> 00:04:00,520 Speaker 1: knew the slippery slope that we're all, are you finding 73 00:04:00,520 --> 00:04:02,240 Speaker 1: ourselves on in the last couple of weeks. With that, 74 00:04:02,560 --> 00:04:05,480 Speaker 1: we'll also talk about Apple, because it seems like they're 75 00:04:05,520 --> 00:04:09,680 Speaker 1: also ignoring the total lack of enthusiasm for virtual reality 76 00:04:09,720 --> 00:04:14,240 Speaker 1: headsets with their upcoming you know, mixed reality headset that 77 00:04:14,440 --> 00:04:17,400 Speaker 1: I can't believe they're still insisting is something we might want. 78 00:04:17,600 --> 00:04:19,479 Speaker 1: But we'll talk a little bit about that and the 79 00:04:19,800 --> 00:04:21,960 Speaker 1: you know, lack of confidence that some of the people 80 00:04:22,000 --> 00:04:25,279 Speaker 1: working on it are experiencing. And then gotta talk to 81 00:04:25,360 --> 00:04:29,719 Speaker 1: Gaya about Look, I'm a new parent. I'm very aware 82 00:04:30,279 --> 00:04:33,240 Speaker 1: of you know, just how many other parents too or 83 00:04:33,320 --> 00:04:36,359 Speaker 1: talk about screen time. I was raised by my mother 84 00:04:36,760 --> 00:04:39,040 Speaker 1: like saying, don't watch the TV. It's going to rot 85 00:04:39,120 --> 00:04:41,120 Speaker 1: your brain. And how all of these things have evolved. 86 00:04:41,360 --> 00:04:44,200 Speaker 1: So I'm really interested in talking to you about sort 87 00:04:44,200 --> 00:04:46,760 Speaker 1: of the evolution of our addictive technologies and kind of 88 00:04:46,800 --> 00:04:51,520 Speaker 1: we're you know, hurtling towards this very isolated way of living, 89 00:04:51,520 --> 00:04:53,679 Speaker 1: because that's something we talk about a lot from the show. 90 00:04:53,960 --> 00:04:58,920 Speaker 1: But first, Guya, we gotta ask you, Professor Guya Birdstein Party, 91 00:04:59,360 --> 00:05:02,159 Speaker 1: what is something from your search history that might reveal 92 00:05:02,240 --> 00:05:04,160 Speaker 1: something about who you are, what you're into right now? 93 00:05:04,520 --> 00:05:07,320 Speaker 1: So you know, I like this question because I remember 94 00:05:07,320 --> 00:05:09,600 Speaker 1: going to a conference and seeing a slide and having 95 00:05:09,680 --> 00:05:13,760 Speaker 1: the Google search, but instead it said I confess. So basically, 96 00:05:13,800 --> 00:05:16,760 Speaker 1: whatever is in our search is like what's going for 97 00:05:16,839 --> 00:05:20,440 Speaker 1: our brains, and there's nothing special in my chest. Basically 98 00:05:20,520 --> 00:05:23,719 Speaker 1: it shows how my brain just jumps from one thing 99 00:05:23,760 --> 00:05:28,240 Speaker 1: to another. So I would be researching a privacy class, 100 00:05:28,240 --> 00:05:30,880 Speaker 1: so I would look for a movie about drones. Then 101 00:05:30,920 --> 00:05:33,120 Speaker 1: I remember, I know this person. I wrote an article 102 00:05:33,200 --> 00:05:36,080 Speaker 1: so I will switch to googling their name, and then 103 00:05:36,160 --> 00:05:38,640 Speaker 1: a minute later, I'll remember that I have to feed 104 00:05:38,640 --> 00:05:40,560 Speaker 1: my kids, and there's this fish in the fridge and 105 00:05:40,720 --> 00:05:42,880 Speaker 1: of a recipe, so I look for the recipe, and 106 00:05:42,920 --> 00:05:45,919 Speaker 1: then I recall that I never got grocery, so I 107 00:05:46,000 --> 00:05:49,760 Speaker 1: switched to googling the fresh Direct. So and all of 108 00:05:49,760 --> 00:05:53,120 Speaker 1: this happens in ten minutes. So really this you can 109 00:05:53,240 --> 00:05:56,680 Speaker 1: see how my brain operates from this? Sure? Sure? And 110 00:05:56,720 --> 00:05:58,960 Speaker 1: do you It is funny when you can kind of 111 00:05:59,000 --> 00:06:01,760 Speaker 1: look at your own stream of consciousness via Google searches 112 00:06:01,800 --> 00:06:03,920 Speaker 1: and you're like, wow, I had I had an interesting 113 00:06:04,000 --> 00:06:07,000 Speaker 1: two hours right there from like looking at wait, so 114 00:06:07,040 --> 00:06:09,720 Speaker 1: what's this? What kind of drone stuff were you looking up? Well? 115 00:06:09,760 --> 00:06:12,960 Speaker 1: I was looking up for I teach about privacy and drones. 116 00:06:13,160 --> 00:06:16,640 Speaker 1: And actually the first time I showed my student drones 117 00:06:17,040 --> 00:06:21,680 Speaker 1: it was when COVID nineteen broke in China, and I 118 00:06:21,760 --> 00:06:24,200 Speaker 1: just was the only video I found online. So they 119 00:06:24,200 --> 00:06:26,479 Speaker 1: started to show it to them, never realizing that a 120 00:06:26,560 --> 00:06:30,680 Speaker 1: month later this is going to be here, and this 121 00:06:30,880 --> 00:06:33,920 Speaker 1: was just showing them how drones can basically fly over 122 00:06:33,960 --> 00:06:35,920 Speaker 1: an empty city and how it looks and how it 123 00:06:36,000 --> 00:06:40,800 Speaker 1: takes pictures and it was shocking to know what happened next. Yeah, 124 00:06:40,839 --> 00:06:43,880 Speaker 1: what kind of privacy laws are there with drones, because 125 00:06:43,920 --> 00:06:45,960 Speaker 1: I feel like it's I get that, like you're not 126 00:06:46,000 --> 00:06:48,920 Speaker 1: supposed to do like fly drones in certain areas, but 127 00:06:49,000 --> 00:06:51,599 Speaker 1: like in LA, I see so many people like in 128 00:06:51,720 --> 00:06:54,359 Speaker 1: residential areas, Like I'll hear the were of like a 129 00:06:54,480 --> 00:06:57,560 Speaker 1: drone and it's like what I saw on like the 130 00:06:57,680 --> 00:07:02,039 Speaker 1: murder Trial documentary had like drones that were flying and 131 00:07:02,120 --> 00:07:05,760 Speaker 1: watching as people remove funds from a place, like it 132 00:07:05,960 --> 00:07:08,440 Speaker 1: was like legit evidence and it was just a dude 133 00:07:08,480 --> 00:07:11,720 Speaker 1: with a drone. Yeah, there's not enough. The main thing 134 00:07:11,800 --> 00:07:14,440 Speaker 1: has to do is registered with the FAA, but it's 135 00:07:14,520 --> 00:07:17,320 Speaker 1: not Basically you can do a lot with the drone 136 00:07:17,400 --> 00:07:20,800 Speaker 1: right now. Yeah, the kid with a drone right exactly. 137 00:07:20,840 --> 00:07:22,360 Speaker 1: And I see like, but then I see videos or 138 00:07:22,400 --> 00:07:24,600 Speaker 1: people are like I took a drone down that was 139 00:07:24,640 --> 00:07:26,760 Speaker 1: above my home, And part of me is like, I 140 00:07:26,800 --> 00:07:29,160 Speaker 1: guess that's fair if you're using a bolow to take 141 00:07:29,480 --> 00:07:32,120 Speaker 1: a drone down. But yeah, it's good to know. Starting 142 00:07:32,120 --> 00:07:35,000 Speaker 1: early on the Robot Wars, you know, yeah, right, I 143 00:07:35,040 --> 00:07:38,400 Speaker 1: mean alrighty, it's funny when you see like people help 144 00:07:38,560 --> 00:07:41,640 Speaker 1: sometimes help the delivery robots get across the street, and 145 00:07:41,680 --> 00:07:43,720 Speaker 1: then you see like the like ludite type people who 146 00:07:43,720 --> 00:07:46,440 Speaker 1: are like, man, this robot like kick it over, and 147 00:07:46,440 --> 00:07:49,480 Speaker 1: you're just like, just decide on one miles. You should 148 00:07:49,480 --> 00:07:54,040 Speaker 1: know that ludites were a name for revolutionary workers. Come on, yes, 149 00:07:54,240 --> 00:07:56,240 Speaker 1: I know, but look and just like we should be 150 00:07:56,240 --> 00:07:59,040 Speaker 1: smashing chat GPT, we should probably smashing them printing presses. 151 00:07:59,080 --> 00:08:03,000 Speaker 1: I get that. Man, that's a whole other boxel worms too. 152 00:08:03,000 --> 00:08:05,080 Speaker 1: In the last couple of days of talking like really 153 00:08:05,080 --> 00:08:08,600 Speaker 1: earnestly with people about AI and some people really being 154 00:08:08,640 --> 00:08:11,560 Speaker 1: on the side of like it's a great tool that 155 00:08:11,680 --> 00:08:14,280 Speaker 1: could make things so much easier, and as somebody who 156 00:08:14,280 --> 00:08:17,000 Speaker 1: has been laid off before or look how budget cuts 157 00:08:17,000 --> 00:08:18,720 Speaker 1: their man, I'm like, that looks like a way to 158 00:08:18,800 --> 00:08:21,960 Speaker 1: condense like fifteen jobs into two. But are we then 159 00:08:22,000 --> 00:08:25,920 Speaker 1: grappling with the complexity of that after I'm like torn 160 00:08:26,040 --> 00:08:28,760 Speaker 1: because I did work in AI in my PhD that 161 00:08:28,800 --> 00:08:32,600 Speaker 1: I quit. But I think like everything gets like automated, 162 00:08:32,760 --> 00:08:38,199 Speaker 1: and job security just like constantly changes as technology evolves. 163 00:08:38,559 --> 00:08:41,800 Speaker 1: So I think that responsibility like definitely like the people 164 00:08:41,840 --> 00:08:44,840 Speaker 1: making the tech need to be talking to policymakers more 165 00:08:44,880 --> 00:08:48,439 Speaker 1: to prepare like the masses for like the skill changes 166 00:08:48,480 --> 00:08:51,480 Speaker 1: that like I mean, think about computer scientists now versus 167 00:08:51,520 --> 00:08:55,240 Speaker 1: fifteen years ago, like the like the job market that's 168 00:08:55,280 --> 00:08:58,440 Speaker 1: completely changed, you know. And I think AI is scary 169 00:08:58,520 --> 00:09:00,560 Speaker 1: because of what we're going to talk about next, But 170 00:09:00,720 --> 00:09:03,640 Speaker 1: there are good uses for it. It's just like people 171 00:09:03,640 --> 00:09:05,640 Speaker 1: are I'm like, I'm not going to be impressed with 172 00:09:05,679 --> 00:09:08,760 Speaker 1: AI until someone makes like a single printer that works reliably, 173 00:09:08,800 --> 00:09:12,960 Speaker 1: you know what I mean. Like, I don't care about 174 00:09:13,000 --> 00:09:15,760 Speaker 1: all of these advances. I just want a printer to work, 175 00:09:15,920 --> 00:09:21,160 Speaker 1: you know. I want a PowerPoint to not defeat a professor. Okay, guy, 176 00:09:21,559 --> 00:09:23,280 Speaker 1: what are you? How are you? Like? What's how it 177 00:09:23,400 --> 00:09:25,439 Speaker 1: is sort of AI sort of intersect with your work 178 00:09:25,520 --> 00:09:30,199 Speaker 1: right now? Well, I think I feel like the writing 179 00:09:30,240 --> 00:09:31,839 Speaker 1: has been on the wall for a while. People are 180 00:09:31,840 --> 00:09:34,840 Speaker 1: so shocked that chat GPT came in, and what are 181 00:09:34,840 --> 00:09:36,400 Speaker 1: they going to do with students and how are they 182 00:09:36,440 --> 00:09:40,640 Speaker 1: going to learn anything? But but seriously, we've just been 183 00:09:40,679 --> 00:09:45,800 Speaker 1: incorporated technology into the classroom, like without even thinking so, 184 00:09:46,160 --> 00:09:49,120 Speaker 1: and suddenly this thing is there and it's the first 185 00:09:49,120 --> 00:09:52,120 Speaker 1: time we are stopping to think and realizing maybe maybe 186 00:09:52,200 --> 00:09:55,120 Speaker 1: not all technology is good. Maybe a kid would not 187 00:09:55,240 --> 00:09:59,000 Speaker 1: learn if they their essays written by chat YPT. So 188 00:09:59,080 --> 00:10:02,600 Speaker 1: even though I'm concern about what's going to happen, I'm 189 00:10:02,600 --> 00:10:06,680 Speaker 1: sort of happy there's some kind of wake up coal here, right. Yeah. 190 00:10:06,720 --> 00:10:09,559 Speaker 1: I talked to some professors about this. I talked to 191 00:10:09,600 --> 00:10:13,000 Speaker 1: a professor from Germany and he was saying that they 192 00:10:13,040 --> 00:10:16,160 Speaker 1: do use like AI to write essays, but the ways 193 00:10:16,200 --> 00:10:18,800 Speaker 1: to get around that is they have like rival technology 194 00:10:18,840 --> 00:10:21,760 Speaker 1: to detect when AI is being used to write the essay. 195 00:10:21,840 --> 00:10:24,200 Speaker 1: So that's what's happening in Europe right now. And I 196 00:10:24,240 --> 00:10:26,520 Speaker 1: was like, how how much is this going to go 197 00:10:26,600 --> 00:10:28,559 Speaker 1: back and forth? This is like a kid battling their 198 00:10:28,559 --> 00:10:30,600 Speaker 1: parent with parental controls, you know what I mean, like 199 00:10:31,000 --> 00:10:34,199 Speaker 1: try to lock them out with the password, right, and 200 00:10:34,320 --> 00:10:36,840 Speaker 1: that never goes anywhere because there's always a new technology, 201 00:10:36,840 --> 00:10:38,920 Speaker 1: and it's just yeah, but the fact we have we're 202 00:10:38,960 --> 00:10:42,360 Speaker 1: faced by this and it's out there. There was no warning. 203 00:10:42,360 --> 00:10:44,719 Speaker 1: It just came the CHAGYBT and I was supposed to 204 00:10:44,760 --> 00:10:47,520 Speaker 1: deal with the consequences and there's no time to think 205 00:10:47,559 --> 00:10:50,720 Speaker 1: even yeah out here. Yeah, And because i They're like 206 00:10:50,760 --> 00:10:52,719 Speaker 1: I have friends too, who are like in grant writing too, 207 00:10:52,800 --> 00:10:56,079 Speaker 1: they're not born writers, you know, they're just like motivated 208 00:10:56,120 --> 00:10:58,080 Speaker 1: people in the field that they do in grant writing 209 00:10:58,160 --> 00:11:00,959 Speaker 1: is difficult, and they're like, it's taken a lot of 210 00:11:01,000 --> 00:11:03,400 Speaker 1: work off of my plate. I'm not a born writer 211 00:11:03,920 --> 00:11:06,320 Speaker 1: or like I have these literary skill to be writing 212 00:11:06,400 --> 00:11:08,640 Speaker 1: like compelling things. It's much easier to use that. And 213 00:11:08,640 --> 00:11:11,559 Speaker 1: I'm like, I completely understand where these tools like how 214 00:11:11,600 --> 00:11:13,720 Speaker 1: they're used. And then I have friends who work in 215 00:11:13,760 --> 00:11:17,280 Speaker 1: advertising who are like, yeah, company clients are now coming 216 00:11:17,320 --> 00:11:20,560 Speaker 1: to us with like half baked decks that AI generated, 217 00:11:20,640 --> 00:11:23,480 Speaker 1: and it's like sort of diminishing the kinds of work 218 00:11:23,559 --> 00:11:26,280 Speaker 1: that we're able to do even as advertising agencies along 219 00:11:26,320 --> 00:11:28,760 Speaker 1: with them. And I'm like, you see, you see where 220 00:11:28,760 --> 00:11:31,640 Speaker 1: the sort of the squeezing and expansion is beginning to happen. 221 00:11:31,679 --> 00:11:34,079 Speaker 1: But yeah, it's this is why robots need to take 222 00:11:34,120 --> 00:11:35,719 Speaker 1: over all of the labor. And then we just get 223 00:11:35,720 --> 00:11:37,400 Speaker 1: to frolic and fields all day. Thank you know what 224 00:11:37,440 --> 00:11:42,080 Speaker 1: I mean, Let us let us frolic let us easier times. Oh, 225 00:11:42,120 --> 00:11:47,080 Speaker 1: we just consume and sit on our screens and nothing 226 00:11:47,160 --> 00:11:54,680 Speaker 1: else because wale, Okay, Kyle, what is something that you 227 00:11:54,679 --> 00:11:57,760 Speaker 1: think is overrated. What I think is overrated. I think 228 00:11:57,760 --> 00:12:02,199 Speaker 1: it's um basically nexting progress. So I'll give you two 229 00:12:02,200 --> 00:12:05,320 Speaker 1: examples what I mean by that. So I am supposed 230 00:12:05,320 --> 00:12:08,400 Speaker 1: to meet a friend in a restaurant and she arrives 231 00:12:09,440 --> 00:12:12,600 Speaker 1: five minutes early, so she texts me'm I just got here, 232 00:12:12,880 --> 00:12:15,720 Speaker 1: And then she texts me two minutes later. I'm sitting 233 00:12:15,720 --> 00:12:18,560 Speaker 1: at the corner on the table, and then you know, 234 00:12:19,000 --> 00:12:21,440 Speaker 1: I come in and instead of looking for her, I 235 00:12:21,480 --> 00:12:24,600 Speaker 1: text her I'm here. And why all these extra texting? 236 00:12:24,720 --> 00:12:28,240 Speaker 1: It's not really necessary? Or I live in New York City. 237 00:12:28,280 --> 00:12:30,760 Speaker 1: We order food all the time. We order delivery, so 238 00:12:30,800 --> 00:12:33,760 Speaker 1: you know, you get confirmation when you order food. I 239 00:12:33,760 --> 00:12:36,400 Speaker 1: guess that's good. You know your order got in. But 240 00:12:36,520 --> 00:12:39,240 Speaker 1: then you get another text your food is being prepared, 241 00:12:39,760 --> 00:12:42,199 Speaker 1: and then you get another one your delivery is on 242 00:12:42,280 --> 00:12:46,440 Speaker 1: the way. And then usually like two minutes after we're delivered, 243 00:12:46,760 --> 00:12:51,000 Speaker 1: you get a text saying congratulations your food is aliver Right, right? 244 00:12:51,040 --> 00:12:53,800 Speaker 1: Do we really need all of this? That makes me think, 245 00:12:54,120 --> 00:12:57,800 Speaker 1: you know, like the constant progress alerts has to be 246 00:12:57,840 --> 00:12:59,640 Speaker 1: something that I know, like in your book, you touch 247 00:12:59,679 --> 00:13:02,160 Speaker 1: on like the idea that a lot of technology stuff 248 00:13:02,240 --> 00:13:05,680 Speaker 1: is informed by like like neuropsychologists too, and just the 249 00:13:05,760 --> 00:13:08,080 Speaker 1: idea of like what's going to get someone like this 250 00:13:08,160 --> 00:13:10,480 Speaker 1: is actually a future that people want even though we're 251 00:13:10,520 --> 00:13:12,200 Speaker 1: like do we But I also get it in the 252 00:13:12,200 --> 00:13:15,959 Speaker 1: age of such instant gratification that it's probably for the 253 00:13:16,040 --> 00:13:17,720 Speaker 1: kinds of people who are like, well, what's going on 254 00:13:17,760 --> 00:13:19,720 Speaker 1: with my order? Like where is it? And then you 255 00:13:19,720 --> 00:13:22,959 Speaker 1: can at least have Oh okay, they've they're preparing it. 256 00:13:23,240 --> 00:13:26,800 Speaker 1: Oh okay, they have another stop along the way. Right. 257 00:13:26,880 --> 00:13:29,160 Speaker 1: We spent a little top because every time I get 258 00:13:29,200 --> 00:13:31,920 Speaker 1: one of these texts, I just just say, Okay, the 259 00:13:31,960 --> 00:13:34,960 Speaker 1: food is coming, my kids will get dinner. I start 260 00:13:35,040 --> 00:13:37,320 Speaker 1: looking at my emails and I start taking a Facebook 261 00:13:37,320 --> 00:13:40,160 Speaker 1: and I've spent ten minutes just because I got this 262 00:13:40,320 --> 00:13:44,040 Speaker 1: extra text. Wow. Also, it takes the responsibility off of 263 00:13:44,040 --> 00:13:45,920 Speaker 1: the person to be like patient, you know what I 264 00:13:45,960 --> 00:13:48,800 Speaker 1: mean like that, and that makes like more face to 265 00:13:48,880 --> 00:13:54,680 Speaker 1: face conversations and demands like much more terrifying, because people 266 00:13:54,760 --> 00:13:59,720 Speaker 1: expect efficiency at their demand and at their service, you know, 267 00:14:00,120 --> 00:14:02,960 Speaker 1: right right. I was just I just was thinking of 268 00:14:03,080 --> 00:14:06,800 Speaker 1: recently I got like a like a like a like 269 00:14:06,840 --> 00:14:10,559 Speaker 1: a delivery thing and it was like telling me. They're like, oh, yeah, 270 00:14:10,600 --> 00:14:13,480 Speaker 1: this package will arrive in like five days from now, 271 00:14:13,800 --> 00:14:15,720 Speaker 1: four days from now. I'm like, yeah, I knew that 272 00:14:15,720 --> 00:14:19,240 Speaker 1: when you gave me the like, you know, expected delivery date. 273 00:14:19,560 --> 00:14:21,960 Speaker 1: And to that point, I'm like, maybe this is for 274 00:14:22,080 --> 00:14:24,680 Speaker 1: a very specific kind of psychology that it's catering to, 275 00:14:24,840 --> 00:14:29,240 Speaker 1: because it certainly was not me. Guya, what's something you 276 00:14:29,320 --> 00:14:32,760 Speaker 1: think is underrated? Well, two things which I related, I 277 00:14:32,800 --> 00:14:37,520 Speaker 1: think alarm clocks and wrist watches. So if I had 278 00:14:37,520 --> 00:14:40,760 Speaker 1: an alarm clock, instead of using the alarm on my phone, 279 00:14:41,240 --> 00:14:44,040 Speaker 1: instead of getting up in the morning in bed and 280 00:14:44,160 --> 00:14:47,560 Speaker 1: picking up my phone and force checking my email, three 281 00:14:47,600 --> 00:14:53,040 Speaker 1: email accounts, my social networks and what's up whatever text, 282 00:14:53,800 --> 00:14:56,640 Speaker 1: I could maybe just you know, woke wake up in 283 00:14:56,640 --> 00:15:01,600 Speaker 1: a more normal way. But if I don't, I don't 284 00:15:01,640 --> 00:15:06,240 Speaker 1: have separate alarm. I'm dreaming about that to get to one. 285 00:15:06,880 --> 00:15:10,440 Speaker 1: And another thing is ast wristwatch. I don't want to 286 00:15:10,440 --> 00:15:12,680 Speaker 1: have one either. I actually lost it three months ago 287 00:15:12,720 --> 00:15:14,520 Speaker 1: and get a chance to buy a new a new one. 288 00:15:14,600 --> 00:15:17,480 Speaker 1: But so every time I want to know the time again, 289 00:15:17,560 --> 00:15:21,640 Speaker 1: I pick up my phone and I just described what happens. 290 00:15:21,760 --> 00:15:24,640 Speaker 1: So I think this are things that if they came 291 00:15:24,680 --> 00:15:27,160 Speaker 1: back into fashion, would be great. I know. That's what 292 00:15:27,360 --> 00:15:30,200 Speaker 1: Like the Apple watch was like so insidious, Like I 293 00:15:30,240 --> 00:15:32,520 Speaker 1: got it in the beginning of the pandemic, like as 294 00:15:32,560 --> 00:15:34,360 Speaker 1: like a heart rate monitor too, because I was like 295 00:15:34,440 --> 00:15:36,200 Speaker 1: running more and I was like, oh, this is great 296 00:15:36,200 --> 00:15:37,760 Speaker 1: and I can I don't have to bring an iPod. 297 00:15:37,800 --> 00:15:39,960 Speaker 1: I can like listen with my like with my wrist 298 00:15:40,000 --> 00:15:42,720 Speaker 1: watch without it's like less bulk. But to your point, now, 299 00:15:43,040 --> 00:15:44,880 Speaker 1: when I look at my wrist, it's for so much 300 00:15:44,920 --> 00:15:49,280 Speaker 1: more than the time, and it's yeah, the the amounts 301 00:15:49,280 --> 00:15:53,240 Speaker 1: of information on offer from such a small devices, it 302 00:15:53,280 --> 00:15:55,960 Speaker 1: can definitely take your attention. And I was a kid. 303 00:15:56,000 --> 00:15:58,240 Speaker 1: I used to wear a wristwatch and like they're great, 304 00:15:58,280 --> 00:15:59,920 Speaker 1: and I was the only one who did so I 305 00:16:00,080 --> 00:16:02,840 Speaker 1: was like a little baby businessman and I always needed 306 00:16:02,880 --> 00:16:04,560 Speaker 1: it before I went to school. Hall I was like, 307 00:16:04,600 --> 00:16:07,000 Speaker 1: I'm gonna be late, mother, Let's get there right right, 308 00:16:07,840 --> 00:16:10,680 Speaker 1: That's all I have to offer to You're like, you're like, well, no, 309 00:16:10,760 --> 00:16:12,520 Speaker 1: we have plenty of time, Like well is your watch 310 00:16:12,520 --> 00:16:16,600 Speaker 1: said to the atomic clock like mine, because we are behind, 311 00:16:17,080 --> 00:16:19,600 Speaker 1: we have to go. That was but it's funny. I 312 00:16:19,680 --> 00:16:21,680 Speaker 1: also like wore a wrist watch too, Like was one 313 00:16:21,680 --> 00:16:23,520 Speaker 1: of those like nerdy kids just like to have like 314 00:16:23,520 --> 00:16:26,040 Speaker 1: a watch, Like yeah, it's like or be like a 315 00:16:26,280 --> 00:16:28,720 Speaker 1: um actually type kid were like, oh, we should actually 316 00:16:28,760 --> 00:16:31,720 Speaker 1: go into recess because it's already like one thirty. They didn't. 317 00:16:31,760 --> 00:16:35,440 Speaker 1: Are you a roller backpack kid? No, rolliebags came out 318 00:16:35,640 --> 00:16:37,960 Speaker 1: my junior year of high school. That's like when we 319 00:16:38,080 --> 00:16:41,000 Speaker 1: hit peak rolliebag. I'm a I'm an elder. So the 320 00:16:41,080 --> 00:16:43,240 Speaker 1: earliest people that I remember there was this one girl 321 00:16:43,240 --> 00:16:45,800 Speaker 1: I would school, a Libby. She I remember she heard 322 00:16:45,800 --> 00:16:48,040 Speaker 1: her back like like you know, playing in the yard 323 00:16:48,080 --> 00:16:49,880 Speaker 1: or whatever, and the doctor told her she couldn't wear 324 00:16:49,920 --> 00:16:52,080 Speaker 1: a backpack. So she had like a travel suitcase that 325 00:16:52,120 --> 00:16:54,280 Speaker 1: she would bring around. So funny to us, you were 326 00:16:54,320 --> 00:16:56,400 Speaker 1: like that it was the o g rolliebag was just 327 00:16:56,440 --> 00:16:59,360 Speaker 1: bringing a travel suitcase around. I have thought about doing 328 00:16:59,360 --> 00:17:01,520 Speaker 1: that at the girl through store before I got a car. 329 00:17:01,640 --> 00:17:04,080 Speaker 1: I was like this closed to bringing just a suitcase 330 00:17:05,080 --> 00:17:09,720 Speaker 1: grostery store. I love the efficiency. I love the efficiency. 331 00:17:09,880 --> 00:17:11,840 Speaker 1: So guy, await, So I love that You're like, you'd 332 00:17:11,840 --> 00:17:14,040 Speaker 1: be great to having a long clock. Me myself don't 333 00:17:14,080 --> 00:17:16,840 Speaker 1: have one, but I think it would be fantastic. What's 334 00:17:17,040 --> 00:17:20,240 Speaker 1: I love, like the sort of paradoxical relationship we all 335 00:17:20,320 --> 00:17:22,280 Speaker 1: have with these kinds of things, or like that's what 336 00:17:22,320 --> 00:17:24,320 Speaker 1: I need. Don't have one yet, but I feel it's 337 00:17:24,320 --> 00:17:27,720 Speaker 1: something I could that could definitely help me. Yeah, Miss Gaya, 338 00:17:27,840 --> 00:17:30,720 Speaker 1: Mother Earth, she wakes up with the sun. Okay, come down, 339 00:17:31,240 --> 00:17:33,800 Speaker 1: I wait with the sun, but not because I want to, 340 00:17:33,960 --> 00:17:39,240 Speaker 1: because I had Okay, cool, Well, let's take a quick 341 00:17:39,280 --> 00:17:42,359 Speaker 1: break and we'll be right back to talk about some 342 00:17:42,359 --> 00:17:46,200 Speaker 1: some really cool, uh Catholic fashion trends coming out of 343 00:17:46,320 --> 00:18:01,920 Speaker 1: Rome right after this. Over the weekend, Oh, we're back. Sorry, 344 00:18:02,040 --> 00:18:05,520 Speaker 1: I'm rusty, folks. I've been on paternal leave parental leave, 345 00:18:05,920 --> 00:18:08,280 Speaker 1: and I've this is my first time speaking with people 346 00:18:08,359 --> 00:18:11,800 Speaker 1: over the age of like seven weeks old. So this 347 00:18:11,920 --> 00:18:13,760 Speaker 1: is helping me a lot as I slowly get back 348 00:18:13,760 --> 00:18:16,840 Speaker 1: into this. We're back, but I want to get into 349 00:18:16,840 --> 00:18:19,960 Speaker 1: our first story. So over the weekend, there was a 350 00:18:20,040 --> 00:18:22,719 Speaker 1: huge splash on social media when an image of the 351 00:18:22,760 --> 00:18:25,679 Speaker 1: Pope warring puffy white jacket came out and it was 352 00:18:25,720 --> 00:18:29,520 Speaker 1: the talk of many text threads and conversations between people 353 00:18:29,560 --> 00:18:31,920 Speaker 1: I know who are Catholic and they're Catholic parents who 354 00:18:31,920 --> 00:18:34,639 Speaker 1: are like, see, he's pretty cool, check him out. But 355 00:18:34,880 --> 00:18:39,720 Speaker 1: turns out it was an AI generated image, which left 356 00:18:40,359 --> 00:18:44,399 Speaker 1: many people sort of like ingenuine shock, and this like 357 00:18:44,480 --> 00:18:46,399 Speaker 1: the sort of journey of this image was. It was 358 00:18:46,440 --> 00:18:50,639 Speaker 1: first posted to the mid Journey AI subreddit and from there, 359 00:18:50,960 --> 00:18:52,919 Speaker 1: you know, I think if anyone knew where it was, 360 00:18:52,960 --> 00:18:54,680 Speaker 1: you go, oh, this is posted in the place where 361 00:18:54,680 --> 00:18:58,159 Speaker 1: people are posting AI generated images. But there there is 362 00:18:58,160 --> 00:19:00,920 Speaker 1: a Reddit to Twitter pipeline that most people aren't maybe 363 00:19:00,920 --> 00:19:02,679 Speaker 1: aren't aware of, or a lot of the context just 364 00:19:02,680 --> 00:19:05,199 Speaker 1: gets ripped and then put on Twitter immediately, sometimes with 365 00:19:05,280 --> 00:19:09,480 Speaker 1: no context or whatever. And then many people were like, 366 00:19:09,880 --> 00:19:13,720 Speaker 1: what is a go weighing on here? The pope looks 367 00:19:13,760 --> 00:19:16,919 Speaker 1: fantastic or is he really wearing all that? But I 368 00:19:16,960 --> 00:19:19,560 Speaker 1: think with anything, if you weren't looking on Twitter and 369 00:19:19,560 --> 00:19:21,840 Speaker 1: you kind of looked at a bigger image, you could 370 00:19:21,920 --> 00:19:25,719 Speaker 1: kind of see that there were some inconsistencies about just 371 00:19:26,119 --> 00:19:29,240 Speaker 1: you know, physics or like light or what kind of 372 00:19:29,240 --> 00:19:32,000 Speaker 1: crucifix even the pope would wear. But I was curious 373 00:19:32,040 --> 00:19:34,919 Speaker 1: if you all saw this picture and what your own 374 00:19:35,280 --> 00:19:38,840 Speaker 1: evolution of thoughts was with this. I saw it, and 375 00:19:38,920 --> 00:19:41,760 Speaker 1: I didn't think any thoughts. I just retweeted my friend's 376 00:19:41,760 --> 00:19:44,760 Speaker 1: tweet that said dope Francis because I thought that was cool, 377 00:19:45,840 --> 00:19:48,320 Speaker 1: and I also want to add some that was at 378 00:19:48,320 --> 00:19:50,560 Speaker 1: Beth Borden, and then I also want to add some 379 00:19:50,600 --> 00:19:55,159 Speaker 1: important context. She responded to her tweet with a comment 380 00:19:55,320 --> 00:19:59,200 Speaker 1: from Reddit that was like, Mussalms are sweaty knee, sweet 381 00:19:59,240 --> 00:20:04,000 Speaker 1: crosses heavy, last supper is ready. It's Lord Spaghetti by 382 00:20:04,080 --> 00:20:08,560 Speaker 1: Lebron James Johnson. So I didn't think any thoughts. I 383 00:20:08,560 --> 00:20:10,679 Speaker 1: guess I don't have enough Catholic friends. I was just like, 384 00:20:10,920 --> 00:20:12,959 Speaker 1: that's a funny joke, and I retweeted it because like 385 00:20:13,240 --> 00:20:16,600 Speaker 1: what he's wearing has no effect on any policies or 386 00:20:16,640 --> 00:20:18,760 Speaker 1: any real thing. So I was like, this is dumb. 387 00:20:18,840 --> 00:20:20,159 Speaker 1: But then later I was like, oh, yeah, I do 388 00:20:20,320 --> 00:20:23,480 Speaker 1: need to look at images more closely to know if 389 00:20:23,560 --> 00:20:25,600 Speaker 1: they're AI, but eventually it's going to get to a 390 00:20:25,640 --> 00:20:28,159 Speaker 1: point where we won't know. Yeah, I mean this this 391 00:20:28,240 --> 00:20:30,200 Speaker 1: kind of goes along with just like in the last 392 00:20:30,200 --> 00:20:32,840 Speaker 1: week there was like Helon Musk holding hands with AOC 393 00:20:33,160 --> 00:20:35,880 Speaker 1: or Trump getting arrested, and like there's one of Macron. 394 00:20:36,000 --> 00:20:40,800 Speaker 1: Also there's a lot of Trump makeout yeah yeah, and 395 00:20:41,200 --> 00:20:43,320 Speaker 1: like and with Trump, we know like that man is 396 00:20:43,359 --> 00:20:46,439 Speaker 1: not dynamic enough to ever cause emotion blur in in 397 00:20:46,480 --> 00:20:48,919 Speaker 1: a still image. So I was like this, Sai, my 398 00:20:48,960 --> 00:20:51,480 Speaker 1: man is not My mind is not that spry, Okay. 399 00:20:51,600 --> 00:20:54,000 Speaker 1: But again with like a lot of people were saying that, 400 00:20:54,600 --> 00:20:57,160 Speaker 1: like some there was maybe a reason why some people 401 00:20:57,200 --> 00:21:00,920 Speaker 1: were quick to at first quite possibly believe that this 402 00:21:01,040 --> 00:21:03,960 Speaker 1: was real because apparently there's a group of people that 403 00:21:04,200 --> 00:21:07,520 Speaker 1: think that he, like the Pope, is like very stylish, 404 00:21:07,560 --> 00:21:10,280 Speaker 1: and that there were rumors going around that he was 405 00:21:10,320 --> 00:21:13,479 Speaker 1: like wearing designer loafers that the Vatican had to debunk 406 00:21:13,560 --> 00:21:16,120 Speaker 1: because people are like, well, it's Rome. It's like fashion 407 00:21:16,320 --> 00:21:19,080 Speaker 1: is everywhere, and like why wouldn't he And then while 408 00:21:19,119 --> 00:21:21,160 Speaker 1: others were just like, I think it just goes part 409 00:21:21,280 --> 00:21:24,479 Speaker 1: in parcel with like people's celebrity worship or the idea 410 00:21:24,560 --> 00:21:27,760 Speaker 1: that just like oh yeah, yeah, that's I'd believe that 411 00:21:27,840 --> 00:21:31,399 Speaker 1: I buy that, or because how oversaturated even images of 412 00:21:31,480 --> 00:21:35,560 Speaker 1: celebrities look, that these Ai sort of imitations are just 413 00:21:35,560 --> 00:21:37,840 Speaker 1: sort of hitting all taking all these boxes visually that 414 00:21:38,000 --> 00:21:40,640 Speaker 1: are sort of our guards come down. Guya. What are 415 00:21:40,680 --> 00:21:44,159 Speaker 1: your thoughts on on dope, Francis, But that's in the 416 00:21:44,240 --> 00:21:46,960 Speaker 1: same I missed a bit, so sometimes I'm not inconsistent. 417 00:21:47,000 --> 00:21:49,359 Speaker 1: I actually follow what I preached, so I'd tried not 418 00:21:49,480 --> 00:21:52,679 Speaker 1: dispense too much time on Twitter, so I didn't see it. 419 00:21:53,520 --> 00:21:56,240 Speaker 1: But I agree with if you said basically, I mean 420 00:21:56,520 --> 00:21:58,840 Speaker 1: this is going to be perfect. I mean, right now 421 00:21:58,880 --> 00:22:01,040 Speaker 1: people can tell us AI, but it's just a matter 422 00:22:01,119 --> 00:22:03,800 Speaker 1: of time. And then the question is how do you 423 00:22:03,840 --> 00:22:06,480 Speaker 1: decide what's reality and what's real and what's not, which 424 00:22:06,480 --> 00:22:09,040 Speaker 1: has been an issue for a lot for quite a 425 00:22:09,080 --> 00:22:13,480 Speaker 1: while with words. But once you get into pictures, when 426 00:22:13,480 --> 00:22:17,520 Speaker 1: you lose When when a vision tells gets perfect, like 427 00:22:17,600 --> 00:22:20,680 Speaker 1: chatty pet will get perfect, that's when we really really 428 00:22:20,680 --> 00:22:23,280 Speaker 1: have a problem. Right. This is the image Guy of 429 00:22:23,520 --> 00:22:26,840 Speaker 1: just of him in this like puffy jacket. Oh wow, 430 00:22:27,520 --> 00:22:30,639 Speaker 1: so this because it was so fashionable, everyone was like, oh, 431 00:22:30,680 --> 00:22:34,240 Speaker 1: we love this, we love this. And again like you're saying, Guy, 432 00:22:34,359 --> 00:22:37,320 Speaker 1: like it's imperfect right now, And many people pointed out 433 00:22:37,359 --> 00:22:41,240 Speaker 1: these imperfections like like hands, we've seen that, like all 434 00:22:41,280 --> 00:22:44,639 Speaker 1: these AI have real trouble rendering like hands and legs, 435 00:22:44,680 --> 00:22:46,760 Speaker 1: like the Trump arresting like he had like he was 436 00:22:46,760 --> 00:22:50,159 Speaker 1: like a quadruped like in that image, and then a 437 00:22:50,200 --> 00:22:52,480 Speaker 1: lot of like other religious people were like, that is 438 00:22:52,520 --> 00:22:54,919 Speaker 1: not the kind of crucifix the pope would wear if 439 00:22:54,920 --> 00:22:56,880 Speaker 1: they look at that, or even like the way the 440 00:22:56,960 --> 00:23:00,680 Speaker 1: glasses frames were blending into the shadow of it were 441 00:23:00,880 --> 00:23:04,280 Speaker 1: like very you know, keen eyed people thought. But for 442 00:23:04,400 --> 00:23:06,520 Speaker 1: most people who are just looking at an image like 443 00:23:06,560 --> 00:23:09,600 Speaker 1: this on Twitter or like on a passive scroll, like 444 00:23:09,680 --> 00:23:12,240 Speaker 1: to your point, when they really dial things in, it's 445 00:23:12,240 --> 00:23:16,399 Speaker 1: truly gonna like the amount of like reconciliation that has 446 00:23:16,400 --> 00:23:18,080 Speaker 1: to happen in your brain to be like what am 447 00:23:18,080 --> 00:23:20,720 Speaker 1: I looking at is definitely going to increase. Yeah, And 448 00:23:20,760 --> 00:23:22,560 Speaker 1: I guess when you think about it, like what what 449 00:23:22,680 --> 00:23:25,280 Speaker 1: is the damage? I know there's some laws to regulate 450 00:23:25,440 --> 00:23:30,880 Speaker 1: you know, photography fakes in some states, not many of them, 451 00:23:30,920 --> 00:23:33,080 Speaker 1: but but they are all kinds of other harms that 452 00:23:33,119 --> 00:23:36,520 Speaker 1: can be caused by just you know, showing somebody in 453 00:23:36,520 --> 00:23:38,280 Speaker 1: a place they would never were, or where they're not 454 00:23:38,280 --> 00:23:40,359 Speaker 1: supposed to be with somebody they're not supposed to be 455 00:23:40,400 --> 00:23:44,160 Speaker 1: with them. I'm not sure it might again take years 456 00:23:44,240 --> 00:23:47,639 Speaker 1: until something is done about that, right. I wanted to 457 00:23:47,640 --> 00:23:51,280 Speaker 1: ask your thoughts because you are in law and so 458 00:23:51,520 --> 00:23:54,760 Speaker 1: accolated within it. How what about like the ramifications for 459 00:23:54,840 --> 00:23:58,280 Speaker 1: like court based evidence, Like already the justice system is 460 00:23:58,320 --> 00:24:04,240 Speaker 1: so flawed. How are we gonna you know, voicemail, voice notes, calls, images, 461 00:24:04,320 --> 00:24:07,000 Speaker 1: all of these are used as evidence. How do we 462 00:24:07,080 --> 00:24:08,680 Speaker 1: know what's going to be real and what's going to 463 00:24:08,720 --> 00:24:11,680 Speaker 1: be presented to a jury? Right? So, but with every 464 00:24:11,760 --> 00:24:14,639 Speaker 1: kind of evidence, there's always an analysis, you know. For example, 465 00:24:14,640 --> 00:24:17,280 Speaker 1: when we started getting DNA evidence, it took a while 466 00:24:17,400 --> 00:24:20,960 Speaker 1: until it was accepted. The problem here is photographs are 467 00:24:21,000 --> 00:24:24,520 Speaker 1: already accepted as evidence exactly, So you have to decide 468 00:24:24,560 --> 00:24:26,600 Speaker 1: how you treated you sort of do you start treating 469 00:24:26,600 --> 00:24:30,960 Speaker 1: it as less reliable? And that's confusing, right yeah, But 470 00:24:31,359 --> 00:24:33,560 Speaker 1: that's only so that's suddenly it's like, I'm sorry, this 471 00:24:33,760 --> 00:24:37,560 Speaker 1: video evidence is just not reliable because we're our ability 472 00:24:37,600 --> 00:24:40,119 Speaker 1: to fake everything and it could absolutely be something that 473 00:24:40,119 --> 00:24:43,080 Speaker 1: would you know, absolve someone of guilt or you know, 474 00:24:43,119 --> 00:24:46,840 Speaker 1: convict Someone's this this voicemail of Joe Biden saying the 475 00:24:46,960 --> 00:24:51,480 Speaker 1: N word on a call. No, like all the deep 476 00:24:51,480 --> 00:24:53,920 Speaker 1: pig voices of the presidents. They had him like singing 477 00:24:54,000 --> 00:24:56,960 Speaker 1: like rap lyrics since and it sounds like him, you know, 478 00:24:57,040 --> 00:24:59,840 Speaker 1: like all of the the AI voice. They have like 479 00:25:00,160 --> 00:25:03,400 Speaker 1: a lot of like gamer videos of Trump and Biden 480 00:25:03,400 --> 00:25:07,120 Speaker 1: and Obama playing video games together and using their voices, 481 00:25:07,400 --> 00:25:10,480 Speaker 1: you know what I mean, Like yeah, to talk shit 482 00:25:10,560 --> 00:25:15,120 Speaker 1: to each other during a gameplay content I never wanted. 483 00:25:15,480 --> 00:25:17,320 Speaker 1: But I didn't hear one of a deep faked Trump. 484 00:25:17,359 --> 00:25:18,960 Speaker 1: And the one thing I was like, they can't get 485 00:25:19,000 --> 00:25:22,040 Speaker 1: his cadence right, Like the tone of voice was right, 486 00:25:22,400 --> 00:25:25,120 Speaker 1: but his way, his manner of speaking of like good 487 00:25:25,200 --> 00:25:27,399 Speaker 1: luck for the person who has to like program the 488 00:25:27,520 --> 00:25:30,399 Speaker 1: nuances of that speech to like into an algorithm, because 489 00:25:30,400 --> 00:25:33,959 Speaker 1: that's yeah. The Biden one was like, was there were 490 00:25:33,960 --> 00:25:37,840 Speaker 1: tiktoks of boomers being scared by their like millennial and 491 00:25:37,920 --> 00:25:40,159 Speaker 1: gen Z kids like hearing Biden like say the N 492 00:25:40,200 --> 00:25:42,159 Speaker 1: word and say all this like crazy stuff, and they 493 00:25:42,160 --> 00:25:44,639 Speaker 1: were like, oh my goodness. Like the use of the 494 00:25:44,920 --> 00:25:48,160 Speaker 1: propaganda for election campaigns, like by the time for this 495 00:25:48,200 --> 00:25:50,320 Speaker 1: image for example, like everybody saw it and then no 496 00:25:50,359 --> 00:25:52,320 Speaker 1: one saw the follow up story. You know that happens 497 00:25:52,359 --> 00:25:56,080 Speaker 1: all the time when even you know, human generated inaccurate 498 00:25:56,119 --> 00:25:59,440 Speaker 1: headlines are produced and then people don't see the corrections, 499 00:25:59,480 --> 00:26:01,399 Speaker 1: like the damage just already done. So what do we 500 00:26:01,440 --> 00:26:04,840 Speaker 1: do then, yeah, slippery slope, but I mean it sounds 501 00:26:04,840 --> 00:26:07,480 Speaker 1: like yeah, like as of now you can kind of 502 00:26:07,560 --> 00:26:09,240 Speaker 1: you can kind of tell because it looks like a 503 00:26:09,320 --> 00:26:12,280 Speaker 1: trippy oil painting. Still, if you look real close, the 504 00:26:12,359 --> 00:26:14,600 Speaker 1: skin's always shinier than it has to be. But yeah, 505 00:26:14,840 --> 00:26:17,159 Speaker 1: that's how I look in pictures. It's just like a 506 00:26:19,920 --> 00:26:22,080 Speaker 1: and we know I'll go with them teach themselves so 507 00:26:22,119 --> 00:26:25,360 Speaker 1: they will learn eventually how it look perfectly. It's really 508 00:26:25,400 --> 00:26:27,800 Speaker 1: a matter of time and awful lot of time until 509 00:26:28,119 --> 00:26:31,320 Speaker 1: it will be perfect, right. It's just yeah, it really 510 00:26:31,400 --> 00:26:34,600 Speaker 1: is a very interesting time because things are just feel 511 00:26:34,600 --> 00:26:38,320 Speaker 1: like they're just accelerating now to speed that like the nineties, 512 00:26:38,359 --> 00:26:40,959 Speaker 1: you're like, yeah, man, you heard a CD rom And 513 00:26:41,000 --> 00:26:43,480 Speaker 1: then it's like you don't need the caddy anymore. You 514 00:26:43,520 --> 00:26:45,720 Speaker 1: can insert the CD disc straight in. That was like 515 00:26:45,800 --> 00:26:48,880 Speaker 1: three years of time. And now we're going from like, hey, 516 00:26:48,960 --> 00:26:51,159 Speaker 1: you can swap faces on Snapchat to like you hear 517 00:26:51,240 --> 00:26:53,040 Speaker 1: Joe Biden say the N word on this phone call 518 00:26:53,080 --> 00:26:56,120 Speaker 1: and you're like, what, how it's kind of interesting how 519 00:26:56,160 --> 00:26:59,600 Speaker 1: we like will be reverting to like non technical stuff 520 00:26:59,640 --> 00:27:03,760 Speaker 1: because talking to that professor about the AI writing the essays. 521 00:27:03,800 --> 00:27:06,400 Speaker 1: The only fool proof way of making sure people don't 522 00:27:06,480 --> 00:27:08,960 Speaker 1: cheat is like, because you can try to get more 523 00:27:09,160 --> 00:27:11,760 Speaker 1: localized and specific with the topics, but eventually it's going 524 00:27:11,800 --> 00:27:12,919 Speaker 1: to get to a point where you have to just 525 00:27:12,960 --> 00:27:14,760 Speaker 1: like watch them write the essay in front of you 526 00:27:15,240 --> 00:27:18,160 Speaker 1: on paper, you know what I mean. And that's like 527 00:27:18,280 --> 00:27:19,919 Speaker 1: reverting back to when we didn't have it, Like we 528 00:27:20,000 --> 00:27:22,440 Speaker 1: use the technology so much it's irresponsible. Now we don't 529 00:27:22,440 --> 00:27:25,000 Speaker 1: get to use it anymore, you know, I think it 530 00:27:25,000 --> 00:27:27,199 Speaker 1: would be some advantages. You know, I've been sitting in 531 00:27:27,280 --> 00:27:31,080 Speaker 1: classrooms teaching students on Wi Fi for a while, and 532 00:27:31,119 --> 00:27:34,320 Speaker 1: I never understood why I have to compete against the Internet. Why, Well, 533 00:27:34,359 --> 00:27:38,400 Speaker 1: I'm talking everybody can be shopping on Instagram. And actually 534 00:27:38,560 --> 00:27:41,239 Speaker 1: if you cut down to Wi Fi, which is how 535 00:27:41,280 --> 00:27:43,440 Speaker 1: people have to do exams, so they won't be able 536 00:27:43,480 --> 00:27:46,600 Speaker 1: to cheat and they'll have secure systems and the computers, 537 00:27:47,240 --> 00:27:52,680 Speaker 1: that might be place we should have gone earlier. Right, Yeah, 538 00:27:52,720 --> 00:27:54,960 Speaker 1: it is true. I mean, like I remember when I 539 00:27:55,000 --> 00:27:57,000 Speaker 1: got to college, like this was this is like the 540 00:27:57,080 --> 00:28:00,919 Speaker 1: beginning of laptops being just you know, ubiquitous, like they 541 00:28:00,920 --> 00:28:03,879 Speaker 1: were everywhere, and I remember like the first time I 542 00:28:03,960 --> 00:28:08,959 Speaker 1: popped my laptop open. In my Spanish and Iberian history class, 543 00:28:09,320 --> 00:28:13,120 Speaker 1: I was looking at nonsense and I hadn't was absolutely 544 00:28:13,280 --> 00:28:16,280 Speaker 1: just missed the entire lesson because I was so amazed. 545 00:28:16,440 --> 00:28:18,320 Speaker 1: I was like, yeah, I can do I can multitask 546 00:28:18,400 --> 00:28:22,040 Speaker 1: quote unquote, But really I was absolutely just distracting myself. Yeah, 547 00:28:22,119 --> 00:28:24,240 Speaker 1: and you doing versus school here in New York City. 548 00:28:24,359 --> 00:28:26,680 Speaker 1: I mean, kids were saying, how I supposed to study, 549 00:28:26,800 --> 00:28:29,439 Speaker 1: but he's playing games on the computers. I'm just watching 550 00:28:29,480 --> 00:28:33,080 Speaker 1: the kid next to me on the game. Right. Well, 551 00:28:33,359 --> 00:28:35,680 Speaker 1: this next story I think is really interesting because we're 552 00:28:35,680 --> 00:28:38,120 Speaker 1: talking about just sort of the speed at which we're 553 00:28:38,200 --> 00:28:43,520 Speaker 1: moving towards like not needing human interaction really anymore. You know, 554 00:28:43,640 --> 00:28:47,040 Speaker 1: Virtual reality or VR as I remember it in the 555 00:28:47,160 --> 00:28:50,760 Speaker 1: nineties was a real hook for people that were really 556 00:28:50,840 --> 00:28:53,240 Speaker 1: interested in technology, you know, like it was in film 557 00:28:53,280 --> 00:28:55,760 Speaker 1: and TV and video games, and like sort of gave 558 00:28:55,840 --> 00:28:58,320 Speaker 1: us this idea of like a world where all you 559 00:28:58,360 --> 00:29:00,280 Speaker 1: had to put on was a goofy looking hell helmet 560 00:29:00,560 --> 00:29:04,200 Speaker 1: and now you are experiencing a new reality. And it 561 00:29:04,240 --> 00:29:06,840 Speaker 1: was like I get that. From that time, We're like, Wow, 562 00:29:06,920 --> 00:29:10,160 Speaker 1: the ability of computers is fantastic. This could really be 563 00:29:10,240 --> 00:29:13,080 Speaker 1: something anyway, fast forward to now where no one is 564 00:29:13,120 --> 00:29:16,240 Speaker 1: interested in wearing a helmet to use Google Docs or 565 00:29:16,520 --> 00:29:19,680 Speaker 1: living on a beach digitally or whatever. Yet a lot 566 00:29:19,720 --> 00:29:22,760 Speaker 1: of the big tech companies are insisting that it's the 567 00:29:22,800 --> 00:29:27,480 Speaker 1: wave of the future. Specifically, you know, Mark Zuckerberg completely 568 00:29:27,520 --> 00:29:30,400 Speaker 1: took a big swing with the metaverse and ended up 569 00:29:30,400 --> 00:29:33,280 Speaker 1: being not what he thought it was going to be 570 00:29:33,280 --> 00:29:35,320 Speaker 1: because again, this was like a weird way from the 571 00:29:35,360 --> 00:29:36,800 Speaker 1: way he saw its, like, this is a new way 572 00:29:36,800 --> 00:29:39,240 Speaker 1: for people to work and toil. And I think most 573 00:29:39,280 --> 00:29:41,800 Speaker 1: people who are on that side of the equation, we're like, no, 574 00:29:42,880 --> 00:29:45,920 Speaker 1: don't don't want that at all. Actually, I'm fine with 575 00:29:45,960 --> 00:29:48,600 Speaker 1: the way we're doing it. I'm actually less distracted without 576 00:29:48,600 --> 00:29:50,240 Speaker 1: having to put on a VR headset and be in 577 00:29:50,280 --> 00:29:54,120 Speaker 1: some like emoji or like avatar based meeting or something 578 00:29:54,160 --> 00:29:57,920 Speaker 1: like that. And now Apple is just they're going full 579 00:29:57,920 --> 00:30:00,920 Speaker 1: steam ahead with their mixed reality heads and people that 580 00:30:00,960 --> 00:30:03,560 Speaker 1: have worked on the product are giving some insight now 581 00:30:03,800 --> 00:30:06,400 Speaker 1: to how the internal design team thinks it's going to be, 582 00:30:06,560 --> 00:30:09,920 Speaker 1: and it's not great. Quote. Some company insiders have been 583 00:30:09,920 --> 00:30:13,120 Speaker 1: wondering if the upcoming headset is a solution in search 584 00:30:13,240 --> 00:30:15,800 Speaker 1: of a problem sources still the media outlet that quote. 585 00:30:15,880 --> 00:30:18,880 Speaker 1: Unlike the iPod, which put digital songs in people's pockets, 586 00:30:18,920 --> 00:30:21,240 Speaker 1: and the iPhone, which combined the abilities of a music 587 00:30:21,280 --> 00:30:24,240 Speaker 1: player and a phone, the headset hasn't been driven by 588 00:30:24,240 --> 00:30:28,920 Speaker 1: the same clarity. And I can totally see that. I mean, 589 00:30:28,920 --> 00:30:30,840 Speaker 1: not only that, it sounds like they are numerous like 590 00:30:30,960 --> 00:30:33,440 Speaker 1: deserters of the project who are working on it, like 591 00:30:33,480 --> 00:30:35,959 Speaker 1: at Apple, because they just felt that the end product 592 00:30:36,040 --> 00:30:38,440 Speaker 1: wasn't going to live up to like what they think 593 00:30:38,440 --> 00:30:40,360 Speaker 1: it's going to be, while like other people have been 594 00:30:40,400 --> 00:30:43,160 Speaker 1: fired because you know, they failed to make certain features work. 595 00:30:43,440 --> 00:30:45,600 Speaker 1: It's just when you look at it right now, we're 596 00:30:45,600 --> 00:30:49,000 Speaker 1: in a landscape where this is not necessarily a booming market, 597 00:30:49,080 --> 00:30:51,840 Speaker 1: like Meta has had to slash the price of their 598 00:30:51,920 --> 00:30:55,720 Speaker 1: top tier fifteen hundred dollar headset by a third to 599 00:30:55,760 --> 00:30:59,480 Speaker 1: try and entice people. This Apple headset, you know how much, 600 00:30:59,520 --> 00:31:04,479 Speaker 1: is rumored to be three thousand dollars. They are not 601 00:31:04,520 --> 00:31:08,680 Speaker 1: going to Google glasses. I am sorry that is not happening. 602 00:31:08,920 --> 00:31:11,040 Speaker 1: I get that the Apple fans out there, they might 603 00:31:11,080 --> 00:31:13,480 Speaker 1: be like, oh yeah, I got it yesterday, But again, 604 00:31:14,120 --> 00:31:16,160 Speaker 1: this does feel like this weird thing where I'm not 605 00:31:16,320 --> 00:31:19,520 Speaker 1: really sure what this how this is making anything easier, 606 00:31:19,680 --> 00:31:22,720 Speaker 1: or what the novelty is going to be. And it 607 00:31:22,800 --> 00:31:25,680 Speaker 1: just feels like, again, like with all of our technology, 608 00:31:25,760 --> 00:31:28,120 Speaker 1: if we're like if in this world where maybe people 609 00:31:28,120 --> 00:31:30,760 Speaker 1: were using VR more, I'd imagine we're just going to 610 00:31:30,880 --> 00:31:34,920 Speaker 1: increasingly like more one dimensional forms of communication without much 611 00:31:35,000 --> 00:31:38,120 Speaker 1: real human contact. You know, it's hard to say, it's 612 00:31:38,160 --> 00:31:41,320 Speaker 1: how to say if what's happening, like what happened with videos, 613 00:31:41,400 --> 00:31:43,400 Speaker 1: Like for decades people were saying they're going to have 614 00:31:43,520 --> 00:31:46,360 Speaker 1: video conferencing and everything will be on video, and nothing 615 00:31:46,400 --> 00:31:49,960 Speaker 1: happened until quite recently, really until the pandemic, I would say, 616 00:31:50,000 --> 00:31:53,440 Speaker 1: nobody was really doing it. And I wonder if this 617 00:31:53,560 --> 00:31:56,440 Speaker 1: is just some kind of delay and eventually they will 618 00:31:56,480 --> 00:31:59,880 Speaker 1: find a way to do it. I mean, I was 619 00:32:00,240 --> 00:32:02,400 Speaker 1: worried about that. I was thinking, I mean, not only 620 00:32:02,400 --> 00:32:04,720 Speaker 1: in our screens all the time, if we're completely in 621 00:32:04,760 --> 00:32:08,360 Speaker 1: a different reality that we know, interaction between us whatsoever. 622 00:32:09,360 --> 00:32:13,360 Speaker 1: But I guess I'm relieved that this is delayed, but 623 00:32:13,440 --> 00:32:16,479 Speaker 1: I'm not sure who delayed forever? Right? So do you 624 00:32:16,480 --> 00:32:19,280 Speaker 1: think it all way? Right? If like the very cynical 625 00:32:19,360 --> 00:32:23,240 Speaker 1: version is like We're on this very increasingly intense path 626 00:32:23,400 --> 00:32:27,240 Speaker 1: where technology is going to make things easier or whatever 627 00:32:27,320 --> 00:32:31,560 Speaker 1: or communication streamlined, which really means less human interaction. And 628 00:32:31,680 --> 00:32:34,800 Speaker 1: because of maybe that feeling of isolation, these headsets are 629 00:32:34,800 --> 00:32:37,560 Speaker 1: going to offer us some feeling of like humanity again. 630 00:32:37,720 --> 00:32:39,640 Speaker 1: Is that like maybe where it's gonna like the hook 631 00:32:39,640 --> 00:32:41,560 Speaker 1: point we'll return for it. They're like, hey, remember we 632 00:32:41,680 --> 00:32:45,760 Speaker 1: used to see people put this headset on. I feel 633 00:32:45,760 --> 00:32:48,200 Speaker 1: like this is I mean, that's why it was invented, 634 00:32:48,320 --> 00:32:50,840 Speaker 1: is because of like rich people's playgrounds, right, Like they 635 00:32:50,840 --> 00:32:53,480 Speaker 1: are isolated and they feel weird, and so they're like, 636 00:32:53,840 --> 00:32:56,120 Speaker 1: how are we going to perform our man hunts on 637 00:32:56,200 --> 00:32:58,200 Speaker 1: islands during the next pandemic? You know what I mean, 638 00:32:58,280 --> 00:33:01,360 Speaker 1: Let's go VR with it. But if you've ever used 639 00:33:01,680 --> 00:33:04,440 Speaker 1: a VR heads like I've played, I've done like gains 640 00:33:04,520 --> 00:33:07,520 Speaker 1: like at arcades and stuff with and it like it 641 00:33:07,560 --> 00:33:09,600 Speaker 1: literally makes me throw up. Like some of them have 642 00:33:09,640 --> 00:33:14,760 Speaker 1: been such unpleasant experiences, like fake roller coasters and stuff 643 00:33:14,800 --> 00:33:17,560 Speaker 1: that I got like really sick after in a way 644 00:33:17,600 --> 00:33:20,880 Speaker 1: that I never had gotten on like real roller coasters, 645 00:33:20,960 --> 00:33:23,520 Speaker 1: you know what I mean. I'm like, I'm like some 646 00:33:23,600 --> 00:33:25,040 Speaker 1: of it just feels I feel like some of it 647 00:33:25,040 --> 00:33:27,120 Speaker 1: could be used in Guantanamo, is what I'm saying, Like, 648 00:33:27,160 --> 00:33:29,640 Speaker 1: I feel like that's where this is gonna be weaponized 649 00:33:29,760 --> 00:33:35,880 Speaker 1: eventually for true yeah, for true psychological terror ops for yeah. 650 00:33:35,880 --> 00:33:37,800 Speaker 1: But guy, like, so, how do you sort of see 651 00:33:37,840 --> 00:33:40,280 Speaker 1: it evolveing? Because, like I think, from our perspective right now, 652 00:33:40,400 --> 00:33:44,400 Speaker 1: is we see it coming from like like an employer 653 00:33:44,480 --> 00:33:46,880 Speaker 1: class that goes like, this is the future of work. 654 00:33:47,040 --> 00:33:49,240 Speaker 1: You know. That's how Meta was sort of sort of 655 00:33:49,240 --> 00:33:51,960 Speaker 1: centering the metaverse in the beginning, was like this is 656 00:33:52,000 --> 00:33:54,840 Speaker 1: going to replace how you meet and how teams interact 657 00:33:54,920 --> 00:33:57,520 Speaker 1: and things like that, And most workers were like, no, 658 00:33:58,520 --> 00:34:01,959 Speaker 1: that's that sounds like added It's sort of like added 659 00:34:02,000 --> 00:34:04,960 Speaker 1: intensity or a level of connection that isn't necessary given 660 00:34:05,200 --> 00:34:08,400 Speaker 1: that we're able to work together online. But how do 661 00:34:08,440 --> 00:34:11,960 Speaker 1: you sort of see this evolution happening where they like 662 00:34:12,000 --> 00:34:13,680 Speaker 1: we're like, yeah, I guess what. Everybody's got one of 663 00:34:13,719 --> 00:34:16,600 Speaker 1: these now, So you know, I thought this was heading 664 00:34:16,640 --> 00:34:18,880 Speaker 1: this way. I was actually if you'd ask me, you know, 665 00:34:19,000 --> 00:34:21,759 Speaker 1: six months ago, a year ago, I was sure this 666 00:34:22,480 --> 00:34:26,359 Speaker 1: together with smart cities were connected everywhere and I phone 667 00:34:26,400 --> 00:34:29,799 Speaker 1: is used for everything, is where we're headed. So I 668 00:34:29,800 --> 00:34:33,120 Speaker 1: am surprised, and I'm always wondering if what happened here 669 00:34:33,320 --> 00:34:36,480 Speaker 1: is that people were somehow struck by the pandemic because 670 00:34:36,480 --> 00:34:39,520 Speaker 1: they sort of felt you know, there were lockdowns, they 671 00:34:39,560 --> 00:34:41,840 Speaker 1: were at home, they felt what it means to be 672 00:34:41,880 --> 00:34:45,920 Speaker 1: on the screen all the time not seeing people. They 673 00:34:46,000 --> 00:34:49,000 Speaker 1: felt how their bodies felt, they felt how their minds felt. 674 00:34:49,360 --> 00:34:54,160 Speaker 1: And maybe they're realizing more than before because when I 675 00:34:54,200 --> 00:34:56,880 Speaker 1: was speaking before the pandemic to people, it was mostly 676 00:34:56,920 --> 00:35:00,359 Speaker 1: parents who are a warried and something is shifted. So 677 00:35:00,400 --> 00:35:05,320 Speaker 1: I'm wondering if that's I'm wondering if it's technological issues 678 00:35:05,440 --> 00:35:08,520 Speaker 1: or it's partly how people feel about it. And I 679 00:35:08,560 --> 00:35:11,800 Speaker 1: really really hope it's the latter. Hm. I feel like, 680 00:35:11,840 --> 00:35:13,239 Speaker 1: at the end of the day, we are people and 681 00:35:13,320 --> 00:35:15,880 Speaker 1: we need human connection and our brains will explode if 682 00:35:15,880 --> 00:35:19,680 Speaker 1: we don't have that. You know, we have access to 683 00:35:19,719 --> 00:35:21,880 Speaker 1: all of these technologies. But at the same time, people 684 00:35:21,880 --> 00:35:24,120 Speaker 1: really wanted to just go to restaurants during the pandemic 685 00:35:24,160 --> 00:35:27,200 Speaker 1: and eat together. So yeah, and hear like human laughter, 686 00:35:27,440 --> 00:35:29,880 Speaker 1: and I remember so much how I longed to like 687 00:35:30,480 --> 00:35:33,000 Speaker 1: eat with a group of like, you know, like eat 688 00:35:33,040 --> 00:35:36,160 Speaker 1: off someone's plate, like let me try that, you know, yeah, 689 00:35:36,200 --> 00:35:39,479 Speaker 1: to like get a hug from someone, you know, Oh man, 690 00:35:39,719 --> 00:35:42,640 Speaker 1: that early day and burst into tears. I was doing 691 00:35:42,760 --> 00:35:45,880 Speaker 1: like turnaway hugs from like my grandfather and stuff like 692 00:35:46,000 --> 00:35:47,680 Speaker 1: wearing all these ppe It was like, all right, you 693 00:35:47,640 --> 00:35:50,960 Speaker 1: gotta go, man, you're almost ninety, Yeah, mess around. But 694 00:35:51,040 --> 00:35:53,560 Speaker 1: again it's true, like and we'll get into this um 695 00:35:53,719 --> 00:35:55,839 Speaker 1: after the break we talk a little bit about your book, guy, 696 00:35:55,880 --> 00:35:58,440 Speaker 1: because I do feel like we are in this experience 697 00:35:58,480 --> 00:36:01,960 Speaker 1: as people where the technology is making our lives easier. 698 00:36:01,960 --> 00:36:04,520 Speaker 1: And for the record, everyone like we're all no one 699 00:36:04,560 --> 00:36:07,799 Speaker 1: hears like hates technology. But the thing that we're up 700 00:36:07,840 --> 00:36:09,799 Speaker 1: against now is like we're starting to see how it's 701 00:36:09,800 --> 00:36:13,080 Speaker 1: eroding at these like little things and increasing this feeling 702 00:36:13,080 --> 00:36:18,080 Speaker 1: of alienation, and yet we still are using it, like 703 00:36:18,160 --> 00:36:21,239 Speaker 1: we just can't quit it. And your book sort of 704 00:36:21,400 --> 00:36:23,800 Speaker 1: gets to the crux of that feeling. We'll talk about 705 00:36:23,840 --> 00:36:37,320 Speaker 1: that right after the break. We'll be right back. So, 706 00:36:38,000 --> 00:36:41,680 Speaker 1: you know, talking about your book Guya Unwired, you know, 707 00:36:42,360 --> 00:36:46,440 Speaker 1: basically gaining control over addictive technologies. It really struck me 708 00:36:46,920 --> 00:36:50,120 Speaker 1: just because again for many younger people, like the allure 709 00:36:50,120 --> 00:36:53,279 Speaker 1: of the smartphone isn't an obscure phenomenon. Like we talk 710 00:36:53,360 --> 00:36:56,719 Speaker 1: constantly about unplugging and the benefits it's had on our 711 00:36:56,760 --> 00:37:00,319 Speaker 1: own mental health, Like just anecdotally amongst ourselves, your book 712 00:37:00,360 --> 00:37:02,080 Speaker 1: address is sort of one of the main I feel 713 00:37:02,080 --> 00:37:05,320 Speaker 1: like cycles of emotions many of us have in relation 714 00:37:05,400 --> 00:37:08,719 Speaker 1: to using screens, where like we get motivated to use 715 00:37:08,760 --> 00:37:11,640 Speaker 1: the screen less and then we're like, oh, this feels great. 716 00:37:11,680 --> 00:37:14,080 Speaker 1: Then we are sucked right back in and we feel 717 00:37:14,080 --> 00:37:16,759 Speaker 1: like shit, and it feels like like we kind of 718 00:37:16,760 --> 00:37:18,520 Speaker 1: come down to like I don't know, it's wow, Like 719 00:37:18,560 --> 00:37:21,319 Speaker 1: I just lack the self control. It's sort of like 720 00:37:21,360 --> 00:37:25,319 Speaker 1: the final sort of sentiment people land on and like 721 00:37:25,400 --> 00:37:28,640 Speaker 1: grappling with technologies. As a new parent too, I'm just 722 00:37:28,840 --> 00:37:32,080 Speaker 1: again very aware about screen time and I also feel 723 00:37:32,080 --> 00:37:34,239 Speaker 1: that like there's a certain futility around it too with 724 00:37:34,280 --> 00:37:36,279 Speaker 1: a lot of parents or they're like, I don't know, 725 00:37:36,400 --> 00:37:38,520 Speaker 1: you know, sooner or later it's just going to be 726 00:37:38,560 --> 00:37:41,799 Speaker 1: normal for them, and maybe I'm trying to delay the inevitable. 727 00:37:41,800 --> 00:37:44,719 Speaker 1: But it also apps. I've seen how much it can help, 728 00:37:44,880 --> 00:37:46,920 Speaker 1: you know, put a kid at ease and allow somebody 729 00:37:46,920 --> 00:37:50,080 Speaker 1: to do something else where do you feel like parents 730 00:37:50,080 --> 00:37:52,560 Speaker 1: and people kind of fit into this mix where we're 731 00:37:52,640 --> 00:37:54,960 Speaker 1: like the world is spinning around us, and we're like, 732 00:37:55,320 --> 00:37:58,839 Speaker 1: am I bad? Or what is happening? Like is our 733 00:37:59,160 --> 00:38:02,160 Speaker 1: brains meant this? Or we up against something a little 734 00:38:02,160 --> 00:38:05,160 Speaker 1: bit more intense than we realized. I think we're an 735 00:38:05,160 --> 00:38:09,319 Speaker 1: interesting place right now because we have a lot of information. 736 00:38:09,760 --> 00:38:12,040 Speaker 1: We have lots of information from whistle lowers from the 737 00:38:12,080 --> 00:38:15,759 Speaker 1: tech companies telling us how tech companies are addicting us 738 00:38:15,760 --> 00:38:20,400 Speaker 1: to keep us online for longer. But still we keep 739 00:38:20,440 --> 00:38:24,360 Speaker 1: blaming ourselves. We keep thinking it's our fault and we 740 00:38:24,400 --> 00:38:28,760 Speaker 1: are unable to stop spending time online. We blame our kids, 741 00:38:29,000 --> 00:38:33,560 Speaker 1: we blame our families. The problem is that that's exactly 742 00:38:33,600 --> 00:38:36,399 Speaker 1: what the tech companies want us to do, and that's 743 00:38:36,400 --> 00:38:39,279 Speaker 1: why they're giving us all these tools to make us 744 00:38:39,360 --> 00:38:42,919 Speaker 1: feel like we're in control. So once the evidence came 745 00:38:42,920 --> 00:38:46,399 Speaker 1: out that it's that they're trying to addict us, they 746 00:38:46,600 --> 00:38:49,680 Speaker 1: gave us these digital well being tools. You know, everybody 747 00:38:49,719 --> 00:38:53,759 Speaker 1: who has an iPhone has this screen time, so you 748 00:38:53,800 --> 00:38:56,120 Speaker 1: know how much time you are online, or you can 749 00:38:56,200 --> 00:38:58,440 Speaker 1: even limit the time on your apps, or you can 750 00:38:58,520 --> 00:39:03,360 Speaker 1: make your gray or they're warning you could put warnings 751 00:39:03,360 --> 00:39:07,359 Speaker 1: on Instagram and not to talk about parental controls, which 752 00:39:07,360 --> 00:39:12,560 Speaker 1: are getting more and more complicated. And but the thing 753 00:39:12,719 --> 00:39:15,719 Speaker 1: is these are just there so we all feel like 754 00:39:15,800 --> 00:39:18,080 Speaker 1: we're doing something, but they're not really there to make 755 00:39:18,160 --> 00:39:21,319 Speaker 1: us succeed because they do not really change the most 756 00:39:21,320 --> 00:39:26,000 Speaker 1: addictive features in our devices or in our apps. There 757 00:39:26,040 --> 00:39:30,160 Speaker 1: are just there, so we will think that it is 758 00:39:30,200 --> 00:39:34,480 Speaker 1: our fault, exactly like you were describing. And again, how 759 00:39:35,719 --> 00:39:38,759 Speaker 1: in this world right if we're pivoting from it's not 760 00:39:38,920 --> 00:39:41,080 Speaker 1: us because it's like, you know, you've likened it to 761 00:39:41,120 --> 00:39:44,160 Speaker 1: like the tobacco industry, where like they know they know it, 762 00:39:44,360 --> 00:39:47,760 Speaker 1: they know it bad for everybody, but then the gas 763 00:39:47,800 --> 00:39:50,719 Speaker 1: lighting starts and it's like I don't know about them. 764 00:39:50,760 --> 00:39:52,560 Speaker 1: I mean, we know what's going on, but we're not 765 00:39:52,640 --> 00:39:54,960 Speaker 1: going to actually cop to it. Do you see like 766 00:39:54,960 --> 00:39:58,160 Speaker 1: a similar evolution where on some level, I mean, because 767 00:39:58,160 --> 00:40:00,640 Speaker 1: I feel like if anything, like you're saying, like the 768 00:40:00,760 --> 00:40:03,680 Speaker 1: markets and capitalism a very good way of shielding themselves 769 00:40:03,800 --> 00:40:06,200 Speaker 1: from like having the profits go down, so they'll find 770 00:40:06,239 --> 00:40:07,600 Speaker 1: a way like you're saying, to be like no, no, 771 00:40:07,640 --> 00:40:09,480 Speaker 1: you actually have control. It's not the it's not the 772 00:40:09,560 --> 00:40:12,080 Speaker 1: other things that were just identified clear as day by 773 00:40:12,120 --> 00:40:15,080 Speaker 1: someone working on it. You have control. Now you know 774 00:40:15,120 --> 00:40:16,920 Speaker 1: what what what does that sort of battle look like. 775 00:40:17,719 --> 00:40:20,839 Speaker 1: I think we're fighting the battle in the wrong place. 776 00:40:20,880 --> 00:40:23,800 Speaker 1: Now we're finding fighting with ourselves, we're fighting with our families. 777 00:40:23,800 --> 00:40:26,960 Speaker 1: I think we have to shift to the public sphere. 778 00:40:27,120 --> 00:40:31,040 Speaker 1: And it is already happening. There's lots of action already 779 00:40:31,080 --> 00:40:35,680 Speaker 1: taking place. There are parents suing social media for addicting 780 00:40:35,680 --> 00:40:40,279 Speaker 1: their kids, causing mental harm, parents suing game manufacturers. There 781 00:40:40,280 --> 00:40:45,200 Speaker 1: are things happening. But I think people have to understand 782 00:40:45,239 --> 00:40:49,520 Speaker 1: it's not just flawyers, it's for everybody, because everybody can 783 00:40:49,640 --> 00:40:54,640 Speaker 1: shift what they're doing to the collective sphere. Parents can 784 00:40:54,719 --> 00:40:58,720 Speaker 1: go to schools. They have an influence about what schools 785 00:40:58,719 --> 00:41:01,600 Speaker 1: are doing. Schools on our exemising technology in the classroom 786 00:41:01,640 --> 00:41:06,440 Speaker 1: because that's the federal policy. Well this could be changed. 787 00:41:06,480 --> 00:41:10,080 Speaker 1: You can decide whether something is useful in the classroom, 788 00:41:10,200 --> 00:41:13,200 Speaker 1: certain technology is useful or not. You can decide whether 789 00:41:13,239 --> 00:41:16,040 Speaker 1: you want the kids to be on their cell phones 790 00:41:16,120 --> 00:41:19,200 Speaker 1: during recess instead of talking to each other. So that's 791 00:41:19,400 --> 00:41:22,719 Speaker 1: the spaces when you can change things, and you can 792 00:41:22,840 --> 00:41:26,200 Speaker 1: change normals. Something you're saying, you know, maybe it's already happening, 793 00:41:26,239 --> 00:41:29,120 Speaker 1: but the new norms evolving every day, which I'm making 794 00:41:29,160 --> 00:41:32,680 Speaker 1: it worse. I was in vacation with my kids and 795 00:41:32,760 --> 00:41:35,600 Speaker 1: there was a family in the pool and my son 796 00:41:35,760 --> 00:41:38,360 Speaker 1: was calling me to look at this. There were two girls, 797 00:41:38,360 --> 00:41:41,279 Speaker 1: I think nine and eleven, and their parents gave them 798 00:41:41,360 --> 00:41:44,279 Speaker 1: these plastic pads to put their iPhones inside so they 799 00:41:44,320 --> 00:41:49,360 Speaker 1: can use the iPhones in the pool instead of playing. Now, 800 00:41:50,239 --> 00:41:53,759 Speaker 1: you know, this is evolving trend, just like a few 801 00:41:53,840 --> 00:41:57,359 Speaker 1: years ago it started to evolve take the kids out 802 00:41:57,400 --> 00:42:00,280 Speaker 1: with iPads to the restaurant. Now so many people about 803 00:42:00,360 --> 00:42:03,879 Speaker 1: doing it, So there are ways too. I think things 804 00:42:03,920 --> 00:42:05,440 Speaker 1: are going to change. I think they're going to be 805 00:42:05,520 --> 00:42:09,719 Speaker 1: lots of legal action and tech companies will be restricted 806 00:42:09,719 --> 00:42:11,760 Speaker 1: in what they could do, but it will take some years. 807 00:42:12,239 --> 00:42:16,600 Speaker 1: So thinks people can do things as business owners. I 808 00:42:16,600 --> 00:42:19,320 Speaker 1: mean in New York City all the airports. If you 809 00:42:19,400 --> 00:42:23,000 Speaker 1: go to an airport, there are four iPads at every table. 810 00:42:23,480 --> 00:42:26,439 Speaker 1: There is no way you can have a conversation there. 811 00:42:27,200 --> 00:42:31,680 Speaker 1: So this is architecture for overuse. You can change. If 812 00:42:31,719 --> 00:42:34,200 Speaker 1: you own a restaurant, you can change. You should don't 813 00:42:34,239 --> 00:42:37,080 Speaker 1: have to use iPads, you don't have to use QR codes, 814 00:42:37,440 --> 00:42:39,520 Speaker 1: so people will take out their phones the moment they 815 00:42:39,520 --> 00:42:43,040 Speaker 1: sit down. So I think there are a lots once 816 00:42:43,239 --> 00:42:45,360 Speaker 1: people are worth a lot of things they can do 817 00:42:46,520 --> 00:42:49,800 Speaker 1: until things will change, and I do believe they're already 818 00:42:49,880 --> 00:42:53,160 Speaker 1: starting to change. Have you heard of like the third 819 00:42:53,239 --> 00:42:57,000 Speaker 1: spaces concept or theory about how there needs to be 820 00:42:57,080 --> 00:43:00,560 Speaker 1: a place for people outside of like work and the home, 821 00:43:00,680 --> 00:43:04,560 Speaker 1: for them to like gather and to exchange information and 822 00:43:04,880 --> 00:43:07,600 Speaker 1: you know, basically develop culture. I think a lot of 823 00:43:07,640 --> 00:43:11,239 Speaker 1: people are saying that the phones are now teenagers and 824 00:43:11,400 --> 00:43:14,560 Speaker 1: kids third spaces because a lot of other third spaces 825 00:43:14,600 --> 00:43:19,280 Speaker 1: have become unsafe and unaccessible and inaccessible to them. For example, 826 00:43:19,760 --> 00:43:21,880 Speaker 1: my friend posted about how in New York when she 827 00:43:22,000 --> 00:43:25,600 Speaker 1: was growing up, it became illegal for kids under eighteen 828 00:43:25,680 --> 00:43:27,879 Speaker 1: to go hang out at the mall. And apparently that's 829 00:43:27,880 --> 00:43:30,400 Speaker 1: a thing that's been happening a lot. Whereas when I 830 00:43:30,440 --> 00:43:32,200 Speaker 1: was growing up, that's where we would go and hang 831 00:43:32,200 --> 00:43:33,839 Speaker 1: out with our friends. We'd go to the mall, we'd 832 00:43:33,880 --> 00:43:37,000 Speaker 1: hang out and go to Jomba juice. You'd like, I'd 833 00:43:37,440 --> 00:43:39,759 Speaker 1: try not to leave the bookstore with too many purchases. 834 00:43:40,200 --> 00:43:43,640 Speaker 1: You know, like you you'd hang out, but now it's 835 00:43:43,640 --> 00:43:46,880 Speaker 1: considered like loitering or whatever. Like a lot of these 836 00:43:47,040 --> 00:43:51,799 Speaker 1: external places that are meant for cultural exchange and you know, 837 00:43:51,920 --> 00:43:56,840 Speaker 1: kids to grow up are becoming unavailable to them. And also, honestly, 838 00:43:56,960 --> 00:44:00,759 Speaker 1: with like mass shootings and all of that, people get 839 00:44:00,800 --> 00:44:03,120 Speaker 1: more scared of going out in public and it seems 840 00:44:03,160 --> 00:44:05,560 Speaker 1: to be safer to have them just like inside on 841 00:44:05,600 --> 00:44:08,080 Speaker 1: their phones, which it doesn't necessarily you know, there are 842 00:44:08,080 --> 00:44:10,399 Speaker 1: other risks with that. So I think it would be 843 00:44:10,560 --> 00:44:13,840 Speaker 1: I like how you highlighted it, that it's going to 844 00:44:13,880 --> 00:44:17,040 Speaker 1: be an effort on these places and these people who 845 00:44:17,080 --> 00:44:20,239 Speaker 1: are in charge of those areas, because it really does 846 00:44:20,360 --> 00:44:24,640 Speaker 1: require cooperation between them and between the companies that are 847 00:44:24,760 --> 00:44:28,239 Speaker 1: like forcing their technology on people. Yeah, and I think 848 00:44:28,280 --> 00:44:31,520 Speaker 1: a lot. Municipalities can do a lot because they can 849 00:44:31,600 --> 00:44:35,040 Speaker 1: create spaces for people to hang out and for kids 850 00:44:35,120 --> 00:44:39,640 Speaker 1: to walk to. If you have places to be together, 851 00:44:40,360 --> 00:44:43,280 Speaker 1: it's very different than if you go home after school 852 00:44:43,320 --> 00:44:45,920 Speaker 1: and sit in your bedroom with your phone. Because the 853 00:44:46,000 --> 00:44:49,920 Speaker 1: statistics at by shocking. I mean, kids are meeting I 854 00:44:49,960 --> 00:44:52,759 Speaker 1: think fifty percent less than they used to be in 855 00:44:52,800 --> 00:44:59,359 Speaker 1: the eighties and parting. I think thirty three percent less 856 00:45:00,200 --> 00:45:04,200 Speaker 1: then not getting together. So you can't by design create 857 00:45:04,320 --> 00:45:07,799 Speaker 1: spaces for people to get together. So I mean, if 858 00:45:07,800 --> 00:45:10,360 Speaker 1: you think about bars that having a cigarettes today, this 859 00:45:10,920 --> 00:45:15,720 Speaker 1: seems so implausible, you know, before it happened and things 860 00:45:15,840 --> 00:45:19,120 Speaker 1: look different now. Can you think about a bar without 861 00:45:19,160 --> 00:45:22,759 Speaker 1: every person having their phone next to them? Yeah, it's 862 00:45:22,880 --> 00:45:27,480 Speaker 1: it's it feels like in a way, like it's almost 863 00:45:27,560 --> 00:45:30,759 Speaker 1: futile to try and reverse things, like in a way 864 00:45:30,760 --> 00:45:34,319 Speaker 1: because like, for example, one of the last concerts I 865 00:45:34,360 --> 00:45:39,359 Speaker 1: went to, like amazing show and there are people experiencing 866 00:45:39,360 --> 00:45:43,279 Speaker 1: the concert through their cell phone, like so much like 867 00:45:43,560 --> 00:45:47,520 Speaker 1: watching you such a pet peeve of mine. I saw 868 00:45:47,600 --> 00:45:50,520 Speaker 1: Tom Petty in in person and I was in like 869 00:45:50,560 --> 00:45:52,960 Speaker 1: the first or second row, and this girl next to 870 00:45:53,000 --> 00:45:55,000 Speaker 1: me literally was on her phone and she was like, 871 00:45:55,239 --> 00:45:57,040 Speaker 1: oh my god, this is such a great song to 872 00:45:57,160 --> 00:45:59,920 Speaker 1: delete pictures too, And I was like Tom Petty is 873 00:46:00,160 --> 00:46:03,160 Speaker 1: on stage right now, right, or that we've lost the 874 00:46:03,320 --> 00:46:05,200 Speaker 1: bit like even like with the example of people like 875 00:46:05,239 --> 00:46:09,239 Speaker 1: in a pool, right, like that swimming isn't enough on 876 00:46:09,239 --> 00:46:11,960 Speaker 1: some level, that like the just being able to play 877 00:46:11,960 --> 00:46:15,000 Speaker 1: in the water is not like stimulating enough that we're 878 00:46:15,000 --> 00:46:17,480 Speaker 1: now adding like well, what if we augmented that with 879 00:46:17,560 --> 00:46:21,000 Speaker 1: some like audio visual stuff too that I'm like, because again, 880 00:46:21,080 --> 00:46:24,279 Speaker 1: I think this is what feels difficult for people, like 881 00:46:24,440 --> 00:46:27,160 Speaker 1: even myself. I was. I remember when I got my 882 00:46:27,239 --> 00:46:30,600 Speaker 1: last vaccination, I forgot my phone in the car, and 883 00:46:30,640 --> 00:46:32,279 Speaker 1: then when you go in there, you gotta wait like 884 00:46:32,320 --> 00:46:35,000 Speaker 1: twenty minutes after like for them just to chill out. 885 00:46:35,000 --> 00:46:37,520 Speaker 1: And I was like, first, I was panicked because I'm like, 886 00:46:37,680 --> 00:46:41,800 Speaker 1: I haven't had to wait without my phone in ages, 887 00:46:42,160 --> 00:46:45,719 Speaker 1: and I there was a moment of sincere like fear, 888 00:46:45,920 --> 00:46:48,360 Speaker 1: not fear, but like I was, I became uneasy and 889 00:46:48,400 --> 00:46:51,719 Speaker 1: I didn't like that. I felt so disarmed to just 890 00:46:51,840 --> 00:46:54,839 Speaker 1: exist in a space without a fucking screen to look at. 891 00:46:55,239 --> 00:46:56,920 Speaker 1: And it was funny because I sat down in the 892 00:46:57,000 --> 00:46:59,880 Speaker 1: chair like in the you know, pain relief medicine aisle, 893 00:47:00,160 --> 00:47:02,040 Speaker 1: and I was just doing I felt like a kid again. 894 00:47:02,040 --> 00:47:04,440 Speaker 1: It was like, I'm like reading all the labels. No, 895 00:47:04,520 --> 00:47:06,120 Speaker 1: I'm just like, I'm gonna read all the labels. Yeah, 896 00:47:06,200 --> 00:47:07,799 Speaker 1: that's I go. If I got twenty minutes to kill, 897 00:47:07,840 --> 00:47:10,080 Speaker 1: I'm gonna start reading labels and just start being in 898 00:47:10,160 --> 00:47:13,320 Speaker 1: my own thoughts again. And it was interesting how foreign 899 00:47:13,480 --> 00:47:15,960 Speaker 1: that felt to me even though I was you know, 900 00:47:15,960 --> 00:47:18,080 Speaker 1: I was born in the eighties, like I'm older millennial. 901 00:47:18,160 --> 00:47:20,879 Speaker 1: I grew up in the pre internet time too, which 902 00:47:20,920 --> 00:47:23,960 Speaker 1: felt like the most Like all humans are probably wired 903 00:47:24,000 --> 00:47:25,759 Speaker 1: to want to do this and connect to other people, 904 00:47:25,800 --> 00:47:29,080 Speaker 1: but we've definitely it's become so normal that to the 905 00:47:29,120 --> 00:47:32,680 Speaker 1: point where feeling human feels foreign, And that's what's really 906 00:47:32,719 --> 00:47:34,840 Speaker 1: scary to me. Well, I don't think we get to 907 00:47:34,880 --> 00:47:37,320 Speaker 1: go back in time, but I think we can balance 908 00:47:37,360 --> 00:47:40,040 Speaker 1: things better. Imagine if you went to concert and the 909 00:47:40,080 --> 00:47:43,400 Speaker 1: concert whole said no phones, so nobody could take off 910 00:47:43,440 --> 00:47:45,880 Speaker 1: their phones to take pictures. Maybe the phones will be 911 00:47:45,880 --> 00:47:48,920 Speaker 1: in their bags. That is changing the norms in a 912 00:47:49,000 --> 00:47:53,239 Speaker 1: way it could be done, and that things will affect everybody. 913 00:47:53,840 --> 00:47:55,840 Speaker 1: But I guess, like in there, right, there's an argument 914 00:47:55,840 --> 00:47:58,239 Speaker 1: to say, like, well, if someone actually needed to contact 915 00:47:58,320 --> 00:48:01,080 Speaker 1: me during it, then that would be like why would 916 00:48:01,160 --> 00:48:02,759 Speaker 1: how would like how do you find a way that 917 00:48:02,800 --> 00:48:04,640 Speaker 1: makes it so it's not just sort of like across 918 00:48:04,680 --> 00:48:08,080 Speaker 1: the board no phones, but we're able to, Like I 919 00:48:08,080 --> 00:48:10,600 Speaker 1: guess that's the hard part of Like there are phones 920 00:48:10,640 --> 00:48:13,799 Speaker 1: for taking no pluonts for taking pictures of the show. 921 00:48:14,080 --> 00:48:19,760 Speaker 1: You can do so with you. But yeah, that's museums. 922 00:48:19,760 --> 00:48:21,520 Speaker 1: They do that too. Like I tried to take a 923 00:48:21,560 --> 00:48:24,720 Speaker 1: picture of a painting recently into like no no, no. Yeah, 924 00:48:24,719 --> 00:48:27,040 Speaker 1: but it's also like that painting is gonna be online. 925 00:48:27,080 --> 00:48:29,120 Speaker 1: Nobody's gonna take you. Like I don't need to take 926 00:48:29,120 --> 00:48:32,120 Speaker 1: a picture of that painting. Yeah. One thing that helped 927 00:48:32,480 --> 00:48:35,759 Speaker 1: me is like literally spending more time with people and 928 00:48:35,800 --> 00:48:38,200 Speaker 1: like making an effort to do that, to like leave 929 00:48:38,239 --> 00:48:40,359 Speaker 1: my home and go spend time with people. I'm trying 930 00:48:40,360 --> 00:48:42,680 Speaker 1: to do that like once a day because I work remotely. 931 00:48:42,960 --> 00:48:44,840 Speaker 1: And then the other thing is like I have dogs 932 00:48:44,920 --> 00:48:46,680 Speaker 1: and when I walk them, one of them's a little 933 00:48:46,760 --> 00:48:48,920 Speaker 1: monster and he will try to eat stuff, and so 934 00:48:48,960 --> 00:48:50,960 Speaker 1: I have to like pay attention. And now I'm like 935 00:48:51,160 --> 00:48:53,400 Speaker 1: I know where all the good sticks are, you know, 936 00:48:53,600 --> 00:48:56,120 Speaker 1: I know where all the great grasses. I'm like going 937 00:48:56,160 --> 00:48:57,440 Speaker 1: back to when I was a kid and I was 938 00:48:57,480 --> 00:49:00,480 Speaker 1: just outside playing with my dogs, and it's so nice 939 00:49:00,560 --> 00:49:03,200 Speaker 1: to take a walk outside when it's sunny here, lay 940 00:49:03,719 --> 00:49:06,200 Speaker 1: and be with my pets, you know, and talk to 941 00:49:06,239 --> 00:49:08,359 Speaker 1: people who have pets and connect with them that way. 942 00:49:08,520 --> 00:49:12,680 Speaker 1: You know. So having things and people around you that 943 00:49:12,800 --> 00:49:14,600 Speaker 1: take you out of your head and like give you 944 00:49:14,640 --> 00:49:18,520 Speaker 1: an external like rounding two. Community is so important. That's 945 00:49:18,520 --> 00:49:22,160 Speaker 1: why I love like mutual aid and like physical activities 946 00:49:22,200 --> 00:49:25,320 Speaker 1: that like help with community things, because that really nourishes 947 00:49:25,320 --> 00:49:27,719 Speaker 1: a part of you that cannot be nurished in the 948 00:49:27,760 --> 00:49:30,799 Speaker 1: same way through a screen. Yeah, I've like in the 949 00:49:30,840 --> 00:49:33,880 Speaker 1: same way, like walking around my neighborhood and doing the 950 00:49:34,000 --> 00:49:37,920 Speaker 1: unthinkable of talking to a stranger has been the one 951 00:49:38,320 --> 00:49:41,200 Speaker 1: thing that I've felt really balances things out because there 952 00:49:41,239 --> 00:49:43,840 Speaker 1: was I saw a recent study that like the people 953 00:49:43,840 --> 00:49:48,040 Speaker 1: have a sincere fear of small talk falling apart, and 954 00:49:48,440 --> 00:49:51,319 Speaker 1: like they like people just have an eight sense, like 955 00:49:51,440 --> 00:49:54,720 Speaker 1: that if people begin small talk and the conversation goes south, 956 00:49:54,800 --> 00:49:56,920 Speaker 1: that it's suddenly on them, and like people get their 957 00:49:56,920 --> 00:49:59,400 Speaker 1: own anxiety of not being able to like keep up 958 00:49:59,440 --> 00:50:02,120 Speaker 1: small talk, which is wild because and sometimes you're just 959 00:50:02,160 --> 00:50:05,000 Speaker 1: exchanging pleasantries and it doesn't have to be more than that. 960 00:50:05,040 --> 00:50:08,560 Speaker 1: But I feel like there's like these certain small things 961 00:50:08,600 --> 00:50:10,719 Speaker 1: you can be doing, but at the same time we're 962 00:50:10,760 --> 00:50:13,759 Speaker 1: developing like also bad habits around how we communicate to 963 00:50:14,080 --> 00:50:16,799 Speaker 1: It's also crazy because everybody has a podcast, so how 964 00:50:16,840 --> 00:50:22,040 Speaker 1: are they scared us? All? Yes, all right, guy interrupted, 965 00:50:22,880 --> 00:50:25,400 Speaker 1: I'm saying it's also bad for people's well being, because 966 00:50:25,440 --> 00:50:28,200 Speaker 1: there are studies which are showing that people's happiness it's 967 00:50:28,239 --> 00:50:31,400 Speaker 1: not just about a long term relationship, but also about 968 00:50:31,400 --> 00:50:35,480 Speaker 1: these most small interactions exactly the small talk, this eye 969 00:50:35,520 --> 00:50:40,200 Speaker 1: contact and a smile that really changed the way your 970 00:50:40,400 --> 00:50:43,319 Speaker 1: brain works and makes you feel much better. And if 971 00:50:43,360 --> 00:50:45,759 Speaker 1: you're not doing that, and if you're just walking in 972 00:50:45,840 --> 00:50:50,280 Speaker 1: a street with your phone, looking at your pictures, answering text, 973 00:50:50,320 --> 00:50:53,840 Speaker 1: and you're missing all these opportunities which somehow just makes 974 00:50:53,880 --> 00:50:57,920 Speaker 1: you feel drained and tired and not good at the 975 00:50:58,000 --> 00:51:01,520 Speaker 1: end of the day. Just when it comes to policy, 976 00:51:01,880 --> 00:51:05,719 Speaker 1: I feel like there's a difficult path ahead, you know. 977 00:51:05,800 --> 00:51:09,000 Speaker 1: I mean, like we saw how seriously they took privacy 978 00:51:09,080 --> 00:51:12,279 Speaker 1: in Europe. The US is a little bit behind, a 979 00:51:12,360 --> 00:51:15,440 Speaker 1: lot behind. It's like say, for maybe like California and 980 00:51:15,719 --> 00:51:18,040 Speaker 1: a couple other places, but like in just in the 981 00:51:18,440 --> 00:51:21,920 Speaker 1: recent hearing on TikTok, it's so clear, at least in 982 00:51:21,920 --> 00:51:25,160 Speaker 1: that the narrow context of that hearing, like they just 983 00:51:25,200 --> 00:51:29,360 Speaker 1: weren't even able or willing to discuss the broader problems 984 00:51:29,360 --> 00:51:31,480 Speaker 1: with social media, and it became sort of this like 985 00:51:31,640 --> 00:51:34,680 Speaker 1: very TikTok specific thing. While we've seen to your point, 986 00:51:34,719 --> 00:51:38,720 Speaker 1: whistleblowers at like Facebook, etc. Say like these are real issues, 987 00:51:38,760 --> 00:51:41,680 Speaker 1: they've had hearings, but then we're not quite seeing the 988 00:51:41,800 --> 00:51:45,320 Speaker 1: follow through. What you know, what kind of like policy 989 00:51:45,360 --> 00:51:48,200 Speaker 1: proposals are out there, do you think are would actually 990 00:51:48,239 --> 00:51:50,600 Speaker 1: benefit people in a way that sort of gets to 991 00:51:50,640 --> 00:51:54,200 Speaker 1: the heart of you know, like our overuse of technology, 992 00:51:54,200 --> 00:51:57,320 Speaker 1: obviously knowing the parts that it's helped make things easier 993 00:51:57,320 --> 00:52:00,719 Speaker 1: for us, but also addressing like the bigger issues of 994 00:52:01,239 --> 00:52:05,280 Speaker 1: you know, feeling increasingly isolated and things like that. Yeah, 995 00:52:05,320 --> 00:52:07,720 Speaker 1: so I think first of all, the issue with TikTok 996 00:52:07,840 --> 00:52:11,600 Speaker 1: is complicated because it just brings out completely different issues 997 00:52:11,680 --> 00:52:14,360 Speaker 1: related to China, and it just it's sort of marking 998 00:52:14,400 --> 00:52:18,160 Speaker 1: the whole debate. Yeah, but there have been a lot 999 00:52:18,480 --> 00:52:22,680 Speaker 1: of bills both further all in state trying to get 1000 00:52:23,120 --> 00:52:25,919 Speaker 1: first of all, the addictive features of the phones, because 1001 00:52:25,920 --> 00:52:28,880 Speaker 1: there's some features on our phones which are really up 1002 00:52:28,960 --> 00:52:36,960 Speaker 1: to no good. What are those from the flashlight? The 1003 00:52:37,040 --> 00:52:45,080 Speaker 1: flashlight is useful, but you know, for example, streaks on Snapchat. Yeah, 1004 00:52:45,160 --> 00:52:48,040 Speaker 1: there are there for nothing but to get you to 1005 00:52:48,080 --> 00:52:51,480 Speaker 1: go back to the platform. So kids have to set kids. 1006 00:52:51,480 --> 00:52:53,600 Speaker 1: I don't think many adults use it, but you know, 1007 00:52:53,680 --> 00:52:56,399 Speaker 1: they have to send a streak to their friend and 1008 00:52:56,760 --> 00:52:59,560 Speaker 1: if they get one within twenty four hours, they've established 1009 00:52:59,560 --> 00:53:02,319 Speaker 1: a street and they keep accummolating them and then they 1010 00:53:02,360 --> 00:53:04,879 Speaker 1: have a number, let's one, one hundred and thirty four. 1011 00:53:04,960 --> 00:53:07,920 Speaker 1: They have special badges and they have older friends onto 1012 00:53:07,960 --> 00:53:11,040 Speaker 1: the number of streaks. Now, there's no requirement for any 1013 00:53:11,840 --> 00:53:14,319 Speaker 1: content in these streaks. You just have to make sure 1014 00:53:14,360 --> 00:53:17,120 Speaker 1: you send it. Why because you go to snapcheck and 1015 00:53:17,120 --> 00:53:21,439 Speaker 1: you see the ads and if the kids miss a day, 1016 00:53:21,760 --> 00:53:26,280 Speaker 1: they lose everything and they lose all their friends. And 1017 00:53:26,320 --> 00:53:28,799 Speaker 1: that's why they get so upset when the parents take 1018 00:53:28,800 --> 00:53:31,799 Speaker 1: away the phone, because they for them it's a huge thing. 1019 00:53:32,480 --> 00:53:36,840 Speaker 1: So all these kinds of features like Snapstreak, you have 1020 00:53:37,000 --> 00:53:39,640 Speaker 1: just there to make it go back to the app 1021 00:53:39,680 --> 00:53:45,280 Speaker 1: with a device, are not needed, and there are builds 1022 00:53:45,280 --> 00:53:49,080 Speaker 1: which are trying to outlaw these kinds of features. Of course, 1023 00:53:49,400 --> 00:53:52,800 Speaker 1: the problem is they'll always come with new ones. Because 1024 00:53:52,840 --> 00:53:55,279 Speaker 1: the whole business model is based on our time and 1025 00:53:55,320 --> 00:53:58,160 Speaker 1: our data. They need us to be there for as 1026 00:53:58,160 --> 00:54:00,600 Speaker 1: long as possible so they can collect more data on us, 1027 00:54:00,640 --> 00:54:03,520 Speaker 1: so they can target advertising at us, and again we 1028 00:54:03,600 --> 00:54:05,200 Speaker 1: have to be there for longer so we can see 1029 00:54:05,239 --> 00:54:07,440 Speaker 1: the ads. So that's why I think it's not just 1030 00:54:07,640 --> 00:54:11,920 Speaker 1: one thing, not just one law, not just one wonderful 1031 00:54:11,960 --> 00:54:14,040 Speaker 1: Supreme Court case. It's not going to happen like that. 1032 00:54:14,480 --> 00:54:17,000 Speaker 1: It's going to happen from a mixture of things that 1033 00:54:17,040 --> 00:54:19,600 Speaker 1: are going to happen, Like if you have the antitrust 1034 00:54:19,680 --> 00:54:24,360 Speaker 1: lasses against big tech, that if, for example, the merger 1035 00:54:24,840 --> 00:54:28,680 Speaker 1: between them, the right now matter owns Facebook owns WhatsApp 1036 00:54:28,800 --> 00:54:33,480 Speaker 1: owns the Instagram. If they're broken up and there's more morenivation, 1037 00:54:33,760 --> 00:54:37,640 Speaker 1: more competition, we might see the different business model which 1038 00:54:37,640 --> 00:54:41,680 Speaker 1: are not based on our time. So so that's another 1039 00:54:41,719 --> 00:54:44,239 Speaker 1: thing that's happening. I think, as I said earlier, I 1040 00:54:44,239 --> 00:54:47,920 Speaker 1: think the policy about maximize the technology in the school 1041 00:54:47,960 --> 00:54:53,440 Speaker 1: has to change because if Minecraft is homework, then how 1042 00:54:53,480 --> 00:54:55,680 Speaker 1: can you prevent your kid from play Minecraft at home? 1043 00:54:56,840 --> 00:55:01,200 Speaker 1: There are so there are a lot and then there's 1044 00:55:01,239 --> 00:55:04,000 Speaker 1: class actions and if you look back, it's cigarettes. You know, 1045 00:55:04,200 --> 00:55:08,319 Speaker 1: we know cigarettes are bad, but it took decades to 1046 00:55:08,480 --> 00:55:12,759 Speaker 1: change things. It took class actions, and it took advertising, 1047 00:55:12,960 --> 00:55:18,120 Speaker 1: and it took warnings and and this is this is 1048 00:55:18,160 --> 00:55:21,200 Speaker 1: going to be the same. It will take a lot 1049 00:55:21,239 --> 00:55:24,759 Speaker 1: of things at the same time. For example, let's say 1050 00:55:24,800 --> 00:55:28,600 Speaker 1: we have ratings for addictiveness. You know, so many parents 1051 00:55:28,680 --> 00:55:32,040 Speaker 1: download games for the kids, thinking oh, Minecraft is an 1052 00:55:32,120 --> 00:55:36,640 Speaker 1: educational game. If they could see before they download, this 1053 00:55:36,760 --> 00:55:40,520 Speaker 1: is high rating for addictiveness, they may not do that. 1054 00:55:40,560 --> 00:55:44,160 Speaker 1: But not only that, the game's company may change the 1055 00:55:44,200 --> 00:55:47,360 Speaker 1: game because they want people to download the game, so 1056 00:55:47,400 --> 00:55:50,359 Speaker 1: they might take the addictive features out by themselves. So 1057 00:55:50,400 --> 00:55:54,680 Speaker 1: it's a matter of pressuring from many directions to move 1058 00:55:54,760 --> 00:55:58,880 Speaker 1: things right. And like when you talk about cigarettes, I know, 1059 00:55:59,000 --> 00:56:01,080 Speaker 1: like the earlier ideas or like maybe in like the 1060 00:56:01,160 --> 00:56:03,920 Speaker 1: fifties where they knew where do you think we are, 1061 00:56:03,920 --> 00:56:06,839 Speaker 1: Like if the first studies come out there, Oh it's bad. 1062 00:56:07,480 --> 00:56:10,359 Speaker 1: Are we close to like the truth dot com era 1063 00:56:10,480 --> 00:56:13,360 Speaker 1: of like anti smoking ads? Or are we like a 1064 00:56:13,440 --> 00:56:16,279 Speaker 1: decade away? How I mean? I guess now everything is 1065 00:56:16,320 --> 00:56:19,120 Speaker 1: moving faster. So maybe what took you know, decades before, 1066 00:56:19,120 --> 00:56:22,200 Speaker 1: it might take seven years. I don't know. Well, I 1067 00:56:22,239 --> 00:56:25,600 Speaker 1: think what change would cigarettes? Yeah, you're right, the first 1068 00:56:25,640 --> 00:56:28,560 Speaker 1: studies came out in the fifties. Nineteen sixty four, the 1069 00:56:28,680 --> 00:56:31,799 Speaker 1: Certain General announces the Health Hazards It's amazing. It took 1070 00:56:31,800 --> 00:56:35,440 Speaker 1: so long considering how bad cigarettes are, right, but from 1071 00:56:35,480 --> 00:56:39,640 Speaker 1: then on we saw you know, advertising, we saw one. 1072 00:56:39,960 --> 00:56:43,280 Speaker 1: Things took a while, but they started shifting. Our problem 1073 00:56:43,360 --> 00:56:46,560 Speaker 1: is right now we are still in the science wars. 1074 00:56:47,080 --> 00:56:50,000 Speaker 1: We do not have you know, a big of metal 1075 00:56:50,040 --> 00:56:52,719 Speaker 1: organizations saying this is bad, especially for children. With the 1076 00:56:52,760 --> 00:56:57,240 Speaker 1: evidence is in, we just have partial recommendations for small 1077 00:56:57,320 --> 00:57:01,680 Speaker 1: kids about screens. I think we have so much data 1078 00:57:01,760 --> 00:57:04,560 Speaker 1: over the last two or three years. But I think 1079 00:57:04,280 --> 00:57:08,440 Speaker 1: we're at a place where medical organizations, governmental entities can 1080 00:57:08,480 --> 00:57:12,120 Speaker 1: make these proclamations and from that moment on polity and 1081 00:57:12,200 --> 00:57:15,960 Speaker 1: proceed faster. And we already have a lot of action 1082 00:57:16,040 --> 00:57:20,280 Speaker 1: in place, so I hope that, yeah, things will move faster. 1083 00:57:20,400 --> 00:57:26,200 Speaker 1: And with cigarettes, I think it will take some years. 1084 00:57:26,200 --> 00:57:29,080 Speaker 1: And that's why I think it's so important what people 1085 00:57:29,160 --> 00:57:31,760 Speaker 1: do in their communities, how they change their business norms, 1086 00:57:31,800 --> 00:57:35,520 Speaker 1: how they change their schools, because things have to happen 1087 00:57:35,840 --> 00:57:39,680 Speaker 1: at the same time. Otherwise it affects all of us 1088 00:57:39,960 --> 00:57:43,320 Speaker 1: not to mention a whole generation of kids already in 1089 00:57:43,400 --> 00:57:46,959 Speaker 1: front of screens for a decade plus a pandemic. Right. 1090 00:57:47,160 --> 00:57:50,840 Speaker 1: It's interesting that you were talking about this now because 1091 00:57:51,120 --> 00:57:53,120 Speaker 1: ten years ago, when I was in a little bit 1092 00:57:53,160 --> 00:57:55,320 Speaker 1: more than ten years when I was in college, people 1093 00:57:55,360 --> 00:57:59,520 Speaker 1: were failing out of a very great college for Minecraft. 1094 00:57:59,640 --> 00:58:02,120 Speaker 1: Like it was a joke about how many students would 1095 00:58:02,160 --> 00:58:05,640 Speaker 1: fail because of their addiction to Minecraft specifically. So it's 1096 00:58:05,800 --> 00:58:08,960 Speaker 1: interesting that you use that and that they haven't really 1097 00:58:09,080 --> 00:58:12,919 Speaker 1: changed much. It seems like in the last you know, decade, Yeah, 1098 00:58:13,000 --> 00:58:16,120 Speaker 1: why change It's a winning formula. And what do you 1099 00:58:16,200 --> 00:58:18,960 Speaker 1: kind of say, because I like to you know, people 1100 00:58:18,960 --> 00:58:21,720 Speaker 1: that are frustrated, parents that are frustrated, who are like, 1101 00:58:21,760 --> 00:58:25,120 Speaker 1: am I fucking up? Like? Am I bad? Because would 1102 00:58:25,120 --> 00:58:28,960 Speaker 1: you say to Miles, my kid is too young although 1103 00:58:29,000 --> 00:58:32,040 Speaker 1: he loves the sopranos, I'm gonna say that whenever he 1104 00:58:32,080 --> 00:58:33,720 Speaker 1: turns his head, I'm like, I don't almost be the 1105 00:58:33,800 --> 00:58:36,880 Speaker 1: lights or the mom But like, what do you what 1106 00:58:37,000 --> 00:58:39,400 Speaker 1: I mean? Because I think again, there is this feeling 1107 00:58:39,440 --> 00:58:42,080 Speaker 1: of like it feels so personal that too when you 1108 00:58:42,080 --> 00:58:44,120 Speaker 1: talk to other parents about scream time, like, hey, what 1109 00:58:44,120 --> 00:58:45,280 Speaker 1: the fuck do you want me to do? Man? Like, 1110 00:58:45,320 --> 00:58:47,080 Speaker 1: it's I got a lot going on? This is this, 1111 00:58:47,080 --> 00:58:50,040 Speaker 1: this works? And I get that there is this internal 1112 00:58:50,080 --> 00:58:53,520 Speaker 1: sense of like responsibility. But then feeling helpless because there 1113 00:58:53,600 --> 00:58:54,920 Speaker 1: is like a what am I going to do a thing? 1114 00:58:55,320 --> 00:58:56,760 Speaker 1: What do you say to people who are sort of 1115 00:58:56,800 --> 00:58:59,680 Speaker 1: like in that mental space and like how to sort 1116 00:58:59,720 --> 00:59:01,840 Speaker 1: of emerge from that or at least to begin to 1117 00:59:01,840 --> 00:59:04,640 Speaker 1: look at the situation with a little more like context. 1118 00:59:04,960 --> 00:59:06,800 Speaker 1: So I'll start with you, Miles, since you have a 1119 00:59:06,920 --> 00:59:10,040 Speaker 1: very small baby, so I think for you it's easy 1120 00:59:10,160 --> 00:59:14,920 Speaker 1: because you can just decide not to give your baby screen. 1121 00:59:15,280 --> 00:59:19,080 Speaker 1: The studies are in the smaller child where the baby 1122 00:59:19,200 --> 00:59:22,560 Speaker 1: is the worst it is, and you have control. So 1123 00:59:22,840 --> 00:59:24,960 Speaker 1: and I think people have a lot of control all 1124 00:59:25,000 --> 00:59:29,479 Speaker 1: the way through a lamentary school that so I think 1125 00:59:29,720 --> 00:59:34,440 Speaker 1: parents can really limit kids screen time. The issue becomes 1126 00:59:34,440 --> 00:59:37,120 Speaker 1: when they get to middle school because social life is 1127 00:59:37,160 --> 00:59:39,440 Speaker 1: in social networks and you can't really eyes it at 1128 00:59:39,480 --> 00:59:42,480 Speaker 1: your child, and that's when it becomes a problem. And 1129 00:59:42,560 --> 00:59:45,440 Speaker 1: you can do things in the meantime. You can model. 1130 00:59:45,440 --> 00:59:46,840 Speaker 1: I mean I try when I'm home and all my 1131 00:59:46,920 --> 00:59:48,520 Speaker 1: kids are here, so I don't have to worry. I 1132 00:59:48,600 --> 00:59:52,280 Speaker 1: try to put the phones somewhere away from me, and 1133 00:59:52,640 --> 00:59:55,200 Speaker 1: so you can do things, and you can do small 1134 00:59:55,320 --> 00:59:58,440 Speaker 1: things for yourself. You know, again, when I work, because 1135 00:59:58,520 --> 01:00:01,840 Speaker 1: I am as I mentioned I'm as addicted as all 1136 01:00:01,880 --> 01:00:04,840 Speaker 1: of us despite everything I know. I always put my 1137 01:00:05,560 --> 01:00:08,520 Speaker 1: phone on twenty minutes a timer and I write for 1138 01:00:08,560 --> 01:00:11,440 Speaker 1: twenty minutes, and then I check my emails and I 1139 01:00:11,560 --> 01:00:15,240 Speaker 1: do this again. There are things you can do, and 1140 01:00:15,520 --> 01:00:19,160 Speaker 1: you can also kids remember the pandemic. They remember how 1141 01:00:19,200 --> 01:00:21,480 Speaker 1: they felt, so you can talk to your mind them 1142 01:00:21,560 --> 01:00:24,080 Speaker 1: how they felt horrible at that time and how they 1143 01:00:24,240 --> 01:00:28,120 Speaker 1: felt much better when they saw people, so you can 1144 01:00:28,160 --> 01:00:31,560 Speaker 1: make a difference. You cannot force an older kid not 1145 01:00:31,680 --> 01:00:33,920 Speaker 1: to do that. It's not going to work. I mean. 1146 01:00:33,960 --> 01:00:37,080 Speaker 1: And also that's smarter than us with technology, they'll always 1147 01:00:37,080 --> 01:00:41,439 Speaker 1: beat us. It's not going to happen. So I think 1148 01:00:41,440 --> 01:00:46,000 Speaker 1: it's a combination of doing what you can and while 1149 01:00:46,040 --> 01:00:49,600 Speaker 1: also realizing this the broader situation too, by not blaming yourself. 1150 01:00:49,640 --> 01:00:52,720 Speaker 1: It's the most important thing because that that is the 1151 01:00:52,800 --> 01:00:55,640 Speaker 1: problem now that people are sitting there and thinking it's 1152 01:00:55,680 --> 01:00:59,120 Speaker 1: all their fault. Right. Yeah, we're like in the plastic 1153 01:00:59,200 --> 01:01:02,760 Speaker 1: straws debate where we're like, actually, what about No, it 1154 01:01:02,800 --> 01:01:05,520 Speaker 1: can't be down to my level. What about the companies 1155 01:01:05,560 --> 01:01:08,040 Speaker 1: that are actually the ones that are steering all of this? 1156 01:01:08,200 --> 01:01:10,320 Speaker 1: And I think that is an important thing to sort 1157 01:01:10,320 --> 01:01:13,640 Speaker 1: of resenter, like in the in the conversation, I have 1158 01:01:14,000 --> 01:01:15,480 Speaker 1: a couple of comments to make on that I do 1159 01:01:15,600 --> 01:01:18,400 Speaker 1: think that like watching kids, because I tootor a lot 1160 01:01:18,400 --> 01:01:22,000 Speaker 1: of kids and like watching them. I feel like overall, 1161 01:01:22,080 --> 01:01:25,440 Speaker 1: with technology, just like with life, like you, you really 1162 01:01:25,520 --> 01:01:29,280 Speaker 1: do have to raise them the way that you think 1163 01:01:29,320 --> 01:01:31,920 Speaker 1: you should raise them, and then try to be as 1164 01:01:31,960 --> 01:01:35,040 Speaker 1: involved as you can without being overbearing and allow them 1165 01:01:35,120 --> 01:01:38,040 Speaker 1: to like make their mistakes, and then you kind of 1166 01:01:38,120 --> 01:01:41,400 Speaker 1: have to hope that like those values that you passed 1167 01:01:41,440 --> 01:01:44,080 Speaker 1: on to them guide their use of technology as well, 1168 01:01:44,120 --> 01:01:45,960 Speaker 1: and that they come to you when they're scared or 1169 01:01:46,000 --> 01:01:48,680 Speaker 1: they like need help with something. It seems like to me, 1170 01:01:48,760 --> 01:01:50,480 Speaker 1: I'm a I'm a parent to dogs, so I don't 1171 01:01:50,520 --> 01:01:52,440 Speaker 1: have to worry about this. But we did watch the 1172 01:01:52,440 --> 01:01:55,800 Speaker 1: Public Bowl and they were addicted. But also the other 1173 01:01:55,840 --> 01:01:58,200 Speaker 1: thing is you're talking about how you don't have to 1174 01:01:58,200 --> 01:02:01,720 Speaker 1: worry about it with smaller children, But FaceTime is how 1175 01:02:01,840 --> 01:02:04,120 Speaker 1: I stay connected with my nephew, And I know plenty 1176 01:02:04,120 --> 01:02:09,160 Speaker 1: of people that purposefully like FaceTime family members like they're 1177 01:02:09,400 --> 01:02:12,120 Speaker 1: infant children, just so that they hear the voice, they 1178 01:02:12,160 --> 01:02:15,800 Speaker 1: see the face. Then they start associating that face with 1179 01:02:15,840 --> 01:02:19,000 Speaker 1: the screen, you know, with the good feelings with the screen. 1180 01:02:19,200 --> 01:02:21,080 Speaker 1: But that's the only way, like I can keep in 1181 01:02:21,120 --> 01:02:23,920 Speaker 1: touch with him because he's it's such a long distance 1182 01:02:23,960 --> 01:02:26,200 Speaker 1: and I know that's slightly different than games and stuff, 1183 01:02:26,240 --> 01:02:28,360 Speaker 1: but I also want to worry about that, you know. 1184 01:02:29,040 --> 01:02:31,280 Speaker 1: I think it's a great example because it's important to 1185 01:02:31,520 --> 01:02:35,400 Speaker 1: also remember that not or screen is made alike. I think, 1186 01:02:35,560 --> 01:02:40,000 Speaker 1: you know, connecting with people over FaceTime is a great thing, 1187 01:02:40,200 --> 01:02:43,760 Speaker 1: especially relatives who live away. You know, being able to 1188 01:02:43,800 --> 01:02:47,840 Speaker 1: read the New Times or any news is different. The 1189 01:02:47,880 --> 01:02:52,360 Speaker 1: problem becomes when you're selling things as educational games and 1190 01:02:52,480 --> 01:02:55,600 Speaker 1: people are playing them and they get all these dopamine 1191 01:02:55,600 --> 01:02:59,480 Speaker 1: bursts from playing, or social networks you get the dopamine 1192 01:02:59,480 --> 01:03:02,160 Speaker 1: burst from the comments and the likes. So there's a 1193 01:03:02,200 --> 01:03:06,480 Speaker 1: big difference between games and social networks or or YouTube 1194 01:03:06,520 --> 01:03:10,160 Speaker 1: where the one short video you know, hands and the 1195 01:03:10,200 --> 01:03:15,840 Speaker 1: next one stalls and and talking to Grandma on FaceTime. Yeah, okay, 1196 01:03:15,920 --> 01:03:19,360 Speaker 1: that makes me feel better, thank you. Yeah, it's I mean, 1197 01:03:19,400 --> 01:03:21,480 Speaker 1: it's but I mean, these are all things, like like 1198 01:03:21,520 --> 01:03:23,880 Speaker 1: you said, like Paula, be like you have concerns, Like 1199 01:03:24,000 --> 01:03:26,440 Speaker 1: I was, like talking to my dad on FaceTime who 1200 01:03:26,520 --> 01:03:28,360 Speaker 1: was like not able to see my son and for 1201 01:03:28,400 --> 01:03:30,960 Speaker 1: you and you, I mean he's not really able, like 1202 01:03:31,000 --> 01:03:33,000 Speaker 1: he can't my son can't see na where he's like 1203 01:03:33,040 --> 01:03:36,360 Speaker 1: associated all this too, but to your point, like how 1204 01:03:37,320 --> 01:03:40,920 Speaker 1: the beginnings of your relationship to the screen begins, and 1205 01:03:40,920 --> 01:03:43,640 Speaker 1: and and guy, you also mentioned this too, like like 1206 01:03:43,760 --> 01:03:46,080 Speaker 1: not using the phone in front of the kids too, 1207 01:03:46,120 --> 01:03:48,160 Speaker 1: because I've seen my other friends do that where they're 1208 01:03:48,160 --> 01:03:50,160 Speaker 1: trying to say like I don't want my kid to 1209 01:03:50,240 --> 01:03:52,760 Speaker 1: think that when you're not doing something, you look at 1210 01:03:52,800 --> 01:03:55,880 Speaker 1: your phone or that that's that's what is normal, Like 1211 01:03:55,920 --> 01:03:58,040 Speaker 1: you can be active, or you can do other things, 1212 01:03:58,120 --> 01:04:00,240 Speaker 1: or you can like read a physical book. But some 1213 01:04:00,320 --> 01:04:02,000 Speaker 1: I've heard people say like I don't want my kids 1214 01:04:02,080 --> 01:04:05,080 Speaker 1: like early memories of like me being like I'm looking 1215 01:04:05,120 --> 01:04:08,800 Speaker 1: down at this like glowing rectangle and baby, ye see 1216 01:04:08,840 --> 01:04:11,960 Speaker 1: that that just like sort of the most normal thing. Obviously, 1217 01:04:12,160 --> 01:04:13,640 Speaker 1: you know, we have to use our phones. But I 1218 01:04:13,680 --> 01:04:16,680 Speaker 1: get that they're like it's it's all very subtle and 1219 01:04:16,800 --> 01:04:19,480 Speaker 1: how like kids begin to like see what's normal or 1220 01:04:19,520 --> 01:04:21,600 Speaker 1: not normal? Yeah, and I think you know that. Yeah, 1221 01:04:21,600 --> 01:04:24,160 Speaker 1: the studies show that parents who are having users their 1222 01:04:24,280 --> 01:04:27,080 Speaker 1: kids also, have you used the phone. On the other hand, 1223 01:04:27,120 --> 01:04:29,720 Speaker 1: it's hard. You know you're there, but you're using a 1224 01:04:29,760 --> 01:04:32,520 Speaker 1: phone because you're texting your babysitters and you can't find 1225 01:04:32,520 --> 01:04:36,040 Speaker 1: a babysitt and you're texting another. So it's it's not Sally, 1226 01:04:36,320 --> 01:04:39,520 Speaker 1: it's not. There's no perfect solution. We're not living in 1227 01:04:39,520 --> 01:04:42,240 Speaker 1: a perfect world for this, so we can just try 1228 01:04:42,240 --> 01:04:47,560 Speaker 1: our best, but it's not. There's no easy way out 1229 01:04:47,720 --> 01:04:51,000 Speaker 1: right now right, I think that sums up so many 1230 01:04:51,040 --> 01:04:55,840 Speaker 1: of what we're experiencing in this present moment. Can I 1231 01:04:55,880 --> 01:04:59,280 Speaker 1: just say something that might help this story come full circle? Yes? Um, 1232 01:04:59,480 --> 01:05:03,040 Speaker 1: I just an article that said Pablo Xavier, a thirty 1233 01:05:03,040 --> 01:05:05,520 Speaker 1: one year old construction worker from the Chicago area, said 1234 01:05:05,560 --> 01:05:07,240 Speaker 1: he was tripping on shrooms last week when he came 1235 01:05:07,320 --> 01:05:11,160 Speaker 1: up with the idea for Pope Francis his puffy jacket image. 1236 01:05:12,520 --> 01:05:16,240 Speaker 1: So he was out in the world something drugs, yes, 1237 01:05:16,440 --> 01:05:20,280 Speaker 1: experiencing community and nature and stuff when he came up 1238 01:05:20,320 --> 01:05:24,640 Speaker 1: with the things that'll do to your brain. You know, 1239 01:05:24,840 --> 01:05:28,200 Speaker 1: just yeah, small talk with people. But again, I think yeah, 1240 01:05:28,320 --> 01:05:31,480 Speaker 1: important that if we understand that it's like this very 1241 01:05:31,520 --> 01:05:34,320 Speaker 1: complex issue where it's such a double edged sword where 1242 01:05:34,360 --> 01:05:36,720 Speaker 1: it's given us things like being able to connect with 1243 01:05:36,760 --> 01:05:39,080 Speaker 1: people when we need to and in ways that are 1244 01:05:39,280 --> 01:05:41,880 Speaker 1: much better than just like talking over the phone or 1245 01:05:41,920 --> 01:05:44,840 Speaker 1: writing something. But at the same time there's also this 1246 01:05:44,880 --> 01:05:49,000 Speaker 1: like commodified monetized, you know, use of technology that is 1247 01:05:49,200 --> 01:05:53,160 Speaker 1: purely built on extracting as much eyeball time from you 1248 01:05:53,200 --> 01:05:55,960 Speaker 1: as possible, and you know, not reckoning with that is 1249 01:05:56,400 --> 01:05:58,840 Speaker 1: creating a bit of a slippery slope. But that's why 1250 01:05:58,920 --> 01:06:01,120 Speaker 1: I thank you so much, Professor Guy burn Scene for 1251 01:06:01,160 --> 01:06:05,160 Speaker 1: stopping by the daily sitegeist. Where can people find you 1252 01:06:05,240 --> 01:06:07,760 Speaker 1: and your work and read more from you if they 1253 01:06:07,800 --> 01:06:11,479 Speaker 1: would like to? So my website is guy burns seem 1254 01:06:11,560 --> 01:06:15,760 Speaker 1: dot com and my book is available on Amazon Bars 1255 01:06:15,760 --> 01:06:19,240 Speaker 1: and I will every every place you would normally purchase 1256 01:06:19,360 --> 01:06:23,160 Speaker 1: your book. And thank you so much for having me. 1257 01:06:23,320 --> 01:06:25,840 Speaker 1: Oh no, no, no, you classed up the joint with 1258 01:06:25,920 --> 01:06:29,320 Speaker 1: your apies. And is there any work of social media 1259 01:06:29,400 --> 01:06:31,040 Speaker 1: or anything that you would like to point to that 1260 01:06:31,080 --> 01:06:33,040 Speaker 1: you were enjoying If not, it's all it's I get it. 1261 01:06:33,240 --> 01:06:35,560 Speaker 1: We're trying to get enjoying you not the right world. 1262 01:06:35,640 --> 01:06:39,520 Speaker 1: But I've been looking at Israeli and there's been lots 1263 01:06:39,560 --> 01:06:42,600 Speaker 1: going on the protest, and my best friends sent me 1264 01:06:42,880 --> 01:06:47,280 Speaker 1: a video of herr and thousands of women dressed as 1265 01:06:47,320 --> 01:06:55,120 Speaker 1: the Handmaid's Tail costume because there's a big constitutional reform 1266 01:06:55,160 --> 01:07:00,600 Speaker 1: attempt which also undermine the rights of women LGBTQ people. 1267 01:07:01,240 --> 01:07:05,000 Speaker 1: So I look at this video a lot, and I'm 1268 01:07:05,000 --> 01:07:07,040 Speaker 1: happy to be able to see this video even though 1269 01:07:07,040 --> 01:07:09,680 Speaker 1: I was not there, and I still wish I was 1270 01:07:09,760 --> 01:07:12,800 Speaker 1: there with everybody. Yeah, it shows you what what a 1271 01:07:12,840 --> 01:07:16,240 Speaker 1: general strike can do too, because I believe Netanyahu said 1272 01:07:16,240 --> 01:07:19,360 Speaker 1: he's going to delay that plan as a result of that. Hey, 1273 01:07:19,440 --> 01:07:25,760 Speaker 1: collectivism works, you know, general strikes, Trium out America and Polavie. 1274 01:07:25,840 --> 01:07:28,640 Speaker 1: Thank you so much for joining me today. Where can 1275 01:07:28,680 --> 01:07:31,280 Speaker 1: people find you and follow you? See you all that? 1276 01:07:31,360 --> 01:07:33,840 Speaker 1: And what's a tweet that you like? I'm at Pola 1277 01:07:33,920 --> 01:07:37,440 Speaker 1: Viganalan everywhere. Good luck spelling that. I'm tired of spelling 1278 01:07:37,440 --> 01:07:39,520 Speaker 1: it out. I've been doing a lot of stand up lately, 1279 01:07:39,760 --> 01:07:42,360 Speaker 1: so come see me perform because I don't want to, 1280 01:07:42,400 --> 01:07:44,840 Speaker 1: as a content creator, have to post to Instagram every 1281 01:07:44,880 --> 01:07:46,960 Speaker 1: day so that real will pop off so I can 1282 01:07:47,040 --> 01:07:49,680 Speaker 1: just be on stage surrounded by people. I don't like 1283 01:07:49,760 --> 01:07:52,640 Speaker 1: that either, that I get punished for not posting everything 1284 01:07:52,720 --> 01:07:56,120 Speaker 1: all the time. I am. I'm so tired of it. 1285 01:07:57,320 --> 01:08:00,920 Speaker 1: My favorite tweet this have you seen banshees of any year? In? No, 1286 01:08:01,120 --> 01:08:03,360 Speaker 1: it's on my list though, Oh my god, it is 1287 01:08:03,440 --> 01:08:06,959 Speaker 1: so good. Yeah. But it's about a friendship falling out. 1288 01:08:07,360 --> 01:08:10,080 Speaker 1: And there was an image of the two main characters 1289 01:08:10,320 --> 01:08:13,240 Speaker 1: you know in that from that movie, and then someone tweeted, 1290 01:08:13,680 --> 01:08:15,640 Speaker 1: I just don't want to do laundry and taxes with 1291 01:08:15,720 --> 01:08:20,080 Speaker 1: you no more. It's a valium breaking up his friendship. 1292 01:08:20,320 --> 01:08:23,280 Speaker 1: And then someone else quote tweeted that with nothing nowhere, 1293 01:08:23,320 --> 01:08:26,960 Speaker 1: never happening again, and I thought that was amazing. And 1294 01:08:27,040 --> 01:08:31,000 Speaker 1: that was at Senna Cianna Dublin and at Runs with Skizzers. 1295 01:08:31,760 --> 01:08:36,120 Speaker 1: All right. You can find me at Miles of Gray 1296 01:08:36,200 --> 01:08:40,200 Speaker 1: on Twitter and Instagram. I've been I don't know anything. 1297 01:08:40,200 --> 01:08:42,280 Speaker 1: What have I liked on? I haven't I'm not gonna, 1298 01:08:42,320 --> 01:08:44,599 Speaker 1: haven't looked on. Oh. I like all the messages people 1299 01:08:44,640 --> 01:08:47,760 Speaker 1: have sent me on Instagram welcoming back. That has been fantastic. 1300 01:08:48,160 --> 01:08:50,479 Speaker 1: I really appreciate all the listeners reaching out with your 1301 01:08:50,520 --> 01:08:53,880 Speaker 1: kind words and reminding me that I will know how 1302 01:08:53,920 --> 01:08:57,760 Speaker 1: to say foot notes when the time does come. You 1303 01:08:57,760 --> 01:09:00,280 Speaker 1: can also find Jack and own our basketball Poe cast 1304 01:09:00,320 --> 01:09:02,760 Speaker 1: Miles and Jack got Matt boosties. Also, I'm on four 1305 01:09:02,840 --> 01:09:05,920 Speaker 1: twenty Day Fiance We'll be coming back soon where Sophia, 1306 01:09:05,960 --> 01:09:08,400 Speaker 1: Alexander and I talk about our favorite reality show, ninety 1307 01:09:08,479 --> 01:09:12,000 Speaker 1: Day Beyonce. You can find us at Daily z et 1308 01:09:12,080 --> 01:09:14,000 Speaker 1: Geist on Twitter, at the Daily ze Geist on Instagram, 1309 01:09:14,160 --> 01:09:16,360 Speaker 1: got a Facebook fan page and website Daily zat geys 1310 01:09:16,400 --> 01:09:19,439 Speaker 1: dot com. We're posted episodes on our footnotes probably one 1311 01:09:19,439 --> 01:09:24,160 Speaker 1: time footnote footnotes thank you, where you can find leading 1312 01:09:24,160 --> 01:09:26,519 Speaker 1: all the articles that we talked about, as well as 1313 01:09:26,560 --> 01:09:29,160 Speaker 1: the song we are going to ride out on. What 1314 01:09:29,360 --> 01:09:32,479 Speaker 1: song is that? Oh well, thank you for asking. Today. 1315 01:09:32,520 --> 01:09:35,519 Speaker 1: We are going to go out on this track called 1316 01:09:35,520 --> 01:09:39,560 Speaker 1: Grateful by El Michael's Affair with Black Doought rapping over it. 1317 01:09:39,640 --> 01:09:41,680 Speaker 1: El Michael's Affair is a great band. I love all 1318 01:09:41,680 --> 01:09:44,200 Speaker 1: their instrument of music. Black Dot is the goat you 1319 01:09:44,240 --> 01:09:47,519 Speaker 1: know when he starts rhyming. This track is really dope, 1320 01:09:48,000 --> 01:09:50,160 Speaker 1: is heavy. If you like Black Doot, you're gonna like this. 1321 01:09:50,200 --> 01:09:52,160 Speaker 1: If you like hip hop, you've got to check this out. 1322 01:09:52,200 --> 01:09:53,559 Speaker 1: And even if you don't, it's a good track. It's 1323 01:09:53,600 --> 01:09:57,200 Speaker 1: called Grateful, which we are for sure. Oh Michael's affair 1324 01:09:57,280 --> 01:09:59,839 Speaker 1: in Black thought, that's gonna do it for us today. 1325 01:10:00,000 --> 01:10:02,240 Speaker 1: We're gonna be back later with to tell you what's trending, 1326 01:10:02,280 --> 01:10:05,400 Speaker 1: and then tomorrow with a whole new episode. So until then, 1327 01:10:05,600 --> 01:10:07,640 Speaker 1: just you know. This is a production of iHeartRadio. So 1328 01:10:07,720 --> 01:10:09,840 Speaker 1: for more podcasts, check out the iHeartRadio app for app 1329 01:10:09,880 --> 01:10:11,600 Speaker 1: podcast wherever you get them. All right, talk to you 1330 01:10:11,640 --> 01:10:14,360 Speaker 1: then bye.