1 00:00:11,680 --> 00:00:15,320 Speaker 1: Welcome to the Therapy for Black Girls Podcast, a weekly 2 00:00:15,400 --> 00:00:20,160 Speaker 1: conversation about mental health, personal development, and all the small 3 00:00:20,200 --> 00:00:23,360 Speaker 1: decisions we can make to become the best possible versions 4 00:00:23,360 --> 00:00:27,440 Speaker 1: of ourselves. I'm your host, doctor Joy hard and Bradford, 5 00:00:27,840 --> 00:00:32,920 Speaker 1: a licensed psychologist in Atlanta, Georgia. For more information or 6 00:00:33,040 --> 00:00:36,440 Speaker 1: to find a therapist in your area, visit our website 7 00:00:36,560 --> 00:00:40,280 Speaker 1: at Therapy for Blackgirls dot com. While I hope you 8 00:00:40,320 --> 00:00:44,319 Speaker 1: love listening to and learning from the podcast, it is 9 00:00:44,360 --> 00:00:47,239 Speaker 1: not meant to be a substitute for a relationship with 10 00:00:47,320 --> 00:00:57,600 Speaker 1: a licensed mental health professional. Hey y'all, Happy New Year, 11 00:00:58,000 --> 00:00:59,640 Speaker 1: and thank you so much for joining me for a 12 00:00:59,680 --> 00:01:03,000 Speaker 1: session four forty five of the Therapy for Black Girls podcast, 13 00:01:03,360 --> 00:01:05,720 Speaker 1: and so the first installment of our twenty twenty sixth 14 00:01:05,800 --> 00:01:09,440 Speaker 1: January Jumpstart series. We'll get right into our conversation after 15 00:01:09,440 --> 00:01:22,560 Speaker 1: a word from our sponsors. Whether you're chronically online or 16 00:01:22,640 --> 00:01:25,840 Speaker 1: only log in to post a quarterly dump, it's important 17 00:01:25,880 --> 00:01:28,279 Speaker 1: that you know where and how the data you're sharing 18 00:01:28,319 --> 00:01:32,440 Speaker 1: on the internet is being used. Oftentimes we accept, we 19 00:01:32,480 --> 00:01:34,720 Speaker 1: say yes, and we check a box on the screen 20 00:01:34,880 --> 00:01:37,880 Speaker 1: just to get to what we're looking for. But exercising 21 00:01:37,959 --> 00:01:40,520 Speaker 1: choice in those moments is a crucial way to protect 22 00:01:40,520 --> 00:01:44,880 Speaker 1: your identity and take control of your digital footprint. Here 23 00:01:44,920 --> 00:01:48,120 Speaker 1: to talk with us today is Camille Stewart Gloucester, an 24 00:01:48,160 --> 00:01:52,680 Speaker 1: attorney and strategist working at the intersection of technology, cybersecurity, 25 00:01:52,880 --> 00:01:58,040 Speaker 1: national security, and foreign policy. Camille has advised top leaders 26 00:01:58,080 --> 00:02:01,800 Speaker 1: in both government and policy and major companies like Google 27 00:02:01,960 --> 00:02:05,480 Speaker 1: in cybersecurity practices, and I'm excited to have her with 28 00:02:05,600 --> 00:02:08,000 Speaker 1: us today to talk about how we can begin to 29 00:02:08,080 --> 00:02:10,840 Speaker 1: protect ourselves from the risks that come with existing in 30 00:02:10,919 --> 00:02:15,519 Speaker 1: digital spaces. If something resonates with you while enjoying our conversation, 31 00:02:16,000 --> 00:02:18,240 Speaker 1: please share it with us on social media using the 32 00:02:18,280 --> 00:02:22,320 Speaker 1: hashtag TVG in session, or join us over in our 33 00:02:22,360 --> 00:02:25,160 Speaker 1: patreon to talk more about the episode. You can join 34 00:02:25,240 --> 00:02:28,600 Speaker 1: us at community dot therapy for Blackgirls dot com. Here's 35 00:02:28,639 --> 00:02:42,200 Speaker 1: our conversation. Thank you so much for joining us today, Camille. 36 00:02:41,840 --> 00:02:43,960 Speaker 2: I am so excited to be here, Thanks for having me. 37 00:02:44,280 --> 00:02:46,160 Speaker 1: So we're very excited to chat with you about a 38 00:02:46,200 --> 00:02:48,320 Speaker 1: topic that I think is on a lot of people's 39 00:02:48,440 --> 00:02:52,720 Speaker 1: minds in terms of our digital footprint. So the January 40 00:02:52,800 --> 00:02:56,320 Speaker 1: Jumpstar series is really all about like a metamorphosis and 41 00:02:56,360 --> 00:02:59,760 Speaker 1: becoming the next version of yourself. And as someone who 42 00:02:59,800 --> 00:03:02,880 Speaker 1: has been online for some time, how do you think 43 00:03:02,880 --> 00:03:05,480 Speaker 1: about your own digital presence and do you feel like 44 00:03:05,520 --> 00:03:07,680 Speaker 1: the story that it's telling is the story that you 45 00:03:07,720 --> 00:03:08,600 Speaker 1: still want to heal. 46 00:03:09,320 --> 00:03:12,600 Speaker 2: Yes, I'm very thoughtful about how I show up online. 47 00:03:12,880 --> 00:03:16,799 Speaker 2: Particularly once I became a parent, it changed my entire outlook. 48 00:03:16,960 --> 00:03:20,000 Speaker 2: So to begin with, I've always been very intentional. I've 49 00:03:20,000 --> 00:03:25,280 Speaker 2: worked in cybersecurity my entire career, and often cybersecurity professionals 50 00:03:25,280 --> 00:03:28,040 Speaker 2: are not very public facing. But I want to change 51 00:03:28,040 --> 00:03:30,760 Speaker 2: the face of cybersecurity. I want to make these issues 52 00:03:31,240 --> 00:03:36,240 Speaker 2: dinner table conversation. Quite frankly, I feel like we alienate 53 00:03:36,280 --> 00:03:40,960 Speaker 2: people by having cybersecurity conversations at this super technical or 54 00:03:41,000 --> 00:03:45,119 Speaker 2: theoretic level that people steel alienated from. And so I've 55 00:03:45,160 --> 00:03:47,360 Speaker 2: made it kind of my mission to empower people in 56 00:03:47,440 --> 00:03:50,840 Speaker 2: and through technology, whether that's getting folks into cybersecurity or 57 00:03:50,920 --> 00:03:53,960 Speaker 2: tech in general, or just having them be a bit 58 00:03:53,960 --> 00:03:56,400 Speaker 2: more intentional about how they navigate the space. And so 59 00:03:56,480 --> 00:03:59,640 Speaker 2: I try to lead by example. I'm thoughtful about whether 60 00:03:59,720 --> 00:04:02,880 Speaker 2: something is public or private and what I say, and 61 00:04:03,040 --> 00:04:06,720 Speaker 2: particularly as things change with AI, folks need to be 62 00:04:06,800 --> 00:04:08,440 Speaker 2: really thoughtful about how they show up. 63 00:04:08,920 --> 00:04:10,840 Speaker 1: So what was it about becoming a parent that you 64 00:04:10,920 --> 00:04:13,600 Speaker 1: felt made you rethink how you showed up online. 65 00:04:14,840 --> 00:04:19,080 Speaker 2: We're lucky because when we came online, there was kind 66 00:04:19,080 --> 00:04:22,880 Speaker 2: of a moment of realization. I was in college, late 67 00:04:22,960 --> 00:04:26,520 Speaker 2: college when Facebook came out, and you know that people 68 00:04:26,640 --> 00:04:29,160 Speaker 2: might see what you posted, and you had some air 69 00:04:29,279 --> 00:04:33,520 Speaker 2: of privacy about you. And I've watched subsequent generations, particularly 70 00:04:33,600 --> 00:04:36,960 Speaker 2: ones that are on technology from the day they're born, 71 00:04:37,160 --> 00:04:40,040 Speaker 2: really be exposed, whether by choice or by force by 72 00:04:40,040 --> 00:04:45,080 Speaker 2: their parents, and the consequences of that. Having been at 73 00:04:45,160 --> 00:04:47,680 Speaker 2: a social media company and looking at it through this 74 00:04:47,800 --> 00:04:50,440 Speaker 2: big tech lens, whether I was at Google or even 75 00:04:50,440 --> 00:04:53,000 Speaker 2: thinking about it from a public policy perspective, from a 76 00:04:53,080 --> 00:04:57,240 Speaker 2: national security perspective, and then just that individual security perspective. 77 00:04:57,880 --> 00:05:00,400 Speaker 2: And one of the things I appreciated about my journey 78 00:05:00,480 --> 00:05:03,280 Speaker 2: navigating technology is I got to make choices. Even if 79 00:05:03,320 --> 00:05:05,800 Speaker 2: there were mistakes, they were my mistakes to have made, 80 00:05:05,920 --> 00:05:07,880 Speaker 2: and I was able to make them at a point 81 00:05:07,920 --> 00:05:11,279 Speaker 2: where I was informed enough and able to do the 82 00:05:11,400 --> 00:05:14,279 Speaker 2: learning to decide how I wanted to show up. And 83 00:05:14,320 --> 00:05:18,359 Speaker 2: we've taken that decision away from our children. In large part, 84 00:05:18,520 --> 00:05:21,400 Speaker 2: we are putting them online before they even understand what 85 00:05:21,480 --> 00:05:25,360 Speaker 2: that means, and many of them wake up fifteen years later, 86 00:05:25,480 --> 00:05:28,839 Speaker 2: twenty years later and don't love the kind of exposure 87 00:05:28,920 --> 00:05:31,359 Speaker 2: that they've had. So I want to gift that to 88 00:05:31,440 --> 00:05:35,520 Speaker 2: my child. But also there are some real security and 89 00:05:35,560 --> 00:05:39,039 Speaker 2: privacy concerns inherent and making your child very public in 90 00:05:39,080 --> 00:05:41,960 Speaker 2: the ways that are happening right now. It's no longer 91 00:05:42,080 --> 00:05:44,320 Speaker 2: you know, just sharing on Facebook that's just you and 92 00:05:44,320 --> 00:05:48,080 Speaker 2: your family friends, or you and your cousins. It is 93 00:05:48,839 --> 00:05:52,120 Speaker 2: wide reaching, and with AI putting an emoji over their 94 00:05:52,160 --> 00:05:54,960 Speaker 2: face doesn't even help. So I want to be intentional 95 00:05:55,000 --> 00:05:58,039 Speaker 2: about giving her the safety and security she deserves, but 96 00:05:58,120 --> 00:06:00,760 Speaker 2: also gifting her the ability to choose how she shows up. 97 00:06:01,080 --> 00:06:03,279 Speaker 1: Yeah, so you mentioned that you've been in lots of 98 00:06:03,480 --> 00:06:07,040 Speaker 1: very cool places, some that require very tight levels of security. 99 00:06:07,080 --> 00:06:08,919 Speaker 1: So you've been in the highest ones of government. You 100 00:06:08,960 --> 00:06:11,480 Speaker 1: mentioned that you worked with Google. How have you been 101 00:06:11,520 --> 00:06:15,080 Speaker 1: intentional about cultivating your digital presence and what have you 102 00:06:15,160 --> 00:06:17,640 Speaker 1: learned from working in those kinds of spaces that you 103 00:06:17,640 --> 00:06:20,360 Speaker 1: think is also important for like the general community to know. 104 00:06:20,760 --> 00:06:24,440 Speaker 2: Be intentional, be thoughtful, take the time to think about 105 00:06:24,760 --> 00:06:27,360 Speaker 2: what you want to say about yourself and who you are. 106 00:06:27,440 --> 00:06:30,480 Speaker 2: Often hear people talk about it as like your personal brand. 107 00:06:30,720 --> 00:06:33,440 Speaker 2: That's fine, your personal brand, your professional brand. However you 108 00:06:33,480 --> 00:06:35,800 Speaker 2: want to think about it, What are you saying with 109 00:06:35,920 --> 00:06:38,800 Speaker 2: every account that you set up? Does it tell a 110 00:06:38,880 --> 00:06:43,120 Speaker 2: cohesive story about who you are in the appropriate context. 111 00:06:43,640 --> 00:06:46,599 Speaker 2: We're moving to a world where we're AI browsers really soon, 112 00:06:46,760 --> 00:06:48,479 Speaker 2: and I think a lot of people are really excited 113 00:06:48,520 --> 00:06:51,520 Speaker 2: by that, and it's a really cool invention. But if 114 00:06:51,520 --> 00:06:53,479 Speaker 2: you think about what that means for how people search 115 00:06:53,520 --> 00:06:56,560 Speaker 2: the internet, You're no longer putting in a search and 116 00:06:56,640 --> 00:06:59,400 Speaker 2: getting a list of results. You will get a curated 117 00:06:59,520 --> 00:07:02,320 Speaker 2: narrative about yourself. So as you think about how you 118 00:07:02,360 --> 00:07:04,600 Speaker 2: show up online, that should be the frame through which 119 00:07:04,640 --> 00:07:07,039 Speaker 2: you think about it. If someone else we're curating who 120 00:07:07,120 --> 00:07:09,400 Speaker 2: you are, if they are to pull from all the 121 00:07:09,440 --> 00:07:12,920 Speaker 2: available sources and tell a story about you, what story 122 00:07:13,000 --> 00:07:15,720 Speaker 2: is that? And so as I engage with every platform 123 00:07:15,960 --> 00:07:19,360 Speaker 2: and choose whether it is a personal engagement, so just 124 00:07:19,400 --> 00:07:22,360 Speaker 2: for a closed network of people or a professional one, 125 00:07:22,480 --> 00:07:25,040 Speaker 2: I make sure that it aligns with me, my values, 126 00:07:25,200 --> 00:07:28,400 Speaker 2: the things that I want to say publicly, and ensure 127 00:07:28,440 --> 00:07:30,640 Speaker 2: that I'm willing to stand behind anything that I post. 128 00:07:31,120 --> 00:07:33,400 Speaker 1: Yeah, so a very good reminder for folks that kind 129 00:07:33,400 --> 00:07:35,800 Speaker 1: of think twice about what you're sharing. 130 00:07:35,880 --> 00:07:37,560 Speaker 2: Rat it never goes away. 131 00:07:37,640 --> 00:07:40,080 Speaker 1: Yeah Yeah, And I definitely want to hear more about 132 00:07:40,120 --> 00:07:42,480 Speaker 1: the AI browsers. So we've started some conversations here on 133 00:07:42,520 --> 00:07:45,080 Speaker 1: the podcast around AI and what to be on the 134 00:07:45,120 --> 00:07:48,680 Speaker 1: lookout for, how it impacts our communities. But it feels 135 00:07:48,720 --> 00:07:52,200 Speaker 1: like every time we have a conversation, like there's thousands 136 00:07:52,200 --> 00:07:54,280 Speaker 1: of new things that we didn't even know since the 137 00:07:54,360 --> 00:07:56,880 Speaker 1: last conversation. So this is the first time I'm hearing 138 00:07:56,920 --> 00:07:59,120 Speaker 1: AI browsers, So I definitely want to get into that 139 00:07:59,200 --> 00:08:01,760 Speaker 1: with you. But I also want to talk about you know, 140 00:08:01,800 --> 00:08:05,240 Speaker 1: like I think that there is far more online about 141 00:08:05,280 --> 00:08:08,440 Speaker 1: each of us than many of us even recognize. So 142 00:08:08,520 --> 00:08:11,560 Speaker 1: can you talk about like what the general average person's 143 00:08:11,600 --> 00:08:13,400 Speaker 1: digital footprint actually looks like? 144 00:08:13,760 --> 00:08:17,040 Speaker 2: That's a great question. So your digital footprint is likely 145 00:08:17,080 --> 00:08:22,600 Speaker 2: all of your social media and it is a culmination 146 00:08:23,160 --> 00:08:26,920 Speaker 2: of data you've inputed into a bunch of random sites. 147 00:08:27,480 --> 00:08:29,400 Speaker 2: I don't know if you've ever googled your name, but 148 00:08:29,520 --> 00:08:32,560 Speaker 2: you should if you have not, and what you'll likely 149 00:08:32,679 --> 00:08:35,760 Speaker 2: see is maybe a website if you have one some 150 00:08:35,880 --> 00:08:38,600 Speaker 2: posts from college or from high school where they're talking 151 00:08:38,600 --> 00:08:43,520 Speaker 2: about an accomplishment or achievement anywhere you've ever been featured 152 00:08:43,640 --> 00:08:47,280 Speaker 2: on the internet or someone else mentioned you. Some of 153 00:08:47,280 --> 00:08:49,400 Speaker 2: that might be curated, and some of that might be 154 00:08:49,480 --> 00:08:52,360 Speaker 2: an ancillary message, but it creates kind of a story 155 00:08:52,400 --> 00:08:55,760 Speaker 2: about you. But there's also a bunch of data from 156 00:08:55,960 --> 00:09:01,360 Speaker 2: whether data leaks or from sites with for or malicious 157 00:09:01,960 --> 00:09:05,000 Speaker 2: intent that are kind of aggregating your data. So you've 158 00:09:05,000 --> 00:09:10,240 Speaker 2: probably seen those like online directories that pull together your 159 00:09:10,280 --> 00:09:12,400 Speaker 2: address and your email and your phone number and then 160 00:09:12,440 --> 00:09:14,439 Speaker 2: try to connect you to your cousin and your mom. 161 00:09:14,520 --> 00:09:20,000 Speaker 2: And those are directories that we all probably hate. You're like, 162 00:09:20,080 --> 00:09:22,440 Speaker 2: oh man, how did they get my address? And that 163 00:09:22,559 --> 00:09:25,760 Speaker 2: is probably from some random something or from some data week. 164 00:09:26,320 --> 00:09:28,360 Speaker 2: There are tools that can help you clean that up. 165 00:09:28,760 --> 00:09:33,959 Speaker 2: Those things are unfortunately unavoidable. You'll find them about every person. 166 00:09:34,480 --> 00:09:37,320 Speaker 2: But there are tools like delete me and Canary with 167 00:09:37,360 --> 00:09:42,440 Speaker 2: a K that can help you scrape the internet for email, 168 00:09:42,640 --> 00:09:47,120 Speaker 2: social security number, phone number, any sensitive data about you 169 00:09:47,160 --> 00:09:49,600 Speaker 2: that you do not want to appear on these websites, 170 00:09:49,640 --> 00:09:51,760 Speaker 2: and does the work to help you get them pulled down. 171 00:09:52,200 --> 00:09:55,559 Speaker 2: And that's something I recommend for everyone in a world 172 00:09:55,640 --> 00:09:59,920 Speaker 2: where people get canceled and when people's attention is turned 173 00:10:00,200 --> 00:10:03,400 Speaker 2: on you, it could mean getting docks, which means people 174 00:10:03,600 --> 00:10:06,520 Speaker 2: flooding you, or swatted, having a swat teams show up 175 00:10:06,559 --> 00:10:09,080 Speaker 2: at your house. What you want to be able to 176 00:10:09,120 --> 00:10:12,079 Speaker 2: do is protect your physical space as much as you're 177 00:10:12,120 --> 00:10:15,640 Speaker 2: protecting your digital space, and so using sites like that 178 00:10:15,760 --> 00:10:17,480 Speaker 2: to kind of clean up your footprint the things that 179 00:10:17,559 --> 00:10:21,520 Speaker 2: go beyond your ability to control, Like you didn't put 180 00:10:21,520 --> 00:10:26,520 Speaker 2: them up there. You should retake that control and get 181 00:10:26,559 --> 00:10:29,160 Speaker 2: that content down. So that's what your digital footprint looks like, 182 00:10:29,160 --> 00:10:31,480 Speaker 2: the stuff that you intentionally put up there, like social media, 183 00:10:31,600 --> 00:10:35,640 Speaker 2: your wedding website, all the random things, and then a 184 00:10:35,679 --> 00:10:40,600 Speaker 2: bevy of aggregated data from random sites and random breaches 185 00:10:40,720 --> 00:10:43,880 Speaker 2: that tell a story about you, sometimes correct and sometimes incorrect. 186 00:10:44,280 --> 00:10:56,360 Speaker 1: More from our conversation after the break. So, Gamia, you know, 187 00:10:56,440 --> 00:10:58,880 Speaker 1: I know occasionally I will get an email or like 188 00:10:58,920 --> 00:11:01,920 Speaker 1: an actual physical less from some company, maybe like an 189 00:11:01,920 --> 00:11:05,400 Speaker 1: insurance company or something, and like they actually inform you, like, hey, 190 00:11:05,559 --> 00:11:08,320 Speaker 1: there was this breach, yes, you know, and you get 191 00:11:08,320 --> 00:11:11,480 Speaker 1: maybe three months of like data protection or something, but 192 00:11:11,559 --> 00:11:14,240 Speaker 1: it doesn't feel like every company does that right, Like, 193 00:11:14,320 --> 00:11:16,800 Speaker 1: are there instances where your data may have been leaked 194 00:11:16,800 --> 00:11:18,360 Speaker 1: and you didn't even know about it? 195 00:11:19,480 --> 00:11:23,880 Speaker 2: Oh? Yes, because how they choose to notify you depends 196 00:11:23,960 --> 00:11:27,920 Speaker 2: on some legal requirements that are not consistently applied. They're 197 00:11:28,000 --> 00:11:30,640 Speaker 2: very different by state, by country, all of those things, 198 00:11:31,040 --> 00:11:35,120 Speaker 2: but also a lot of its kind of voluntary responsible actors. 199 00:11:35,200 --> 00:11:37,560 Speaker 2: Companies that really want to do right by their users 200 00:11:37,600 --> 00:11:41,439 Speaker 2: will make sure that they understand will provide some kind 201 00:11:41,480 --> 00:11:44,839 Speaker 2: of resources to help scrape that data, but many don't. 202 00:11:44,920 --> 00:11:47,240 Speaker 2: So you'll hear about it on a news report, or 203 00:11:47,360 --> 00:11:49,560 Speaker 2: you won't hear about it at all, depending on scale 204 00:11:49,559 --> 00:11:52,160 Speaker 2: and scope. So that's why you have to be on 205 00:11:52,200 --> 00:11:56,280 Speaker 2: the lookout for irregular activity on your bank accounts, your 206 00:11:56,400 --> 00:11:59,280 Speaker 2: name popping up. I actually recommend that everyone set up 207 00:11:59,320 --> 00:12:02,360 Speaker 2: a Google alone or there's a site called talk walker 208 00:12:02,840 --> 00:12:05,000 Speaker 2: that you can set up on alert about yourself just 209 00:12:05,080 --> 00:12:08,200 Speaker 2: to kind of see what's flooding the internet with your name, 210 00:12:08,760 --> 00:12:12,400 Speaker 2: maybe your business name, your address, anything like that, so 211 00:12:12,480 --> 00:12:14,160 Speaker 2: you can see as those things start to pop up 212 00:12:14,200 --> 00:12:15,840 Speaker 2: online and do something to take it down. 213 00:12:16,679 --> 00:12:20,600 Speaker 1: Unfortunately, I feel like there have been several cases of 214 00:12:20,640 --> 00:12:23,079 Speaker 1: like high profile and maybe some not even so high 215 00:12:23,120 --> 00:12:27,280 Speaker 1: profile black women specifically who have gotten docs right because 216 00:12:27,320 --> 00:12:30,359 Speaker 1: of maybe something you see it online or maybe activist activity. 217 00:12:30,760 --> 00:12:32,840 Speaker 1: Can you talk about why it's even more important for 218 00:12:32,880 --> 00:12:36,600 Speaker 1: black women, beyond privacy concerns, to be mindful of their 219 00:12:36,640 --> 00:12:37,480 Speaker 1: digital footprint. 220 00:12:37,800 --> 00:12:42,400 Speaker 2: Yeah, we are in a devisive time where our comments 221 00:12:42,440 --> 00:12:47,640 Speaker 2: are weaponized against us. Where DEI diversity, equity, inclusion, I 222 00:12:47,679 --> 00:12:49,920 Speaker 2: actually like to say it to make people actually claim 223 00:12:49,960 --> 00:12:54,040 Speaker 2: what the words are is being vilified and denied, and 224 00:12:54,120 --> 00:12:56,840 Speaker 2: people are using that to push hate. And what often 225 00:12:56,920 --> 00:13:02,240 Speaker 2: happens is the loudest voices in the room pushing conversations 226 00:13:02,240 --> 00:13:06,280 Speaker 2: of equity and equality or the most marginalized groups often 227 00:13:06,280 --> 00:13:08,400 Speaker 2: feel the brunt of that. And so what we find 228 00:13:08,520 --> 00:13:11,520 Speaker 2: is a lot of Black women are standing up and 229 00:13:11,520 --> 00:13:15,120 Speaker 2: speaking out because they feel the brunt of the pain. 230 00:13:15,280 --> 00:13:18,360 Speaker 2: We saw six hundred thousand black women have been laid 231 00:13:18,400 --> 00:13:21,679 Speaker 2: off in recent months. We've seen, to your point, a 232 00:13:21,760 --> 00:13:26,720 Speaker 2: number of attacks on political figures. That kind of targeting 233 00:13:28,000 --> 00:13:34,000 Speaker 2: is endemic to the kind of natural leadership roles black 234 00:13:34,040 --> 00:13:36,360 Speaker 2: women tend to take on, particularly in the pursuit of 235 00:13:36,360 --> 00:13:39,679 Speaker 2: equity and human rights. It's far reaching that might be 236 00:13:39,720 --> 00:13:42,520 Speaker 2: equity in the tech sector, that might be equity in healthcare, 237 00:13:42,600 --> 00:13:45,599 Speaker 2: it could be anything. But with the amount of visibility 238 00:13:45,600 --> 00:13:48,800 Speaker 2: each person gets with social media and with the Internet, 239 00:13:49,320 --> 00:13:51,360 Speaker 2: and with all of the new tools at our disposal, 240 00:13:51,800 --> 00:13:54,559 Speaker 2: a lot of us become targets when we didn't anticipate. 241 00:13:54,679 --> 00:13:57,720 Speaker 2: You know, you might think political figures and journalists take 242 00:13:57,760 --> 00:14:01,000 Speaker 2: that on as part of the job because they're on 243 00:14:01,200 --> 00:14:03,679 Speaker 2: public scrutiny. But we all have a bit of product 244 00:14:03,720 --> 00:14:05,920 Speaker 2: scrutiny now, and so we all have the potential to 245 00:14:05,960 --> 00:14:08,400 Speaker 2: be subject to that. That shouldn't make us silence ourselves, 246 00:14:08,640 --> 00:14:11,440 Speaker 2: but it shouldn't make us be intentional about using our 247 00:14:11,520 --> 00:14:15,200 Speaker 2: levers of power to protect our presence online and the 248 00:14:15,200 --> 00:14:18,320 Speaker 2: way people get access to us, whether that's digital or physical. 249 00:14:18,640 --> 00:14:21,240 Speaker 1: And you mentioned services that you could sign up for 250 00:14:21,400 --> 00:14:25,360 Speaker 1: like delete me Canary. Are there other additional layers that 251 00:14:25,440 --> 00:14:28,440 Speaker 1: you would use, because those are like after something has happened, right, 252 00:14:28,760 --> 00:14:30,960 Speaker 1: Are there things that you can do to be preventative 253 00:14:31,440 --> 00:14:33,720 Speaker 1: in terms of protecting your data online. 254 00:14:33,880 --> 00:14:37,760 Speaker 2: Yes, so those are both proactive and reactive in the 255 00:14:37,800 --> 00:14:41,640 Speaker 2: sense that you'll find data that's online before somebody acts 256 00:14:41,640 --> 00:14:44,840 Speaker 2: on it, hopefully. But some really important things are really 257 00:14:44,920 --> 00:14:48,880 Speaker 2: small behavior changes that might seem like a little bit 258 00:14:48,880 --> 00:14:50,840 Speaker 2: of an annoyance to add them to your routine, but 259 00:14:50,920 --> 00:14:53,720 Speaker 2: they're going to make a huge difference. First, turn on 260 00:14:53,880 --> 00:14:57,200 Speaker 2: multi factor authentication on everything. I'm sure you see notifications 261 00:14:57,200 --> 00:15:00,520 Speaker 2: about too, if you've been forced by some apps to 262 00:15:00,600 --> 00:15:03,840 Speaker 2: do it. That is really important because when someone gets 263 00:15:03,840 --> 00:15:06,040 Speaker 2: a hold of your password because it was leaked in 264 00:15:06,120 --> 00:15:08,680 Speaker 2: a data reach, they still can't get into your account 265 00:15:08,720 --> 00:15:11,600 Speaker 2: because they need to have the other credential to get in. 266 00:15:11,800 --> 00:15:15,400 Speaker 2: That's like saying I'm not gonna put on a dead 267 00:15:15,440 --> 00:15:20,040 Speaker 2: vault on my house because I have a standard issued 268 00:15:20,440 --> 00:15:23,640 Speaker 2: like door handle lock that everybody has like a universal 269 00:15:23,720 --> 00:15:25,840 Speaker 2: key to. I mean, technically, I guess it could keep 270 00:15:25,880 --> 00:15:28,720 Speaker 2: somebody out, but it's not keeping everybody out, So let's 271 00:15:28,800 --> 00:15:33,400 Speaker 2: arm ourselves. Update your software. Most of those pushes include 272 00:15:33,760 --> 00:15:37,560 Speaker 2: some kind of security patch, use a password manager, and 273 00:15:37,680 --> 00:15:41,040 Speaker 2: stop reusing old passwords from college, like that one password 274 00:15:41,080 --> 00:15:44,280 Speaker 2: that's the password to everything has got to go, because 275 00:15:44,320 --> 00:15:48,640 Speaker 2: what that means is when they get access to your whatever, 276 00:15:48,840 --> 00:15:51,560 Speaker 2: they have access to so many things, and they're gonna 277 00:15:51,600 --> 00:15:54,680 Speaker 2: test every site that you use to see if that 278 00:15:54,760 --> 00:15:57,720 Speaker 2: password can be used there as well, the other thing 279 00:15:57,720 --> 00:16:01,000 Speaker 2: you shouldn't do is authenticate into something with something else. 280 00:16:01,000 --> 00:16:04,280 Speaker 2: So you'll often see use your Google, use your Facebook 281 00:16:04,640 --> 00:16:08,360 Speaker 2: to get into this other site. I don't recommend that 282 00:16:08,440 --> 00:16:11,200 Speaker 2: it's so easy, but when you do that, you make 283 00:16:11,240 --> 00:16:13,840 Speaker 2: this kind of connection that we're talking about, particularly if 284 00:16:13,880 --> 00:16:17,120 Speaker 2: not if it's done without connecting it to a multifactor authentication. 285 00:16:17,200 --> 00:16:19,200 Speaker 2: So they get into your Google account, they've gotten into 286 00:16:19,360 --> 00:16:22,400 Speaker 2: all of these other sites and potentially to even lock 287 00:16:22,440 --> 00:16:25,440 Speaker 2: you out of your Gmail account. Keep your work and 288 00:16:25,520 --> 00:16:29,760 Speaker 2: personal account separate, and consider using one of the privacy 289 00:16:29,880 --> 00:16:34,240 Speaker 2: focused browsers like duck dot Go and block the trackers. 290 00:16:34,280 --> 00:16:37,800 Speaker 2: I know the cookie requests are a little annoying when 291 00:16:37,800 --> 00:16:41,040 Speaker 2: you get them, but deny them. The less access you 292 00:16:41,080 --> 00:16:42,920 Speaker 2: give for them to be tracking you when you're off 293 00:16:43,040 --> 00:16:49,120 Speaker 2: their site, the better you are going to expose information. 294 00:16:49,200 --> 00:16:51,479 Speaker 2: By virtual of using a site, you know you're exchanging 295 00:16:51,600 --> 00:16:54,520 Speaker 2: access for some kind of service that you don't have 296 00:16:54,520 --> 00:16:57,400 Speaker 2: to give them more than what they need. And the 297 00:16:57,600 --> 00:16:59,800 Speaker 2: last major one I'll talk about, because we all use 298 00:16:59,800 --> 00:17:03,960 Speaker 2: our phones so much, is to be thoughtful about the 299 00:17:04,000 --> 00:17:07,359 Speaker 2: permissions you give each app and review your apps routinely. So, 300 00:17:07,440 --> 00:17:10,560 Speaker 2: for example, in the middle of the pandemic, everybody was 301 00:17:10,640 --> 00:17:14,440 Speaker 2: using the There was this app that kind of let 302 00:17:14,480 --> 00:17:17,680 Speaker 2: you play games together and almost have like a video chat, 303 00:17:18,440 --> 00:17:22,160 Speaker 2: and it asked for your contacts. It asked for access 304 00:17:22,200 --> 00:17:25,520 Speaker 2: to like your flashlight, all kinds of random things. Why 305 00:17:25,520 --> 00:17:29,280 Speaker 2: do you need my contacts for me to initiate a 306 00:17:29,320 --> 00:17:32,119 Speaker 2: call with a group of friends. You don't really in 307 00:17:32,160 --> 00:17:35,720 Speaker 2: most cases. So I denied that, and then I denied 308 00:17:35,840 --> 00:17:39,760 Speaker 2: access to my flashlight because I don't understand why you 309 00:17:39,800 --> 00:17:43,520 Speaker 2: need that. And I was thoughtful about each thing that 310 00:17:43,640 --> 00:17:47,360 Speaker 2: I declined, because the app is going to work if 311 00:17:47,400 --> 00:17:50,399 Speaker 2: you only accept the things you need, which was camera 312 00:17:50,680 --> 00:17:53,000 Speaker 2: and audio. Right, that's all I needed to do a 313 00:17:53,080 --> 00:17:56,200 Speaker 2: video chat. But what we started to see is your 314 00:17:56,240 --> 00:17:59,840 Speaker 2: contacts are like a roadmap through your history. You're op 315 00:18:00,000 --> 00:18:03,120 Speaker 2: boyfriend or girlfriend is in there, your cousins are in there, 316 00:18:03,200 --> 00:18:05,879 Speaker 2: your parents are in there. You think that's not a 317 00:18:05,920 --> 00:18:08,040 Speaker 2: lot of data about you, but that is a huge 318 00:18:08,080 --> 00:18:10,960 Speaker 2: amount of data, particularly when you think about that next 319 00:18:11,000 --> 00:18:12,840 Speaker 2: to all those sites that are trying to make connections 320 00:18:12,880 --> 00:18:14,879 Speaker 2: between you and all the people that you know and 321 00:18:14,920 --> 00:18:17,760 Speaker 2: put all of your data out there. You're giving them 322 00:18:17,800 --> 00:18:20,520 Speaker 2: so much access to yourself. So just be thoughtful about 323 00:18:20,520 --> 00:18:23,280 Speaker 2: what you say yes and no too, and try to 324 00:18:23,520 --> 00:18:26,199 Speaker 2: like allow access only when I'm in the app, or 325 00:18:26,280 --> 00:18:30,439 Speaker 2: allow once for some of these apps. Small steps can 326 00:18:30,480 --> 00:18:33,040 Speaker 2: really make a big difference in how much access an 327 00:18:33,080 --> 00:18:35,840 Speaker 2: app or a potential malicious actor has to you in 328 00:18:35,840 --> 00:18:37,800 Speaker 2: the future, you. 329 00:18:37,800 --> 00:18:40,440 Speaker 1: Know, Krill, Even as I'm listening to you talk about 330 00:18:40,440 --> 00:18:42,800 Speaker 1: all these things, I'm starting to feel a little overwhelmed 331 00:18:42,840 --> 00:18:45,840 Speaker 1: by like all of the things that not not in 332 00:18:45,880 --> 00:18:47,719 Speaker 1: a bad way, but like all of these things are 333 00:18:47,760 --> 00:18:50,160 Speaker 1: so connected, right, and I'm thinking, you know, like even 334 00:18:50,240 --> 00:18:53,359 Speaker 1: like your newspapers will ask you to connect through like 335 00:18:53,400 --> 00:18:56,159 Speaker 1: your Google account or through your Facebook account, and it's like, 336 00:18:56,240 --> 00:18:59,720 Speaker 1: surely whoever is in charge of it at these companies 337 00:18:59,840 --> 00:19:02,480 Speaker 1: know so that that's not the safest way, but that's 338 00:19:02,520 --> 00:19:05,000 Speaker 1: like the path at least resistance right to get you 339 00:19:05,080 --> 00:19:07,280 Speaker 1: to sign up. But there are all these things that 340 00:19:07,320 --> 00:19:08,919 Speaker 1: you have to do and really be on top of 341 00:19:09,080 --> 00:19:11,520 Speaker 1: to try to make sure that you're protecting your data, 342 00:19:11,560 --> 00:19:14,320 Speaker 1: which I think can feel overwhelming, like how do I 343 00:19:14,359 --> 00:19:16,960 Speaker 1: stay on top of this? And so my next question 344 00:19:17,000 --> 00:19:20,359 Speaker 1: is around like the psychological impact of trying to stay 345 00:19:20,400 --> 00:19:23,000 Speaker 1: on top of data and privacy, and even just like 346 00:19:23,040 --> 00:19:25,280 Speaker 1: the digital clutter, like all the pictures we have and 347 00:19:25,320 --> 00:19:27,600 Speaker 1: all the apps we have, can you talk about just 348 00:19:27,600 --> 00:19:29,880 Speaker 1: the impact that has on our mental health and how 349 00:19:29,880 --> 00:19:31,320 Speaker 1: it impacts us psychologically. 350 00:19:31,560 --> 00:19:33,760 Speaker 2: I'm so glad you brought that up because I know 351 00:19:33,840 --> 00:19:37,800 Speaker 2: that people feel overwhelmed by all this conversation about technology, 352 00:19:37,840 --> 00:19:40,360 Speaker 2: and I want people to see them as small steps 353 00:19:40,359 --> 00:19:44,359 Speaker 2: that have big impact. Right. Turning on multi factor authentication 354 00:19:45,520 --> 00:19:47,800 Speaker 2: is going to help even if you have used one 355 00:19:47,960 --> 00:19:50,240 Speaker 2: app to authenticate into a bunch of different things, Like 356 00:19:50,240 --> 00:19:51,600 Speaker 2: even if you have to use your Google or your 357 00:19:51,600 --> 00:19:53,359 Speaker 2: Facebook to get into a bunch of other apps, if 358 00:19:53,359 --> 00:19:56,800 Speaker 2: your Google has multipactor authentication, and then those apps to too, 359 00:19:57,280 --> 00:19:59,960 Speaker 2: you've created a bunch of resilience that you didn't have before. 360 00:20:00,480 --> 00:20:03,439 Speaker 2: So I want the people to think that. But you know, 361 00:20:03,600 --> 00:20:05,240 Speaker 2: in my line of work, I do get to see 362 00:20:05,280 --> 00:20:07,240 Speaker 2: all of the things that happen, and there is a 363 00:20:07,359 --> 00:20:10,800 Speaker 2: huge mental toll not only on practitioners who work in 364 00:20:10,800 --> 00:20:13,919 Speaker 2: the space, but everyone. What I challenge people to do 365 00:20:14,040 --> 00:20:17,560 Speaker 2: is think about this as a version of your physical safety. 366 00:20:18,040 --> 00:20:20,960 Speaker 2: In your physical safety, you trust the police, the fire 367 00:20:21,000 --> 00:20:24,679 Speaker 2: department to secure your neighborhood, to put out fires, to 368 00:20:24,800 --> 00:20:30,040 Speaker 2: do the big heavy lifting, and then you take your 369 00:20:30,280 --> 00:20:32,919 Speaker 2: sphere of influence your home and you put on a 370 00:20:33,000 --> 00:20:35,400 Speaker 2: dead bolt, you turn on an alarm system, you close 371 00:20:35,440 --> 00:20:37,919 Speaker 2: the windows, you close your garage, you take the precautions 372 00:20:37,960 --> 00:20:43,280 Speaker 2: you think are necessary to protect yourself, and those two 373 00:20:43,520 --> 00:20:48,199 Speaker 2: things in concert helped create a more safe ecosystem around you. 374 00:20:48,240 --> 00:20:50,280 Speaker 2: I want people to be intentional about that, and I 375 00:20:50,280 --> 00:20:52,520 Speaker 2: think that helps relead the mental load quite a bit. 376 00:20:53,119 --> 00:20:55,480 Speaker 2: But if you are a person who's very vocal online 377 00:20:55,520 --> 00:20:58,880 Speaker 2: and you are speaking out against all of the injustices 378 00:20:58,920 --> 00:21:03,760 Speaker 2: in the world for marginalized communities and speaking your truth 379 00:21:03,840 --> 00:21:09,080 Speaker 2: in a world where your truth is often a discrete perspective, 380 00:21:09,680 --> 00:21:13,000 Speaker 2: you are likely to be thinking about the potential harm 381 00:21:13,040 --> 00:21:15,600 Speaker 2: that could come from that. Whether it is docting like 382 00:21:15,600 --> 00:21:19,920 Speaker 2: we talked about, or getting squatted, or just hateful messages online. 383 00:21:19,960 --> 00:21:22,520 Speaker 2: There is a lot of harm and even just reading 384 00:21:22,560 --> 00:21:26,080 Speaker 2: the comments. So I recognize that the space can be 385 00:21:27,080 --> 00:21:31,840 Speaker 2: a harmful manifestation of the best and worst of what 386 00:21:31,960 --> 00:21:35,840 Speaker 2: we are as people and flood you with that best 387 00:21:35,880 --> 00:21:38,439 Speaker 2: and worst. Right it used to be that you'd have 388 00:21:38,480 --> 00:21:40,400 Speaker 2: to be able to come find me physically to say 389 00:21:40,400 --> 00:21:42,960 Speaker 2: you're negative or positive comments, and so the scale of 390 00:21:43,040 --> 00:21:46,879 Speaker 2: that was negligible, even though it could still be harmful. Now, 391 00:21:47,000 --> 00:21:51,000 Speaker 2: every troll, every random can say something to you about 392 00:21:51,000 --> 00:21:53,359 Speaker 2: the work that you're doing or about your personhood, and 393 00:21:53,400 --> 00:21:56,840 Speaker 2: they often do make it very personal. And so one 394 00:21:56,840 --> 00:21:59,160 Speaker 2: of my pieces of advice for people is to really 395 00:21:59,560 --> 00:22:02,840 Speaker 2: find opportunities to separate yourself into being community in a 396 00:22:02,920 --> 00:22:07,080 Speaker 2: physical sense, disconnect from your device. Also, with a lot 397 00:22:07,080 --> 00:22:10,240 Speaker 2: of the information that's flooding us these days, I try 398 00:22:10,280 --> 00:22:13,240 Speaker 2: to do more of a pull than a push, meaning 399 00:22:13,640 --> 00:22:16,160 Speaker 2: I don't want notifications to flood me all the time 400 00:22:16,240 --> 00:22:18,600 Speaker 2: about what's happening in the world or what's happening on 401 00:22:18,640 --> 00:22:21,960 Speaker 2: my post. I'm going to go with intention to kind 402 00:22:22,000 --> 00:22:24,520 Speaker 2: of pull that information out of the app. Let me 403 00:22:24,560 --> 00:22:27,840 Speaker 2: find out who's been commenting on my xyz post, let 404 00:22:27,840 --> 00:22:30,280 Speaker 2: me see what's going on in the news, and that 405 00:22:30,400 --> 00:22:33,119 Speaker 2: kind of behavior really does help to preserve your mental health. 406 00:22:33,640 --> 00:22:36,480 Speaker 2: One of the things we're seeing is this desire for community. 407 00:22:36,520 --> 00:22:40,919 Speaker 2: You're seeing it with AI chatbots that have become boyfriends, girlfriends, companions, 408 00:22:40,960 --> 00:22:45,199 Speaker 2: best friends, and that's really worrisome because their inclination is 409 00:22:45,240 --> 00:22:50,159 Speaker 2: to reinforce and to promote like a positive perspective on 410 00:22:50,320 --> 00:22:53,959 Speaker 2: anything you say, even your darkest intentions. And so I 411 00:22:54,119 --> 00:22:56,800 Speaker 2: also just want to encourage people to remember that technology 412 00:22:56,960 --> 00:23:00,199 Speaker 2: is just technology. It is not human connection, is not 413 00:23:00,240 --> 00:23:03,040 Speaker 2: the source of counseling. It is not your best friend. 414 00:23:03,080 --> 00:23:05,720 Speaker 2: It's not going to give you real advice. It can 415 00:23:05,760 --> 00:23:09,320 Speaker 2: be a thought partner, but it is still just a 416 00:23:09,320 --> 00:23:11,480 Speaker 2: piece of technology that is only as good as the 417 00:23:11,560 --> 00:23:13,520 Speaker 2: data that's put into it, and only as good as 418 00:23:13,520 --> 00:23:18,200 Speaker 2: the coding that built it. And so those reminders usually 419 00:23:18,400 --> 00:23:21,000 Speaker 2: are helpful for folks as they put this in context 420 00:23:21,000 --> 00:23:23,679 Speaker 2: and hopefully stay grounded in their real lives so that 421 00:23:23,720 --> 00:23:26,560 Speaker 2: the mental health aspects can be mitigated. 422 00:23:26,920 --> 00:23:29,960 Speaker 1: So you mentioned AI several times. I echo your concerns, 423 00:23:30,080 --> 00:23:32,879 Speaker 1: especially as a you know, as a psychologist around you 424 00:23:32,920 --> 00:23:36,440 Speaker 1: know some of the suicides we've seen link to AI, unfortunately, 425 00:23:36,600 --> 00:23:38,640 Speaker 1: and you know the ways that people are really using 426 00:23:38,680 --> 00:23:43,320 Speaker 1: AI to mimic human connection. Can you talk about some 427 00:23:43,400 --> 00:23:46,679 Speaker 1: of the ways that AI makes this whole conversation around 428 00:23:46,680 --> 00:23:50,160 Speaker 1: like data privacy even more nuanced than what we should 429 00:23:50,200 --> 00:23:51,000 Speaker 1: be paying attention to. 430 00:23:52,240 --> 00:23:57,119 Speaker 2: Oh yes, I mean, as the newest technological underpinning is 431 00:23:57,160 --> 00:24:01,760 Speaker 2: fundamentally changing our society, how people connect, where people connect, 432 00:24:01,840 --> 00:24:05,359 Speaker 2: how people get gather information. So I mentioned earlier AI browsers, 433 00:24:05,880 --> 00:24:09,280 Speaker 2: and what that is is your chat, GPT, your Claude, 434 00:24:09,359 --> 00:24:12,960 Speaker 2: your favorite Gemini being embedded into a browser and kind 435 00:24:13,000 --> 00:24:15,640 Speaker 2: of those aio reviews you're starting to get now when 436 00:24:15,640 --> 00:24:18,679 Speaker 2: you do a Google search are the breath of what 437 00:24:18,720 --> 00:24:21,480 Speaker 2: you see when you are searching for something, Think of perplexity, 438 00:24:21,640 --> 00:24:25,080 Speaker 2: but times ten because it's your whole browser and you'll 439 00:24:25,119 --> 00:24:27,880 Speaker 2: eventually link back to a website. But what you see 440 00:24:27,920 --> 00:24:31,199 Speaker 2: first is some curated answer based on how that model 441 00:24:31,520 --> 00:24:33,919 Speaker 2: reads all of the information about you or about that 442 00:24:34,040 --> 00:24:38,000 Speaker 2: topic online. That is a real change and how we 443 00:24:38,080 --> 00:24:41,439 Speaker 2: consume information. It becomes even more important for all of 444 00:24:41,480 --> 00:24:44,280 Speaker 2: us to be discerners of truth and to understand how 445 00:24:44,320 --> 00:24:49,520 Speaker 2: to research and to think about information integrity. Who made this, 446 00:24:49,600 --> 00:24:51,600 Speaker 2: When did they make this? Why do they make this? 447 00:24:51,760 --> 00:24:54,560 Speaker 2: Can I validate it anywhere? Else? Is that just one 448 00:24:55,440 --> 00:24:58,920 Speaker 2: source saying this one outlet? And so it'll also mean 449 00:24:59,080 --> 00:25:02,200 Speaker 2: more work from us. We want to be rooted in truth, 450 00:25:02,280 --> 00:25:05,640 Speaker 2: rooted in information that is fact based, and that's a hard, 451 00:25:05,960 --> 00:25:10,600 Speaker 2: dynamic shift and means also on the privacy and security side, 452 00:25:10,640 --> 00:25:13,560 Speaker 2: some new avenues for people to get access to you, 453 00:25:13,880 --> 00:25:15,919 Speaker 2: for people to understand who you are, for people to 454 00:25:15,960 --> 00:25:19,320 Speaker 2: form an opinion about you that might catalyze them to action. So, 455 00:25:19,400 --> 00:25:22,680 Speaker 2: for example, you're doing all of this work, and if 456 00:25:22,720 --> 00:25:26,560 Speaker 2: an AI browser or just an AI model in general 457 00:25:26,680 --> 00:25:31,000 Speaker 2: provides synopsis of your work, that's not favorable to someone 458 00:25:31,760 --> 00:25:33,480 Speaker 2: if they're unwilling to do the work to figure out 459 00:25:33,480 --> 00:25:36,160 Speaker 2: exactly your perspective and your point of view, that might 460 00:25:36,160 --> 00:25:38,280 Speaker 2: make you the subject of ayer, the subject of their 461 00:25:39,119 --> 00:25:42,639 Speaker 2: bad intentions, and then have them act out in accordance. 462 00:25:42,800 --> 00:25:47,200 Speaker 2: So there's definitely a heightened need to practice these privacy 463 00:25:47,320 --> 00:25:50,760 Speaker 2: and security behaviors, and I hope folks do see them 464 00:25:50,760 --> 00:25:53,200 Speaker 2: as small because they will provide a lot of far 465 00:25:53,359 --> 00:25:55,639 Speaker 2: reaching protection for you. And be thoughtful about how you 466 00:25:55,720 --> 00:25:58,680 Speaker 2: use AIAI doesn't have to do everything for you. Don't 467 00:25:58,680 --> 00:26:02,400 Speaker 2: forget how to read, write, think for yourself. The thing 468 00:26:02,400 --> 00:26:06,280 Speaker 2: that your job loves the most is your ability to systems, think, 469 00:26:06,880 --> 00:26:12,199 Speaker 2: your deep understanding of your area of expertise, and the 470 00:26:12,240 --> 00:26:15,159 Speaker 2: societal context, the cultural context. All of those things are 471 00:26:15,200 --> 00:26:18,040 Speaker 2: things AI can't mimic, So don't lose that in the 472 00:26:18,080 --> 00:26:22,359 Speaker 2: pursuit of leveraging AI to be more efficient or more effective. 473 00:26:24,080 --> 00:26:26,800 Speaker 1: So earlier, Camille, you talked about even with your little one, 474 00:26:26,920 --> 00:26:29,199 Speaker 1: like even putting an emoji, which I think is a 475 00:26:29,240 --> 00:26:32,480 Speaker 1: popular way that parents like, well, share cute pictures, are 476 00:26:32,560 --> 00:26:35,400 Speaker 1: cute things, and they add an emoji, thinking like that 477 00:26:35,400 --> 00:26:37,560 Speaker 1: that's actually protecting. But it sounds like that is not 478 00:26:37,680 --> 00:26:40,200 Speaker 1: even enough, especially with the EBB and of AI. Can 479 00:26:40,240 --> 00:26:41,280 Speaker 1: you think more about that. 480 00:26:42,480 --> 00:26:44,399 Speaker 2: Yeah, And I'm guilty of this too. At first I 481 00:26:44,480 --> 00:26:46,280 Speaker 2: was like, oh my god, could just put emojis on 482 00:26:46,320 --> 00:26:50,800 Speaker 2: my baby's face? No, no, no, AI has figured out 483 00:26:50,800 --> 00:26:52,840 Speaker 2: how to scrub that clean, and now people still have 484 00:26:52,920 --> 00:26:54,960 Speaker 2: the face of your child. And what you've seen in 485 00:26:55,000 --> 00:26:59,840 Speaker 2: a lot of unfortunate child sexual abuse material is predators 486 00:27:00,119 --> 00:27:06,360 Speaker 2: leveraging your photos to create really disgusting imagery or they 487 00:27:06,440 --> 00:27:09,000 Speaker 2: get off on things that you just wouldn't expect. I 488 00:27:09,040 --> 00:27:10,879 Speaker 2: really hope we're passed the place that people posting their 489 00:27:10,920 --> 00:27:13,800 Speaker 2: kids in the dattab, but that is a highly searched 490 00:27:13,840 --> 00:27:18,240 Speaker 2: category on some of these disgusting child pornography sites. But 491 00:27:18,359 --> 00:27:20,359 Speaker 2: even without that, even if it's just your child on 492 00:27:20,359 --> 00:27:22,840 Speaker 2: a playground, there's the potential. And don't get me wrong. 493 00:27:23,119 --> 00:27:26,280 Speaker 2: I recognize the value of community, end sharing, your family 494 00:27:26,359 --> 00:27:28,199 Speaker 2: and all of those things. So I'm not making a 495 00:27:28,280 --> 00:27:30,399 Speaker 2: judgment call on people who have decided that they do 496 00:27:30,440 --> 00:27:32,639 Speaker 2: want to put their children on social media. I just 497 00:27:32,640 --> 00:27:35,000 Speaker 2: actually want people to be intentional about it. What are 498 00:27:35,040 --> 00:27:38,400 Speaker 2: you posting? What context around your child are you posting? 499 00:27:39,119 --> 00:27:42,119 Speaker 2: Is that account private? Is it public? How old is 500 00:27:42,160 --> 00:27:45,000 Speaker 2: that child? There are ways to mitigate the risk even 501 00:27:45,040 --> 00:27:49,760 Speaker 2: if you still want to engage in sharing your child's wins, successes, 502 00:27:49,800 --> 00:27:53,639 Speaker 2: evolution with the people you trust and hoole dear, I 503 00:27:54,080 --> 00:27:57,040 Speaker 2: completely support that, but there's a real risk out there, 504 00:27:57,240 --> 00:27:59,520 Speaker 2: and so I hope parents just think intentionally. Actually, we 505 00:27:59,520 --> 00:28:03,399 Speaker 2: put out a toolkit called at Digitalfluency dot tech for 506 00:28:03,760 --> 00:28:08,080 Speaker 2: parents and educators to help them have conversations about technology 507 00:28:08,119 --> 00:28:11,800 Speaker 2: around AI and to just align their technology used to 508 00:28:11,840 --> 00:28:15,199 Speaker 2: their values. What you often find in toolkits is like 509 00:28:15,200 --> 00:28:17,359 Speaker 2: a turn this on, turn this off, and this is 510 00:28:17,400 --> 00:28:19,919 Speaker 2: more of a how do you make a decision that 511 00:28:19,960 --> 00:28:23,040 Speaker 2: aligns with your family values? With sleepovers, there are some 512 00:28:23,160 --> 00:28:25,680 Speaker 2: families that are like, ohhay, you go to whoever's house, 513 00:28:26,080 --> 00:28:27,440 Speaker 2: and there are some that are like, you are never 514 00:28:27,480 --> 00:28:29,760 Speaker 2: going to go on a sleepover, and most people fall 515 00:28:29,800 --> 00:28:32,880 Speaker 2: somewhere in the middle, whether that falls closer to one 516 00:28:32,880 --> 00:28:35,680 Speaker 2: side or the other, and they do that, they make 517 00:28:35,680 --> 00:28:38,600 Speaker 2: that choice based on trust, based on their values, based 518 00:28:38,640 --> 00:28:42,080 Speaker 2: on a whole host of factors. You can do that 519 00:28:42,120 --> 00:28:44,160 Speaker 2: with your tech used to does my child get to 520 00:28:44,240 --> 00:28:46,959 Speaker 2: dive right in and use everything or do they use nothing? 521 00:28:47,040 --> 00:28:49,440 Speaker 2: It's probably not either of the two. It's probably somewhere 522 00:28:49,440 --> 00:28:52,320 Speaker 2: in the middle, and it'll probably move based on their age. 523 00:28:52,920 --> 00:28:54,680 Speaker 2: And I think that's the way that we should all 524 00:28:54,720 --> 00:28:57,520 Speaker 2: be thinking about it. Just a little intentionality would provide 525 00:28:57,560 --> 00:28:59,440 Speaker 2: a lot of protection for us and our families. 526 00:29:01,080 --> 00:29:13,160 Speaker 1: More from our conversation after the break. So, I think 527 00:29:13,160 --> 00:29:14,840 Speaker 1: when a lot of us think about like, Okay, I 528 00:29:14,880 --> 00:29:17,600 Speaker 1: want to take some steps to clean up my digital footprint, 529 00:29:17,640 --> 00:29:21,160 Speaker 1: we often start with social media. And I'd love for 530 00:29:21,200 --> 00:29:25,360 Speaker 1: you to talk through like the distinction between deleting and 531 00:29:25,440 --> 00:29:29,320 Speaker 1: like deactivating yourself on social media sites. Are those the 532 00:29:29,360 --> 00:29:32,440 Speaker 1: same thing? And what happens to your data? Let's say 533 00:29:32,440 --> 00:29:34,120 Speaker 1: if you do delete your account, so. 534 00:29:34,080 --> 00:29:36,240 Speaker 2: It depends on the site. They usually tell you in 535 00:29:36,280 --> 00:29:39,920 Speaker 2: their privacy policy if deleting means that they will actually 536 00:29:39,960 --> 00:29:43,040 Speaker 2: delete the data that they are holding about your profile. 537 00:29:44,200 --> 00:29:48,480 Speaker 2: Sometimes they do, sometimes they don't. You disabling your account 538 00:29:48,600 --> 00:29:51,880 Speaker 2: does not mean deleting the air. Two separate actions, so 539 00:29:51,920 --> 00:29:54,280 Speaker 2: you've got to do both if that's what your goal is. 540 00:29:55,160 --> 00:29:58,640 Speaker 2: You know, there's a third category, I would say, and 541 00:29:58,680 --> 00:30:00,720 Speaker 2: that's just kind of making it really private and being 542 00:30:00,760 --> 00:30:04,160 Speaker 2: thoughtful about who you let view your account and like 543 00:30:04,200 --> 00:30:06,800 Speaker 2: who you let into your circle. I think that's probably 544 00:30:06,840 --> 00:30:11,360 Speaker 2: where most people can and should fall. I think deleting 545 00:30:11,360 --> 00:30:15,640 Speaker 2: your social media account is a great opportunity to leverage 546 00:30:15,640 --> 00:30:19,640 Speaker 2: your buying power to speak to your values. So, for example, 547 00:30:19,680 --> 00:30:22,200 Speaker 2: you'll see a lot of people jump off of Instagram 548 00:30:22,440 --> 00:30:26,320 Speaker 2: and Facebook because they feel like the way meta has 549 00:30:26,360 --> 00:30:29,000 Speaker 2: evolved doesn't align with their values, and I'm sure you'll 550 00:30:29,000 --> 00:30:32,920 Speaker 2: start to see that in the AI space as well. Now, 551 00:30:33,080 --> 00:30:36,240 Speaker 2: that can be tough because if everybody's on Instagram or 552 00:30:36,280 --> 00:30:38,680 Speaker 2: everybody's on TikTok, it's kind of tough to not be 553 00:30:38,840 --> 00:30:41,800 Speaker 2: in the mix, and so then how do you moderate 554 00:30:41,840 --> 00:30:47,800 Speaker 2: your behavior accordingly? But I do think that contemplating deleting 555 00:30:47,920 --> 00:30:52,000 Speaker 2: or disconnecting, even if it's just for a temporary detox, 556 00:30:52,520 --> 00:30:54,920 Speaker 2: is a tool at your disposal to think about how 557 00:30:55,000 --> 00:30:57,680 Speaker 2: you are speaking to these companies about what it is 558 00:30:57,680 --> 00:30:59,480 Speaker 2: that you value and how you'd like them to show 559 00:30:59,560 --> 00:31:00,360 Speaker 2: up in your life. 560 00:31:00,600 --> 00:31:03,040 Speaker 1: You know, Camille, I think that there's often this pressure 561 00:31:03,080 --> 00:31:07,200 Speaker 1: intension that exists, especially for Black women who are entrepreneurs 562 00:31:07,360 --> 00:31:11,360 Speaker 1: and creatives. So much of visibility is tied to like 563 00:31:11,800 --> 00:31:16,040 Speaker 1: your next gig or partnerships or speaking engagements. Right, how 564 00:31:16,040 --> 00:31:19,479 Speaker 1: do you balance the need for privacy and protecting your 565 00:31:19,520 --> 00:31:22,640 Speaker 1: mental health with the very real benefits that often come 566 00:31:22,680 --> 00:31:24,280 Speaker 1: with being visible online? 567 00:31:24,680 --> 00:31:28,680 Speaker 2: Yeah, that is about being intentional with what you are doing. 568 00:31:28,920 --> 00:31:33,360 Speaker 2: Your professional PERFOCTSONA or your professional pursuits require of you 569 00:31:33,480 --> 00:31:37,480 Speaker 2: to be visible and do that well. But that doesn't 570 00:31:37,520 --> 00:31:40,640 Speaker 2: mean that you have to film in your home or 571 00:31:40,760 --> 00:31:44,800 Speaker 2: talk about your home life, or talk about where you 572 00:31:44,840 --> 00:31:48,480 Speaker 2: went on vacation. There can be boundaries, and so my 573 00:31:48,600 --> 00:31:51,720 Speaker 2: recommendation to people is not to stay offline. As you 574 00:31:51,720 --> 00:31:54,000 Speaker 2: can see, I'm online. I think it is a great 575 00:31:54,040 --> 00:31:59,880 Speaker 2: opportunity for connection, to get business, to meet new community, 576 00:32:00,200 --> 00:32:02,479 Speaker 2: to understand how the things that you're passionate about alying 577 00:32:02,520 --> 00:32:06,560 Speaker 2: to other issues that are important in the moment just 578 00:32:06,640 --> 00:32:10,080 Speaker 2: for entertainment and recreation. And so there are a number 579 00:32:10,120 --> 00:32:13,960 Speaker 2: of reasons to engage online, do that thoughtfully. If it 580 00:32:14,040 --> 00:32:17,480 Speaker 2: is about business only or business accounts, keep it business. 581 00:32:17,680 --> 00:32:21,040 Speaker 2: If you are creating a professional personal persona kind of 582 00:32:21,040 --> 00:32:24,080 Speaker 2: this like hybrid, what are the boundaries on that. Is 583 00:32:24,120 --> 00:32:26,560 Speaker 2: it that your family isn't a part of the content? 584 00:32:26,800 --> 00:32:29,000 Speaker 2: Is it that your child isn't a part or children 585 00:32:29,000 --> 00:32:30,840 Speaker 2: aren't a part of the content. Is it that you 586 00:32:30,880 --> 00:32:34,280 Speaker 2: want to protect your family your parents? Really be thoughtful 587 00:32:34,280 --> 00:32:36,400 Speaker 2: about what those boundaries are and then go for it. 588 00:32:37,080 --> 00:32:39,520 Speaker 2: But just know also what you've put out there and 589 00:32:39,560 --> 00:32:42,760 Speaker 2: then react accordingly. So, for example, if you decide that 590 00:32:42,800 --> 00:32:46,880 Speaker 2: you've got this business personal hybrid persona that you want 591 00:32:46,880 --> 00:32:49,160 Speaker 2: out in the world, because a big part of your 592 00:32:49,200 --> 00:32:53,479 Speaker 2: professional pursuits is your personal professional brand, the information you 593 00:32:53,520 --> 00:32:57,520 Speaker 2: share as part of that should not become your password. 594 00:32:58,040 --> 00:33:00,280 Speaker 2: That shouldn't connect back to the ways that you seek 595 00:33:00,320 --> 00:33:03,440 Speaker 2: to protect yourself and your family. So just be thoughtful 596 00:33:03,480 --> 00:33:05,719 Speaker 2: about those things and you should be fine. 597 00:33:05,880 --> 00:33:08,560 Speaker 1: You know, Gamil, I feel like being online. There are 598 00:33:08,560 --> 00:33:11,040 Speaker 1: so many things I learned that I just never would 599 00:33:11,040 --> 00:33:14,920 Speaker 1: have thought about, you know, like seeing people on TikTok unfortunately, 600 00:33:15,200 --> 00:33:18,960 Speaker 1: be able to guess somebody's location based on like the 601 00:33:19,000 --> 00:33:22,120 Speaker 1: angle the sun comes into their living room, or oh 602 00:33:22,160 --> 00:33:24,120 Speaker 1: I stayed at this hotel and now I see someone 603 00:33:24,200 --> 00:33:27,560 Speaker 1: else and then like sharing that information in the comments section, right, Like, 604 00:33:27,800 --> 00:33:29,640 Speaker 1: I think that there's just so much of that happening 605 00:33:29,640 --> 00:33:32,400 Speaker 1: that you know, sometimes I think happens mindlessly, but it 606 00:33:32,440 --> 00:33:35,480 Speaker 1: does impact our safety. Are there other things that you'd 607 00:33:35,480 --> 00:33:38,000 Speaker 1: like to call attention to in terms of digital privacy 608 00:33:38,080 --> 00:33:40,880 Speaker 1: or cybersecurity that we haven't touched on that you think 609 00:33:40,880 --> 00:33:42,120 Speaker 1: are important for people to know? 610 00:33:43,720 --> 00:33:47,320 Speaker 2: I would just say complimenting that nothing is full proof, 611 00:33:47,320 --> 00:33:50,560 Speaker 2: and the technology and the circumstances change all the time. 612 00:33:50,600 --> 00:33:53,680 Speaker 2: Like I said, I thought putting an emoji over your 613 00:33:53,720 --> 00:33:55,760 Speaker 2: child's face might be okay for a time, and then 614 00:33:55,800 --> 00:33:58,680 Speaker 2: realize it wasn't, so you adapt. But if you compliment 615 00:33:58,720 --> 00:34:01,240 Speaker 2: that with some of those security behaviors that I've talked about, 616 00:34:01,280 --> 00:34:04,680 Speaker 2: like using twofa and having a really good password manager, 617 00:34:04,760 --> 00:34:08,200 Speaker 2: updating your software, you'll build in a lot of protection 618 00:34:08,320 --> 00:34:11,680 Speaker 2: for you and your family. If you find yourself the 619 00:34:11,760 --> 00:34:15,080 Speaker 2: subject of some kind of online abuse or harassment. There's 620 00:34:15,080 --> 00:34:17,200 Speaker 2: a good book called How to Be a Woman Online, 621 00:34:17,239 --> 00:34:20,279 Speaker 2: Surviving Abuse and Harassment and how to Fight Back by 622 00:34:20,400 --> 00:34:25,000 Speaker 2: Nina Jenkowitz. There are also a number of folks online 623 00:34:25,000 --> 00:34:26,920 Speaker 2: that talk about these issues all day. You don't have 624 00:34:26,960 --> 00:34:29,440 Speaker 2: to do the heavy listening. Let me and other people 625 00:34:30,480 --> 00:34:34,920 Speaker 2: look at the privacy policies for websites, look at the trends, 626 00:34:35,080 --> 00:34:38,800 Speaker 2: and highlight for you areas where you should or should 627 00:34:38,880 --> 00:34:44,200 Speaker 2: it start to engage. Also, reject the instinct to be 628 00:34:44,280 --> 00:34:46,600 Speaker 2: a first adopter. I don't know how many of you 629 00:34:46,640 --> 00:34:49,920 Speaker 2: have heard about Mattel embedding AI into some of their toys. 630 00:34:49,960 --> 00:34:52,520 Speaker 2: They've been thinking about that. Please do not give your 631 00:34:52,600 --> 00:34:57,120 Speaker 2: job AI model without having understood exactly what information it's 632 00:34:57,160 --> 00:34:59,799 Speaker 2: going to collect and what it's going to do for 633 00:34:59,800 --> 00:35:03,000 Speaker 2: the child, how it's going to affect their learning. You 634 00:35:03,160 --> 00:35:05,560 Speaker 2: don't have to be your first adopt on these things. 635 00:35:05,640 --> 00:35:10,040 Speaker 2: I would say your best bet actually is to let 636 00:35:10,080 --> 00:35:12,600 Speaker 2: it roll out, see how people use it, see where 637 00:35:12,640 --> 00:35:14,560 Speaker 2: some of the harms pop up, and then adjust your 638 00:35:14,640 --> 00:35:19,040 Speaker 2: use accordingly. That'll really help you as you navigate the 639 00:35:19,120 --> 00:35:22,840 Speaker 2: space and just stay connected. Just be thoughtful and adapt 640 00:35:22,840 --> 00:35:24,359 Speaker 2: your behavior as things come up. 641 00:35:24,680 --> 00:35:28,000 Speaker 1: I was not aware that Mattila was inventing AI and 642 00:35:28,120 --> 00:35:30,960 Speaker 1: toy that it's new news to me, but it does 643 00:35:31,120 --> 00:35:33,600 Speaker 1: bring up a great question. You've already mentioned your daughter. 644 00:35:33,719 --> 00:35:36,440 Speaker 1: I also have two sons, and thinking about this is 645 00:35:36,480 --> 00:35:38,680 Speaker 1: a whole new thing. Now I have to talk about 646 00:35:39,120 --> 00:35:41,920 Speaker 1: with the kids. Right these schools are already talking about AI, 647 00:35:42,120 --> 00:35:44,719 Speaker 1: they're using it. What kinds of things do you think 648 00:35:44,800 --> 00:35:47,600 Speaker 1: as parents and caregivers are important to talk to our 649 00:35:47,719 --> 00:35:49,880 Speaker 1: kids about in terms of AI and the way that 650 00:35:49,920 --> 00:35:52,360 Speaker 1: it's evolving in our society. 651 00:35:52,680 --> 00:35:55,799 Speaker 2: First and foremost is to have conversations about it. This 652 00:35:55,840 --> 00:35:59,640 Speaker 2: should be a dinner table conversation, talking about the latest tools, 653 00:35:59,600 --> 00:36:02,480 Speaker 2: talking about about your fears and concerns, talking about how 654 00:36:02,520 --> 00:36:06,320 Speaker 2: you're navigating it. Talk to your child about, oh, I 655 00:36:06,520 --> 00:36:09,920 Speaker 2: use this chatbot for a title for my article, but 656 00:36:10,160 --> 00:36:12,680 Speaker 2: I won't use it to write the article. And here's why. 657 00:36:13,320 --> 00:36:16,040 Speaker 2: Talk to them about the kinds of information they get 658 00:36:16,080 --> 00:36:19,040 Speaker 2: and how reliable or not reliable. It is the limitations 659 00:36:19,080 --> 00:36:23,000 Speaker 2: that conversation piece goes a long long way. Your kids 660 00:36:23,000 --> 00:36:25,520 Speaker 2: are creating their values and their boundaries based on what 661 00:36:25,560 --> 00:36:28,720 Speaker 2: they hear from you and your candor about your fears, 662 00:36:28,760 --> 00:36:31,239 Speaker 2: your concerns, and you figuring it out is going to 663 00:36:31,239 --> 00:36:34,440 Speaker 2: be a big part of them wrestling with and understanding this. 664 00:36:35,400 --> 00:36:41,480 Speaker 2: That inclination to be skeptical about a technology, skeptical about 665 00:36:42,000 --> 00:36:44,920 Speaker 2: when you get a random email. Continue to reinforce that 666 00:36:45,400 --> 00:36:47,680 Speaker 2: with your children, Tell them to lean into that those 667 00:36:47,760 --> 00:36:50,759 Speaker 2: instincts about things that are genuine versus not will be 668 00:36:51,440 --> 00:36:54,799 Speaker 2: a skill that they need long term. But don't keep 669 00:36:54,800 --> 00:36:57,400 Speaker 2: them away from technology. Moderate it based on their age. 670 00:36:57,400 --> 00:36:59,920 Speaker 2: I actually wrote a series about this where we talk 671 00:37:00,080 --> 00:37:03,719 Speaker 2: about by age group, what are some of the things 672 00:37:03,719 --> 00:37:05,400 Speaker 2: that you should be thinking about, What are some of 673 00:37:05,400 --> 00:37:07,840 Speaker 2: the tools at your disposal, some of the resources that 674 00:37:07,920 --> 00:37:12,400 Speaker 2: explain well what your child's capacity is. Because the goal 675 00:37:12,520 --> 00:37:16,640 Speaker 2: is to both make them digitally resilient and savvy and 676 00:37:16,719 --> 00:37:19,359 Speaker 2: able to navigate the latest technology, because that will be 677 00:37:19,440 --> 00:37:22,120 Speaker 2: an increasing part of their future, particularly as they transition 678 00:37:22,280 --> 00:37:25,800 Speaker 2: to being workers and employees and thinking about their future. 679 00:37:26,520 --> 00:37:28,759 Speaker 2: But they also need to recognize that there is a 680 00:37:28,760 --> 00:37:31,880 Speaker 2: world of human connection and anchoring their use and the 681 00:37:31,880 --> 00:37:37,000 Speaker 2: things that the values and the connections and dynamics that 682 00:37:37,040 --> 00:37:40,200 Speaker 2: they hold dear. And that's a tough balance if you're 683 00:37:40,200 --> 00:37:44,960 Speaker 2: not having constant conversation. So as you think about your children, 684 00:37:46,239 --> 00:37:48,719 Speaker 2: thinking clearly about what the limit should be on how 685 00:37:48,760 --> 00:37:51,760 Speaker 2: much TV, on how much they use the computer, what apps, 686 00:37:52,239 --> 00:37:55,120 Speaker 2: where are the places where you should be imposing a 687 00:37:55,120 --> 00:37:57,799 Speaker 2: limit and how are you checking in to make sure 688 00:37:57,800 --> 00:38:00,960 Speaker 2: that you're understanding how these tools evolve. You're gonna have 689 00:38:01,000 --> 00:38:02,719 Speaker 2: to be a little bit active on that, just like 690 00:38:02,800 --> 00:38:04,919 Speaker 2: you would be on where they're going to go after 691 00:38:04,960 --> 00:38:08,600 Speaker 2: school or who they're interacting with in real life. This 692 00:38:08,719 --> 00:38:11,160 Speaker 2: is going to take our active engagement as parents, because 693 00:38:11,200 --> 00:38:14,319 Speaker 2: we'll be learning as they're learning, but our lived experience 694 00:38:14,400 --> 00:38:17,880 Speaker 2: around how harm can manifest itself and how much access 695 00:38:17,880 --> 00:38:20,520 Speaker 2: people should and shouldn't have to your kids based on 696 00:38:20,560 --> 00:38:23,160 Speaker 2: their age is going to be integral to them figuring 697 00:38:23,160 --> 00:38:24,799 Speaker 2: this out, to you both figuring it out. 698 00:38:26,280 --> 00:38:28,640 Speaker 1: This has been so helpful, Camille, Thank you so much 699 00:38:28,680 --> 00:38:31,400 Speaker 1: for sharing all this information with us. Where can we 700 00:38:31,440 --> 00:38:33,640 Speaker 1: stay connected with you? What is your website as well 701 00:38:33,640 --> 00:38:35,360 Speaker 1: as any social media challenge you'd. 702 00:38:35,239 --> 00:38:37,440 Speaker 2: Like to share? Yes, so you can find me at 703 00:38:37,480 --> 00:38:41,200 Speaker 2: Camille Stuart Gloucester dot com. You can also find me 704 00:38:41,200 --> 00:38:46,640 Speaker 2: at at Camille esq on Instagram, and I also have 705 00:38:46,719 --> 00:38:50,840 Speaker 2: a substat called command Line with Camille. That's where you'll 706 00:38:50,880 --> 00:38:54,560 Speaker 2: find the articles I talked about with age appropriate, like 707 00:38:54,640 --> 00:38:58,760 Speaker 2: alignment on technology use. You'll find articles about the changing 708 00:38:58,840 --> 00:39:03,440 Speaker 2: AI ecosystem cybersecurity, and that is subseac dot com slash 709 00:39:03,520 --> 00:39:05,800 Speaker 2: Camille ESQ. That's usually where you can find me in 710 00:39:05,840 --> 00:39:09,040 Speaker 2: most places, and I look forward to connecting with you all. 711 00:39:09,120 --> 00:39:12,759 Speaker 2: Let me do the heavy lifting on what's changing, when 712 00:39:12,880 --> 00:39:15,080 Speaker 2: and how and how you secure yourself, and then you 713 00:39:15,160 --> 00:39:18,960 Speaker 2: do the light work of implementing it perfect well. 714 00:39:19,000 --> 00:39:20,839 Speaker 1: We should to include all of that in the show notes. 715 00:39:20,840 --> 00:39:22,759 Speaker 1: Thank you so much for spending some time with us today. 716 00:39:22,800 --> 00:39:23,800 Speaker 1: I appreciate it. 717 00:39:23,960 --> 00:39:25,440 Speaker 2: My pleasure. Thank you for having me. 718 00:39:29,719 --> 00:39:31,799 Speaker 1: I'm so happy Kamille was able to join us for 719 00:39:31,840 --> 00:39:34,719 Speaker 1: today's episode to learn more about her and the work 720 00:39:34,760 --> 00:39:37,200 Speaker 1: that she's doing. Be sure to visit the show notes 721 00:39:37,200 --> 00:39:40,479 Speaker 1: at Therapy for Blackgirls dot com SATs Session four forty five, 722 00:39:41,120 --> 00:39:43,200 Speaker 1: and don't forget to text two of your girls right 723 00:39:43,200 --> 00:39:46,160 Speaker 1: now and tell them to check out the episode. Did 724 00:39:46,160 --> 00:39:47,880 Speaker 1: you know that you could leave us a voicemail with 725 00:39:47,920 --> 00:39:51,120 Speaker 1: your questions and suggestions for the podcast. If you have 726 00:39:51,239 --> 00:39:53,920 Speaker 1: books you think we should read, or movies we should watch, 727 00:39:54,239 --> 00:39:56,759 Speaker 1: or topics you think we should discuss, drop us a 728 00:39:56,800 --> 00:39:59,520 Speaker 1: message at Memo dot fm slash Therapy for Black Girls 729 00:39:59,600 --> 00:40:01,680 Speaker 1: and let us know what's on your mind. We just 730 00:40:01,760 --> 00:40:04,840 Speaker 1: might feature it on the podcast. If you're looking for 731 00:40:04,880 --> 00:40:07,880 Speaker 1: a therapist in your area, visit our therapist directory at 732 00:40:07,920 --> 00:40:11,960 Speaker 1: Therapy for Blackgirls dot com slash directory. Don't forget to 733 00:40:11,960 --> 00:40:15,000 Speaker 1: follow us on Instagram at Therapy for Black Girls and 734 00:40:15,080 --> 00:40:17,280 Speaker 1: come on over and join us in our Patreon community 735 00:40:17,280 --> 00:40:20,800 Speaker 1: for exclusive updates, behind the scenes content, and much more. 736 00:40:21,280 --> 00:40:24,040 Speaker 1: You can join us at community dot Therapy for Blackgirls 737 00:40:24,040 --> 00:40:28,560 Speaker 1: dot com. This episode was produced by Elise Ellis, Indechubu 738 00:40:28,760 --> 00:40:32,879 Speaker 1: and Tyrie Rush. Editing was done by Dennison Bradford. Thank 739 00:40:32,960 --> 00:40:35,279 Speaker 1: y'all so much for joining me again this week. I 740 00:40:35,360 --> 00:40:38,720 Speaker 1: look forward to continuing this conversation with you all real soon. 741 00:40:39,360 --> 00:40:40,000 Speaker 1: Take good care,