1 00:00:00,080 --> 00:00:13,920 Speaker 1: Kay, Welcome to the Therapy for Black Girls Podcast, a 2 00:00:14,000 --> 00:00:18,799 Speaker 1: weekly conversation about mental health, personal development, and all the 3 00:00:18,840 --> 00:00:21,880 Speaker 1: small decisions we can make to become the best possible 4 00:00:22,000 --> 00:00:26,480 Speaker 1: versions of ourselves. I'm your host, Dr Joy hard and Bradford, 5 00:00:26,840 --> 00:00:31,920 Speaker 1: a licensed psychologist in Atlanta, Georgia. For more information or 6 00:00:32,000 --> 00:00:35,400 Speaker 1: to find a therapist in your area, visit our website 7 00:00:35,560 --> 00:00:39,160 Speaker 1: at Therapy for Black Girls dot com. While I hope 8 00:00:39,159 --> 00:00:43,040 Speaker 1: you love listening to and learning from the podcast, it 9 00:00:43,200 --> 00:00:46,279 Speaker 1: is not meant to be a substitute for relationship with 10 00:00:46,320 --> 00:00:57,480 Speaker 1: a licensed mental health professional. Hey, y'all, thanks so much 11 00:00:57,520 --> 00:01:00,000 Speaker 1: for joining me for session to seven of the Therapy 12 00:01:00,040 --> 00:01:03,000 Speaker 1: be for Black Girls Podcast. We'll get right into the 13 00:01:03,040 --> 00:01:15,400 Speaker 1: episode after a word from our sponsors. Our experiences in 14 00:01:15,440 --> 00:01:19,440 Speaker 1: spaces like social media are heavily impacted by our identities 15 00:01:19,480 --> 00:01:21,920 Speaker 1: and how we show up in the world. While this 16 00:01:22,000 --> 00:01:25,080 Speaker 1: can be the grounds for increased connections and networks, it 17 00:01:25,160 --> 00:01:28,080 Speaker 1: can also be the basis for things like racialized cyber 18 00:01:28,120 --> 00:01:32,360 Speaker 1: bullying and harassment. Joining me today to dig deeper into 19 00:01:32,400 --> 00:01:35,720 Speaker 1: what these experiences look like online and how we might 20 00:01:35,760 --> 00:01:40,000 Speaker 1: be impacted is Dr Brendisha Tynes. Dr Times is an 21 00:01:40,080 --> 00:01:44,399 Speaker 1: associate Professor of Education in Psychology and founding director of 22 00:01:44,440 --> 00:01:48,080 Speaker 1: the Center for Empowered Learning and Development with Technology at 23 00:01:48,120 --> 00:01:53,040 Speaker 1: the University of Southern California. She's a developmental psychologist whose 24 00:01:53,040 --> 00:01:56,520 Speaker 1: research focuses on the racial landscape at a lescents, navigating 25 00:01:56,640 --> 00:02:03,720 Speaker 1: online settings, online racial discrimination, digital equity in schools, digital literacy, 26 00:02:03,800 --> 00:02:07,800 Speaker 1: and the design of digital tools that empower students. A 27 00:02:07,840 --> 00:02:11,880 Speaker 1: little later, you'll also hear from Dr Danielle Hairston, who's 28 00:02:11,919 --> 00:02:15,040 Speaker 1: the Director of Residency Training in the Department of Psychiatry 29 00:02:15,120 --> 00:02:19,200 Speaker 1: at Howard University College of Medicine and a practicing psychiatrist 30 00:02:19,280 --> 00:02:23,280 Speaker 1: in the Division of Consultation Liaison Psychiatry at the University 31 00:02:23,280 --> 00:02:28,200 Speaker 1: of Maryland Medical Center, Baltimore, Maryland. During our conversations, we 32 00:02:28,280 --> 00:02:31,560 Speaker 1: chatted about what racialized cyber bullying and harassment might look 33 00:02:31,560 --> 00:02:35,440 Speaker 1: like online, how it impacts young people, how it might 34 00:02:35,480 --> 00:02:38,920 Speaker 1: impact our mental health, and how we can support ourselves 35 00:02:38,919 --> 00:02:42,320 Speaker 1: and loved ones who might have had these experiences. If 36 00:02:42,320 --> 00:02:45,520 Speaker 1: there's something that resonates with you while enjoying our conversations, 37 00:02:45,880 --> 00:02:48,720 Speaker 1: please share with us on social media using the hashtag 38 00:02:49,000 --> 00:02:55,399 Speaker 1: TBG in session. Here's our conversation. Thank you so much 39 00:02:55,400 --> 00:02:59,720 Speaker 1: for joining us dr Times. Oh absolutely, thank you so 40 00:03:00,320 --> 00:03:03,240 Speaker 1: for having me. I really appreciate it. Well, I'm very 41 00:03:03,240 --> 00:03:05,240 Speaker 1: excited to chat with you because I know there has 42 00:03:05,280 --> 00:03:08,400 Speaker 1: been so much going on related to you know, just 43 00:03:08,520 --> 00:03:11,200 Speaker 1: everything with like the social media platforms and everything in 44 00:03:11,200 --> 00:03:13,480 Speaker 1: the digital space, so I know that you would have 45 00:03:13,520 --> 00:03:16,840 Speaker 1: a wealth of information to share with us. Yes, I'm 46 00:03:16,880 --> 00:03:20,960 Speaker 1: excited to talk about my work indeed, So can you 47 00:03:21,080 --> 00:03:23,200 Speaker 1: start by maybe just telling us a little bit about 48 00:03:23,240 --> 00:03:26,360 Speaker 1: what you do as a development and an educational psychologists 49 00:03:26,400 --> 00:03:30,079 Speaker 1: and how your work touches the digital experience. Yes, So, 50 00:03:30,639 --> 00:03:36,520 Speaker 1: developmental psychologists are more interested in changes in development over 51 00:03:36,640 --> 00:03:41,160 Speaker 1: time and across different life stages, and these changes might 52 00:03:41,240 --> 00:03:48,200 Speaker 1: be social, cognitive, emotional, physical, whereas the educational psychologist in 53 00:03:48,320 --> 00:03:53,480 Speaker 1: me would apply theories of human development to teaching and learning, 54 00:03:53,760 --> 00:03:56,480 Speaker 1: and we're more interested in the range of factors that 55 00:03:56,560 --> 00:04:01,680 Speaker 1: might impact learning processes, be they social, cognitive, or emotional. 56 00:04:02,080 --> 00:04:07,800 Speaker 1: And so my questions as a developmental psychologist might be 57 00:04:08,040 --> 00:04:13,680 Speaker 1: more geared toward understanding how online experiences are associated with 58 00:04:13,800 --> 00:04:18,800 Speaker 1: mental health outcomes over time, academic performance over time, and 59 00:04:18,839 --> 00:04:24,120 Speaker 1: then as an educational psychologist, I'm more interested in blended 60 00:04:24,279 --> 00:04:30,080 Speaker 1: learning types of experiences or remote and online learning, and 61 00:04:30,400 --> 00:04:34,960 Speaker 1: how those experiences might be associated with mental health outcomes. 62 00:04:35,240 --> 00:04:38,279 Speaker 1: Got it okay? And so you know, I think many 63 00:04:38,320 --> 00:04:40,880 Speaker 1: of us have an understanding of what bullying looks like 64 00:04:41,279 --> 00:04:44,240 Speaker 1: in person and in real life. But now that a 65 00:04:44,279 --> 00:04:48,159 Speaker 1: lot of people have connections to mobile devices and computers, 66 00:04:48,160 --> 00:04:51,240 Speaker 1: we have seen a rise in cyber bullying. So can 67 00:04:51,320 --> 00:04:53,520 Speaker 1: you just say a little bit about what cyber bullying 68 00:04:53,800 --> 00:04:56,200 Speaker 1: is in some of the race based cyber bullying that 69 00:04:56,279 --> 00:05:01,279 Speaker 1: you've been studying, Yes, I actually cover a range of 70 00:05:01,560 --> 00:05:09,080 Speaker 1: online victimization experiences, so general types of cyber bullying that 71 00:05:09,279 --> 00:05:17,440 Speaker 1: often include repeated experiences and in some cases of power 72 00:05:17,480 --> 00:05:22,800 Speaker 1: and balance. And then there's online racial discrimination where you 73 00:05:22,920 --> 00:05:29,240 Speaker 1: might be demeaned or integrated or excluded because of your 74 00:05:29,360 --> 00:05:33,279 Speaker 1: race in some sort of online environment or via text. 75 00:05:34,000 --> 00:05:39,159 Speaker 1: And then there's also traumatic events online which we think 76 00:05:39,320 --> 00:05:45,440 Speaker 1: are different than online racial discrimination, and it's witnessing viral 77 00:05:45,520 --> 00:05:51,320 Speaker 1: videos of state or police violence, so seeing police killing 78 00:05:51,400 --> 00:05:56,080 Speaker 1: of an unarmed black person or witnessing an immigrant being 79 00:05:56,240 --> 00:06:02,120 Speaker 1: placed in a cage. So when we talk about cyber bullying, 80 00:06:03,080 --> 00:06:07,520 Speaker 1: sometimes we tend to lump all of those experiences together, 81 00:06:08,360 --> 00:06:11,480 Speaker 1: but we try to make a distinction in the work 82 00:06:11,480 --> 00:06:14,600 Speaker 1: that I do. Got it, Okay, And when you're talking 83 00:06:14,600 --> 00:06:17,919 Speaker 1: about like cyber bullying list with the first category that 84 00:06:18,000 --> 00:06:21,720 Speaker 1: you mentioned, right, So, people kind of making racial comments 85 00:06:21,800 --> 00:06:25,200 Speaker 1: or those kinds of things. Are these typically people that 86 00:06:25,560 --> 00:06:28,040 Speaker 1: no one another or is this somebody that is like 87 00:06:28,120 --> 00:06:31,760 Speaker 1: unknown to like the victim. We've seen that it's both. 88 00:06:32,160 --> 00:06:36,080 Speaker 1: So it could be people that they just might encounter 89 00:06:36,240 --> 00:06:39,839 Speaker 1: or say, for example, when they're watching a YouTube video 90 00:06:39,880 --> 00:06:42,839 Speaker 1: and reading the comments. And then it also could be 91 00:06:43,040 --> 00:06:48,320 Speaker 1: people from school that they may know, and so it varies. 92 00:06:48,760 --> 00:06:52,760 Speaker 1: M and your work, you've identified that adolescents of color 93 00:06:52,800 --> 00:06:55,720 Speaker 1: are most likely to be the ones that suffer from 94 00:06:55,800 --> 00:06:58,960 Speaker 1: online victimization. Can you say more about that work? Yes, 95 00:06:59,200 --> 00:07:09,240 Speaker 1: So we actually stopped administering our measures that assess online 96 00:07:09,320 --> 00:07:15,280 Speaker 1: racial discrimination and traumatic events online to white populations just 97 00:07:15,360 --> 00:07:20,560 Speaker 1: because we found early on that white kids didn't even 98 00:07:20,600 --> 00:07:24,040 Speaker 1: know their race, right, and when we would ask them 99 00:07:24,080 --> 00:07:29,520 Speaker 1: about racial discrimination, they would say, well, one time, this 100 00:07:29,800 --> 00:07:34,360 Speaker 1: British person was making fun of Americans, and so we 101 00:07:34,360 --> 00:07:37,400 Speaker 1: were like, okay, we're talking about apples and oranges here. 102 00:07:37,880 --> 00:07:42,440 Speaker 1: But early on in the first study of online racial 103 00:07:42,440 --> 00:07:48,080 Speaker 1: discrimination and psychological adjustment. For example, we found that black 104 00:07:48,160 --> 00:07:54,600 Speaker 1: high school students experienced online racial discrimination of the sample 105 00:07:54,800 --> 00:08:01,880 Speaker 1: experienced online racial discrimination, whereas only of whites said that 106 00:08:02,000 --> 00:08:05,560 Speaker 1: they had those experiences. And now again they aren't able 107 00:08:05,680 --> 00:08:10,840 Speaker 1: to recognize what's racial discrimination, and so that number could 108 00:08:10,920 --> 00:08:14,840 Speaker 1: be inflated a bit. And so did the research really 109 00:08:14,880 --> 00:08:19,119 Speaker 1: talk about like how this was impacting the young people. Yes, 110 00:08:19,280 --> 00:08:22,840 Speaker 1: and so early on we never published this study, but 111 00:08:23,640 --> 00:08:29,680 Speaker 1: we showed that online racial discrimination was not associated with 112 00:08:29,760 --> 00:08:33,240 Speaker 1: any of our mental health outcomes for white children. I've 113 00:08:33,240 --> 00:08:39,199 Speaker 1: been studying this topic for twenty years, and we've consistently 114 00:08:39,400 --> 00:08:45,520 Speaker 1: found that online racial discrimination is associated with depressive symptoms. 115 00:08:46,000 --> 00:08:51,320 Speaker 1: We've seen it associated with anxiety, and more recently we've 116 00:08:51,360 --> 00:08:57,720 Speaker 1: done studies where it's actually associated with suicide ideation. And 117 00:08:57,800 --> 00:09:03,720 Speaker 1: so we're hoping to get that manuscript submitted very soon 118 00:09:04,200 --> 00:09:07,400 Speaker 1: because of this rise that we're finding with black youth 119 00:09:07,440 --> 00:09:10,760 Speaker 1: and suicide. Yeah, you know, dunch of times I read 120 00:09:10,800 --> 00:09:12,680 Speaker 1: something that you wrote I think a couple of years 121 00:09:12,720 --> 00:09:16,880 Speaker 1: ago around parents and adults have concerns around screen time 122 00:09:17,360 --> 00:09:21,120 Speaker 1: related to devices, but really this is a greater concern, 123 00:09:21,240 --> 00:09:24,080 Speaker 1: like the cyber bullying that children and adolescents might be 124 00:09:24,120 --> 00:09:29,360 Speaker 1: experiencing online absolutely, So I argued in that New York 125 00:09:29,400 --> 00:09:34,600 Speaker 1: Times piece that just being on a screen does not 126 00:09:35,200 --> 00:09:40,040 Speaker 1: tell us a whole lot about people's mental health outcomes. 127 00:09:40,520 --> 00:09:42,960 Speaker 1: I think because a lot of times black kids in 128 00:09:43,000 --> 00:09:47,080 Speaker 1: particular may have positive experiences with race, where they're interacting 129 00:09:47,080 --> 00:09:52,400 Speaker 1: with other black people, they are supporting one another, they're 130 00:09:52,480 --> 00:09:58,760 Speaker 1: learning how to critique and resist racism, right, and so 131 00:09:59,480 --> 00:10:05,360 Speaker 1: in some instances, their screen time is really beneficial. And 132 00:10:05,400 --> 00:10:11,240 Speaker 1: we've actually found that connecting with other black people online 133 00:10:11,400 --> 00:10:18,040 Speaker 1: for adolescents is associated with increased empathy over time. We 134 00:10:18,200 --> 00:10:22,559 Speaker 1: have both these positive and negative outcomes that can result 135 00:10:22,640 --> 00:10:26,400 Speaker 1: from the time that people spend in front of screens. 136 00:10:27,440 --> 00:10:30,000 Speaker 1: So do you have suggestions for maybe how parents are 137 00:10:30,040 --> 00:10:33,200 Speaker 1: adults can talk with their young people about whether these 138 00:10:33,280 --> 00:10:35,600 Speaker 1: kinds of instances are happening, or whether they might even 139 00:10:35,600 --> 00:10:40,920 Speaker 1: be perpetrators or something of this. Yes. So my main 140 00:10:41,679 --> 00:10:47,160 Speaker 1: recommendation is for parents to make sure that they have 141 00:10:47,440 --> 00:10:55,160 Speaker 1: open lines of communication with their children and to also 142 00:10:55,720 --> 00:10:59,520 Speaker 1: make sure that their children trust them with their secrets. 143 00:10:59,679 --> 00:11:03,679 Speaker 1: And I want to make sure that parents aren't going 144 00:11:03,800 --> 00:11:08,079 Speaker 1: to take away the students devices if they find out 145 00:11:08,200 --> 00:11:13,480 Speaker 1: that they're online and having these experiences, because the answer 146 00:11:13,640 --> 00:11:16,920 Speaker 1: isn't to take away the devices, it's how do we 147 00:11:17,640 --> 00:11:22,000 Speaker 1: give the young person strategies to cope with the experiences 148 00:11:22,080 --> 00:11:25,840 Speaker 1: once they happen, or making their settings so that they're 149 00:11:25,840 --> 00:11:31,920 Speaker 1: not automatically, for example, seeing these traumatic events, seeing a 150 00:11:31,960 --> 00:11:36,200 Speaker 1: police killing that comes up on their timelines, and so 151 00:11:36,760 --> 00:11:42,360 Speaker 1: helping them also to put the messages in historical context 152 00:11:43,040 --> 00:11:47,080 Speaker 1: and showing them how we as black people have been 153 00:11:47,120 --> 00:11:52,240 Speaker 1: resilient and we've been able to overcome having these types 154 00:11:52,280 --> 00:11:57,120 Speaker 1: of experiences for centuries, and so this is another thing 155 00:11:57,160 --> 00:11:59,920 Speaker 1: that we're going to get through. Those are my main 156 00:12:00,040 --> 00:12:03,400 Speaker 1: in recommendations. I appreciate you sharing that historical context, right, 157 00:12:03,400 --> 00:12:05,760 Speaker 1: because that is like where the world is now, right, 158 00:12:05,840 --> 00:12:09,679 Speaker 1: you know, so we know historically like racial discrimination and 159 00:12:09,720 --> 00:12:12,120 Speaker 1: stuff was much more in real life, right, and of 160 00:12:12,160 --> 00:12:15,120 Speaker 1: course that still happens, but there is this whole new 161 00:12:15,160 --> 00:12:18,160 Speaker 1: world that technology has opened up that adds to what 162 00:12:18,320 --> 00:12:22,600 Speaker 1: that can look like now. Yeah. Yeah, so suns of times, 163 00:12:22,640 --> 00:12:25,520 Speaker 1: I'm wondering if you have seen, either through your work 164 00:12:25,600 --> 00:12:27,960 Speaker 1: or just through research that you've done from other people, 165 00:12:28,480 --> 00:12:32,480 Speaker 1: if there is a difference between the social media platforms 166 00:12:32,720 --> 00:12:36,640 Speaker 1: because it seems like younger people are drawn to platforms 167 00:12:36,679 --> 00:12:40,440 Speaker 1: like TikTok and snapchat and maybe not so much like Twitter, 168 00:12:40,520 --> 00:12:44,800 Speaker 1: Facebook and Instagram. Does the bullying look different from platforms 169 00:12:44,800 --> 00:12:50,600 Speaker 1: and platform So we haven't done any studies looking at 170 00:12:50,640 --> 00:12:56,240 Speaker 1: differences across the platforms, but there have been several studies 171 00:12:56,360 --> 00:13:02,920 Speaker 1: that suggests that girls might be more interested in more 172 00:13:02,960 --> 00:13:09,520 Speaker 1: of the visual types of platforms like Instagram. And then 173 00:13:09,920 --> 00:13:16,640 Speaker 1: we've consistently found that larger percentages of black people are 174 00:13:16,720 --> 00:13:23,000 Speaker 1: interested in social media platforms like Twitter. And so the 175 00:13:23,080 --> 00:13:29,160 Speaker 1: experiences that we've asked our participants about, we haven't noticed 176 00:13:29,720 --> 00:13:35,360 Speaker 1: differences across platforms. But then the early twenty tens, our 177 00:13:35,400 --> 00:13:39,480 Speaker 1: participants were more likely to experience online racial discrimination via 178 00:13:39,600 --> 00:13:45,000 Speaker 1: Facebook and also through texts. We just have data on 179 00:13:45,440 --> 00:13:50,200 Speaker 1: the spaces where people are more likely to witness online 180 00:13:50,240 --> 00:13:54,400 Speaker 1: racial discrimination, but we don't have data on the nature 181 00:13:54,640 --> 00:13:59,280 Speaker 1: of the experiences and how they might differ across platforms. Yeah, 182 00:13:59,280 --> 00:14:01,000 Speaker 1: I mean in recent in the news, you know, we 183 00:14:01,040 --> 00:14:04,679 Speaker 1: are hearing more about like how some of these platforms 184 00:14:04,720 --> 00:14:07,840 Speaker 1: impact self esteem. Right, all the information that came out 185 00:14:07,840 --> 00:14:10,440 Speaker 1: around the Instagram for teens that they were trying to 186 00:14:10,480 --> 00:14:14,720 Speaker 1: create and how teens actually don't necessarily have great experiences 187 00:14:14,720 --> 00:14:17,280 Speaker 1: they are it actually makes them feel worse about themselves 188 00:14:17,600 --> 00:14:19,840 Speaker 1: after spending time on the platform. It feels like just 189 00:14:19,880 --> 00:14:22,400 Speaker 1: so much more as to learn there, we are starting 190 00:14:22,440 --> 00:14:28,040 Speaker 1: to find out about algorithmic bias, and I created a 191 00:14:28,080 --> 00:14:33,600 Speaker 1: scale of algorithmic bias that includes using these filters to 192 00:14:33,760 --> 00:14:38,360 Speaker 1: make yourself look more uses and so we're trying to 193 00:14:38,440 --> 00:14:46,000 Speaker 1: see if those taking a pick with these filters, if 194 00:14:46,000 --> 00:14:51,360 Speaker 1: it's associated with you know, psychological functioning and racial identity 195 00:14:51,560 --> 00:14:54,680 Speaker 1: over time, and we get a daily diary study. I'm 196 00:14:54,720 --> 00:14:59,280 Speaker 1: so excited about this data. It's a daily diary study 197 00:14:59,600 --> 00:15:04,200 Speaker 1: of of algorithmic bias, traumatic events online racial discrimination, and 198 00:15:04,240 --> 00:15:12,080 Speaker 1: then positives online messages about their race. And we're trying 199 00:15:12,080 --> 00:15:16,160 Speaker 1: to see how those different types of experiences might be 200 00:15:16,760 --> 00:15:22,640 Speaker 1: associated with depressive symptoms and anxiety across the seven day period, 201 00:15:23,320 --> 00:15:27,760 Speaker 1: and then also how their racial identity might fluctuate. Oh, 202 00:15:28,080 --> 00:15:30,400 Speaker 1: that sounds like it will be very interesting. So we 203 00:15:30,480 --> 00:15:33,240 Speaker 1: spent a lot of time talking about adolescents, but you know, 204 00:15:33,360 --> 00:15:37,680 Speaker 1: we know that as adults we also have experiences online, 205 00:15:37,800 --> 00:15:40,680 Speaker 1: So something that often comes up and we've had a 206 00:15:40,680 --> 00:15:44,840 Speaker 1: lot of conversations with black women on the podcast who 207 00:15:45,040 --> 00:15:49,360 Speaker 1: have really had to limit like their use of some 208 00:15:49,480 --> 00:15:52,240 Speaker 1: of the platforms and being online because of like the 209 00:15:52,280 --> 00:15:55,000 Speaker 1: bullying and like the name calling and all of those things. 210 00:15:55,240 --> 00:15:56,960 Speaker 1: So can you talk a little bit about what this 211 00:15:57,040 --> 00:16:01,160 Speaker 1: might look like for adults. Yes, So I have been 212 00:16:01,240 --> 00:16:06,080 Speaker 1: doing this research, like I said, twenty years and I 213 00:16:06,120 --> 00:16:13,280 Speaker 1: have yet to forget an image or video. And I've 214 00:16:13,360 --> 00:16:20,120 Speaker 1: had trouble myself looking at videos, especially of police killings, 215 00:16:20,960 --> 00:16:24,480 Speaker 1: and so I put my settings so that these videos 216 00:16:24,480 --> 00:16:28,600 Speaker 1: aren't automatically playing because it could steal a whole day 217 00:16:28,760 --> 00:16:32,880 Speaker 1: or week of productivity for me. We're finding that for 218 00:16:33,000 --> 00:16:40,320 Speaker 1: emerging adults eighteen two, twenty four, when they are witnessing 219 00:16:40,360 --> 00:16:45,000 Speaker 1: these viral videos of police killings, there is associated with 220 00:16:45,240 --> 00:16:52,640 Speaker 1: PTSD symptoms. I should say that when they have increased 221 00:16:52,720 --> 00:16:57,160 Speaker 1: liberatory media literacy. It's a type of critical media literacy 222 00:16:57,240 --> 00:17:03,680 Speaker 1: where people can critique ray sism. There's no association between 223 00:17:03,800 --> 00:17:10,959 Speaker 1: the traumatic event online and PTTSD symptoms. Really, yes, So 224 00:17:11,000 --> 00:17:13,120 Speaker 1: how would that work? Like what kind of skill would 225 00:17:13,119 --> 00:17:16,600 Speaker 1: you need to not have that impact you? You would 226 00:17:16,680 --> 00:17:21,160 Speaker 1: need to know about systemic racism, you would need to 227 00:17:21,240 --> 00:17:27,440 Speaker 1: place messages in historical context, you would need to know 228 00:17:27,640 --> 00:17:33,719 Speaker 1: something about possibly white supremacy. You would need to have 229 00:17:34,040 --> 00:17:40,919 Speaker 1: also a range of ways to cope, and so we 230 00:17:41,119 --> 00:17:47,720 Speaker 1: didn't assess coping. But we think because people who are 231 00:17:48,080 --> 00:17:54,399 Speaker 1: emerging adults may have more of a toolkit for managing 232 00:17:54,680 --> 00:18:02,320 Speaker 1: racist experiences. We think that maybe another reason why their 233 00:18:02,440 --> 00:18:09,280 Speaker 1: critical media literacy experiences are a buffer. But definitely for 234 00:18:09,480 --> 00:18:14,000 Speaker 1: the experiences that we did assess explicitly, like being able 235 00:18:14,040 --> 00:18:20,320 Speaker 1: to critique racism, recognize it, and critique it, we're showing 236 00:18:20,400 --> 00:18:24,600 Speaker 1: that it's a definite buffer. M hmm, got it, you know. 237 00:18:24,640 --> 00:18:26,639 Speaker 1: And I'm wondering if there's something that you can speak 238 00:18:26,680 --> 00:18:31,920 Speaker 1: to about why black women, black trans people, some of 239 00:18:31,960 --> 00:18:35,720 Speaker 1: the people who are most marginalized in culture often our 240 00:18:35,880 --> 00:18:40,639 Speaker 1: targets for some of this behavior online. Yes, I think 241 00:18:40,680 --> 00:18:48,160 Speaker 1: because so often black women are the moral compass of platforms, 242 00:18:48,960 --> 00:18:54,400 Speaker 1: and they're the ones pointing out injustice and so when 243 00:18:54,440 --> 00:18:59,439 Speaker 1: you're vocal about injustice, when you're able to resist, people 244 00:18:59,520 --> 00:19:06,080 Speaker 1: find that threatening and so unfortunately they get targeted more often. 245 00:19:06,520 --> 00:19:12,960 Speaker 1: They are resilient and have strategies for managing these experiences, 246 00:19:13,280 --> 00:19:16,359 Speaker 1: and we see them sort of modeling for other people, 247 00:19:17,000 --> 00:19:21,840 Speaker 1: how to manage the trolls that they might encounter, how 248 00:19:21,880 --> 00:19:25,439 Speaker 1: to manage the racism, right, you know, they're always the 249 00:19:25,480 --> 00:19:29,679 Speaker 1: people to watch, right, And I mean, and often that 250 00:19:29,760 --> 00:19:32,280 Speaker 1: comes at our detriment, right, you know. That's why I said, 251 00:19:32,320 --> 00:19:34,520 Speaker 1: you know, so many of us haven't had to limit 252 00:19:35,080 --> 00:19:38,040 Speaker 1: either time or put all these kinds of things in 253 00:19:38,080 --> 00:19:40,080 Speaker 1: place to try to stop people from being able to 254 00:19:40,119 --> 00:19:42,879 Speaker 1: respond to you. I mean, it really has become another 255 00:19:43,040 --> 00:19:45,879 Speaker 1: job of figuring out just how to protect your peace 256 00:19:46,000 --> 00:19:48,359 Speaker 1: when you're engaging online, I think for a lot of people. 257 00:19:48,720 --> 00:19:52,399 Speaker 1: Oh yeah, absolutely. And are there any resources that you 258 00:19:52,400 --> 00:19:54,440 Speaker 1: would suggest for people who want to kind of read 259 00:19:54,440 --> 00:19:57,359 Speaker 1: more about the kinds of things that you research. Yes, 260 00:19:57,640 --> 00:20:00,600 Speaker 1: so some of my work is open act sets. So 261 00:20:00,720 --> 00:20:05,440 Speaker 1: the article in the Journal of Adolescent Health on traumatic 262 00:20:05,440 --> 00:20:12,080 Speaker 1: events online and mental health outcomes. They can also Child 263 00:20:12,119 --> 00:20:18,000 Speaker 1: Development recently published an article on online racial discrimination over time, 264 00:20:19,000 --> 00:20:22,440 Speaker 1: and if it's not available now, it will be if 265 00:20:22,440 --> 00:20:27,680 Speaker 1: they can do Google scholar, most of my manuscript should 266 00:20:27,720 --> 00:20:32,119 Speaker 1: come up. And if they can't access the full article, 267 00:20:32,240 --> 00:20:34,919 Speaker 1: then they can email me and I can send it 268 00:20:34,960 --> 00:20:37,760 Speaker 1: to them. Perfect And what is your website dot? Your 269 00:20:37,800 --> 00:20:40,760 Speaker 1: times and any social media handles that you want to 270 00:20:40,800 --> 00:20:45,240 Speaker 1: share on Twitter. I'm at Brindisha b R E N 271 00:20:45,359 --> 00:20:50,480 Speaker 1: D E s h A. And for my website is 272 00:20:50,640 --> 00:20:54,240 Speaker 1: the receiver site. I don't have a personal website it okay, 273 00:20:54,240 --> 00:20:58,400 Speaker 1: we have them, will include it? Okay? Perfect? Well, thank 274 00:20:58,400 --> 00:21:01,080 Speaker 1: you so much, Actor Times. I really appreciate you sharing 275 00:21:01,080 --> 00:21:04,600 Speaker 1: this with us today. Thank you. This was really a 276 00:21:04,600 --> 00:21:18,959 Speaker 1: wonderful conversation. Recently, HBO Max released a documentary called fifteen 277 00:21:19,000 --> 00:21:22,280 Speaker 1: Minutes of Shame that digs into the culture around online 278 00:21:22,280 --> 00:21:25,240 Speaker 1: harassment and bullying and the impact it can have on 279 00:21:25,280 --> 00:21:28,679 Speaker 1: those who have experienced it. I was particularly struck by 280 00:21:28,680 --> 00:21:31,600 Speaker 1: Taylor Dumpson's story as a young sister, who at the 281 00:21:31,640 --> 00:21:35,920 Speaker 1: time was a student at American University who experienced despicable 282 00:21:36,040 --> 00:21:39,400 Speaker 1: threats and harassment after being elected as the first black 283 00:21:39,480 --> 00:21:43,360 Speaker 1: woman president of their student government. She shared that following 284 00:21:43,400 --> 00:21:47,440 Speaker 1: these experiences, it was very triggering to even be on campus, 285 00:21:47,920 --> 00:21:51,800 Speaker 1: and that she was diagnosed with PTSD, depression and anxiety. 286 00:21:52,520 --> 00:21:55,800 Speaker 1: It seems that when harassment occurs online, people tend to 287 00:21:55,840 --> 00:21:58,359 Speaker 1: minimize it and encourage people to just walk away from 288 00:21:58,359 --> 00:22:02,320 Speaker 1: the device, but it's not always that easy. Joining me 289 00:22:02,359 --> 00:22:05,240 Speaker 1: now to talk about how cyber bullying and harassment can 290 00:22:05,280 --> 00:22:12,760 Speaker 1: impact our mental health. Is Dr Danny, Well, thank you 291 00:22:12,840 --> 00:22:15,879 Speaker 1: so much for joining me today, Dr Danny, thank you 292 00:22:16,000 --> 00:22:18,600 Speaker 1: Dr Joy for having me. I'm like fan girling over 293 00:22:18,680 --> 00:22:21,840 Speaker 1: here that I'm actually talking to you. You are an 294 00:22:21,880 --> 00:22:26,400 Speaker 1: inspiration and influencer for all things black girls and mental health. 295 00:22:26,440 --> 00:22:28,880 Speaker 1: So I'm excited to be here. I love it. Thank 296 00:22:28,920 --> 00:22:31,119 Speaker 1: you so much for those kind of girls. I appreciate 297 00:22:31,160 --> 00:22:33,840 Speaker 1: you joining me today to talk about this really important topic. 298 00:22:34,119 --> 00:22:36,080 Speaker 1: And it feels really important to also be talking with 299 00:22:36,119 --> 00:22:38,520 Speaker 1: you as you're the director of the residency program at 300 00:22:38,520 --> 00:22:40,639 Speaker 1: Howard right, and so a little later I want to 301 00:22:40,640 --> 00:22:43,679 Speaker 1: get into how you're training your students to like do 302 00:22:43,720 --> 00:22:46,080 Speaker 1: some of this new work, because it feels like with 303 00:22:46,160 --> 00:22:49,560 Speaker 1: the advent of in the increases in technology and especially 304 00:22:49,600 --> 00:22:52,480 Speaker 1: social media, it has really opened up the world in 305 00:22:52,600 --> 00:22:55,560 Speaker 1: terms of connections, but it has also opened up the 306 00:22:55,560 --> 00:22:59,240 Speaker 1: world in terms of like harassment and bullying. And so 307 00:22:59,400 --> 00:23:04,520 Speaker 1: we know, in particular, black women it seems like experience 308 00:23:04,600 --> 00:23:08,959 Speaker 1: this like unique kind of flavor of disrespect and harassment. 309 00:23:09,000 --> 00:23:11,119 Speaker 1: So can you say a little bit about the ways 310 00:23:11,160 --> 00:23:14,159 Speaker 1: that you have seen black women experience things like cyber 311 00:23:14,200 --> 00:23:17,320 Speaker 1: bullying and racial harassment. It's a dichonomy, right, because you 312 00:23:17,400 --> 00:23:21,520 Speaker 1: and I use social media, diverse media for benefits, to 313 00:23:21,680 --> 00:23:24,520 Speaker 1: educate people, to connect people to network. So there's the 314 00:23:24,600 --> 00:23:27,800 Speaker 1: positive side, but then there's a negative side as well, 315 00:23:27,880 --> 00:23:31,239 Speaker 1: and it seems like black women seem to get the 316 00:23:31,240 --> 00:23:34,439 Speaker 1: worst of things, especially when there's a lot going on 317 00:23:34,560 --> 00:23:37,760 Speaker 1: with body shaming. I think that's huge, a lot of 318 00:23:37,840 --> 00:23:40,840 Speaker 1: comparing others like why don't you look like this, Why 319 00:23:40,880 --> 00:23:42,679 Speaker 1: it's her hair like this? Why are you doing that 320 00:23:42,760 --> 00:23:44,439 Speaker 1: to your hair? Why do you look like this? And 321 00:23:44,520 --> 00:23:47,679 Speaker 1: just coming for them if you will, coming for them quotes. 322 00:23:48,240 --> 00:23:50,200 Speaker 1: I think that there's something that I notice a lot 323 00:23:50,240 --> 00:23:55,240 Speaker 1: on the Twitter, a lot of people just feeling like 324 00:23:55,359 --> 00:23:57,920 Speaker 1: they have the ability to just comment on every aspect 325 00:23:57,960 --> 00:24:01,040 Speaker 1: of every single thing that you say. And I think 326 00:24:01,040 --> 00:24:05,000 Speaker 1: that it's something that black women go through the most. 327 00:24:06,000 --> 00:24:08,160 Speaker 1: And I think that it's something that black women experienced. 328 00:24:08,160 --> 00:24:09,600 Speaker 1: So it can be a lot, and it can wait 329 00:24:09,680 --> 00:24:14,240 Speaker 1: on people. It sometimes feels like people don't pay attention 330 00:24:14,280 --> 00:24:17,680 Speaker 1: to a really recognized that these are actual people on 331 00:24:17,720 --> 00:24:20,199 Speaker 1: the other side of the computer, right, And so there 332 00:24:20,240 --> 00:24:24,960 Speaker 1: are real ramifications and consequences for people who have experienced 333 00:24:24,960 --> 00:24:27,280 Speaker 1: this type of harassment and bullying. Can you say a 334 00:24:27,320 --> 00:24:30,639 Speaker 1: little bit about how that might impact their mental health. Yeah, 335 00:24:30,720 --> 00:24:35,080 Speaker 1: so it can be as extreme as people being driven 336 00:24:35,119 --> 00:24:38,720 Speaker 1: to having suicidal thoughts, suicidal ideations. That's something that we've seen. 337 00:24:38,800 --> 00:24:40,960 Speaker 1: I mean, that's something I've shown that your listeners and 338 00:24:41,040 --> 00:24:43,720 Speaker 1: you have also seen posts about, like people who have 339 00:24:43,800 --> 00:24:46,960 Speaker 1: been bullied, children, even who have been bullied to the 340 00:24:47,000 --> 00:24:50,800 Speaker 1: extent that they either have attempted to take their lives, 341 00:24:50,880 --> 00:24:53,320 Speaker 1: have taken their lives, or have at least thought about that. 342 00:24:53,560 --> 00:24:58,040 Speaker 1: So that's the extreme level. Then there's also those who, 343 00:24:58,160 --> 00:25:01,960 Speaker 1: you know, it drives you to lose sleep, people lose 344 00:25:02,000 --> 00:25:05,080 Speaker 1: their appetite, people start you know, just like any other 345 00:25:05,160 --> 00:25:08,680 Speaker 1: stressor social media. Although, like I said, I can find 346 00:25:08,680 --> 00:25:11,840 Speaker 1: it to be very helpful and beneficial and educational, but 347 00:25:11,880 --> 00:25:14,800 Speaker 1: it can also be a stressor it can also be 348 00:25:14,880 --> 00:25:17,359 Speaker 1: a trigger. It also can be a source of anxiety 349 00:25:17,400 --> 00:25:19,520 Speaker 1: and depression, so we really have to be careful with it. 350 00:25:19,720 --> 00:25:22,000 Speaker 1: So the things that you see with depression, the things 351 00:25:22,000 --> 00:25:25,159 Speaker 1: that you see with traumatic disorders like PTSD or acute 352 00:25:25,160 --> 00:25:28,640 Speaker 1: stress disorder, loss of sleep and ability to concentrate. When 353 00:25:28,640 --> 00:25:33,520 Speaker 1: you see people whose lives are being distressed by these issues, 354 00:25:33,600 --> 00:25:35,199 Speaker 1: like they're not able to go to work, they're not 355 00:25:35,280 --> 00:25:39,040 Speaker 1: able to engage with people on a normal level, they're 356 00:25:39,040 --> 00:25:41,160 Speaker 1: not interested in the same thing that they're just being 357 00:25:41,240 --> 00:25:45,200 Speaker 1: encompassed by social media and the attacks or the comments. 358 00:25:45,560 --> 00:25:47,879 Speaker 1: I want to spend a little more time here because 359 00:25:47,880 --> 00:25:50,080 Speaker 1: you've been doing a lot of work advocating for the 360 00:25:50,080 --> 00:25:52,879 Speaker 1: inclusion of media based distress and media based trauma in 361 00:25:52,920 --> 00:25:55,080 Speaker 1: the d s M. Can you talk a little bit 362 00:25:55,080 --> 00:25:57,119 Speaker 1: more about this term, because I think a lot of 363 00:25:57,160 --> 00:26:00,840 Speaker 1: people think about social media as frivolous in not necessarily 364 00:26:00,880 --> 00:26:05,760 Speaker 1: something that could lead to somebody experiencing symptoms like a PTSD. Right, 365 00:26:06,119 --> 00:26:07,960 Speaker 1: I want to hear more about the term that you're 366 00:26:07,960 --> 00:26:10,960 Speaker 1: talking about in terms of media based distress and you're 367 00:26:10,960 --> 00:26:13,119 Speaker 1: thinking for why it should be included in something like 368 00:26:13,160 --> 00:26:16,760 Speaker 1: the d s M. The DSM created years and years 369 00:26:16,760 --> 00:26:19,320 Speaker 1: and years ago, and we have multiple versions, and you're 370 00:26:19,560 --> 00:26:22,600 Speaker 1: changing with the times, and they change the diagnoses based 371 00:26:22,600 --> 00:26:25,040 Speaker 1: on the culture and things. And I think in this 372 00:26:25,160 --> 00:26:28,080 Speaker 1: time where we have social media and diverse media, we 373 00:26:28,119 --> 00:26:31,600 Speaker 1: really have to think about how things are felt vicariously, 374 00:26:32,080 --> 00:26:35,800 Speaker 1: whether that is racism, whether that is bullying. It does 375 00:26:35,840 --> 00:26:38,240 Speaker 1: not always have to be this direct thing that we've 376 00:26:38,280 --> 00:26:40,520 Speaker 1: been discussing in ds M five when we talk about 377 00:26:40,520 --> 00:26:45,360 Speaker 1: the diagnostic criteria, like what exactly a trigger for PTSD 378 00:26:45,440 --> 00:26:48,119 Speaker 1: or another acute stress disorder. We have to change what 379 00:26:48,160 --> 00:26:50,000 Speaker 1: we're thinking about. Times are changing that we have to 380 00:26:50,119 --> 00:26:52,399 Speaker 1: change the way that we diagnose and the way that 381 00:26:52,440 --> 00:26:56,520 Speaker 1: we assess. Now, if someone is on their phone constantly, 382 00:26:56,920 --> 00:26:59,879 Speaker 1: especially for those who social media is a means of 383 00:27:00,280 --> 00:27:04,000 Speaker 1: employment or getting their coins, who have to check social 384 00:27:04,000 --> 00:27:08,320 Speaker 1: media frequently, to be constantly bombarded with negative comments, with 385 00:27:08,960 --> 00:27:13,360 Speaker 1: hurtful statements, it's hurtful. And I've actually talked to creators, 386 00:27:13,359 --> 00:27:15,920 Speaker 1: like creators on YouTube who have had to deal with 387 00:27:16,520 --> 00:27:19,679 Speaker 1: negative comments constantly. And I think that a lot of 388 00:27:19,680 --> 00:27:22,600 Speaker 1: times people see you on screen but don't think about 389 00:27:22,680 --> 00:27:26,600 Speaker 1: them as an actual person, and they are an actual 390 00:27:26,640 --> 00:27:31,040 Speaker 1: person who can go through depressive episodes, who can be traumatized, 391 00:27:31,080 --> 00:27:33,719 Speaker 1: who can be triggered, who are not able to just 392 00:27:33,800 --> 00:27:37,560 Speaker 1: let things go, especially when it's constantly happening. I think 393 00:27:37,600 --> 00:27:41,120 Speaker 1: that being able to create content, you are bringing your 394 00:27:41,119 --> 00:27:45,280 Speaker 1: talent right, You're bringing your talents to an audience. You're 395 00:27:45,280 --> 00:27:48,840 Speaker 1: trying to inform people. You're trying to empower people. Also, 396 00:27:49,240 --> 00:27:51,280 Speaker 1: you're trying to show off your talents or your art 397 00:27:51,400 --> 00:27:55,359 Speaker 1: or your skills, and that's a privilege. And for you 398 00:27:55,440 --> 00:27:58,520 Speaker 1: to be attacked constantly, or to be scrutinized constantly or 399 00:27:58,560 --> 00:28:03,000 Speaker 1: criticized constantly, that can weigh on people. And when media 400 00:28:03,560 --> 00:28:06,480 Speaker 1: is the source of this, that's where we get to 401 00:28:06,480 --> 00:28:09,200 Speaker 1: the point where media can be traumatic, where social media 402 00:28:09,240 --> 00:28:13,399 Speaker 1: can be traumatic, where this constant barrage of comments of 403 00:28:13,400 --> 00:28:17,640 Speaker 1: a text of questioning you, questioning your worth, questioning your skills, 404 00:28:17,760 --> 00:28:20,320 Speaker 1: questioning your art or your talent, it can really weigh 405 00:28:20,359 --> 00:28:23,320 Speaker 1: on someone. So we can't say that, oh, it's only 406 00:28:23,440 --> 00:28:27,280 Speaker 1: if you experience this, that's the only definition of trauma, 407 00:28:27,480 --> 00:28:30,320 Speaker 1: because as we see, a lot of things can result 408 00:28:30,359 --> 00:28:33,040 Speaker 1: in trauma and we really have to come to a 409 00:28:33,160 --> 00:28:35,480 Speaker 1: change and we really have to level up. There was 410 00:28:35,480 --> 00:28:38,840 Speaker 1: a time when homosexuality was a mental health diagnosis and 411 00:28:38,880 --> 00:28:40,880 Speaker 1: the d s M, which of course we know is 412 00:28:41,080 --> 00:28:45,479 Speaker 1: completely unacceptable. So you know, when we've looked at things changing, 413 00:28:45,520 --> 00:28:47,280 Speaker 1: then we have to look at things changing now as 414 00:28:47,320 --> 00:28:51,200 Speaker 1: well and look at how media effects, especially the younger generations. 415 00:28:51,880 --> 00:28:54,800 Speaker 1: M HM. So I would like to move into some 416 00:28:54,880 --> 00:28:58,560 Speaker 1: of the interventions. So what would you tell somebody who 417 00:28:58,720 --> 00:29:02,360 Speaker 1: is struggling with like maybe postra medic stress disorder symptoms 418 00:29:02,480 --> 00:29:06,040 Speaker 1: or things like depression and anxiety related to harassment or 419 00:29:06,120 --> 00:29:09,440 Speaker 1: some experiences they've had online. Well, one thing about me, 420 00:29:09,960 --> 00:29:13,840 Speaker 1: Dr Jody, my block game is strong. Let me tell you, 421 00:29:13,880 --> 00:29:16,240 Speaker 1: I'll block you. I'll lock your mother, your cousins, to 422 00:29:16,920 --> 00:29:19,520 Speaker 1: whoever it is, because I don't need that. Again, it's 423 00:29:19,520 --> 00:29:22,520 Speaker 1: a privilege for you to see the content that I'm 424 00:29:22,520 --> 00:29:25,000 Speaker 1: producing for you. It's a privilege for someone to bring 425 00:29:25,000 --> 00:29:27,200 Speaker 1: you into their life and try to relate to you 426 00:29:27,240 --> 00:29:31,800 Speaker 1: and create relatable content and educate or entertain whether you're 427 00:29:31,800 --> 00:29:34,920 Speaker 1: stinging or teaching. If someone is attacking you, you can 428 00:29:34,920 --> 00:29:38,000 Speaker 1: block them. You can turn off commenting if you need to, 429 00:29:38,240 --> 00:29:41,800 Speaker 1: and also let other people do some heavy lifting for you. 430 00:29:42,160 --> 00:29:44,160 Speaker 1: Let other people if they want to go back and forth, 431 00:29:44,200 --> 00:29:47,880 Speaker 1: your your friends. They want to comment, okay, if they 432 00:29:47,920 --> 00:29:50,000 Speaker 1: want to stand up for you, they want to advocate 433 00:29:50,040 --> 00:29:53,920 Speaker 1: for you, that's fine. But know that you do not 434 00:29:54,080 --> 00:29:57,240 Speaker 1: have to respond to everything. And I think that is 435 00:29:57,280 --> 00:30:00,360 Speaker 1: probably the most powerful thing you can do, is give 436 00:30:00,400 --> 00:30:03,320 Speaker 1: them a response that stops people even more in their tracks. 437 00:30:03,320 --> 00:30:06,160 Speaker 1: By not giving a response. So everything does not deserve 438 00:30:06,200 --> 00:30:09,400 Speaker 1: a response, and everything does not deserve your energy. If 439 00:30:09,440 --> 00:30:14,000 Speaker 1: you do respond, respond with awareness and advocacy, respond with 440 00:30:14,040 --> 00:30:16,600 Speaker 1: purpose and intention. Know what you're gonna say. Don't just 441 00:30:16,880 --> 00:30:20,200 Speaker 1: clap back and start cussing and saying it, because you know, 442 00:30:20,320 --> 00:30:25,720 Speaker 1: everything on these internets on the interweb stays, especially if 443 00:30:25,760 --> 00:30:28,880 Speaker 1: you're trying to build your brand or build whoever you 444 00:30:28,960 --> 00:30:31,880 Speaker 1: are or what you represent. You don't want something that's 445 00:30:31,880 --> 00:30:34,440 Speaker 1: going to be sticking around and floating around or becoming 446 00:30:34,480 --> 00:30:38,680 Speaker 1: a meme forever. So if you do respond, respond with purpose, 447 00:30:39,160 --> 00:30:44,120 Speaker 1: with awareness, intention. I think it's also important for people 448 00:30:44,160 --> 00:30:47,880 Speaker 1: to decompress, like whether it's a racist or a hateful interaction. 449 00:30:48,320 --> 00:30:52,040 Speaker 1: Decompressed and decompressing will be different for everyone, but some 450 00:30:52,120 --> 00:30:55,520 Speaker 1: things are a must, like closing the at, close the 451 00:30:55,600 --> 00:30:59,600 Speaker 1: at and walk away. Some people think that this is trivial, 452 00:30:59,680 --> 00:31:02,080 Speaker 1: but don't think. To ten is also helpful, even if 453 00:31:02,080 --> 00:31:05,640 Speaker 1: it sounds childish. Counting to ten just taking some deep 454 00:31:05,640 --> 00:31:07,960 Speaker 1: breaths and walking away from what it is that is 455 00:31:08,000 --> 00:31:12,920 Speaker 1: triggering you or upsetting you. Remind yourself of why you're here, 456 00:31:13,080 --> 00:31:15,160 Speaker 1: of how you got here, of what you did your 457 00:31:15,200 --> 00:31:18,240 Speaker 1: accomplishments big and small. So when someone is attacking you 458 00:31:18,240 --> 00:31:20,360 Speaker 1: and coming for your character, you can really say, I 459 00:31:20,400 --> 00:31:23,360 Speaker 1: know who I am, I know that this person doesn't 460 00:31:23,400 --> 00:31:25,600 Speaker 1: know me. I know what I'm about, I know my purpose. 461 00:31:26,160 --> 00:31:29,920 Speaker 1: Deep breathing and music can also be helpful approaches. I 462 00:31:29,960 --> 00:31:34,120 Speaker 1: think that how you decompressed changes based on who you are. 463 00:31:34,240 --> 00:31:37,480 Speaker 1: But maybe it's discussing things with your family, discussing things 464 00:31:37,560 --> 00:31:40,560 Speaker 1: with your friends, or if it's coming to the point 465 00:31:40,560 --> 00:31:44,560 Speaker 1: where it's really damaging you, it's causing distress in your life. 466 00:31:44,560 --> 00:31:47,600 Speaker 1: You're not able to go on with your regular activities 467 00:31:47,960 --> 00:31:51,800 Speaker 1: of living, and it's hampering your life. Talk to a professional. 468 00:31:52,360 --> 00:32:07,480 Speaker 1: More from my conversation after the break, Dr Danny. I 469 00:32:07,560 --> 00:32:10,680 Speaker 1: have had conversations with people where it has gotten to 470 00:32:10,680 --> 00:32:13,040 Speaker 1: the point, right, So people who maybe like make their 471 00:32:13,080 --> 00:32:17,200 Speaker 1: living doing maybe influencial work or doing something that involves 472 00:32:17,200 --> 00:32:19,880 Speaker 1: them like being on screen or being like active on 473 00:32:19,920 --> 00:32:23,120 Speaker 1: social media channels, and they will have anxiety about like 474 00:32:23,160 --> 00:32:26,280 Speaker 1: even opening the app right, or like really really worried 475 00:32:26,280 --> 00:32:28,160 Speaker 1: about what's gonna be in their d m s when 476 00:32:28,160 --> 00:32:30,880 Speaker 1: they get there, and so it interferes with their ability 477 00:32:30,920 --> 00:32:33,120 Speaker 1: to even create a life for themselves right and take 478 00:32:33,120 --> 00:32:35,160 Speaker 1: care of themselves. Can you talk a little bit about 479 00:32:35,400 --> 00:32:37,880 Speaker 1: professionally how you might work with the client who maybe 480 00:32:37,960 --> 00:32:42,400 Speaker 1: came in with some of those struggles. Sure, so it's distressing. 481 00:32:42,760 --> 00:32:45,200 Speaker 1: It's not for me or any other mental health professional 482 00:32:45,320 --> 00:32:47,600 Speaker 1: to say like, oh, it's just social media or what 483 00:32:47,680 --> 00:32:49,920 Speaker 1: you're going through isn't a big deal, because, as I said, 484 00:32:49,920 --> 00:32:52,920 Speaker 1: and as you said, this is how people make ends meet, 485 00:32:53,160 --> 00:32:55,040 Speaker 1: this is how people are funded, this is how they 486 00:32:55,080 --> 00:32:57,400 Speaker 1: make their coins, this is how they literally get paid 487 00:32:57,480 --> 00:32:59,800 Speaker 1: and put food on the table for themselves and their family. 488 00:33:00,200 --> 00:33:03,440 Speaker 1: So it is a huge issue. So first it's to 489 00:33:03,520 --> 00:33:06,520 Speaker 1: acknowledge that this is a big issue, not to dismiss it, 490 00:33:06,600 --> 00:33:09,000 Speaker 1: not to say, oh, this is just Instagram or this 491 00:33:09,160 --> 00:33:12,200 Speaker 1: is just Twitter or your tip talk star or whatever. 492 00:33:12,400 --> 00:33:15,200 Speaker 1: This is what is important to someone's life. So first 493 00:33:15,240 --> 00:33:17,920 Speaker 1: you have to acknowledge what is going on and that 494 00:33:17,960 --> 00:33:19,880 Speaker 1: this is having an impact on them. I think that's 495 00:33:19,920 --> 00:33:22,040 Speaker 1: the most important, and then I would have to address 496 00:33:22,080 --> 00:33:24,880 Speaker 1: it the same way I address all issues, whether it's 497 00:33:25,200 --> 00:33:29,160 Speaker 1: postpartum depression or infidelity or cheating, or a medical issue 498 00:33:29,160 --> 00:33:33,320 Speaker 1: that's causing you stress. We have to really open up 499 00:33:33,360 --> 00:33:35,920 Speaker 1: our minds and open up our practices and our treatment 500 00:33:35,960 --> 00:33:40,640 Speaker 1: plans to understanding that these things are issues. It's complex, 501 00:33:41,000 --> 00:33:43,160 Speaker 1: So I think it's important that we acknowledge and also 502 00:33:43,240 --> 00:33:46,400 Speaker 1: we treat and we attack their symptoms that they're presenting with. 503 00:33:46,760 --> 00:33:50,000 Speaker 1: So I don't believe in my treatment in my practice, Yes, 504 00:33:50,120 --> 00:33:53,080 Speaker 1: diagnosis are important, but I really try to understand what 505 00:33:53,280 --> 00:33:56,200 Speaker 1: is the biggest thing to stressing my patients. So what 506 00:33:56,600 --> 00:33:58,840 Speaker 1: is the issue. Is it causing you a lack of 507 00:33:58,840 --> 00:34:01,000 Speaker 1: sleep that I'm going to talk about your sleep? Is 508 00:34:01,000 --> 00:34:03,360 Speaker 1: it causing you to have palpitations? Is it causing you 509 00:34:03,400 --> 00:34:05,480 Speaker 1: to have headaches? Like? What is going on? So that 510 00:34:05,480 --> 00:34:07,080 Speaker 1: we can address a head on so you can move 511 00:34:07,120 --> 00:34:09,480 Speaker 1: on with your life, because these things be very distressing. 512 00:34:09,840 --> 00:34:11,919 Speaker 1: So I would like to hear Dr Danny a little 513 00:34:11,920 --> 00:34:14,600 Speaker 1: bit about how you are training your residents to do 514 00:34:14,680 --> 00:34:16,880 Speaker 1: some of this new work. I think a lot of 515 00:34:16,960 --> 00:34:20,040 Speaker 1: us are kind of building the foundation as we teach 516 00:34:20,080 --> 00:34:22,120 Speaker 1: in a lot of ways. Right, there is no manual 517 00:34:22,200 --> 00:34:24,640 Speaker 1: for like some of these things that the field hasn't 518 00:34:24,680 --> 00:34:27,040 Speaker 1: necessarily kept up with. So can you talk a little 519 00:34:27,080 --> 00:34:29,080 Speaker 1: bit about how you're infusing maybe some of this into 520 00:34:29,120 --> 00:34:32,520 Speaker 1: your training with your residence. So first, I tell them 521 00:34:32,600 --> 00:34:35,600 Speaker 1: to take a look at themselves. I happen to be 522 00:34:35,640 --> 00:34:39,720 Speaker 1: a program director, a teacher, and educator who uses social media. 523 00:34:39,760 --> 00:34:42,920 Speaker 1: So I'm very into using diverse media, whether it's film 524 00:34:42,920 --> 00:34:46,719 Speaker 1: and documentaries, whether it's social media, to inform people about 525 00:34:46,719 --> 00:34:49,120 Speaker 1: what we're doing at Howard and in our program. But 526 00:34:49,360 --> 00:34:52,440 Speaker 1: also take a look at yourself, take a look at 527 00:34:52,480 --> 00:34:54,600 Speaker 1: how you're taking care of yourself. What is wellness look 528 00:34:54,680 --> 00:34:56,960 Speaker 1: like for you, so that you can understand what wellness 529 00:34:57,000 --> 00:34:59,400 Speaker 1: needs to look like for others. So I really asked 530 00:34:59,440 --> 00:35:03,279 Speaker 1: them to start with the basics and build a foundation 531 00:35:03,480 --> 00:35:06,160 Speaker 1: for what people should know and what people need to 532 00:35:06,280 --> 00:35:09,520 Speaker 1: know and understanding why they're seeking help. And I try 533 00:35:09,520 --> 00:35:12,480 Speaker 1: to make sure that my residents and students understand that 534 00:35:12,520 --> 00:35:15,919 Speaker 1: there's not just one approach. There's not just therapy. There's 535 00:35:15,960 --> 00:35:19,400 Speaker 1: also medication, there's also social support, there's a self assessment. 536 00:35:19,840 --> 00:35:21,600 Speaker 1: You might need to bring in music, might need to 537 00:35:21,600 --> 00:35:25,680 Speaker 1: bring in yoga, exercise, food, relaxation, whatever it is. But 538 00:35:25,719 --> 00:35:28,480 Speaker 1: I try to make sure that they understand patients and 539 00:35:28,520 --> 00:35:31,480 Speaker 1: people not as just that picture that you see right 540 00:35:31,520 --> 00:35:34,440 Speaker 1: there in your office or in the emergency room, Like 541 00:35:34,560 --> 00:35:38,480 Speaker 1: really try to understand people holistically and what they're dealing 542 00:35:38,480 --> 00:35:41,400 Speaker 1: with in total, and understanding all the factors that are 543 00:35:41,440 --> 00:35:44,759 Speaker 1: acting upon them, and understanding, especially for those who are 544 00:35:45,600 --> 00:35:48,319 Speaker 1: living in this age where social media is important, where 545 00:35:48,360 --> 00:35:50,359 Speaker 1: it drives you, where it's how you promote yourself, where 546 00:35:50,360 --> 00:35:53,560 Speaker 1: it's how you market yourself, understand the large impact that 547 00:35:53,640 --> 00:35:56,000 Speaker 1: it has on them. And I think it's easier and 548 00:35:56,080 --> 00:35:59,880 Speaker 1: luckily I have that advantage that I am younger than 549 00:36:00,040 --> 00:36:02,959 Speaker 1: some of my more senior or season colleagues who don't 550 00:36:03,000 --> 00:36:05,879 Speaker 1: have that same connection to social media and understand really 551 00:36:05,960 --> 00:36:10,200 Speaker 1: the intricacies of what it's like to receive these comments 552 00:36:10,239 --> 00:36:12,759 Speaker 1: all the time, what it's like when people perceive or 553 00:36:12,760 --> 00:36:15,960 Speaker 1: talk about your weaknesses or character flaws that they perceive 554 00:36:16,120 --> 00:36:19,840 Speaker 1: that's their perception. So I really get my presidents to 555 00:36:20,440 --> 00:36:24,160 Speaker 1: think about the people their patients as a whole, not 556 00:36:24,280 --> 00:36:28,320 Speaker 1: just as a diagnosis. Mm hmmm, yeah, you know, Dr Danny. 557 00:36:28,320 --> 00:36:30,160 Speaker 1: As we're talking, I'm thinking it could be helpful to 558 00:36:30,239 --> 00:36:34,760 Speaker 1: even have questions on like our intake forms about online behavior, 559 00:36:34,840 --> 00:36:38,600 Speaker 1: right because I could imagine people maybe feeling a little hesitant. 560 00:36:38,600 --> 00:36:40,239 Speaker 1: I mean, we know it's already difficult to just come 561 00:36:40,280 --> 00:36:42,560 Speaker 1: to a their business psychiatrist office. Right, But if you 562 00:36:42,600 --> 00:36:44,840 Speaker 1: feel like I'm coming in talking about like, oh, I 563 00:36:44,880 --> 00:36:47,600 Speaker 1: got harassed on YouTube, I think for a lot of 564 00:36:47,600 --> 00:36:50,359 Speaker 1: people they made worry that that isn't a valid concern, right, 565 00:36:50,400 --> 00:36:52,839 Speaker 1: But it is a valid concern. I agree. I think 566 00:36:53,000 --> 00:36:55,799 Speaker 1: that I work with a lot of college students and 567 00:36:55,840 --> 00:36:59,120 Speaker 1: a lot of young adults, and they come in talking 568 00:36:59,120 --> 00:37:01,440 Speaker 1: about it. They're like, yes, I'm like, well, what's going on? Well, 569 00:37:01,440 --> 00:37:03,399 Speaker 1: this person said this on Facebook and then I said 570 00:37:03,440 --> 00:37:05,800 Speaker 1: this and this and this, and if I'm like, oh, okay, 571 00:37:06,200 --> 00:37:08,279 Speaker 1: And they're very open about it. And it's because it's 572 00:37:08,280 --> 00:37:11,000 Speaker 1: play such a huge part in their lives that they 573 00:37:11,040 --> 00:37:14,799 Speaker 1: don't hold back. But for those who are being harassed 574 00:37:15,280 --> 00:37:18,360 Speaker 1: or feel that they are being traumatized or being triggered, 575 00:37:18,400 --> 00:37:20,480 Speaker 1: they're not as open to it. Because I have my 576 00:37:20,640 --> 00:37:23,680 Speaker 1: residents to ask about racism, and we talk about social 577 00:37:23,719 --> 00:37:29,760 Speaker 1: justice and the media and how that impacts yourself, your parents, 578 00:37:30,160 --> 00:37:33,759 Speaker 1: the older generations, and how generational trauma is really perpetuated. 579 00:37:34,040 --> 00:37:36,399 Speaker 1: But we need to add in social media and there too, 580 00:37:36,440 --> 00:37:38,560 Speaker 1: because while we do have a lot of patients who 581 00:37:38,640 --> 00:37:40,640 Speaker 1: are happy to talk about it, but everyone is not 582 00:37:40,680 --> 00:37:43,640 Speaker 1: as open. So that is something valuable and beneficial. Do 583 00:37:43,640 --> 00:37:45,920 Speaker 1: you have other tips, not you, Danny, just for us 584 00:37:45,960 --> 00:37:49,759 Speaker 1: to manage our peace and promote peace for ourselves and 585 00:37:50,080 --> 00:37:53,439 Speaker 1: established healthy boundaries. Something that has been helpful for me 586 00:37:53,600 --> 00:37:56,359 Speaker 1: and I also encourage my patients to do it is 587 00:37:56,520 --> 00:37:59,960 Speaker 1: turn off notifications. You no longer own your own time 588 00:38:00,000 --> 00:38:02,680 Speaker 1: time And I learned this actually from one of my assistants. 589 00:38:03,120 --> 00:38:08,080 Speaker 1: Is just selecting a time that you look at social media. Okay, 590 00:38:08,239 --> 00:38:10,160 Speaker 1: when I wake up, I'm going to check the news 591 00:38:10,600 --> 00:38:14,239 Speaker 1: and I'm gonna check Instagram from seven thirty eight or 592 00:38:14,280 --> 00:38:16,680 Speaker 1: something like that. Owned that time, Schedule that time and 593 00:38:16,719 --> 00:38:19,640 Speaker 1: so it's not taking up all of your time throughout 594 00:38:19,719 --> 00:38:22,799 Speaker 1: the day, or if you only do it twice a day, 595 00:38:22,880 --> 00:38:25,319 Speaker 1: or you only do it in the evening, but somehow 596 00:38:25,360 --> 00:38:30,439 Speaker 1: to compartmentalize that aspect so it's not taking over your 597 00:38:31,120 --> 00:38:33,560 Speaker 1: entire day, which I know can be a struggle, especially 598 00:38:33,560 --> 00:38:37,279 Speaker 1: for those who are creators who create content, who have 599 00:38:37,440 --> 00:38:39,960 Speaker 1: small businesses, Like people are asking you questions in your 600 00:38:40,000 --> 00:38:42,239 Speaker 1: d M and if you don't answer them quick enough, 601 00:38:42,719 --> 00:38:46,200 Speaker 1: they're running a negative comment about you. It's challenging, but 602 00:38:46,280 --> 00:38:48,680 Speaker 1: you really have to find a balance that puts you first. 603 00:38:50,160 --> 00:38:53,040 Speaker 1: And I think my biggest tip or take home is 604 00:38:53,040 --> 00:38:58,440 Speaker 1: saying yes to yourself and protecting yourself first promoting piece 605 00:38:58,480 --> 00:39:01,480 Speaker 1: for yourself is what's more, it's important, Like self care 606 00:39:01,560 --> 00:39:04,560 Speaker 1: is not just a spa day. It's self preservation, it's 607 00:39:04,600 --> 00:39:08,440 Speaker 1: self protection. It's saying no, it's stepping back, it's turning 608 00:39:08,440 --> 00:39:12,960 Speaker 1: off notifications. I have found great joy and taking off 609 00:39:13,040 --> 00:39:16,719 Speaker 1: my work email notifications from my phone, Like just that 610 00:39:17,280 --> 00:39:21,560 Speaker 1: really so much stress for me, so important. I wonder 611 00:39:21,560 --> 00:39:23,680 Speaker 1: if you could also share how we might be able 612 00:39:23,680 --> 00:39:26,520 Speaker 1: to support family and friends who maybe have had an 613 00:39:26,560 --> 00:39:29,319 Speaker 1: experience with being bullied or harassed online, or the things 614 00:39:29,320 --> 00:39:30,920 Speaker 1: that we can do to support the other people in 615 00:39:30,960 --> 00:39:33,120 Speaker 1: our lives. Yes, I think the first thing that we 616 00:39:33,160 --> 00:39:34,920 Speaker 1: need to do the same thing I was saying about 617 00:39:34,920 --> 00:39:39,200 Speaker 1: my residents and students is listen, like actually listen to 618 00:39:39,280 --> 00:39:43,520 Speaker 1: what they're saying. Don't minimize their experience. Don't say it's 619 00:39:43,520 --> 00:39:46,239 Speaker 1: just social media, it's not a big deal. Acknowledge that 620 00:39:46,320 --> 00:39:48,960 Speaker 1: this is something serious and acknowledge that it actually has 621 00:39:49,000 --> 00:39:52,239 Speaker 1: an impact. Acknowledged and recognize that this is something that 622 00:39:52,320 --> 00:39:55,440 Speaker 1: can be triggering, This is something that can be traumatizing, 623 00:39:55,880 --> 00:39:59,920 Speaker 1: and there's often stigma about even just having a conversation 624 00:40:00,000 --> 00:40:02,840 Speaker 1: and understand it's a huge step to even talk about 625 00:40:02,880 --> 00:40:05,520 Speaker 1: mental health, to even talk about bullying, to even talk 626 00:40:05,560 --> 00:40:09,000 Speaker 1: about the trauma that you're going through. So first go 627 00:40:09,160 --> 00:40:13,080 Speaker 1: in with your listening ears, actually listen, listen to what 628 00:40:13,200 --> 00:40:16,799 Speaker 1: someone saying, Listen and learn, and don't listen to just 629 00:40:16,920 --> 00:40:20,320 Speaker 1: have a response and really ask them to check ins. 630 00:40:20,600 --> 00:40:23,160 Speaker 1: I think that a lot of times people are like, oh, 631 00:40:23,200 --> 00:40:25,520 Speaker 1: that's my strong friend, or yeah, she looks great on 632 00:40:25,560 --> 00:40:27,399 Speaker 1: social media. I don't need to check in on her, 633 00:40:27,880 --> 00:40:29,600 Speaker 1: or I don't need to check in on hand or them, 634 00:40:29,840 --> 00:40:32,480 Speaker 1: but check in on them. If they share something with you, 635 00:40:32,920 --> 00:40:36,160 Speaker 1: they're being on social media, they've told you about bullying, 636 00:40:36,320 --> 00:40:39,160 Speaker 1: check in on them, check in on them, and tell 637 00:40:39,200 --> 00:40:43,520 Speaker 1: them when it's time to seek professional help. Now I'm 638 00:40:43,560 --> 00:40:46,120 Speaker 1: saying this and I know, and you know, Dr Joy, 639 00:40:46,200 --> 00:40:50,080 Speaker 1: that access to mental health care, access to therapy is 640 00:40:50,080 --> 00:40:53,799 Speaker 1: a privilege, and I know that it can be quite elitist, 641 00:40:53,920 --> 00:40:57,120 Speaker 1: and it's not always based on what resources you have. 642 00:40:57,200 --> 00:41:01,279 Speaker 1: It's not always the most readily available thing. But knowing 643 00:41:01,320 --> 00:41:03,239 Speaker 1: when you are out of your lead, knowing when you're 644 00:41:03,280 --> 00:41:07,040 Speaker 1: not able to help, just listen listen with purpose and 645 00:41:07,120 --> 00:41:11,239 Speaker 1: listen to try to understand what's happening. What are your 646 00:41:11,280 --> 00:41:13,440 Speaker 1: thoughts about like the different platforms in the way that 647 00:41:13,520 --> 00:41:16,120 Speaker 1: harassment and bullying maybe looks. My friends and I called 648 00:41:16,160 --> 00:41:20,160 Speaker 1: Twitter the angry app because it seems like that's where 649 00:41:20,360 --> 00:41:24,239 Speaker 1: people come to like vent to attack. Again. There's great 650 00:41:24,280 --> 00:41:27,040 Speaker 1: things like black Twitter and the community and all of 651 00:41:27,040 --> 00:41:31,280 Speaker 1: those things, but I think that Twitter has a space 652 00:41:32,080 --> 00:41:37,480 Speaker 1: for foolishness because you can't delete someone's comments, like I 653 00:41:37,520 --> 00:41:40,200 Speaker 1: can't delete someone's comment. You know, on Instagram, someone says something, 654 00:41:40,719 --> 00:41:43,520 Speaker 1: you can just go right ahead and delete it. I 655 00:41:43,560 --> 00:41:47,480 Speaker 1: think on Twitter it allows for the foolishness to continue. 656 00:41:47,600 --> 00:41:51,040 Speaker 1: It allows for hateful comments to continue even though you 657 00:41:51,080 --> 00:41:53,680 Speaker 1: can hide them from yourself, but they're still there and 658 00:41:53,680 --> 00:41:56,719 Speaker 1: other people can comment on them unless you delete your 659 00:41:56,920 --> 00:41:58,880 Speaker 1: entire posts, which you might not want to do if 660 00:41:58,920 --> 00:42:01,359 Speaker 1: you were saying something in and something that you want 661 00:42:01,360 --> 00:42:03,480 Speaker 1: to others stay engage with and you wanted to have 662 00:42:03,520 --> 00:42:06,600 Speaker 1: some influence on. And I think that Twitter is the 663 00:42:06,640 --> 00:42:10,279 Speaker 1: social media platform that allows for the most attacking and 664 00:42:10,680 --> 00:42:14,000 Speaker 1: quick back and forth subtweeting and tweeting, and you can't 665 00:42:14,200 --> 00:42:17,440 Speaker 1: prevent them from commenting or if you want to prevent 666 00:42:17,480 --> 00:42:19,960 Speaker 1: someone from retweeting what you said, then you have to 667 00:42:20,000 --> 00:42:24,600 Speaker 1: turn off retweeting for everything. And I think that what 668 00:42:24,719 --> 00:42:28,600 Speaker 1: we see on Instagram is a lot of like exposing people. 669 00:42:28,760 --> 00:42:31,520 Speaker 1: I think they're but they're like to expose people on 670 00:42:31,640 --> 00:42:35,840 Speaker 1: screenshot people during the docks saying both on Twitter and Instagram. 671 00:42:35,840 --> 00:42:38,560 Speaker 1: But I think that Instagram is more for like the oh, 672 00:42:38,600 --> 00:42:41,600 Speaker 1: I got the screenshot, like I got the receipts, You're 673 00:42:41,640 --> 00:42:46,560 Speaker 1: gonna see this here. But it's easier to block negativity 674 00:42:46,600 --> 00:42:50,400 Speaker 1: on YouTube and on Instagram because you have the control 675 00:42:50,480 --> 00:42:54,200 Speaker 1: to delete things on your own page. I actually appreciate 676 00:42:54,200 --> 00:42:57,520 Speaker 1: when I see content creators or people posting and saying, 677 00:42:57,600 --> 00:43:02,360 Speaker 1: like any negative comment, I would delete like m but 678 00:43:02,520 --> 00:43:04,760 Speaker 1: in the boundary right there, like letting you know, don't 679 00:43:04,760 --> 00:43:08,720 Speaker 1: try it, but if you do, you're going to be deleted. 680 00:43:09,400 --> 00:43:11,839 Speaker 1: So Dr Danny, let us know where can people find 681 00:43:11,880 --> 00:43:14,080 Speaker 1: you and keep up with you on the social media 682 00:43:14,160 --> 00:43:18,520 Speaker 1: channels like we've all the socials, Dr Joy. You can 683 00:43:18,560 --> 00:43:22,520 Speaker 1: find me on Instagram and Twitter at a doc named Danny, 684 00:43:22,640 --> 00:43:27,239 Speaker 1: so like a doc named d A and I um 685 00:43:27,280 --> 00:43:31,360 Speaker 1: on Instagram and Twitter. I also am the content creator 686 00:43:31,440 --> 00:43:35,839 Speaker 1: behind a YouTube and Instagram platform called Black Psychiatry, which 687 00:43:35,920 --> 00:43:39,600 Speaker 1: includes lots of black psychiatrists and other mental health professionals 688 00:43:39,680 --> 00:43:43,560 Speaker 1: talking about all things black mental health for the community, free, 689 00:43:44,000 --> 00:43:48,440 Speaker 1: easily accessible. We have lots of content there and also 690 00:43:48,600 --> 00:43:51,520 Speaker 1: you can find me. I have my own podcast. It's 691 00:43:51,560 --> 00:43:54,080 Speaker 1: called The Next seventy two Hours. It's wherever you find 692 00:43:54,080 --> 00:44:00,480 Speaker 1: your podcasts, Apple, Spotify, Google, and it's about racism, historical racism, 693 00:44:00,640 --> 00:44:04,880 Speaker 1: current racism, and mental health and medicine and psychiatry. So 694 00:44:04,960 --> 00:44:09,120 Speaker 1: that's definitely something that I would appreciate your listeners listening to. 695 00:44:09,520 --> 00:44:14,400 Speaker 1: And I have a website. It's Danielle hairston MD dot com. Perfect. 696 00:44:14,400 --> 00:44:16,319 Speaker 1: Thank you so much for spending some time with us today, 697 00:44:16,440 --> 00:44:22,120 Speaker 1: Dr Danny, thank you. I'm so thankful that Dr Times 698 00:44:22,160 --> 00:44:24,400 Speaker 1: and Dr Danny were able to share their expertise with 699 00:44:24,480 --> 00:44:27,120 Speaker 1: us today. To learn more about them and their work, 700 00:44:27,520 --> 00:44:29,600 Speaker 1: be sure to visit the show notes at Therapy for 701 00:44:29,640 --> 00:44:32,960 Speaker 1: Black Girls dot com. Slash session to thirty seven, and 702 00:44:33,000 --> 00:44:36,200 Speaker 1: don't forget to text two sisters right now and tell 703 00:44:36,280 --> 00:44:39,160 Speaker 1: them to check out the episode. If you're looking for 704 00:44:39,200 --> 00:44:41,839 Speaker 1: a therapist in your area, be sure to check out 705 00:44:41,840 --> 00:44:44,839 Speaker 1: our therapist directory at Therapy for Black Girls dot com 706 00:44:44,920 --> 00:44:48,279 Speaker 1: slash directory and if you want to continue digging into 707 00:44:48,280 --> 00:44:51,240 Speaker 1: this topic or just be in community with other sisters, 708 00:44:51,680 --> 00:44:53,760 Speaker 1: come on over and join us in the Sister Circle. 709 00:44:54,239 --> 00:44:56,840 Speaker 1: It's our cozy corner of the Internet designed just for 710 00:44:56,920 --> 00:45:00,320 Speaker 1: black women. You can join us at community dot Arapy 711 00:45:00,320 --> 00:45:03,480 Speaker 1: for black girls dot com. Thank you all so much 712 00:45:03,520 --> 00:45:06,040 Speaker 1: for joining me again this week. I look forward to 713 00:45:06,080 --> 00:45:10,000 Speaker 1: continuing this conversation with you all real soon. Take your 714 00:45:10,080 --> 00:45:10,360 Speaker 1: care