1 00:00:05,160 --> 00:00:06,760 Speaker 1: Hey, this is Annie and Samantha. 2 00:00:06,760 --> 00:00:08,319 Speaker 2: I'm welcome to stuff one ever told you a protection 3 00:00:08,360 --> 00:00:22,480 Speaker 2: of I Heart Radio, and welcome to another classic. Samantha 4 00:00:22,520 --> 00:00:25,200 Speaker 2: and I have been talking a lot about health lately 5 00:00:26,480 --> 00:00:29,319 Speaker 2: because we both had some issues that we've had to 6 00:00:30,440 --> 00:00:36,560 Speaker 2: seek medical attention for. And we're both having trouble now 7 00:00:36,600 --> 00:00:42,800 Speaker 2: that we've finally taken that step getting that medical Yeah, 8 00:00:42,840 --> 00:00:45,080 Speaker 2: which is no laughing matter, but it's just we put 9 00:00:45,080 --> 00:00:47,760 Speaker 2: it off for so long and now it's like, you 10 00:00:47,920 --> 00:00:49,879 Speaker 2: really really should have done it earlier, at. 11 00:00:49,880 --> 00:00:52,720 Speaker 1: Least in my case, but I did. 12 00:00:52,800 --> 00:00:57,880 Speaker 2: Recently, both of us have seen medical professionals, and I 13 00:00:58,360 --> 00:01:01,040 Speaker 2: there's some things really stuck out with me with my 14 00:01:01,200 --> 00:01:05,160 Speaker 2: visit to the hospital. And one thing was I could 15 00:01:05,280 --> 00:01:11,880 Speaker 2: tell that some people thought I was trying to diagnose myself. 16 00:01:12,520 --> 00:01:16,120 Speaker 2: And this is something that you have talked about. You 17 00:01:16,240 --> 00:01:21,320 Speaker 2: did an episode about being on TikTok and diagnosing yourself 18 00:01:21,360 --> 00:01:23,920 Speaker 2: more around mental health than anything else. But I could 19 00:01:24,000 --> 00:01:27,840 Speaker 2: tell that some people because I would I said it 20 00:01:27,959 --> 00:01:29,480 Speaker 2: in the episode where I talked about it. 21 00:01:29,480 --> 00:01:31,440 Speaker 1: But I went to like fifteen. 22 00:01:31,040 --> 00:01:35,880 Speaker 2: Doctors, not doctors, but like personnel, and some of them 23 00:01:36,000 --> 00:01:40,319 Speaker 2: I think were like, she saw this online. 24 00:01:40,480 --> 00:01:42,880 Speaker 1: Now she thinks she's got our social media. 25 00:01:43,680 --> 00:01:46,640 Speaker 2: I wasn't even saying I was just saying, like, I, 26 00:01:46,720 --> 00:01:48,720 Speaker 2: you know, it always happens in the morning, and I've 27 00:01:48,720 --> 00:01:52,000 Speaker 2: read that if I know how it sounds, I know 28 00:01:52,040 --> 00:01:54,600 Speaker 2: how it sounds. But I'm not on TikTok, and that's 29 00:01:54,680 --> 00:01:59,000 Speaker 2: not what was happening. But I do think a lot 30 00:01:59,040 --> 00:02:01,200 Speaker 2: of people do this kind of a running joke about 31 00:02:01,200 --> 00:02:05,320 Speaker 2: like the web, the web in d oh, I'm dying thing. 32 00:02:06,160 --> 00:02:09,600 Speaker 2: But what you're saying to me, oh no, never, never, Samantha, 33 00:02:10,800 --> 00:02:19,040 Speaker 2: but yes, please enjoy this classic episode. Hey, this is 34 00:02:19,080 --> 00:02:21,280 Speaker 2: Annie and Samantha and welcome to stephan I never told 35 00:02:21,280 --> 00:02:22,480 Speaker 2: you a prediction of iHeartRadio. 36 00:02:31,960 --> 00:02:35,800 Speaker 3: Yeah, and welcome to another Monday Mini. And for today's 37 00:02:35,919 --> 00:02:40,200 Speaker 3: Monday Mini, we're gonna play What's Samantha's Diagnosis with y'all. 38 00:02:40,280 --> 00:02:43,160 Speaker 3: I'm having a moment right now because I just told 39 00:02:43,200 --> 00:02:46,560 Speaker 3: Aandie I'm smelling something very floral. I'm trying to figure 40 00:02:46,560 --> 00:02:50,000 Speaker 3: out why because I'm in my little cubby and honestly, 41 00:02:50,919 --> 00:02:53,680 Speaker 3: this foreign smells almost like a perfume. So I'm like 42 00:02:53,760 --> 00:02:56,040 Speaker 3: freaking out. I'm like, am I having a stroke. I'm 43 00:02:56,080 --> 00:02:58,320 Speaker 3: concerned I don't think so. And if you see me 44 00:02:58,360 --> 00:03:01,560 Speaker 3: doing things, please let me know. Might be an odd episode. 45 00:03:02,639 --> 00:03:07,000 Speaker 3: Yeah anyway, But I mean I say all this kiddingly, 46 00:03:07,560 --> 00:03:10,400 Speaker 3: but I mean I'm not gonna lie when I say 47 00:03:10,440 --> 00:03:12,399 Speaker 3: I might go look on TikTok. I might go look 48 00:03:12,400 --> 00:03:16,800 Speaker 3: on TikTok. Honestly, as much as I love TikTok, there's 49 00:03:16,880 --> 00:03:18,680 Speaker 3: so many things that TikTok I've told me that I'm 50 00:03:18,680 --> 00:03:20,959 Speaker 3: starting to get really concerned about a lot of things, 51 00:03:21,080 --> 00:03:23,840 Speaker 3: and as in fact, I've also learned about conditions that 52 00:03:24,120 --> 00:03:27,839 Speaker 3: I didn't know existed and feeling like maybe that's part 53 00:03:27,840 --> 00:03:29,840 Speaker 3: of my problem because I really feel like there's something 54 00:03:29,880 --> 00:03:32,359 Speaker 3: off all the time. And when I say something off, 55 00:03:32,400 --> 00:03:35,440 Speaker 3: I see like something's imbalanced in my body. And it 56 00:03:35,480 --> 00:03:38,800 Speaker 3: could be that I'm going through perimenopause, which we're going 57 00:03:38,840 --> 00:03:40,320 Speaker 3: to talk about in a later date, because I'm like, 58 00:03:40,360 --> 00:03:42,920 Speaker 3: what is that and have I entered that? And what 59 00:03:42,960 --> 00:03:46,680 Speaker 3: does that look like? As well as maybe it's something 60 00:03:46,680 --> 00:03:49,000 Speaker 3: that I've always been going through that didn't I did 61 00:03:49,040 --> 00:03:51,560 Speaker 3: not realize was normal. When I talk to people, I'm like, oh, 62 00:03:51,560 --> 00:03:54,640 Speaker 3: that's not normal. I found out that people Asian people 63 00:03:54,680 --> 00:03:56,800 Speaker 3: have a gene that makes their ear wax different from 64 00:03:56,840 --> 00:03:59,760 Speaker 3: other people. I was like what, And then I started 65 00:03:59,800 --> 00:04:01,800 Speaker 3: ask skin around like is this the type of earwax 66 00:04:01,840 --> 00:04:03,960 Speaker 3: you have? And they're like yes, And I was like what. 67 00:04:04,320 --> 00:04:07,200 Speaker 1: I love that you asked around about. 68 00:04:08,080 --> 00:04:09,760 Speaker 3: Like white people, tell me what this is. 69 00:04:14,520 --> 00:04:16,880 Speaker 1: I know you asked multiple. 70 00:04:16,560 --> 00:04:19,960 Speaker 3: I asked multiple people, but like, this is all because 71 00:04:20,600 --> 00:04:23,560 Speaker 3: of TikTok. These are things that I'm like, huh, I 72 00:04:23,640 --> 00:04:26,960 Speaker 3: did not know this. Again, like I said, because I'm 73 00:04:26,960 --> 00:04:29,039 Speaker 3: watching these things and these things are popping up on 74 00:04:29,120 --> 00:04:31,839 Speaker 3: my for you page, I sorry to become a little 75 00:04:31,880 --> 00:04:35,479 Speaker 3: concerned with all the similarities I've seemed to be having 76 00:04:35,520 --> 00:04:41,560 Speaker 3: with all these mental and medical help things that I'm like, what, 77 00:04:41,560 --> 00:04:45,520 Speaker 3: what's going on? And with that, I began to wonder 78 00:04:45,600 --> 00:04:50,360 Speaker 3: if others were feeling and noticing this too. Uh. And yes, 79 00:04:50,360 --> 00:04:53,720 Speaker 3: there are article after article with similar questions, and even 80 00:04:53,800 --> 00:04:57,239 Speaker 3: one is concerned on how it's affecting the youth today. 81 00:04:58,160 --> 00:04:59,760 Speaker 3: But I will say we're going to go back and 82 00:04:59,760 --> 00:05:02,520 Speaker 3: forth bit on this for a minute, because it also 83 00:05:02,560 --> 00:05:04,560 Speaker 3: it's one of those moments of like we need things 84 00:05:04,600 --> 00:05:07,919 Speaker 3: like this to be taken seriously. There are two different 85 00:05:07,960 --> 00:05:10,760 Speaker 3: perspectives obviously, with this one. It's a good thing because 86 00:05:10,760 --> 00:05:14,520 Speaker 3: it brings awareness to mental health and physical health for 87 00:05:14,600 --> 00:05:17,440 Speaker 3: those who've been gas lit by the medical filled u. 88 00:05:17,800 --> 00:05:19,960 Speaker 3: You can see why women You can see our episodes 89 00:05:19,960 --> 00:05:23,240 Speaker 3: about why women are not believed by the medical professionals, 90 00:05:23,279 --> 00:05:26,360 Speaker 3: and even our episode on women with autism and ADHD 91 00:05:26,920 --> 00:05:30,520 Speaker 3: and how misdiagnosed that is for women specifically, and for 92 00:05:30,560 --> 00:05:32,680 Speaker 3: those in the black community specifically, like for those in 93 00:05:32,720 --> 00:05:35,320 Speaker 3: the marginalized community, how often this is misdiagnosed or not 94 00:05:35,360 --> 00:05:39,200 Speaker 3: diagnosed at all, or just disregarded altogether. And the information 95 00:05:39,240 --> 00:05:41,600 Speaker 3: on how to be taken seriously by their doctors has 96 00:05:41,640 --> 00:05:45,080 Speaker 3: been validating and even life saving, to the point that, yes, 97 00:05:45,480 --> 00:05:48,400 Speaker 3: I know doctors are really, really, really really really upset 98 00:05:48,440 --> 00:05:52,440 Speaker 3: with WebMD and getting more so with TikTok because people 99 00:05:52,480 --> 00:05:54,920 Speaker 3: are like I saw this on TikTok, I googled this 100 00:05:55,000 --> 00:05:58,599 Speaker 3: on WebMD, and yes, I think that TikTok has become 101 00:05:58,720 --> 00:06:02,200 Speaker 3: the new web ms, which many have talked about recently. 102 00:06:02,200 --> 00:06:04,480 Speaker 3: I saw a Reddit post about it like it's just 103 00:06:04,600 --> 00:06:06,839 Speaker 3: the new MMD and people complaining about it. I'm like, 104 00:06:06,960 --> 00:06:11,800 Speaker 3: it's kind of true, but we talked about before any how, 105 00:06:11,960 --> 00:06:15,279 Speaker 3: TikTok has become one of the best or the bigger 106 00:06:15,600 --> 00:06:19,520 Speaker 3: search engines for people today, like instead of going to Google, 107 00:06:19,600 --> 00:06:23,000 Speaker 3: they're going to TikTok. And I wonder if that's partially 108 00:06:23,040 --> 00:06:26,760 Speaker 3: because not only did Google get bought out essentially by 109 00:06:26,800 --> 00:06:28,800 Speaker 3: people who wanted their content to be at the top 110 00:06:29,279 --> 00:06:30,960 Speaker 3: or who can do this and who has the more 111 00:06:31,000 --> 00:06:34,560 Speaker 3: money companies, big corporate companies coming in and for a while, 112 00:06:35,080 --> 00:06:38,720 Speaker 3: TikTok was not like that. It has drastically changed. Please 113 00:06:38,760 --> 00:06:42,920 Speaker 3: see previous episodes with Bridget about consumer stuff as well 114 00:06:42,960 --> 00:06:45,760 Speaker 3: as influencer stuff and who's getting bought out. We could 115 00:06:45,760 --> 00:06:51,680 Speaker 3: talk about this genocide that is currently occurring and influencers 116 00:06:51,680 --> 00:06:55,839 Speaker 3: who are being bought out by pro specific sides. I 117 00:06:55,839 --> 00:06:58,640 Speaker 3: guess you can say, you know, like they've been called out. 118 00:06:58,839 --> 00:07:03,000 Speaker 3: They've talked about big lobbyists going after politicians to try 119 00:07:03,000 --> 00:07:04,960 Speaker 3: to run against other politicians that they did not like 120 00:07:05,000 --> 00:07:07,320 Speaker 3: their agenda. We've known that's happened with NRA, but they're 121 00:07:07,320 --> 00:07:11,240 Speaker 3: actually using TikTok and social media influences like that and 122 00:07:11,280 --> 00:07:14,320 Speaker 3: buying people out like that. So we know that TikTok 123 00:07:14,320 --> 00:07:17,640 Speaker 3: has changed, unfortunately from what it was originally when you 124 00:07:17,720 --> 00:07:22,080 Speaker 3: could get more probably genuine, I guess posts. Whether it's 125 00:07:22,440 --> 00:07:26,240 Speaker 3: people talking about their own experiences with whatever diagnosis they 126 00:07:26,280 --> 00:07:29,560 Speaker 3: may have or whatever physical element they may have, and 127 00:07:29,600 --> 00:07:32,040 Speaker 3: being able to understand what it is and for a 128 00:07:32,040 --> 00:07:34,560 Speaker 3: lot of people figuring out, oh, that's what's been happening 129 00:07:34,600 --> 00:07:39,560 Speaker 3: with me, POTS, POTS, post les, tachycardia syndrome, and I 130 00:07:39,560 --> 00:07:42,520 Speaker 3: hope I said that right. Which is POTS, which has 131 00:07:42,520 --> 00:07:47,280 Speaker 3: like dizziness or lightheadedness, fainting or almost fainting, noticeable heartbeats 132 00:07:47,440 --> 00:07:49,920 Speaker 3: or heart palpitations, chest pains, shortness of breath, the chickening, 133 00:07:50,000 --> 00:07:53,040 Speaker 3: and sweating. Those are some of the symptoms. I'm wondering 134 00:07:53,040 --> 00:07:55,240 Speaker 3: if I need to question my doctor about that, because 135 00:07:55,360 --> 00:07:58,520 Speaker 3: I definitely have fainted, not a lot, because I've seen 136 00:07:58,520 --> 00:08:02,240 Speaker 3: people with severe POTS who think constantly like they have 137 00:08:02,320 --> 00:08:04,760 Speaker 3: to have a service dog because this dog can feel it. 138 00:08:05,000 --> 00:08:06,640 Speaker 3: But a lot of those are like, yeah, I do that, 139 00:08:06,640 --> 00:08:08,760 Speaker 3: that's not normal. Not everybody does that. I thought we 140 00:08:08,800 --> 00:08:11,880 Speaker 3: all did that, So there's like different questions. Those things 141 00:08:11,920 --> 00:08:14,760 Speaker 3: like that can be really really helpful and life saving. 142 00:08:14,840 --> 00:08:17,480 Speaker 3: In a twenty twenty two Time article titled for Some 143 00:08:17,600 --> 00:08:21,680 Speaker 3: Women with ADHD, TikTok is the first place they felt heard. 144 00:08:22,000 --> 00:08:24,560 Speaker 3: So they talk about how TikTok has been amazing for 145 00:08:24,680 --> 00:08:27,200 Speaker 3: them for the women who have figured this out, And 146 00:08:27,240 --> 00:08:30,400 Speaker 3: here's a quote from a woman who discovered her diagnosis 147 00:08:30,440 --> 00:08:35,000 Speaker 3: after researching and finding videos on TikTok. For many women 148 00:08:35,080 --> 00:08:37,680 Speaker 3: who see these videos in their feed, it's the first 149 00:08:37,720 --> 00:08:40,319 Speaker 3: time they've learned about some of the symptoms of ADHD, 150 00:08:40,760 --> 00:08:44,760 Speaker 3: beyond the most widely known hyperactive and trouble focusing quote. 151 00:08:44,840 --> 00:08:48,200 Speaker 3: As an overachieving child who got good grades, ADHD was 152 00:08:48,320 --> 00:08:51,040 Speaker 3: never on my radar. Lace told Time in an email, 153 00:08:51,120 --> 00:08:54,240 Speaker 3: I was shocked to discover through TikTok that my experiences 154 00:08:54,320 --> 00:08:57,199 Speaker 3: were consistent with ADHD. And she's not the only one. 155 00:08:57,240 --> 00:08:59,720 Speaker 3: We talked about this before, so many like her finally 156 00:08:59,760 --> 00:09:02,800 Speaker 3: had and definitions for things that she was doing. They 157 00:09:02,840 --> 00:09:06,600 Speaker 3: were on a clinical level, such as masking or even 158 00:09:07,240 --> 00:09:10,720 Speaker 3: being misdiagnosed, concluding that women were quiet due to being 159 00:09:10,800 --> 00:09:13,960 Speaker 3: shy or even less intelligent instead of seeing that they 160 00:09:13,960 --> 00:09:18,640 Speaker 3: were being inattentive due to ADHD and was incorrectly diagnosed 161 00:09:18,760 --> 00:09:22,720 Speaker 3: or was told that's not a diagnosis of ADHD, which 162 00:09:22,760 --> 00:09:25,520 Speaker 3: again we've talked about it previously. We've had guess Kate 163 00:09:25,640 --> 00:09:29,080 Speaker 3: who was amazing talking about her eye diagnosis of ADHD 164 00:09:29,440 --> 00:09:32,520 Speaker 3: and what that looks like for women and why it's 165 00:09:32,600 --> 00:09:37,400 Speaker 3: so hard even today. But the message is spreading and 166 00:09:37,480 --> 00:09:39,520 Speaker 3: social media has been a big part of that. TikTok 167 00:09:39,600 --> 00:09:43,480 Speaker 3: specifically has been a huge part of that. There's also 168 00:09:43,600 --> 00:09:48,520 Speaker 3: the awareness that helps destigmatize a mental health diagnosis. More 169 00:09:48,520 --> 00:09:50,880 Speaker 3: people are able to talk about their experiences and talk 170 00:09:50,920 --> 00:09:52,760 Speaker 3: about what they did and even how they were finally 171 00:09:52,760 --> 00:09:55,280 Speaker 3: able to get people to listen to them, which helped 172 00:09:55,320 --> 00:09:59,200 Speaker 3: others along the way. There were several articles about how 173 00:09:59,200 --> 00:10:02,480 Speaker 3: it's decema tie mental health for men and helping men 174 00:10:02,600 --> 00:10:05,320 Speaker 3: get help or realizing that they need help and what 175 00:10:05,360 --> 00:10:07,720 Speaker 3: that help can do for them. So those are some 176 00:10:07,800 --> 00:10:12,440 Speaker 3: amazing things that have occurred on TikTok specifically. But in 177 00:10:12,480 --> 00:10:16,040 Speaker 3: an article written about teenagers and TikTok from CNN, they 178 00:10:16,040 --> 00:10:18,920 Speaker 3: talk about the good and the bad and some parents 179 00:10:18,920 --> 00:10:22,080 Speaker 3: phylis it has helped them with their teens And here's 180 00:10:22,080 --> 00:10:24,960 Speaker 3: a quote from that. Julie Fulter from Raleigh, North Carolina 181 00:10:25,040 --> 00:10:28,880 Speaker 3: said she began following ADHD influencers who were able to 182 00:10:28,920 --> 00:10:32,640 Speaker 3: better explain behaviors impulsivities and how the condition is related 183 00:10:32,640 --> 00:10:35,520 Speaker 3: to executive functioning, so she can help her daughter navigate 184 00:10:35,559 --> 00:10:40,720 Speaker 3: her diagnosis. Meanwhile, another person from upstate New York feels 185 00:10:40,720 --> 00:10:43,559 Speaker 3: mixed about her daughter using social media for reasons related 186 00:10:43,600 --> 00:10:46,680 Speaker 3: to her autism diagnosis. She's doing a lot of self 187 00:10:46,679 --> 00:10:49,760 Speaker 3: discovery right now in so many areas, and social media 188 00:10:49,840 --> 00:10:51,800 Speaker 3: is a big part of that, she said. I know 189 00:10:51,920 --> 00:10:54,800 Speaker 3: social media gets a bad rap, but in her case, 190 00:10:54,920 --> 00:10:57,760 Speaker 3: it's hard to tell. Sometimes if the pros outweigh the 191 00:10:57,800 --> 00:11:01,360 Speaker 3: cons and it goes on. And many adults appear to 192 00:11:01,480 --> 00:11:05,559 Speaker 3: credit social media with helping them identify lifelong mental health struggles. 193 00:11:05,720 --> 00:11:08,880 Speaker 3: One person, a thirty five year old professional photographer, says 194 00:11:08,960 --> 00:11:12,040 Speaker 3: she sought guidance from a professional after seeing videos pop 195 00:11:12,120 --> 00:11:26,800 Speaker 3: up on her TikTok for you page about eighty HD Okay, 196 00:11:27,040 --> 00:11:30,280 Speaker 3: this for you page really has me thinking. And I've 197 00:11:30,320 --> 00:11:33,840 Speaker 3: made this statement and I'm not trying to be ablest 198 00:11:33,960 --> 00:11:35,520 Speaker 3: or any of that. I'm not making enlighten me in them, 199 00:11:35,559 --> 00:11:38,320 Speaker 3: but I'm like, maybe I am on the spectrum because 200 00:11:38,360 --> 00:11:40,559 Speaker 3: at this point in time, I'm having a rough time 201 00:11:40,640 --> 00:11:44,800 Speaker 3: with these types of cognitive functioning social functioning, and I'm 202 00:11:44,840 --> 00:11:50,000 Speaker 3: like what is this? But let's be very honest when 203 00:11:50,040 --> 00:11:54,480 Speaker 3: it comes down to coming out of the pandemic, getting older, 204 00:11:54,720 --> 00:11:57,920 Speaker 3: getting tired, things change, and not that I couldn't be 205 00:11:58,000 --> 00:12:01,319 Speaker 3: on the spectrum. I could get that diagnosis, but that 206 00:12:01,480 --> 00:12:04,400 Speaker 3: the more you hear things, the more you start kind 207 00:12:04,440 --> 00:12:09,160 Speaker 3: of convincing yourself yes that's me, which for some is 208 00:12:09,280 --> 00:12:11,960 Speaker 3: very helpful. For others, it could go down like a 209 00:12:12,040 --> 00:12:15,360 Speaker 3: dangerous route again, like I'm saying, like I'm smelling things, 210 00:12:15,400 --> 00:12:18,440 Speaker 3: what is happening? That's why it seemed to be okay, Annie, 211 00:12:18,440 --> 00:12:19,240 Speaker 3: would you agree? 212 00:12:19,520 --> 00:12:22,200 Speaker 1: I would, but you know I'm keeping an eye out. 213 00:12:22,440 --> 00:12:24,440 Speaker 1: I'm keeping this, Please keep an eye out. 214 00:12:25,720 --> 00:12:28,760 Speaker 3: But again that same article about teenagers, they talk about 215 00:12:28,760 --> 00:12:31,520 Speaker 3: how there's the ups and downs, but it seems with 216 00:12:31,600 --> 00:12:35,640 Speaker 3: even more risk with the self diagnosing. So from that article, 217 00:12:37,040 --> 00:12:39,599 Speaker 3: it says a growing number of teens are turning to 218 00:12:39,640 --> 00:12:43,440 Speaker 3: social media such as Instagram and TikTok for guidance, resources, 219 00:12:43,480 --> 00:12:46,320 Speaker 3: and support for their mental health to find conditions that 220 00:12:46,320 --> 00:12:50,559 Speaker 3: they think match their own, a trend that has alarmed parents, therapist, 221 00:12:50,640 --> 00:12:55,000 Speaker 3: school counselors. According to interviews with CNN, some teens start 222 00:12:55,040 --> 00:12:58,560 Speaker 3: to follow creators who discuss their own mental health conditions, symptoms, 223 00:12:58,559 --> 00:13:02,199 Speaker 3: and treatments. Others have come across posts with symptom checklists 224 00:13:02,240 --> 00:13:05,359 Speaker 3: to help decide if they meet the criteria for a diagnosis, 225 00:13:05,559 --> 00:13:10,359 Speaker 3: and it continues. However, many parents and experts expressed concerns 226 00:13:10,400 --> 00:13:15,520 Speaker 3: over how self diagnosing and mislabeling could exacerbate teens behaviors, 227 00:13:15,840 --> 00:13:19,199 Speaker 3: make them feel isolated, and be counterproductive in giving them 228 00:13:19,280 --> 00:13:22,920 Speaker 3: the help they need. In a worst case scenario, teens 229 00:13:22,920 --> 00:13:25,800 Speaker 3: could set themselves on a path to receiving medication for 230 00:13:25,920 --> 00:13:29,040 Speaker 3: condition they do not have, and once teens search for 231 00:13:29,080 --> 00:13:33,280 Speaker 3: this mental health content, the algorithms may keep surfacing similar 232 00:13:33,320 --> 00:13:37,839 Speaker 3: videos and posts. Some parents and therapists have found that 233 00:13:37,880 --> 00:13:40,640 Speaker 3: once teens decide they have a condition, it can be 234 00:13:40,720 --> 00:13:45,080 Speaker 3: hard to convince them otherwise again and in the article 235 00:13:45,400 --> 00:13:48,400 Speaker 3: there's some theories as to why this is growing bigger 236 00:13:48,440 --> 00:13:52,000 Speaker 3: for teens, It says some experts believe teens may be 237 00:13:52,080 --> 00:13:56,200 Speaker 3: over identifying with a specific label or diagnosis, even if 238 00:13:56,280 --> 00:13:59,439 Speaker 3: it is not a fully accurate representation of their struggles, 239 00:13:59,760 --> 00:14:03,280 Speaker 3: because because a diagnosis can be a shield or justification 240 00:14:03,480 --> 00:14:05,760 Speaker 3: of behavior in social situations. 241 00:14:06,720 --> 00:14:07,160 Speaker 1: Quote. 242 00:14:07,280 --> 00:14:09,800 Speaker 3: With the amounting pressure that young people face to be 243 00:14:09,880 --> 00:14:14,480 Speaker 3: socially competitive, those teens with more significant insecurities may feel 244 00:14:14,480 --> 00:14:17,320 Speaker 3: that they will never measure up, said a psychologist in 245 00:14:17,320 --> 00:14:20,240 Speaker 3: New York. She goes on a team may rely on 246 00:14:20,320 --> 00:14:25,000 Speaker 3: diagnosis to lower other expectations of their abilities. Of course, 247 00:14:25,520 --> 00:14:27,840 Speaker 3: this might be a boomer take, and I say this 248 00:14:28,320 --> 00:14:30,640 Speaker 3: nicely to all the boomers, but that idea of like 249 00:14:30,760 --> 00:14:33,960 Speaker 3: I in my days, we never had these things and 250 00:14:34,080 --> 00:14:39,160 Speaker 3: so dismissing people with those diagnosts because as a person, 251 00:14:39,280 --> 00:14:42,480 Speaker 3: just just a random person being out here in the world, 252 00:14:42,560 --> 00:14:47,000 Speaker 3: as well as working in the social work world and 253 00:14:47,040 --> 00:14:50,600 Speaker 3: with mental health, everyone has something like I think it 254 00:14:50,640 --> 00:14:54,240 Speaker 3: would be incorrect to say that there is normalcy like 255 00:14:54,360 --> 00:14:57,040 Speaker 3: some There is a diagnosis for everyone, whether it is 256 00:14:57,160 --> 00:15:03,560 Speaker 3: by social pressure or whether it is by mental health, biology, environment, 257 00:15:04,040 --> 00:15:06,680 Speaker 3: any of those things do come on. We know PTSD 258 00:15:08,320 --> 00:15:10,520 Speaker 3: is a big thing, and especially a lot of us 259 00:15:10,560 --> 00:15:13,880 Speaker 3: feeling that pressure are just from the pandemic from recession. 260 00:15:14,200 --> 00:15:17,280 Speaker 3: All of these things have a name. There's a level 261 00:15:17,280 --> 00:15:19,960 Speaker 3: of anxiety that everybody is going through. There's a level 262 00:15:20,000 --> 00:15:22,800 Speaker 3: of depression that most people have gone through, especially during 263 00:15:22,840 --> 00:15:26,200 Speaker 3: the holidays, like it's there and for teenagers who also 264 00:15:26,280 --> 00:15:29,360 Speaker 3: watch social media on a constant or on social media 265 00:15:29,400 --> 00:15:33,160 Speaker 3: on a constant like that level is pretty high. So 266 00:15:33,440 --> 00:15:39,040 Speaker 3: I think that it would be disrespectful to dismiss teens 267 00:15:39,040 --> 00:15:41,280 Speaker 3: in general saying you've been looking at too or anybody 268 00:15:41,360 --> 00:15:43,960 Speaker 3: you've been looking at too much social media, because that's 269 00:15:44,000 --> 00:15:46,120 Speaker 3: part of the problem to begin with, you know, when 270 00:15:46,120 --> 00:15:49,760 Speaker 3: we go up one step back, three steps type of conversations. 271 00:15:49,800 --> 00:15:51,520 Speaker 3: Mental health could be a part of that as well. 272 00:15:51,920 --> 00:15:54,800 Speaker 3: And that too much diagnosis. We had that with the 273 00:15:54,840 --> 00:15:57,440 Speaker 3: when medications were finally coming out and people started talking 274 00:15:57,440 --> 00:16:00,800 Speaker 3: about needing medication, that being too medicated, and I think 275 00:16:00,840 --> 00:16:03,360 Speaker 3: that's that's the true story. I've seen that where kids 276 00:16:03,520 --> 00:16:05,880 Speaker 3: being diagnosed with the simplest things, or being forced to 277 00:16:05,920 --> 00:16:08,880 Speaker 3: be diagnosed with the simplest things so they could sedate kids, 278 00:16:09,600 --> 00:16:12,520 Speaker 3: which I thought was horrible. Like I saw kids on 279 00:16:13,360 --> 00:16:16,520 Speaker 3: ADHD medication because they wouldn't go to sleep. I was like, 280 00:16:16,760 --> 00:16:19,760 Speaker 3: what is happening? Why are we putting them on a 281 00:16:20,040 --> 00:16:24,720 Speaker 3: prescribed medication for that? So there's definitely this level that 282 00:16:24,760 --> 00:16:27,360 Speaker 3: I've seen, But when it comes to social media, there 283 00:16:27,440 --> 00:16:29,000 Speaker 3: is this back and forth. It's like how much is 284 00:16:29,040 --> 00:16:31,960 Speaker 3: too much? Is this too much information given to kids 285 00:16:32,000 --> 00:16:34,800 Speaker 3: who are smart enough to know the system and work 286 00:16:34,840 --> 00:16:36,880 Speaker 3: the system, and I say, what the system. I'm one 287 00:16:36,880 --> 00:16:40,320 Speaker 3: of those people I can manipulate into a diagnosis. I've 288 00:16:40,360 --> 00:16:43,120 Speaker 3: done it as someone who's like, but I also did 289 00:16:43,200 --> 00:16:45,000 Speaker 3: it because I knew that there was something wrong and 290 00:16:45,040 --> 00:16:46,960 Speaker 3: people were not listening to me, and I knew the 291 00:16:47,040 --> 00:16:50,360 Speaker 3: links that you had to go to to get those medications. 292 00:16:50,560 --> 00:16:53,480 Speaker 3: So there's this level of like, and it's not everybody's 293 00:16:53,520 --> 00:16:56,760 Speaker 3: like what it is this balance because of the system 294 00:16:56,800 --> 00:16:59,760 Speaker 3: that's been created. You have social media coming in here 295 00:16:59,760 --> 00:17:02,640 Speaker 3: and let me help you, and that has fed into 296 00:17:02,640 --> 00:17:07,879 Speaker 3: the TikTok algorithm, which is making TikTok money, lots and 297 00:17:07,960 --> 00:17:11,399 Speaker 3: lots of money. That's also what a reminder is also 298 00:17:11,680 --> 00:17:17,640 Speaker 3: those who are the disingenuous, bad players who have found 299 00:17:18,520 --> 00:17:21,520 Speaker 3: that kind of It's been around with snake oil salesmen. 300 00:17:22,119 --> 00:17:24,439 Speaker 3: They have found a new platform and it has worked. 301 00:17:24,560 --> 00:17:26,520 Speaker 3: So if you can build trust and saying I know 302 00:17:26,560 --> 00:17:28,280 Speaker 3: this is what you're going through because I'm going through 303 00:17:28,320 --> 00:17:31,119 Speaker 3: this and this has helped me, they can sell stuff. 304 00:17:31,480 --> 00:17:33,280 Speaker 3: Don't get me wrong, I think TikTok has done a 305 00:17:33,280 --> 00:17:36,160 Speaker 3: great job, especially the followers and the individuals and people 306 00:17:36,200 --> 00:17:38,399 Speaker 3: in there where they call that out when they find 307 00:17:38,440 --> 00:17:41,200 Speaker 3: it out. It's beautiful to see. I don't want anybody 308 00:17:41,240 --> 00:17:43,080 Speaker 3: to go down in flames. But at the same time, 309 00:17:43,080 --> 00:17:46,800 Speaker 3: if you're feeding misinformation to people to make a profit 310 00:17:47,440 --> 00:17:50,280 Speaker 3: and could be hurting those individuals, yeah, I hope you 311 00:17:50,320 --> 00:17:53,920 Speaker 3: go down in flames. You know. So all of these 312 00:17:53,920 --> 00:17:58,040 Speaker 3: things are happening, But this type of rise of teenagers 313 00:17:58,160 --> 00:18:01,320 Speaker 3: diagnosing themselves has come as in fact they all There 314 00:18:01,400 --> 00:18:03,200 Speaker 3: was also a trend I didn't know about this because 315 00:18:03,200 --> 00:18:06,359 Speaker 3: I guess I'm not as cool as I thought impossible 316 00:18:07,440 --> 00:18:11,840 Speaker 3: called undiagnosed, where they the kids who have been diagnosed 317 00:18:11,920 --> 00:18:15,560 Speaker 3: would make jokes about how they were cured now, and 318 00:18:15,600 --> 00:18:18,760 Speaker 3: they even use those who had eating disorders, like a 319 00:18:18,760 --> 00:18:21,320 Speaker 3: one girl joked I have eight a mil I'm cured 320 00:18:21,520 --> 00:18:25,280 Speaker 3: type of conversation. And it is not because they're being disrespectful. 321 00:18:25,520 --> 00:18:28,000 Speaker 3: They're trying to make a joke, as we have often 322 00:18:28,080 --> 00:18:32,520 Speaker 3: done for ourselves about their own condition, trying to joke 323 00:18:32,600 --> 00:18:36,080 Speaker 3: through it, essentially, But it kind of has fed into 324 00:18:36,080 --> 00:18:40,120 Speaker 3: this like misinformation to other teens, where it can lead 325 00:18:40,160 --> 00:18:43,160 Speaker 3: to a bad, harmful pattern where they backtrack and say 326 00:18:43,200 --> 00:18:46,280 Speaker 3: I can solve this myself. Now that's not always the case. Again, 327 00:18:46,400 --> 00:18:47,800 Speaker 3: a lot of them are doing this in like a 328 00:18:47,840 --> 00:18:51,240 Speaker 3: tongue in cheek way, but it's coming off as m. 329 00:18:51,760 --> 00:18:57,800 Speaker 3: This actually may be a bad trend m for other teens. Again, 330 00:18:57,960 --> 00:19:04,080 Speaker 3: there's also those who actually think TikTok has trivialized they're 331 00:19:04,119 --> 00:19:07,480 Speaker 3: actual diagnosis. So there was a writer who has been 332 00:19:07,480 --> 00:19:11,440 Speaker 3: diagnosed with autism who talks about that specifically of like, okay, 333 00:19:11,440 --> 00:19:16,000 Speaker 3: everybody's joking now that because they don't like loud sounds, 334 00:19:17,000 --> 00:19:19,960 Speaker 3: that's autism, you know, and that's the spectrum, and not 335 00:19:20,040 --> 00:19:22,840 Speaker 3: that it can't be once again, but to put it 336 00:19:22,920 --> 00:19:25,919 Speaker 3: lightly in making a joke very ablest, but trying to 337 00:19:26,080 --> 00:19:28,080 Speaker 3: accept it at the same time, because you see on 338 00:19:28,160 --> 00:19:31,200 Speaker 3: TikTok's kind of that balance of like, okay, but how 339 00:19:31,280 --> 00:19:36,080 Speaker 3: much have we put into this content that is now 340 00:19:36,119 --> 00:19:39,159 Speaker 3: trivializing those who have struggled all of their lives with 341 00:19:39,280 --> 00:19:42,280 Speaker 3: that diagnosed, so with actually being diagnosed and may have 342 00:19:42,400 --> 00:19:46,840 Speaker 3: this like be on the further spectrum of autism. We're 343 00:19:46,880 --> 00:19:48,879 Speaker 3: not going to talk about Cia in any way, but 344 00:19:48,920 --> 00:19:50,679 Speaker 3: if you kind of look at what she did with 345 00:19:50,800 --> 00:19:53,359 Speaker 3: that and how she did both of those things, not 346 00:19:53,400 --> 00:19:57,720 Speaker 3: only did she almost made it like graphically over the top, 347 00:19:58,200 --> 00:20:01,080 Speaker 3: where it was all the caricatures of people who have 348 00:20:01,119 --> 00:20:04,239 Speaker 3: autism or who are on the spectrum, to trivializing it, 349 00:20:04,320 --> 00:20:06,760 Speaker 3: to make it excuses for others. It's kind of just 350 00:20:06,800 --> 00:20:12,639 Speaker 3: like m But is this partly because social media has 351 00:20:12,680 --> 00:20:16,040 Speaker 3: also led into that as well? And yes I have 352 00:20:16,160 --> 00:20:21,719 Speaker 3: been on autism, uh the spectrum TikTok and they you know, 353 00:20:22,040 --> 00:20:25,720 Speaker 3: and ADHD TikTok and the overlapping is I mean, like 354 00:20:25,800 --> 00:20:28,320 Speaker 3: over the top. And then this whole conversation about being 355 00:20:28,359 --> 00:20:33,880 Speaker 3: neurodivergent versus uh, neurotypical, which again came to my conversation 356 00:20:33,920 --> 00:20:35,760 Speaker 3: of like, but there's no such thing as really anybody 357 00:20:35,760 --> 00:20:39,360 Speaker 3: being neurotypical if we really look down into what society 358 00:20:39,440 --> 00:20:42,680 Speaker 3: has kind of created in itself, whether it's the anxiety 359 00:20:42,800 --> 00:20:46,399 Speaker 3: or the push or the depression like and PTSD in 360 00:20:46,440 --> 00:20:49,440 Speaker 3: itself and the trauma in itself, Like, it's kind of 361 00:20:49,440 --> 00:20:53,159 Speaker 3: like does that actually exist? So I don't know. So 362 00:20:53,200 --> 00:20:56,000 Speaker 3: that term, the over blanketing term, is an interesting term 363 00:20:56,040 --> 00:20:58,800 Speaker 3: to me. Again, I know I'm not I'm not saying 364 00:20:58,840 --> 00:21:01,840 Speaker 3: I'm a professional any list, but just seeing what it 365 00:21:01,960 --> 00:21:06,119 Speaker 3: is if we come to realization that the norm is 366 00:21:06,200 --> 00:21:11,880 Speaker 3: actually being neurodivergent it seems, but the spectrum does change, 367 00:21:11,960 --> 00:21:14,640 Speaker 3: and then the respect that we give those terms are 368 00:21:14,680 --> 00:21:17,840 Speaker 3: really important. And yeah, a lot of the articles that 369 00:21:17,920 --> 00:21:21,640 Speaker 3: I've seen that we've talked about talk about that terminology 370 00:21:21,680 --> 00:21:26,120 Speaker 3: is really, really, really important, mainly because it does differentiate 371 00:21:26,160 --> 00:21:29,480 Speaker 3: from a true diagnosis versus a false diagnosis. I guess 372 00:21:29,520 --> 00:21:31,400 Speaker 3: one of the things they talk about is high functioning. 373 00:21:31,640 --> 00:21:35,120 Speaker 3: That is not a clinical term, which is also why 374 00:21:35,200 --> 00:21:36,920 Speaker 3: that's kind of been banned. And we talk about the 375 00:21:36,920 --> 00:21:41,200 Speaker 3: spectrum in itself. Yeah, again, a big reminder, people make 376 00:21:41,400 --> 00:21:44,280 Speaker 3: money off this content. They can make tons of money 377 00:21:44,800 --> 00:21:47,160 Speaker 3: just by having a certain amount of views and accruing 378 00:21:47,200 --> 00:21:51,600 Speaker 3: a lot of followers. Again, there's some amazing content creators 379 00:21:51,640 --> 00:21:54,520 Speaker 3: who do good things. There's a lot of bad players 380 00:21:54,520 --> 00:21:58,000 Speaker 3: as well. So what are some things that we can 381 00:21:58,040 --> 00:22:01,000 Speaker 3: do to lessen this issue? I could just get off TikTok. 382 00:22:02,520 --> 00:22:04,280 Speaker 3: I don't know. I really like the dog videos. And 383 00:22:04,359 --> 00:22:08,200 Speaker 3: right now, following Neil the Seal Elephant seal down in Australia, 384 00:22:08,680 --> 00:22:11,320 Speaker 3: have you seen anything about Neil the Seal you know. 385 00:22:11,680 --> 00:22:16,840 Speaker 2: And not on TikTok, but yes, yeah, isn't it interesting 386 00:22:16,840 --> 00:22:18,520 Speaker 2: when you're like, how do you find things? 387 00:22:19,080 --> 00:22:23,160 Speaker 3: Do you find things? It's been fantastic. That's just one creator. 388 00:22:23,280 --> 00:22:25,199 Speaker 3: I think it goes by J. C. J. A y 389 00:22:25,280 --> 00:22:30,040 Speaker 3: Cee and I think he works with like the government 390 00:22:30,480 --> 00:22:33,639 Speaker 3: to try to keep Neil safe. But he's just like Neil. 391 00:22:34,720 --> 00:22:36,920 Speaker 3: Those people need to go to work, get off their lawn. 392 00:22:37,000 --> 00:22:42,000 Speaker 3: Like it's just fantastic content. Fantastic content, okay, but some 393 00:22:42,119 --> 00:22:44,399 Speaker 3: other ways to look at it outside of just getting 394 00:22:44,400 --> 00:22:46,320 Speaker 3: off of TikTok. And this came from the University of 395 00:22:46,359 --> 00:22:50,399 Speaker 3: Colorado's medical school. They said, look at credentials. Who is 396 00:22:50,440 --> 00:22:53,760 Speaker 3: giving advice or posting what's their background? And I know 397 00:22:53,880 --> 00:22:56,600 Speaker 3: people can put whatever they want because there's been several 398 00:22:56,640 --> 00:22:59,399 Speaker 3: call outs for people who said their doctors are in 399 00:22:59,400 --> 00:23:01,639 Speaker 3: the medical field that it turns out they're not. But 400 00:23:01,800 --> 00:23:05,600 Speaker 3: actually trying to see if you can research deep enough 401 00:23:05,600 --> 00:23:09,320 Speaker 3: like you usually those who are in the profession will 402 00:23:09,359 --> 00:23:12,399 Speaker 3: show you credentials because they realize how important that is, 403 00:23:12,720 --> 00:23:16,080 Speaker 3: So look for that. Look at the data being presented. 404 00:23:16,320 --> 00:23:18,359 Speaker 3: Are they throwing a lot of words together again like 405 00:23:18,440 --> 00:23:21,000 Speaker 3: high functioning and all these things, like you've got to 406 00:23:21,000 --> 00:23:23,960 Speaker 3: talk about where this is coming from look for resources, 407 00:23:24,040 --> 00:23:27,040 Speaker 3: ask them for resources, and then following that up realizing 408 00:23:27,119 --> 00:23:30,119 Speaker 3: there's no such thing as quick fixes. Saw content about 409 00:23:30,160 --> 00:23:33,119 Speaker 3: someone saying this is the over the counter adderall drug. 410 00:23:34,320 --> 00:23:50,600 Speaker 3: Y'all be careful, just just be careful, And they even 411 00:23:50,600 --> 00:23:55,119 Speaker 3: talked about how to fix or recalculate your for you page. 412 00:23:55,200 --> 00:23:57,919 Speaker 3: So if you're following some content creators that are specialized 413 00:23:57,960 --> 00:24:00,320 Speaker 3: to that, if you unfollow them sometimes that'll help. 414 00:24:00,800 --> 00:24:01,080 Speaker 1: For me. 415 00:24:01,240 --> 00:24:04,120 Speaker 3: I don't follow those content readers, but I still get it. 416 00:24:04,200 --> 00:24:06,560 Speaker 3: There is a way, and I think you just need 417 00:24:06,560 --> 00:24:08,359 Speaker 3: a Google or actually you can look it up in 418 00:24:08,359 --> 00:24:13,399 Speaker 3: TikTok to completely wipe off your for you algorithm and 419 00:24:13,480 --> 00:24:16,000 Speaker 3: it starts over again. So there is a way to 420 00:24:16,040 --> 00:24:19,040 Speaker 3: do that. But I've discovered as so as my partner, 421 00:24:19,280 --> 00:24:25,320 Speaker 3: just as talking about something, that stuff will pop up. Yeah, 422 00:24:24,640 --> 00:24:27,399 Speaker 3: at like, it's not that we will look at it, 423 00:24:27,480 --> 00:24:29,800 Speaker 3: we don't look it up anything, it's just that we're 424 00:24:29,800 --> 00:24:32,480 Speaker 3: talking about it. So that tells you we know we're 425 00:24:32,560 --> 00:24:35,480 Speaker 3: being watched. I'm not even that I'm not even that cool, 426 00:24:35,560 --> 00:24:38,199 Speaker 3: stop watching me. But there's ways to do that. You 427 00:24:38,200 --> 00:24:41,240 Speaker 3: can also go through and report and say you're not 428 00:24:41,359 --> 00:24:44,640 Speaker 3: interested because I've done that oftentimes, so you can get 429 00:24:44,640 --> 00:24:47,000 Speaker 3: off of those types of contents. But yeah, there's this 430 00:24:47,040 --> 00:24:50,080 Speaker 3: whole back and forth about how well this is going 431 00:24:50,119 --> 00:24:53,520 Speaker 3: for our society. I'm calling this doctor TikTok because again, 432 00:24:53,600 --> 00:24:58,159 Speaker 3: like WebMD, I start falling in that rhythm of like, 433 00:24:58,200 --> 00:25:00,280 Speaker 3: maybe this is what's wrong with me. But at the 434 00:25:00,280 --> 00:25:02,920 Speaker 3: same time it may not be inaccurate, and it does 435 00:25:03,040 --> 00:25:05,440 Speaker 3: validate how I feel. I still think I'm not a elliptic. 436 00:25:05,760 --> 00:25:06,520 Speaker 3: Thanks tik tok. 437 00:25:07,280 --> 00:25:09,680 Speaker 1: Also a listeners, some listeners have written into you. 438 00:25:10,000 --> 00:25:16,359 Speaker 3: Yeah, I feel like all the listeners like these are 439 00:25:16,359 --> 00:25:18,439 Speaker 3: the diagnoses we'd give you. You may want to go 440 00:25:18,440 --> 00:25:21,639 Speaker 3: see a doctor, And I'm like, you're right, you're right, 441 00:25:22,160 --> 00:25:24,960 Speaker 3: you're right. And that's just the reminder that if these 442 00:25:25,000 --> 00:25:29,920 Speaker 3: things are in question, it's okay to ask, don't act 443 00:25:29,960 --> 00:25:31,679 Speaker 3: on it, don't try to self medicate, don't do all 444 00:25:31,680 --> 00:25:34,440 Speaker 3: these things. Don't make it your excuse necessarily. But trying 445 00:25:34,440 --> 00:25:36,520 Speaker 3: to get a diagnosis going to a doctor saying these 446 00:25:36,520 --> 00:25:38,840 Speaker 3: are the following things, maybe not say I got this 447 00:25:38,880 --> 00:25:41,280 Speaker 3: off a TikTok. I have a feeling that I roll 448 00:25:41,320 --> 00:25:44,960 Speaker 3: will be They'll probably know anyway, they'll probably know anyway, 449 00:25:45,160 --> 00:25:46,920 Speaker 3: but in general, just like, these are the things I'm 450 00:25:46,960 --> 00:25:49,640 Speaker 3: concerned with. These are the things I didn't know was normal. 451 00:25:49,840 --> 00:25:52,040 Speaker 3: Can you help me it was not normal rather in 452 00:25:52,440 --> 00:25:54,960 Speaker 3: your physical health, like being dissy and passing out. 453 00:25:55,560 --> 00:25:56,800 Speaker 1: Yeah, I mean that's it. 454 00:25:57,119 --> 00:25:59,760 Speaker 2: I again, like you kept saying throughout this, I feel 455 00:25:59,800 --> 00:26:04,400 Speaker 2: like that's the balance of it because pre TikTok, I've 456 00:26:04,440 --> 00:26:08,119 Speaker 2: had multiple people in my life come up to me 457 00:26:08,160 --> 00:26:10,040 Speaker 2: and say, like the episode you did about women and 458 00:26:10,040 --> 00:26:13,240 Speaker 2: autism change my life because we don't talk about it 459 00:26:13,920 --> 00:26:16,720 Speaker 2: because it has been historically ignored or misdiagnosed. 460 00:26:17,680 --> 00:26:18,600 Speaker 1: And so it's both. 461 00:26:18,680 --> 00:26:20,800 Speaker 2: It's like it is good that we're talking about it 462 00:26:20,800 --> 00:26:23,119 Speaker 2: more and we're seeing more things, and people are like, oh, okay, 463 00:26:23,119 --> 00:26:25,399 Speaker 2: maybe it's that. Maybe it's that I know when somebody 464 00:26:25,400 --> 00:26:27,960 Speaker 2: I think it was you told me like having the 465 00:26:28,520 --> 00:26:30,440 Speaker 2: at the back of my neck, I have a very 466 00:26:30,480 --> 00:26:35,960 Speaker 2: strange thing about that with shirts and hair, and it 467 00:26:36,080 --> 00:26:39,080 Speaker 2: is like, oh okay, so that's not like completely normal thing, 468 00:26:40,280 --> 00:26:43,040 Speaker 2: but it is also yes you. I mean I could 469 00:26:43,280 --> 00:26:47,520 Speaker 2: I always joke web MD as like a horoscope. I 470 00:26:47,520 --> 00:26:49,560 Speaker 2: could read it and be like, I probably have this 471 00:26:50,280 --> 00:26:53,040 Speaker 2: and it could be nothing close to what I have like, 472 00:26:53,080 --> 00:26:55,800 Speaker 2: so it is like a it's good, it's bringing awareness, 473 00:26:55,800 --> 00:26:57,760 Speaker 2: but it is important to know where it's coming from 474 00:26:58,880 --> 00:27:04,560 Speaker 2: and to do further research before you you self medicate 475 00:27:04,640 --> 00:27:07,480 Speaker 2: or self diagnose. So yeah, it's just both. 476 00:27:07,760 --> 00:27:07,960 Speaker 1: You know. 477 00:27:08,640 --> 00:27:10,880 Speaker 3: Well when you said that the horoscoping thing, actually there 478 00:27:10,960 --> 00:27:13,840 Speaker 3: was one article that talked about the horoscope effect. 479 00:27:15,359 --> 00:27:18,840 Speaker 1: Really, Yes, what is called people. 480 00:27:18,600 --> 00:27:22,439 Speaker 3: Read symptoms online or hear someone discuss the experience and 481 00:27:22,440 --> 00:27:24,680 Speaker 3: feel connected to it so much so that they both 482 00:27:24,720 --> 00:27:28,120 Speaker 3: now believe they may have the same diagnosis. The bold 483 00:27:28,160 --> 00:27:32,120 Speaker 3: proclamations as of like predicting it or something along the lines. 484 00:27:32,160 --> 00:27:35,639 Speaker 3: But yeah, apparently that is a whole theory within psychologists 485 00:27:36,200 --> 00:27:39,119 Speaker 3: when it comes to predicting or the disorder they used 486 00:27:39,119 --> 00:27:39,880 Speaker 3: to TikTok. 487 00:27:40,320 --> 00:27:43,320 Speaker 2: I just feel like a lot of times we all 488 00:27:43,359 --> 00:27:46,560 Speaker 2: do it. But there are some symptoms that are just 489 00:27:46,640 --> 00:27:49,439 Speaker 2: common amongst a lot of things, right, And for me, 490 00:27:49,520 --> 00:27:52,879 Speaker 2: one of them is my heart beats really fast. Not 491 00:27:52,960 --> 00:27:56,679 Speaker 2: all the time, but sometimes it's fast. Yeah, and that 492 00:27:56,880 --> 00:28:00,960 Speaker 2: is just a symptom amongst all things. And it could 493 00:28:01,119 --> 00:28:04,760 Speaker 2: just be and I fingers cross suspected is I'm anxious? 494 00:28:05,320 --> 00:28:05,439 Speaker 3: Right? 495 00:28:06,480 --> 00:28:09,480 Speaker 1: But if I go on to womend it's like Jesus. 496 00:28:09,280 --> 00:28:16,240 Speaker 3: This and you would do good. Yeah, but we can't 497 00:28:16,320 --> 00:28:20,320 Speaker 3: negate that because women more women died than men every 498 00:28:20,400 --> 00:28:24,080 Speaker 3: year of a heart attack or heart related uh fatalities. 499 00:28:24,280 --> 00:28:27,560 Speaker 3: So part of this is like being told we're being paranoid, 500 00:28:27,640 --> 00:28:29,639 Speaker 3: but at the same time it's actually true because no 501 00:28:29,680 --> 00:28:32,359 Speaker 3: one listens and what which part do we listen to? 502 00:28:32,440 --> 00:28:34,400 Speaker 3: And it should be better safe than sorry, But we 503 00:28:34,480 --> 00:28:38,080 Speaker 3: as women have been so shamed yeah, to be healthy 504 00:28:38,680 --> 00:28:40,600 Speaker 3: that we're we are afraid to go to doctors half 505 00:28:40,640 --> 00:28:40,920 Speaker 3: the time. 506 00:28:41,160 --> 00:28:44,280 Speaker 2: Yes, and it costs money and it takes time. So 507 00:28:45,400 --> 00:28:48,840 Speaker 2: it's like really unfortunate. Some people's healthcare is really bad. Yes, 508 00:28:49,680 --> 00:28:53,240 Speaker 2: yes it is. But yeah, so I think it's like 509 00:28:53,280 --> 00:28:57,800 Speaker 2: more complicated than saying like one side is bad. I 510 00:28:57,840 --> 00:29:04,760 Speaker 2: don't know, right, it's just it's it's more nuanced than that, right. Yes, Well, 511 00:29:05,440 --> 00:29:08,160 Speaker 2: thank you for bringing this to Samantha. Believe we're going 512 00:29:08,240 --> 00:29:12,840 Speaker 2: to talk about episode coming up about why you were 513 00:29:12,920 --> 00:29:16,560 Speaker 2: thinking about this, but our our calendars and flux, so 514 00:29:16,600 --> 00:29:17,000 Speaker 2: we'll see. 515 00:29:17,080 --> 00:29:18,680 Speaker 3: I think this is going to be ahead of this, 516 00:29:19,360 --> 00:29:20,040 Speaker 3: but yes, we'll. 517 00:29:19,880 --> 00:29:22,120 Speaker 1: Talk about why why you were thinking about this? 518 00:29:22,240 --> 00:29:26,520 Speaker 2: Yes, all right, Well in the meantime, you can contact 519 00:29:26,560 --> 00:29:29,520 Speaker 2: us listeners if you have any thoughts about this. Our 520 00:29:29,560 --> 00:29:32,480 Speaker 2: email is Stephania mom Stuff at iHeartMedia dot com. You 521 00:29:32,480 --> 00:29:34,440 Speaker 2: can find us on Twitter at mom Stuff podcast, or 522 00:29:34,480 --> 00:29:36,680 Speaker 2: on Instagram and TikTok at stuff one. Never told you 523 00:29:37,200 --> 00:29:39,960 Speaker 2: we have a teapubook store and we do have a book. Yes, 524 00:29:40,840 --> 00:29:45,040 Speaker 2: and thanks as always too, our super producer Christina, our 525 00:29:45,080 --> 00:29:48,640 Speaker 2: executive producer Maya, and our contributor Joey. Thank you and 526 00:29:48,680 --> 00:29:50,920 Speaker 2: thanks to you for listening. Steffan never told you the 527 00:29:50,960 --> 00:29:53,360 Speaker 2: production of iHeartRadio. For more podcasts from my Heart Radio, 528 00:29:53,400 --> 00:29:55,240 Speaker 2: you can check out the Art radio app, Apple podcast, 529 00:29:55,320 --> 00:29:57,200 Speaker 2: or wherever you listen to your favorite shows.