1 00:00:05,160 --> 00:00:07,680 Speaker 1: Hey, this is Andy and Samantha and all my stuff. 2 00:00:07,720 --> 00:00:22,760 Speaker 1: I never told you prodection, iHeart radio. Welcome to another undemony. 3 00:00:23,079 --> 00:00:27,520 Speaker 1: Or Samantha's already laughing at me because I'm demonstrating what 4 00:00:27,600 --> 00:00:34,159 Speaker 1: I'm talking about right now, which is crying about media. 5 00:00:35,000 --> 00:00:38,440 Speaker 1: If you need a hint, we just did our last 6 00:00:38,479 --> 00:00:43,840 Speaker 1: of us to reaction to episode two. So here I am. 7 00:00:44,920 --> 00:00:46,839 Speaker 1: I have talked about this before, but I thought I 8 00:00:46,840 --> 00:00:51,760 Speaker 1: would look into it. I'm just see you know what 9 00:00:52,000 --> 00:00:54,280 Speaker 1: is going on here? Is it good? Is a bad? 10 00:00:54,400 --> 00:01:02,400 Speaker 1: All of those things. So one of the main reasons 11 00:01:03,200 --> 00:01:09,680 Speaker 1: that people cry at media is often because you have 12 00:01:09,800 --> 00:01:14,720 Speaker 1: a relationship with a fictional character. This is a parasocial 13 00:01:14,760 --> 00:01:20,600 Speaker 1: relationship or a one directional relationship because you know they're fictional. 14 00:01:20,760 --> 00:01:24,800 Speaker 1: They don't know anything about you. But the thing is, 15 00:01:25,480 --> 00:01:28,160 Speaker 1: people who have looked into this say, actually, the brain 16 00:01:28,240 --> 00:01:33,640 Speaker 1: can't really differentiate between that all the time. You might 17 00:01:33,760 --> 00:01:38,000 Speaker 1: still feel it as if it was a real relationship, 18 00:01:38,040 --> 00:01:41,120 Speaker 1: and in some cases it can be. You know, I've 19 00:01:41,120 --> 00:01:45,040 Speaker 1: talked before about I've gone through these traumatic things and 20 00:01:45,160 --> 00:01:47,080 Speaker 1: here was this piece of media that helped me get 21 00:01:47,120 --> 00:01:49,640 Speaker 1: through it. So you can really form a connection. I 22 00:01:49,640 --> 00:01:52,960 Speaker 1: also think, like for me personally, I think a lot 23 00:01:52,960 --> 00:01:59,400 Speaker 1: of people too, you might assign someone a fictional character like, oh, 24 00:01:59,440 --> 00:02:01,440 Speaker 1: that reminds me of this person that I know in 25 00:02:01,480 --> 00:02:04,880 Speaker 1: real life, or something like that, so you can form 26 00:02:04,920 --> 00:02:08,680 Speaker 1: a connection in that way. We are going to get 27 00:02:08,680 --> 00:02:10,480 Speaker 1: into it later. I think a lot of us know 28 00:02:10,639 --> 00:02:15,720 Speaker 1: this can have unhealthy side effects, but it can have 29 00:02:15,760 --> 00:02:18,880 Speaker 1: a lot of really healthy ones, including like boosting your 30 00:02:18,880 --> 00:02:22,600 Speaker 1: self esteem, not feeling as lonely, or maybe feeling like belonging, 31 00:02:22,720 --> 00:02:25,600 Speaker 1: especially if you're in maybe a small town and there 32 00:02:25,639 --> 00:02:29,600 Speaker 1: aren't people who necessarily you connect to for one reason 33 00:02:29,680 --> 00:02:33,440 Speaker 1: or the other. They can be really beneficial and that way, 34 00:02:33,480 --> 00:02:38,160 Speaker 1: and then they become really important to you, so if 35 00:02:38,160 --> 00:02:43,839 Speaker 1: you lose them, it can be really painful. And to 36 00:02:43,880 --> 00:02:48,520 Speaker 1: that end, there have been studies about what happens when 37 00:02:48,600 --> 00:02:52,720 Speaker 1: that relationship comes to an end for one reason or another, 38 00:02:52,760 --> 00:02:54,560 Speaker 1: Because it doesn't have to be a death. It could 39 00:02:54,639 --> 00:02:58,320 Speaker 1: be like maybe they take the character in a way 40 00:02:58,360 --> 00:03:00,919 Speaker 1: you really don't like or really don't connect to you anymore, 41 00:03:01,680 --> 00:03:05,480 Speaker 1: but it is usually a death, and then you feel 42 00:03:05,480 --> 00:03:11,760 Speaker 1: this really strong emotional response and you know, especially like 43 00:03:12,280 --> 00:03:16,880 Speaker 1: you can have these relationships with a character for a 44 00:03:16,919 --> 00:03:21,359 Speaker 1: long time, depending on the media or depending on how 45 00:03:21,400 --> 00:03:24,200 Speaker 1: you engaged with it, and that can be outside of 46 00:03:24,400 --> 00:03:26,560 Speaker 1: you know, if you're doing even like things like fan 47 00:03:26,600 --> 00:03:28,480 Speaker 1: fiction or something like that, you can have a really 48 00:03:28,520 --> 00:03:36,920 Speaker 1: strong relationship with a character. And so that's why when 49 00:03:36,960 --> 00:03:41,160 Speaker 1: they die or when they change in a way that 50 00:03:41,240 --> 00:03:46,240 Speaker 1: it is just irreconcilable to you. It can be really painful. 51 00:03:47,480 --> 00:03:49,880 Speaker 1: And you might turn to something like fan fiction and 52 00:03:50,040 --> 00:03:54,120 Speaker 1: write that ending differently, but no one else, everyone else 53 00:03:54,200 --> 00:03:56,360 Speaker 1: knows them as dead, and you do too, Like, just 54 00:03:56,400 --> 00:03:59,160 Speaker 1: because I wrote that fan fiction doesn't mean they're still alive. 55 00:04:00,040 --> 00:04:02,800 Speaker 1: Eventually the person's gonna forget them, but no one's gonna 56 00:04:02,800 --> 00:04:07,080 Speaker 1: write fan fiction about them anymore. This is a real 57 00:04:07,120 --> 00:04:11,840 Speaker 1: concern that I have. This also involves something called the 58 00:04:11,840 --> 00:04:15,360 Speaker 1: paradox of tragedy, which I thought was really interesting. It's 59 00:04:15,360 --> 00:04:19,760 Speaker 1: basically like, we don't like tragedy in real life. A 60 00:04:19,800 --> 00:04:22,480 Speaker 1: lot of us don't like crying, but some of us 61 00:04:22,520 --> 00:04:28,000 Speaker 1: really do enjoy the vulnerability and shared emotion of a 62 00:04:28,040 --> 00:04:32,279 Speaker 1: fictional tragedy, and a lot of people think necessity with 63 00:04:32,320 --> 00:04:35,600 Speaker 1: the whole idea. The ancient Greek idea of catharsis of 64 00:04:35,720 --> 00:04:38,919 Speaker 1: getting together to witness something tragic and that releases maybe 65 00:04:38,920 --> 00:04:42,200 Speaker 1: something you've had pent up. And as we've discussed before, 66 00:04:42,240 --> 00:04:45,440 Speaker 1: some research does suggest that we feel better after crying. 67 00:04:46,279 --> 00:04:50,840 Speaker 1: Some theorize that something called meta emotions is involved. Basically, 68 00:04:50,839 --> 00:04:54,800 Speaker 1: we're glad that we can feel empathy or something like 69 00:04:54,880 --> 00:04:58,760 Speaker 1: that towards other people, even if they're fictional. Other studies 70 00:04:58,760 --> 00:05:04,160 Speaker 1: suggest that empathy and reading fictional characters can increase emotional intelligence, 71 00:05:04,240 --> 00:05:07,839 Speaker 1: so like reading them emotionally, it can help you read 72 00:05:08,720 --> 00:05:16,280 Speaker 1: people in real life emotionally. So empathy is one of 73 00:05:16,320 --> 00:05:21,159 Speaker 1: the five key parts of emotional intelligence, along with self awareness, 74 00:05:21,160 --> 00:05:27,600 Speaker 1: self regulation, motivation, and social skills. So high emotional intelligence 75 00:05:27,640 --> 00:05:30,599 Speaker 1: is associated with a lot of positive things, including better leadership, 76 00:05:30,640 --> 00:05:35,720 Speaker 1: better relationships, better conflict resolution, things like that a good 77 00:05:35,760 --> 00:05:38,280 Speaker 1: story also triggers oxytocin which I know a lot of 78 00:05:38,360 --> 00:05:41,200 Speaker 1: us think of as like, you know, the happiness, but 79 00:05:41,440 --> 00:05:43,880 Speaker 1: it actually like if a good story is triggering that 80 00:05:44,600 --> 00:05:46,120 Speaker 1: in a case of a sad story, that can be 81 00:05:46,160 --> 00:05:50,560 Speaker 1: going on, and it then enhances your sad emotions, especially 82 00:05:50,600 --> 00:05:54,800 Speaker 1: around compassion and empathy. On top of that, in some 83 00:05:54,839 --> 00:05:57,039 Speaker 1: ways it can feel like a safer release, even in 84 00:05:57,120 --> 00:06:00,159 Speaker 1: dark time so that's something I really relate to. It 85 00:06:00,160 --> 00:06:02,719 Speaker 1: can be way easier for me to cry for a 86 00:06:02,720 --> 00:06:05,279 Speaker 1: fictional character than someone I actually know. And it doesn't 87 00:06:05,279 --> 00:06:06,800 Speaker 1: mean I didn't care for that person I actually know. 88 00:06:07,320 --> 00:06:11,120 Speaker 1: I'm probably crying for them through the fictional character. It's 89 00:06:11,200 --> 00:06:14,680 Speaker 1: just I think it's a you know, you're trying to 90 00:06:14,720 --> 00:06:20,080 Speaker 1: protect yourself type thing. Since crying is stereotypically seen as 91 00:06:20,080 --> 00:06:22,680 Speaker 1: a more feminine thing, we have often painted it as 92 00:06:22,680 --> 00:06:25,080 Speaker 1: a weakness, but it can be a sign of strength. 93 00:06:25,560 --> 00:06:29,880 Speaker 1: Obviously there are some exceptions, but in general, good movie 94 00:06:29,920 --> 00:06:34,600 Speaker 1: or t be sure game, it's good to let it out. 95 00:06:35,600 --> 00:06:37,839 Speaker 1: I mean, as I mentioned, there are downsides if people 96 00:06:37,920 --> 00:06:42,159 Speaker 1: care more about fictional characters over real people, even people 97 00:06:42,200 --> 00:06:44,800 Speaker 1: they don't know, or neglect your life because the media. 98 00:06:44,880 --> 00:06:48,080 Speaker 1: Those are bad things, as is if you can't shake 99 00:06:48,160 --> 00:06:50,520 Speaker 1: the sadness of it and it starts to impact your life. 100 00:06:50,600 --> 00:06:52,279 Speaker 1: This kind of made me crack up because they're like 101 00:06:52,279 --> 00:06:55,479 Speaker 1: if you're still over a week later, still dealing with it, 102 00:06:55,520 --> 00:07:01,640 Speaker 1: and I was like, oh, years, we're talking, but it's 103 00:07:01,680 --> 00:07:08,360 Speaker 1: not as bad as it was. I also found this 104 00:07:09,120 --> 00:07:10,880 Speaker 1: and this is not what we're talking about today, but 105 00:07:10,920 --> 00:07:13,520 Speaker 1: I did find it related to this conversation. There's this 106 00:07:13,600 --> 00:07:16,160 Speaker 1: newish thing of people posting videos of themselves crying on 107 00:07:16,200 --> 00:07:18,960 Speaker 1: social media, and there's a lot of theories about why 108 00:07:18,960 --> 00:07:22,920 Speaker 1: that is. Wanting connection, wanting pain to be witnessed, wanting 109 00:07:22,960 --> 00:07:26,440 Speaker 1: to feel validated, being performative, wanting more clicks. There's a 110 00:07:26,480 --> 00:07:30,080 Speaker 1: lot of conversation about it. But that can happen and 111 00:07:30,280 --> 00:07:32,720 Speaker 1: is happening in a lot of instances when a fictional 112 00:07:32,800 --> 00:07:38,840 Speaker 1: character dies or something painful happens in something popular. And 113 00:07:38,880 --> 00:07:41,040 Speaker 1: this is where I did look up. I've mentioned this before, 114 00:07:41,080 --> 00:07:44,720 Speaker 1: but I really want a fandom therapist. But I realized 115 00:07:45,640 --> 00:07:47,800 Speaker 1: this is a thing, but it refers to a therapist 116 00:07:47,920 --> 00:07:50,440 Speaker 1: using fandom to conduct therapy, which is not what I'm 117 00:07:50,440 --> 00:07:52,960 Speaker 1: talking about. I need someone I can be like, do 118 00:07:53,000 --> 00:07:55,720 Speaker 1: you remember in Star Wars in this episode where this 119 00:07:55,920 --> 00:07:59,520 Speaker 1: happened and people reacted this way. I need to get 120 00:07:59,560 --> 00:08:04,400 Speaker 1: into it. But there is there is a directory I 121 00:08:04,520 --> 00:08:07,760 Speaker 1: found that I cannot vouch for, but it's supposed to 122 00:08:07,800 --> 00:08:10,880 Speaker 1: be like if I if you want somebody who knows 123 00:08:10,880 --> 00:08:14,240 Speaker 1: a lot about this fandom. So I don't know. Maybe 124 00:08:14,240 --> 00:08:16,679 Speaker 1: I should, I should start. I should be the change 125 00:08:16,720 --> 00:08:19,920 Speaker 1: I want to see a lot of stuff is written 126 00:08:19,960 --> 00:08:25,840 Speaker 1: about the power of fandom for therapy, So there is 127 00:08:26,080 --> 00:08:28,760 Speaker 1: there are There's been science and research into this. It 128 00:08:28,880 --> 00:08:32,320 Speaker 1: is a thing. So if you have experienced this, you 129 00:08:32,400 --> 00:08:35,680 Speaker 1: are not alone. I'm going through it right now. I'm sorry, 130 00:08:35,679 --> 00:08:44,080 Speaker 1: my voice is a little, a little snotty. Well, listeners, 131 00:08:44,240 --> 00:08:46,480 Speaker 1: let me know. If you have any thoughts about this. 132 00:08:46,640 --> 00:08:49,080 Speaker 1: You can email us at hellous Stuffhenever Told You dot com. 133 00:08:49,200 --> 00:08:50,840 Speaker 1: You can find us on Blue Sky at mom Stuff 134 00:08:50,880 --> 00:08:53,360 Speaker 1: podcast are on Instagram and to talk at stuff when 135 00:08:53,360 --> 00:08:55,360 Speaker 1: Never Told You for us on YouTube. We have a 136 00:08:55,400 --> 00:08:56,960 Speaker 1: t topic store and we have a book you can 137 00:08:56,960 --> 00:08:58,760 Speaker 1: get where you get your books. Thanks, It's always too 138 00:08:58,800 --> 00:09:01,520 Speaker 1: our super roduced to Christine Next producer my enerconcopter Joey, 139 00:09:01,760 --> 00:09:04,280 Speaker 1: thank you and thanks to you for listening stuff When 140 00:09:04,280 --> 00:09:05,800 Speaker 1: I per Told you this propections of iHeartRadio. For more 141 00:09:05,840 --> 00:09:07,280 Speaker 1: podcast from my Heart Radio, you can check out the 142 00:09:07,320 --> 00:09:09,000 Speaker 1: heart Radio app Apple Podcasts, or if you listen to 143 00:09:09,040 --> 00:09:09,760 Speaker 1: your favorite shows,