1 00:00:15,436 --> 00:00:24,196 Speaker 1: Pushkin. As a super busy professor, it's rare that I 2 00:00:24,236 --> 00:00:26,636 Speaker 1: have time to truly get addicted to a new TV 3 00:00:26,716 --> 00:00:30,716 Speaker 1: show or podcast. To be really bingeworthy for me, a 4 00:00:30,796 --> 00:00:34,556 Speaker 1: new show has to be exciting, surprising, so captivating that 5 00:00:34,596 --> 00:00:37,876 Speaker 1: I simply can't stay away. And since I really like 6 00:00:38,036 --> 00:00:41,236 Speaker 1: learning new stuff, it usually has to teach me something important. 7 00:00:41,836 --> 00:00:43,956 Speaker 1: And that's the reason I decided to share this special 8 00:00:43,956 --> 00:00:47,836 Speaker 1: bonus episode with you today, because I recently have become 9 00:00:47,916 --> 00:00:51,156 Speaker 1: totally addicted to just such a podcast. It's a new 10 00:00:51,196 --> 00:00:55,716 Speaker 1: show called Bad Women, hosted by the amazing historian Hallie Rubinholt. 11 00:00:57,036 --> 00:00:59,756 Speaker 1: Bad Woman is a show devoted to understanding more about 12 00:00:59,756 --> 00:01:01,716 Speaker 1: the lives of the women who were murdered by Jack 13 00:01:01,796 --> 00:01:05,716 Speaker 1: the Ripper. Now from that description, you might be thinking, Oh, 14 00:01:05,716 --> 00:01:07,916 Speaker 1: this is some new true crime thing that Laurie's gotten 15 00:01:07,916 --> 00:01:11,156 Speaker 1: obsessed with, but that's not actually what the show is about. 16 00:01:11,756 --> 00:01:14,836 Speaker 1: What Bad Women really tackles is the mystery behind why 17 00:01:14,876 --> 00:01:17,316 Speaker 1: the Ripper's victims wound up in his path in the 18 00:01:17,396 --> 00:01:20,516 Speaker 1: first place. What are the sociological factors that put them 19 00:01:20,516 --> 00:01:23,156 Speaker 1: in harm's way, and what is it about our minds 20 00:01:23,196 --> 00:01:26,156 Speaker 1: that prevents us from empathizing with these women and countless 21 00:01:26,156 --> 00:01:29,996 Speaker 1: others who face poverty and injustice in the modern world today. 22 00:01:30,116 --> 00:01:32,516 Speaker 1: I want to talk to Hallie about these mysteries and 23 00:01:32,596 --> 00:01:35,396 Speaker 1: what they mean for our own psychology and even for 24 00:01:35,436 --> 00:01:38,516 Speaker 1: our own happiness. I also wanted Hallie to tell me 25 00:01:38,636 --> 00:01:42,156 Speaker 1: the strategies she uses to face these awful injustices head on. 26 00:01:42,996 --> 00:01:45,556 Speaker 1: I wanted to understand how someone could study these dark 27 00:01:45,596 --> 00:01:49,676 Speaker 1: topics and stay optimistic, because that, in particular seems like 28 00:01:49,676 --> 00:01:53,996 Speaker 1: a strategy that we all need, especially nowadays. But before 29 00:01:53,996 --> 00:01:55,676 Speaker 1: we jump in, I also wanted you to get a 30 00:01:55,716 --> 00:01:58,596 Speaker 1: sense of the show and why it is so captivating 31 00:01:58,636 --> 00:02:01,356 Speaker 1: in catchy and so here's a quick listen to the 32 00:02:01,396 --> 00:02:14,996 Speaker 1: Bad Women trailer. It's a bad habit we have. We 33 00:02:15,156 --> 00:02:18,996 Speaker 1: tell the tale of the murderer and not the murdered. 34 00:02:21,236 --> 00:02:24,836 Speaker 1: The clock on Whitechapel Church was striking half past two 35 00:02:25,156 --> 00:02:29,596 Speaker 1: when Ellen Holland watched her friend Polly Nichols sway off 36 00:02:29,636 --> 00:02:35,996 Speaker 1: into the darkness. Polly was drunk, penniless and broken. She 37 00:02:36,116 --> 00:02:39,036 Speaker 1: was inconsequential in the minds of most people she met, 38 00:02:39,716 --> 00:02:42,676 Speaker 1: but she was about to cross paths was someone who 39 00:02:42,756 --> 00:02:50,196 Speaker 1: would give her a grisly, unenviable place in history. In 40 00:02:50,236 --> 00:02:53,276 Speaker 1: the autumn of eighteen eighty eight, Polly and four other 41 00:02:53,316 --> 00:02:56,596 Speaker 1: women were brutally killed in a slum neighborhood of London. 42 00:02:57,476 --> 00:03:01,636 Speaker 1: Their unsolved murders were so violent, so cruel, that their 43 00:03:01,756 --> 00:03:05,556 Speaker 1: killer earned a nickname that is still known the world over. 44 00:03:07,156 --> 00:03:12,076 Speaker 1: Armed down on halls, shann't quick ripping them Jack the Ripper. 45 00:03:17,916 --> 00:03:20,916 Speaker 1: But in the greatest cold case in history, few of 46 00:03:20,996 --> 00:03:24,836 Speaker 1: us have stopped to question the basic facts. One fact 47 00:03:25,036 --> 00:03:27,036 Speaker 1: that you know about Jack the Ripper that he never 48 00:03:27,076 --> 00:03:36,436 Speaker 1: got caught and who did he kill? Prostitutes? I'm Hallie 49 00:03:36,476 --> 00:03:40,036 Speaker 1: Ribbinhold as a historian interested in the stories of women. 50 00:03:40,436 --> 00:03:43,036 Speaker 1: I'd assume the lives of the five victims had been 51 00:03:43,116 --> 00:03:47,836 Speaker 1: thoroughly researched long ago. I was wrong. When I dug 52 00:03:47,836 --> 00:03:51,716 Speaker 1: into the records. I began to reveal rich and interesting lives. 53 00:03:52,116 --> 00:03:55,196 Speaker 1: Most of the time mellmen leave if they're being beaten 54 00:03:55,196 --> 00:03:58,196 Speaker 1: to a pulp or he put out an eye. Lives 55 00:03:58,196 --> 00:04:02,196 Speaker 1: blighted by problems and prejudices. Most women today would recognize. 56 00:04:02,476 --> 00:04:06,516 Speaker 1: There wasn't a great deal of sympathy for alcoholics, so 57 00:04:06,676 --> 00:04:10,116 Speaker 1: one had to sign a temperance pledge saying I will 58 00:04:10,236 --> 00:04:13,436 Speaker 1: not drink if only a tourist so simple. I identified 59 00:04:13,436 --> 00:04:17,276 Speaker 1: with the women, sympathized with the tough life choices they made, 60 00:04:17,596 --> 00:04:20,436 Speaker 1: and admired their determination. She gets on the boat and 61 00:04:20,436 --> 00:04:24,076 Speaker 1: comes to England. She seeks wider horizons. I came to 62 00:04:24,156 --> 00:04:27,636 Speaker 1: know them and like them. She wants more than she's 63 00:04:27,796 --> 00:04:32,516 Speaker 1: been born into. But I also discovered something else, something new, 64 00:04:33,076 --> 00:04:35,756 Speaker 1: something troubling. It was just so obvious to me the 65 00:04:35,876 --> 00:04:38,196 Speaker 1: very first time I looked at this file. How could 66 00:04:38,196 --> 00:04:40,516 Speaker 1: we have gotten this this wrong for this many years? 67 00:04:40,676 --> 00:04:43,156 Speaker 1: And something that chips away at the foundations of the 68 00:04:43,236 --> 00:04:46,396 Speaker 1: Ripper myth? Nam Sure, all of you here today, do 69 00:04:46,436 --> 00:04:49,516 Speaker 1: you know that Jack's sectims were ladies of the night, 70 00:04:49,676 --> 00:04:52,476 Speaker 1: weren't they? They were forced to choose the myth still 71 00:04:52,556 --> 00:04:56,036 Speaker 1: served up by tour guides to visitors who flocked the 72 00:04:56,116 --> 00:05:00,916 Speaker 1: murder scenes each evening. And now, ladies, you might be thinking, well, 73 00:05:01,596 --> 00:05:06,956 Speaker 1: I'd never think that work ever, but didn't Okay, Well, sadly, 74 00:05:07,116 --> 00:05:11,516 Speaker 1: the answer is yes. Polly, Annie, Elizabeth, Katherine, and Mary 75 00:05:11,596 --> 00:05:15,516 Speaker 1: Jane were not killed while selling sex. It was other, 76 00:05:15,756 --> 00:05:18,316 Speaker 1: no less troubling factors that put them in the path 77 00:05:18,396 --> 00:05:21,836 Speaker 1: of their murderer. In this podcast, I'll tell you how 78 00:05:21,876 --> 00:05:24,876 Speaker 1: I know that and why it still matters very much 79 00:05:25,116 --> 00:05:29,716 Speaker 1: even today, Halle has upset the world of rippology by 80 00:05:29,796 --> 00:05:34,876 Speaker 1: her general attitude towards RiPP rologists. I'll also explain why 81 00:05:34,916 --> 00:05:38,156 Speaker 1: my research has enraged so many people who claim to 82 00:05:38,196 --> 00:05:41,636 Speaker 1: be experts in the Ripper case. The attacks have just 83 00:05:41,716 --> 00:05:45,476 Speaker 1: been relentless and malicious. But actually I think they just 84 00:05:45,636 --> 00:05:48,436 Speaker 1: don't really want other people talking about the murdered women 85 00:05:48,676 --> 00:05:54,676 Speaker 1: and challenging their views in any way. If you want 86 00:05:54,716 --> 00:05:56,956 Speaker 1: to know how we got the Ripper story so wrong, 87 00:05:57,356 --> 00:06:00,756 Speaker 1: what those mistakes tell us about ourselves, and why putting 88 00:06:00,756 --> 00:06:04,916 Speaker 1: the record straight make some people so very angry, join me, 89 00:06:05,316 --> 00:06:09,716 Speaker 1: Hallie Rubbin Holt for Bad Women The Ripper Retold starting 90 00:06:09,756 --> 00:06:33,076 Speaker 1: October fifth, wherever you get your podcasts. So, Hallie, thank 91 00:06:33,116 --> 00:06:35,516 Speaker 1: you so much for being on the show. Thank you 92 00:06:35,556 --> 00:06:37,956 Speaker 1: for having me. I mean, when I first heard that 93 00:06:37,996 --> 00:06:40,756 Speaker 1: Pushkin was doing a podcast about the women who were 94 00:06:40,836 --> 00:06:43,316 Speaker 1: murdered by Jack the Ripper, I kind of had this 95 00:06:43,356 --> 00:06:45,956 Speaker 1: assumption it was like the standard true crime podcast, and 96 00:06:45,996 --> 00:06:48,116 Speaker 1: then when I listened to it, I realize that that's 97 00:06:48,116 --> 00:06:50,436 Speaker 1: not really what the show is actually about. And so, 98 00:06:50,636 --> 00:06:53,676 Speaker 1: you know, talk about this misconception. Well, yes, I mean 99 00:06:53,716 --> 00:06:56,636 Speaker 1: it is about a lot more than that. It's largely 100 00:06:56,716 --> 00:07:00,996 Speaker 1: about the lives of the five victims of Jack the Ripper, 101 00:07:01,116 --> 00:07:04,716 Speaker 1: which have been completely overlooked, and we look at a 102 00:07:04,716 --> 00:07:08,276 Speaker 1: lot of issues that come out of their lives. So 103 00:07:08,796 --> 00:07:13,396 Speaker 1: issue about addiction, for example, issues about homelessness, issues about 104 00:07:13,556 --> 00:07:16,756 Speaker 1: violence against women, all of these things, because they're so 105 00:07:16,876 --> 00:07:21,396 Speaker 1: integrated into the story and the way we tell the 106 00:07:21,396 --> 00:07:24,196 Speaker 1: traditional story of Jack the Ripper. And one of the 107 00:07:24,196 --> 00:07:26,116 Speaker 1: things that I found so striking about this is that 108 00:07:26,156 --> 00:07:28,876 Speaker 1: all of these injustices you've just mentioned, you know, violence 109 00:07:28,916 --> 00:07:32,556 Speaker 1: against women, you know, poverty, addiction, these are things that 110 00:07:32,596 --> 00:07:35,636 Speaker 1: we're facing today and they're the kinds of things that 111 00:07:35,676 --> 00:07:38,636 Speaker 1: we just haven't really dealt with. And so, without giving 112 00:07:38,676 --> 00:07:40,756 Speaker 1: away too many spoilers, give me a sense of how 113 00:07:40,796 --> 00:07:43,316 Speaker 1: some of these things played out in Victorian times and 114 00:07:43,316 --> 00:07:46,476 Speaker 1: with some of the women who were the Ripper's victims. Well, 115 00:07:46,516 --> 00:07:49,796 Speaker 1: I mean, people very much in the past were trapped 116 00:07:49,796 --> 00:07:53,036 Speaker 1: in a poverty cycle similar to poverty cycles that people 117 00:07:53,076 --> 00:07:56,236 Speaker 1: in the present are trapped in. So that if you 118 00:07:56,276 --> 00:08:00,316 Speaker 1: were born into a family that doesn't have a lot 119 00:08:00,356 --> 00:08:03,316 Speaker 1: of money, doesn't have a lot of resources, your opportunities 120 00:08:03,316 --> 00:08:05,396 Speaker 1: in life are going to be very limited. And this 121 00:08:05,516 --> 00:08:08,476 Speaker 1: was even more so the case for working class women 122 00:08:08,796 --> 00:08:12,676 Speaker 1: in the nineteenth century. Poor women didn't have access to 123 00:08:12,796 --> 00:08:16,076 Speaker 1: education in the way that people of wealth had it 124 00:08:16,516 --> 00:08:21,276 Speaker 1: or men. Women couldn't actually make careers for themselves in 125 00:08:21,316 --> 00:08:25,556 Speaker 1: the professions because the professions weren't open to women, because 126 00:08:25,636 --> 00:08:29,556 Speaker 1: it was believed that a woman's only role in society 127 00:08:29,956 --> 00:08:33,676 Speaker 1: was to be a mother, a wife, a care She 128 00:08:33,756 --> 00:08:36,756 Speaker 1: wasn't a breadwinner, she couldn't have a life of her own. 129 00:08:36,956 --> 00:08:40,716 Speaker 1: And because of that, if something went terribly wrong in 130 00:08:40,756 --> 00:08:42,756 Speaker 1: a woman's life and she didn't have a man to 131 00:08:42,836 --> 00:08:46,956 Speaker 1: support her, she had no way of earning an income. 132 00:08:47,196 --> 00:08:51,396 Speaker 1: She fell into this poverty trap and had to rely 133 00:08:51,596 --> 00:08:56,036 Speaker 1: on going to the workhouse, begging, taking very menial jobs 134 00:08:56,076 --> 00:08:59,476 Speaker 1: that weren't designed to pay her very much. And so 135 00:08:59,636 --> 00:09:01,836 Speaker 1: things were much more difficult if you were a woman. 136 00:09:02,476 --> 00:09:04,916 Speaker 1: And so, I mean these injustices you talk about, I mean, 137 00:09:04,956 --> 00:09:07,196 Speaker 1: the workhouse has just sounds so scary and awful. You 138 00:09:07,276 --> 00:09:10,196 Speaker 1: do such a good job of making sounds so terrifying 139 00:09:10,236 --> 00:09:12,996 Speaker 1: in their podcast. But all of these injustice is the 140 00:09:13,036 --> 00:09:15,436 Speaker 1: thing that I found most striking was the fact that 141 00:09:15,516 --> 00:09:17,756 Speaker 1: all of these women faced this kind of final injustice, 142 00:09:17,916 --> 00:09:20,236 Speaker 1: which is they died at the hand of some horrible 143 00:09:20,356 --> 00:09:22,996 Speaker 1: serial killer, and everyone thinks it was like their fault, 144 00:09:23,276 --> 00:09:25,116 Speaker 1: you know, Oh they were just prostitutes, or oh they 145 00:09:25,116 --> 00:09:27,716 Speaker 1: were addicts and so on. So talk about this kind 146 00:09:27,756 --> 00:09:32,236 Speaker 1: of final injustice, right. Yeah. In the Victorian era, the 147 00:09:32,276 --> 00:09:35,996 Speaker 1: belief that somehow these women deserved what they got was 148 00:09:36,076 --> 00:09:39,156 Speaker 1: part of a narrative which was designed to keep people 149 00:09:39,196 --> 00:09:42,796 Speaker 1: in their place. So it was believed that bad women 150 00:09:43,356 --> 00:09:48,316 Speaker 1: got what they deserved. So here were women who contravened 151 00:09:48,556 --> 00:09:50,716 Speaker 1: the norm for what a woman was supposed to do 152 00:09:50,796 --> 00:09:54,476 Speaker 1: in her life. These were women who had no families, 153 00:09:54,596 --> 00:09:57,116 Speaker 1: or they'd left their families behind. They didn't have a 154 00:09:57,196 --> 00:09:59,796 Speaker 1: proper roof over their heads, they didn't have a husband 155 00:09:59,836 --> 00:10:02,436 Speaker 1: to look after them or a father, and often they 156 00:10:02,436 --> 00:10:06,396 Speaker 1: were reliant upon drink. They were failures as women. And 157 00:10:06,476 --> 00:10:09,796 Speaker 1: because they were failures as women, it was believed, well, 158 00:10:09,916 --> 00:10:11,916 Speaker 1: this is what happens when you're not a good woman. 159 00:10:12,356 --> 00:10:15,756 Speaker 1: You get killed in a terrible, terrible way. And so 160 00:10:16,036 --> 00:10:19,396 Speaker 1: basically this narrative propped up a lot of belief systems 161 00:10:19,436 --> 00:10:23,036 Speaker 1: in the Victorian era. At that time. The problem is 162 00:10:23,356 --> 00:10:26,996 Speaker 1: that we have never really questioned this, and I think 163 00:10:27,116 --> 00:10:30,836 Speaker 1: in a lot of mythologizing about the past, we just 164 00:10:30,876 --> 00:10:35,636 Speaker 1: don't question these sets of ideas that we are really bequeathed. 165 00:10:35,836 --> 00:10:38,156 Speaker 1: You know, we just repeat them, we parrot them, and 166 00:10:38,236 --> 00:10:42,236 Speaker 1: imparroting them. We kind of acquiesce in a lot of 167 00:10:42,236 --> 00:10:46,876 Speaker 1: the morality and the belief systems that pre existed, which 168 00:10:46,956 --> 00:10:49,476 Speaker 1: are of no use to us anymore. And you know, 169 00:10:49,596 --> 00:10:53,116 Speaker 1: I would be lying if I said that we live 170 00:10:53,276 --> 00:10:57,756 Speaker 1: in a society where it's not believed that bad women 171 00:10:58,076 --> 00:11:01,036 Speaker 1: are the ones who get punished. Every time a woman 172 00:11:01,196 --> 00:11:05,876 Speaker 1: goes out at night and is murdered, the first question 173 00:11:05,956 --> 00:11:08,756 Speaker 1: that is asked is, well, why was she out, what 174 00:11:08,876 --> 00:11:12,996 Speaker 1: was wearing? What did she do to incur this? And 175 00:11:13,076 --> 00:11:15,836 Speaker 1: that shouldn't be the question we're asking. The funny thing, 176 00:11:15,916 --> 00:11:17,996 Speaker 1: or the interesting thing for me is a psychologist, is 177 00:11:17,996 --> 00:11:19,916 Speaker 1: that the fact that we ask this question might be 178 00:11:20,236 --> 00:11:23,116 Speaker 1: and part because of some of these biases of the mind. Right, Like, 179 00:11:23,356 --> 00:11:25,436 Speaker 1: it's true that this kind of fit with the Victorian 180 00:11:25,516 --> 00:11:27,836 Speaker 1: narrative at the time of kind of keeping women in line, 181 00:11:27,876 --> 00:11:30,396 Speaker 1: but this might also just be a deep and pretty 182 00:11:30,436 --> 00:11:32,556 Speaker 1: awful part of our human nature like it tends to 183 00:11:32,556 --> 00:11:35,116 Speaker 1: fit with a couple of the psychological biases that we 184 00:11:35,156 --> 00:11:37,276 Speaker 1: talk about in cognitive science. One of them which is 185 00:11:37,276 --> 00:11:40,916 Speaker 1: so interesting is this bias known as the fundamental attribution error. 186 00:11:41,236 --> 00:11:43,436 Speaker 1: And the fundamental attribution error is that even though when 187 00:11:43,436 --> 00:11:46,156 Speaker 1: I think about my behavior, I usually try to explain 188 00:11:46,196 --> 00:11:49,356 Speaker 1: it using specific situations like oh, I drank too much 189 00:11:49,356 --> 00:11:50,956 Speaker 1: because I was having a bad day, or you know, 190 00:11:50,956 --> 00:11:52,756 Speaker 1: I for about my keys in the car because you know, 191 00:11:52,796 --> 00:11:55,036 Speaker 1: I was like really stressed or something. When we look 192 00:11:55,036 --> 00:11:58,156 Speaker 1: at other people's behavior, we assume that their behavior is 193 00:11:58,196 --> 00:12:00,756 Speaker 1: the result of their personality traits or their moral traits, 194 00:12:00,796 --> 00:12:03,556 Speaker 1: not situations. Right, So if you had a friend, you know, 195 00:12:03,796 --> 00:12:05,476 Speaker 1: a co worker who comes into work and says, oh, 196 00:12:05,476 --> 00:12:07,356 Speaker 1: my gosh, you know, my car just got broken into. 197 00:12:07,396 --> 00:12:09,996 Speaker 1: I'd left my person you might say like, oh, she's 198 00:12:09,996 --> 00:12:12,076 Speaker 1: so absent minded, why would she leave her purse in 199 00:12:12,076 --> 00:12:14,276 Speaker 1: the car, Or like, oh, that's just kind of fits 200 00:12:14,316 --> 00:12:16,236 Speaker 1: with her, like she just doesn't treat her stuff well 201 00:12:16,316 --> 00:12:18,436 Speaker 1: or something like that. You wouldn't think, Oh, maybe her 202 00:12:18,476 --> 00:12:20,596 Speaker 1: kid was sick, or maybe she was really stressed, or 203 00:12:20,636 --> 00:12:22,956 Speaker 1: maybe she had a bad day. Right, when you attribute 204 00:12:22,956 --> 00:12:24,836 Speaker 1: other people's behaviors to things. We don't tend to think 205 00:12:24,836 --> 00:12:27,396 Speaker 1: in terms of the circumstances. We tend to think in 206 00:12:27,476 --> 00:12:30,396 Speaker 1: terms of their personality traits. And I see this written 207 00:12:30,436 --> 00:12:32,596 Speaker 1: large in the stories you talk about, right. I just 208 00:12:32,596 --> 00:12:34,556 Speaker 1: listened to your episode with Polly, and it was kind 209 00:12:34,596 --> 00:12:37,156 Speaker 1: of like, oh, well, Polly, you know, she's such a 210 00:12:37,156 --> 00:12:39,316 Speaker 1: glutton and she liked to drink, And it wasn't like, no, 211 00:12:39,476 --> 00:12:41,876 Speaker 1: Polly had these awful family issues, like Polly had this 212 00:12:41,916 --> 00:12:44,076 Speaker 1: horrible marriage, and so talk about what it's been like 213 00:12:44,236 --> 00:12:47,636 Speaker 1: to bring those circumstances to light. Knowing that people's natural 214 00:12:47,636 --> 00:12:50,436 Speaker 1: tendency might not be even to think about them, it's 215 00:12:50,516 --> 00:12:55,836 Speaker 1: really important to try to understand what the circumstances were 216 00:12:55,996 --> 00:12:58,436 Speaker 1: that these people were coming from. This is where I 217 00:12:58,476 --> 00:13:02,036 Speaker 1: think empathy comes in. And using empathy as a historian 218 00:13:02,156 --> 00:13:05,436 Speaker 1: or any type of I think scientist is sometimes frowned 219 00:13:05,516 --> 00:13:08,156 Speaker 1: upon and people recoil at this because they think that 220 00:13:08,316 --> 00:13:10,676 Speaker 1: you're putting too much of yourself into it. But that 221 00:13:10,796 --> 00:13:13,916 Speaker 1: isn't the case. I think we can be empathetic and 222 00:13:14,676 --> 00:13:20,756 Speaker 1: still be very impartial. But when we are telling human stories, 223 00:13:20,836 --> 00:13:24,116 Speaker 1: when we are looking at human struggles, we must try 224 00:13:24,156 --> 00:13:28,396 Speaker 1: and understand what the sets of circumstances were that allowed 225 00:13:28,436 --> 00:13:31,236 Speaker 1: for something to happen. And this is so important because 226 00:13:31,236 --> 00:13:32,796 Speaker 1: I think one of the things we know from cognitive 227 00:13:32,836 --> 00:13:35,316 Speaker 1: science is that if you actually look at the reasons 228 00:13:35,356 --> 00:13:39,196 Speaker 1: behind people's behavior, often it's not as much their personality 229 00:13:39,236 --> 00:13:41,516 Speaker 1: traits or their kind of moral failings as we think. 230 00:13:41,796 --> 00:13:44,476 Speaker 1: Often it's the situation. The whole history of the last 231 00:13:44,516 --> 00:13:47,156 Speaker 1: few decades of psychology has shown us that, hey, if 232 00:13:47,156 --> 00:13:49,476 Speaker 1: you're in a situation where everybody is doing something, you're 233 00:13:49,476 --> 00:13:50,996 Speaker 1: probably going to do it too. If you're in a 234 00:13:50,996 --> 00:13:54,596 Speaker 1: situation where you just feel really horrible time pressure, right, 235 00:13:54,596 --> 00:13:56,556 Speaker 1: you're going to act more immorally. We talk about this 236 00:13:56,596 --> 00:13:59,196 Speaker 1: on my podcast in terms of feeling time famine causes 237 00:13:59,196 --> 00:14:01,036 Speaker 1: you to do all this awful stuff. And if you're 238 00:14:01,076 --> 00:14:04,476 Speaker 1: in situations of poverty, if you're in situations where you're 239 00:14:04,596 --> 00:14:07,356 Speaker 1: in danger, like, you're going to do things that you 240 00:14:07,556 --> 00:14:10,076 Speaker 1: yourself right now not in those situations, might not do. 241 00:14:10,356 --> 00:14:12,676 Speaker 1: And so I think that there's this idea that we 242 00:14:12,756 --> 00:14:15,636 Speaker 1: kind of morally condemn people for taking actions that we 243 00:14:15,716 --> 00:14:18,076 Speaker 1: ourselves would have taken in those situations. And then the 244 00:14:18,236 --> 00:14:21,236 Speaker 1: striking thing about bad women. Is it just articulates how 245 00:14:21,276 --> 00:14:23,996 Speaker 1: awful some of these situations really were. I mean, give 246 00:14:24,036 --> 00:14:25,596 Speaker 1: me a sense of some of the things that these 247 00:14:25,596 --> 00:14:29,276 Speaker 1: women were facing, like just on a day to day basis. Gosh, well, 248 00:14:29,316 --> 00:14:32,516 Speaker 1: I think we really take for granted how easy our 249 00:14:32,556 --> 00:14:36,396 Speaker 1: lives are. Physically, the Victorian era was absolutely terrible. If 250 00:14:36,396 --> 00:14:39,756 Speaker 1: you were poor. The living environment was terrible, often completely 251 00:14:39,836 --> 00:14:44,356 Speaker 1: infested with vermin. People were physically uncomfortable all the time. 252 00:14:44,476 --> 00:14:47,236 Speaker 1: You know, today, we take a painkiller if we've got 253 00:14:47,276 --> 00:14:50,076 Speaker 1: a headache, if something's wrong. You know, they didn't have 254 00:14:50,156 --> 00:14:52,916 Speaker 1: access to that, and it didn't exist, so there's a 255 00:14:52,956 --> 00:14:55,996 Speaker 1: certain level of ambient pain all the time. A lot 256 00:14:56,036 --> 00:15:00,436 Speaker 1: of these women we know had diseases. Annie Chapman had tuberculosis, 257 00:15:00,916 --> 00:15:05,236 Speaker 1: Katherine Edos had kidney disease. Elizabeth Stride, for example, was 258 00:15:05,276 --> 00:15:10,236 Speaker 1: suffering from treachery, syphilis, and the circumstances. Their lives were 259 00:15:10,436 --> 00:15:16,156 Speaker 1: so awful, so bleak, so without hope that of course 260 00:15:16,276 --> 00:15:20,556 Speaker 1: they're going to be drinking to forget as well, and hunger. 261 00:15:21,116 --> 00:15:23,516 Speaker 1: I mean, these people did not know where their next 262 00:15:23,516 --> 00:15:26,516 Speaker 1: meal was coming from. They didn't know what was going 263 00:15:26,556 --> 00:15:28,796 Speaker 1: to happen. Really from hour to hour where they were 264 00:15:28,796 --> 00:15:30,476 Speaker 1: going to be laying their head. One of the most 265 00:15:30,516 --> 00:15:34,036 Speaker 1: evocative things about your podcast is like the smells is 266 00:15:34,156 --> 00:15:37,716 Speaker 1: like a poop everywhere, and it's like stinky. It's such 267 00:15:37,716 --> 00:15:40,156 Speaker 1: a I mean, honestly, it's actually such an evocative part 268 00:15:40,196 --> 00:15:42,876 Speaker 1: of your podcast. It's actually like takes you there in 269 00:15:42,916 --> 00:15:45,596 Speaker 1: this wonderful way. So it's so funny because I'm so 270 00:15:45,636 --> 00:15:49,156 Speaker 1: immersed in it, I don't it's so normalized for me 271 00:15:49,276 --> 00:15:53,196 Speaker 1: that wait a minute, everybody has chamber pots. They're just 272 00:15:53,196 --> 00:15:57,516 Speaker 1: like poop at people's best. Yeah. Absolutely, I mean honestly, 273 00:15:57,676 --> 00:15:59,636 Speaker 1: even if I was just facing like the stench of 274 00:15:59,716 --> 00:16:01,876 Speaker 1: feces all the time, I feel like I definitely would 275 00:16:01,916 --> 00:16:06,276 Speaker 1: start drinking washed in the bodies as well. A lot 276 00:16:06,356 --> 00:16:10,396 Speaker 1: of times people did not have access to laundry facilities, 277 00:16:10,796 --> 00:16:14,556 Speaker 1: to bathing facilities, you know, and sometimes when they had 278 00:16:14,556 --> 00:16:18,356 Speaker 1: access to bathing facilities, it was cold water, so people 279 00:16:18,676 --> 00:16:21,396 Speaker 1: smelled terrible. And the fact that these women were going 280 00:16:21,436 --> 00:16:23,876 Speaker 1: through so much pain, I think leads to the next 281 00:16:23,876 --> 00:16:26,236 Speaker 1: bias that I see so much in your telling of 282 00:16:26,316 --> 00:16:29,396 Speaker 1: these women's stories, right, the next psychological bias, which is 283 00:16:29,436 --> 00:16:32,716 Speaker 1: what researcher is called the just world bias, and so 284 00:16:32,796 --> 00:16:35,036 Speaker 1: the idea behind the just world biases. We would like 285 00:16:35,116 --> 00:16:37,276 Speaker 1: to assume that the world is a just place, that 286 00:16:37,476 --> 00:16:40,756 Speaker 1: you know, good things, like truly awful bad things don't 287 00:16:40,796 --> 00:16:43,356 Speaker 1: happen to good people, right, and that if we ourselves 288 00:16:43,396 --> 00:16:45,396 Speaker 1: are good, if we work hard and do the right thing, 289 00:16:45,516 --> 00:16:48,076 Speaker 1: we can live a life filled with joy and pleasure. 290 00:16:48,076 --> 00:16:49,956 Speaker 1: In these kinds of things. These are deep beliefs that 291 00:16:49,996 --> 00:16:51,596 Speaker 1: like keep us getting up in the morning and not 292 00:16:51,676 --> 00:16:53,796 Speaker 1: kind of terrified about what the world is. And what 293 00:16:53,836 --> 00:16:56,316 Speaker 1: that means is that it makes us do good things right, 294 00:16:56,316 --> 00:16:58,316 Speaker 1: like it makes us fight injustice, It makes us be 295 00:16:58,396 --> 00:17:00,276 Speaker 1: good people because we don't want to get our come up, 296 00:17:00,276 --> 00:17:01,956 Speaker 1: and so we kind of want to believe that if 297 00:17:01,956 --> 00:17:03,796 Speaker 1: we do the right thing, you know, good things will 298 00:17:03,796 --> 00:17:07,236 Speaker 1: happen to us. But it also leads to this insidious rationalization, 299 00:17:07,476 --> 00:17:09,716 Speaker 1: which is like when you see a bad thing happened 300 00:17:09,756 --> 00:17:13,916 Speaker 1: to a person, your first instinct is to assume, well, 301 00:17:14,276 --> 00:17:17,316 Speaker 1: you know, maybe there was some reason there, right, maybe 302 00:17:17,316 --> 00:17:20,116 Speaker 1: it just didn't happen by chance. A friend tells you like, 303 00:17:20,156 --> 00:17:22,276 Speaker 1: oh my gosh, my sister just found out she has 304 00:17:22,356 --> 00:17:25,756 Speaker 1: liver cancer. Your brain instantly goes to there must be 305 00:17:25,796 --> 00:17:28,476 Speaker 1: some reason that this is a just dessert, right that 306 00:17:28,596 --> 00:17:30,756 Speaker 1: you know, the world doesn't work by making terrible things 307 00:17:30,756 --> 00:17:33,636 Speaker 1: happen to people. There should be some reason there. It's like, well, 308 00:17:33,676 --> 00:17:35,796 Speaker 1: I wonder if she like drank too much or didn't 309 00:17:35,836 --> 00:17:38,396 Speaker 1: take care of herself. Right, And this I saw just 310 00:17:38,516 --> 00:17:40,996 Speaker 1: written large, not just in your women's stories, but just 311 00:17:41,076 --> 00:17:43,436 Speaker 1: in how people reacted to the fact that you were 312 00:17:43,516 --> 00:17:46,276 Speaker 1: humanizing them and telling their stories. And so does this 313 00:17:46,356 --> 00:17:48,476 Speaker 1: kind of resonate, this kind of just world bias, the 314 00:17:48,556 --> 00:17:50,836 Speaker 1: kind of thing you saw and the retelling of these stories. 315 00:17:51,236 --> 00:17:54,196 Speaker 1: I mean, it's very current with the murder of women. 316 00:17:54,356 --> 00:17:58,236 Speaker 1: So in the United States, Gabrielle Petito, what was going on? 317 00:17:58,356 --> 00:18:01,076 Speaker 1: What was she doing? What was people were asking those questions, 318 00:18:01,316 --> 00:18:04,716 Speaker 1: you know, what did they do to deserve this? You know, 319 00:18:04,756 --> 00:18:08,636 Speaker 1: the belief being, of course, that they brought it on themselves. 320 00:18:09,636 --> 00:18:15,796 Speaker 1: We are so far from actually taking that step back, 321 00:18:17,156 --> 00:18:24,156 Speaker 1: not asking those questions, but just reserving judgment and listening 322 00:18:24,436 --> 00:18:29,236 Speaker 1: and examining the other factors that were at play here. 323 00:18:29,716 --> 00:18:32,996 Speaker 1: But what's triggered, it seems, is this kind of knee 324 00:18:33,076 --> 00:18:38,836 Speaker 1: jerk response about women being at fault for their own murders. 325 00:18:39,356 --> 00:18:41,796 Speaker 1: So for example, you know, here I am coming in 326 00:18:42,196 --> 00:18:45,716 Speaker 1: and saying that, actually the story that you know about 327 00:18:45,756 --> 00:18:49,796 Speaker 1: Check the Ripper, it wasn't these women's faults that this happened. 328 00:18:50,196 --> 00:18:54,196 Speaker 1: By changing it, people get very upset about that because 329 00:18:54,236 --> 00:18:58,876 Speaker 1: they're clinging to this belief system that's still with us today. 330 00:18:59,236 --> 00:19:01,716 Speaker 1: And it really seems to be a belief system that well, 331 00:19:01,756 --> 00:19:04,036 Speaker 1: first of all, there's evidence that it might be really 332 00:19:04,196 --> 00:19:06,396 Speaker 1: ingrained in the sense of emerging really early. In fact, 333 00:19:06,476 --> 00:19:09,636 Speaker 1: there's evidence that even little kids show this, you know, 334 00:19:09,716 --> 00:19:12,356 Speaker 1: belief in a just world bias. In fact, if you 335 00:19:12,396 --> 00:19:14,596 Speaker 1: tell little kids kid versions of the stories we wouldn't 336 00:19:14,596 --> 00:19:16,556 Speaker 1: talk about, you know, Jack the rippburn my serial killer 337 00:19:16,636 --> 00:19:18,836 Speaker 1: murders to little kids, but simple things like Joe was 338 00:19:18,876 --> 00:19:21,196 Speaker 1: walking to school and he got hooped on by a bird, 339 00:19:21,716 --> 00:19:24,476 Speaker 1: the kids will start justifying, well, you know, maybe Joe 340 00:19:24,556 --> 00:19:27,076 Speaker 1: did something wrong or he's a bad person. Right, he 341 00:19:27,116 --> 00:19:29,716 Speaker 1: can't just be unlucky. There has to be some reason 342 00:19:29,836 --> 00:19:32,476 Speaker 1: for this, right, And this emerges in four year old kids, 343 00:19:32,476 --> 00:19:34,196 Speaker 1: so it makes sense that it's still with us as 344 00:19:34,236 --> 00:19:36,236 Speaker 1: an adult. But another thing we know about this bias 345 00:19:36,276 --> 00:19:37,796 Speaker 1: is kind of exactly what you said. This is a 346 00:19:37,916 --> 00:19:40,636 Speaker 1: knee jerk response. Right, this is some unconscious thing our 347 00:19:40,676 --> 00:19:43,156 Speaker 1: brains do really fast, and one of the powers of 348 00:19:43,196 --> 00:19:46,516 Speaker 1: this is that our brains start unconsciously looking for evidence 349 00:19:46,636 --> 00:19:49,076 Speaker 1: that there might be some reason in there that it's 350 00:19:49,116 --> 00:19:51,356 Speaker 1: a justice or you know, like with the cancer example, 351 00:19:51,356 --> 00:19:52,876 Speaker 1: you're like, I wonder if she drank, Like you don't 352 00:19:52,916 --> 00:19:54,836 Speaker 1: know anything about this sister, Like you're just like making 353 00:19:54,876 --> 00:19:57,036 Speaker 1: stuff up about her, because you assume there has to 354 00:19:57,076 --> 00:19:59,116 Speaker 1: be something there, And that means we go through just 355 00:19:59,196 --> 00:20:02,356 Speaker 1: a ton of mental gymnastics to keep these distortions up. 356 00:20:02,356 --> 00:20:05,156 Speaker 1: And I think you know you've seen like these mental 357 00:20:05,236 --> 00:20:07,436 Speaker 1: gymnastics at work, you know, So talk about some of 358 00:20:07,516 --> 00:20:10,276 Speaker 1: the challenges you've faced from people who are like, no, no, no, 359 00:20:10,316 --> 00:20:13,716 Speaker 1: you're just getting these women's stories wrong. Some people, it's 360 00:20:13,756 --> 00:20:18,716 Speaker 1: some people, not all of the people are incredibly nasty 361 00:20:18,956 --> 00:20:22,476 Speaker 1: and incredibly vicious with what they say. I mean, there 362 00:20:22,556 --> 00:20:25,036 Speaker 1: is this investment in this story. You know, this is 363 00:20:25,156 --> 00:20:28,156 Speaker 1: very personal to some people who feel they've invested their 364 00:20:29,076 --> 00:20:32,356 Speaker 1: lives in researching Jack the Ripper, And here comes this 365 00:20:32,396 --> 00:20:36,356 Speaker 1: person from the outside saying, well, actually, this evidence that 366 00:20:36,396 --> 00:20:39,596 Speaker 1: you've always assumed was true isn't actually true, and it's 367 00:20:39,796 --> 00:20:42,156 Speaker 1: not true because of this, this and this, and so 368 00:20:42,356 --> 00:20:45,356 Speaker 1: the only way instead of actually taking that on board, 369 00:20:45,756 --> 00:20:49,556 Speaker 1: the only way they can process what it is that 370 00:20:49,596 --> 00:20:52,956 Speaker 1: I've written is to say that I've lied, that I've 371 00:20:53,036 --> 00:20:57,196 Speaker 1: hid evidence, that I'm dishonest, that I'm a pathological liar. 372 00:20:57,676 --> 00:21:01,436 Speaker 1: And that's the only way that they make it actually 373 00:21:02,156 --> 00:21:06,396 Speaker 1: carry on with their biases. That really surprised me. Yeah, 374 00:21:06,516 --> 00:21:09,356 Speaker 1: human minds kind of suck, as we say a lot, yes, 375 00:21:09,516 --> 00:21:12,436 Speaker 1: and this is you know, when things challenge our core beliefs, 376 00:21:12,556 --> 00:21:16,476 Speaker 1: it's amazing how much effort will go through to kind 377 00:21:16,476 --> 00:21:20,116 Speaker 1: of rationalize that. But I mean I get this too occasionally, right, 378 00:21:20,156 --> 00:21:22,236 Speaker 1: Like I'm doing a podcast on happiness, and I occasionally 379 00:21:22,316 --> 00:21:25,196 Speaker 1: get critics who say, you know, my voice sucks or 380 00:21:25,276 --> 00:21:28,636 Speaker 1: I'm dumb, or how dare I mentioned this particular thing right? 381 00:21:28,716 --> 00:21:30,716 Speaker 1: Or how dare I not mention someone in like X 382 00:21:30,716 --> 00:21:33,116 Speaker 1: circumstance when I know we have an episode coming up 383 00:21:33,156 --> 00:21:35,636 Speaker 1: about X circumstance in like three weeks. If they just waited, 384 00:21:35,796 --> 00:21:37,716 Speaker 1: What doesn't that kind of critique feel like to you? 385 00:21:37,756 --> 00:21:39,796 Speaker 1: I know what it does to me. It just feels awful. 386 00:21:40,236 --> 00:21:42,836 Speaker 1: How have you kind of processed it? I've tried to 387 00:21:42,876 --> 00:21:46,916 Speaker 1: process it with empathy actually and actually reasoning myself through it, 388 00:21:46,956 --> 00:21:50,156 Speaker 1: which is, look, people want to be angry. By being 389 00:21:50,196 --> 00:21:52,756 Speaker 1: in public life at all, you're going to be a 390 00:21:52,876 --> 00:21:57,836 Speaker 1: lightning rod for everybody's issues, and a lot of people 391 00:21:57,836 --> 00:22:01,596 Speaker 1: who attack you it's their issues. It actually almost has 392 00:22:01,636 --> 00:22:05,556 Speaker 1: nothing to do with you. And I see that as 393 00:22:05,556 --> 00:22:08,356 Speaker 1: a kind of way to navigate through some of this 394 00:22:08,556 --> 00:22:12,436 Speaker 1: real lastiness. But at the same time, it's like being 395 00:22:12,436 --> 00:22:15,156 Speaker 1: a little boat on a rocky sea, you know, and 396 00:22:15,196 --> 00:22:18,676 Speaker 1: you're just trying to find that even keel. That takes 397 00:22:18,676 --> 00:22:21,196 Speaker 1: a lot of energy. Yeah, it can be really exhausting. 398 00:22:21,196 --> 00:22:23,956 Speaker 1: And I think a particular feature of the kind of 399 00:22:23,956 --> 00:22:26,436 Speaker 1: trolling that you've been getting is that my understanding is 400 00:22:26,436 --> 00:22:28,836 Speaker 1: that most of it happens online. Right. It's not like 401 00:22:28,876 --> 00:22:31,596 Speaker 1: these people see you, you know, in a shop and say, hey, 402 00:22:31,596 --> 00:22:33,876 Speaker 1: halle face to face, you know, here my critiques about 403 00:22:33,876 --> 00:22:37,436 Speaker 1: your podcast. It's kind of this short, character length, anonymous 404 00:22:37,436 --> 00:22:40,036 Speaker 1: tweet on the internet. Right. And this is something we 405 00:22:40,076 --> 00:22:43,116 Speaker 1: also know scientifically, is that the internet takes away our 406 00:22:43,156 --> 00:22:46,796 Speaker 1: face to face interactions, like our natural empathic urges come 407 00:22:46,876 --> 00:22:49,356 Speaker 1: up when we're talking to people face to face, right, 408 00:22:49,676 --> 00:22:51,316 Speaker 1: And That's one of the reasons that people can be 409 00:22:51,356 --> 00:22:54,716 Speaker 1: so mean online is that the normal empathic urges that 410 00:22:54,716 --> 00:22:57,196 Speaker 1: would creep up, like I shouldn't say something so harmful 411 00:22:57,236 --> 00:22:59,436 Speaker 1: to someone. It doesn't feel like you're saying it to 412 00:22:59,476 --> 00:23:01,596 Speaker 1: a person because you're doing it on the Internet. On 413 00:23:01,876 --> 00:23:04,396 Speaker 1: the Happiness Lab podcast, Scared of my colleague Jamie Zaki, 414 00:23:04,476 --> 00:23:07,356 Speaker 1: who studies empathy directly, and he talks about how he 415 00:23:07,636 --> 00:23:11,276 Speaker 1: worries that the Internet in general is undermining our natural empathy. 416 00:23:11,316 --> 00:23:13,676 Speaker 1: And I think that this is especially true in this 417 00:23:13,716 --> 00:23:16,156 Speaker 1: space of kind of trolling and things. But one of 418 00:23:16,196 --> 00:23:17,716 Speaker 1: the reasons I love your podcast so much, and one 419 00:23:17,716 --> 00:23:19,156 Speaker 1: of the reasons it has been fun talking to you 420 00:23:19,196 --> 00:23:22,036 Speaker 1: today is I can see your natural empathy kind of 421 00:23:22,076 --> 00:23:23,876 Speaker 1: shining through, right. You know, these are people who are 422 00:23:23,876 --> 00:23:25,796 Speaker 1: saying really hateful things about you, and you're like, well, 423 00:23:26,036 --> 00:23:27,756 Speaker 1: I'm trying to have empathy for them. This is the 424 00:23:27,836 --> 00:23:29,476 Speaker 1: kind of thing that I think can be so powerful 425 00:23:29,476 --> 00:23:32,276 Speaker 1: about a podcast like yours, right, is it's kind of 426 00:23:32,316 --> 00:23:35,756 Speaker 1: like naturally allowing us to sort of flex our empathy muscles. 427 00:23:35,956 --> 00:23:39,236 Speaker 1: If we can empathize with some women who died many 428 00:23:39,276 --> 00:23:42,196 Speaker 1: many years ago, who've been in circumstances who are very 429 00:23:42,276 --> 00:23:44,716 Speaker 1: unlike ours. It's kind of a way to kind of 430 00:23:44,716 --> 00:23:47,516 Speaker 1: boost our empathy for other people today. I mean, have 431 00:23:47,596 --> 00:23:50,276 Speaker 1: you found that, you know, really understanding what it was 432 00:23:50,316 --> 00:23:53,076 Speaker 1: like to be these women in awful circumstances has helped 433 00:23:53,116 --> 00:23:57,356 Speaker 1: you empathize more with people today. Absolutely. I mean writing 434 00:23:57,436 --> 00:24:01,676 Speaker 1: so much and researching so much about female homelessness, and 435 00:24:02,476 --> 00:24:04,756 Speaker 1: one day I was just walking down the strand in 436 00:24:04,836 --> 00:24:09,036 Speaker 1: London and walked by a woman i'd say like maybe 437 00:24:09,156 --> 00:24:11,916 Speaker 1: early thirties, and she had a child with her who 438 00:24:11,996 --> 00:24:15,316 Speaker 1: was about seven years old, and she was begging for money, 439 00:24:15,396 --> 00:24:19,956 Speaker 1: and I just stopped and I was so overwhelmed by 440 00:24:21,196 --> 00:24:24,636 Speaker 1: looking at her and thinking, my god, you could be 441 00:24:24,796 --> 00:24:27,396 Speaker 1: any one of these women I have just spent so 442 00:24:27,516 --> 00:24:30,236 Speaker 1: much time with. And I stopped and I talked to her, 443 00:24:30,236 --> 00:24:32,236 Speaker 1: and I gave her some money, and she told me 444 00:24:32,276 --> 00:24:35,876 Speaker 1: a story about how she fell into arrears on her 445 00:24:35,876 --> 00:24:39,276 Speaker 1: rent and her landlord threw her out with her child. 446 00:24:40,156 --> 00:24:44,516 Speaker 1: And I was staggered because this was literally a story 447 00:24:45,156 --> 00:24:49,596 Speaker 1: right out of the nineteenth century and right there, right 448 00:24:49,716 --> 00:24:53,396 Speaker 1: in that same place. And so I think writing about 449 00:24:53,476 --> 00:24:58,876 Speaker 1: these women well, writing about poverty, writing about individual's experiences 450 00:24:59,676 --> 00:25:04,556 Speaker 1: really makes you see more of the universal human experience. 451 00:25:05,156 --> 00:25:06,996 Speaker 1: And I think this is one of the powerful things 452 00:25:06,996 --> 00:25:10,116 Speaker 1: about listening to these historical stories, right, you know you 453 00:25:10,196 --> 00:25:12,916 Speaker 1: had that experience with you know, the homeless woman. But 454 00:25:12,956 --> 00:25:15,236 Speaker 1: I'll be honest, like, actually, the same thing happened. I 455 00:25:15,356 --> 00:25:17,116 Speaker 1: binge listened to a lot of your show when I 456 00:25:17,156 --> 00:25:19,556 Speaker 1: was on a long drive and when I finished my drive, 457 00:25:19,596 --> 00:25:21,236 Speaker 1: I parked my car and I was walking back to 458 00:25:21,316 --> 00:25:24,156 Speaker 1: my house. I ran into someone who was homeless, this 459 00:25:24,236 --> 00:25:27,116 Speaker 1: homeless guy who was asking for some money. And you know, 460 00:25:27,396 --> 00:25:29,756 Speaker 1: he's a guy who's often on that same street corner. 461 00:25:29,876 --> 00:25:32,076 Speaker 1: And after listening to your show, I kind of not 462 00:25:32,116 --> 00:25:33,756 Speaker 1: only gave him money, but did the same thing you did, 463 00:25:33,756 --> 00:25:35,716 Speaker 1: which was like spoke to him for a couple of 464 00:25:35,876 --> 00:25:38,076 Speaker 1: like how's your day, Like how are things going? And 465 00:25:38,116 --> 00:25:40,236 Speaker 1: I was compelled to do it because you know, now 466 00:25:40,276 --> 00:25:42,516 Speaker 1: he wasn't just someone I was walking by. You know, 467 00:25:42,716 --> 00:25:45,156 Speaker 1: this was a human who could have been me. Right, 468 00:25:45,436 --> 00:25:47,516 Speaker 1: there's this idea that like, you know, you tend to 469 00:25:47,556 --> 00:25:50,516 Speaker 1: see this not as just oh, this is another but really, 470 00:25:50,556 --> 00:25:53,316 Speaker 1: this is someone that I've connected with in this interesting way, 471 00:25:53,356 --> 00:25:55,716 Speaker 1: And so have you had other people who've had this 472 00:25:55,796 --> 00:25:58,436 Speaker 1: reaction to kind of listening to your podcast where you 473 00:25:58,476 --> 00:26:02,636 Speaker 1: know they're getting a sense of experiencing more empathy afterwards. Yeah, 474 00:26:02,876 --> 00:26:05,596 Speaker 1: people are telling me that you know, they're crying as 475 00:26:05,636 --> 00:26:10,636 Speaker 1: they're listening to it. I mean, my my feeling is good. Good. 476 00:26:10,676 --> 00:26:14,636 Speaker 1: We need to connect, we need it's so important, you know, 477 00:26:14,916 --> 00:26:22,196 Speaker 1: human experience. Understanding human experience is so fundamental to understanding ourselves. 478 00:26:22,516 --> 00:26:25,796 Speaker 1: And I worry sometimes with a lot of true crime, 479 00:26:26,276 --> 00:26:28,876 Speaker 1: we tend to focus on the story of the perpetrator. 480 00:26:29,276 --> 00:26:32,836 Speaker 1: We don't focus on the stories of the victims, and 481 00:26:33,116 --> 00:26:39,396 Speaker 1: victimology is not as seen as sexy as suspectology. And ironically, 482 00:26:39,396 --> 00:26:42,396 Speaker 1: you know you mentioned listeners crying, but that those tears 483 00:26:42,396 --> 00:26:44,796 Speaker 1: are actually a path towards well being. This is something 484 00:26:44,796 --> 00:26:48,156 Speaker 1: we know from the research, right that connecting with other people, 485 00:26:48,236 --> 00:26:51,196 Speaker 1: empathizing with other people, even if they have stories of pain, 486 00:26:51,636 --> 00:26:54,836 Speaker 1: can sometimes make us feel more socially connected. They can 487 00:26:54,916 --> 00:26:57,836 Speaker 1: ultimately lead to us helping other people, you know, giving 488 00:26:57,916 --> 00:27:00,116 Speaker 1: you know, the dollar to the homeless person, and then 489 00:27:00,156 --> 00:27:03,436 Speaker 1: that can allow you to feel better. Right, So, ironically, 490 00:27:03,516 --> 00:27:06,516 Speaker 1: kind of connecting to other people's pain is actually a 491 00:27:06,556 --> 00:27:08,836 Speaker 1: path to happiness. The problem though, is that it doesn't 492 00:27:08,916 --> 00:27:11,556 Speaker 1: like feel great in the moment. And I think that's 493 00:27:11,556 --> 00:27:14,036 Speaker 1: what we're going to tackle when the Happiness Lab returns. 494 00:27:14,036 --> 00:27:17,396 Speaker 1: In a moment, which is when you compassionately embrace the 495 00:27:17,436 --> 00:27:20,196 Speaker 1: suffering of other people, it kind of sucks. And so 496 00:27:20,236 --> 00:27:22,276 Speaker 1: what are some strategies we can use to deal with 497 00:27:22,316 --> 00:27:24,356 Speaker 1: the injustice of the world, to face it head on, 498 00:27:24,796 --> 00:27:26,636 Speaker 1: but to protect our well being when we do it. 499 00:27:26,756 --> 00:27:28,676 Speaker 1: That's what we'll talk about when the Happiness Lab returns 500 00:27:28,716 --> 00:27:36,796 Speaker 1: in a second. I'm embarrassed to admit that even as 501 00:27:36,836 --> 00:27:39,516 Speaker 1: I started listening to your podcast, I just assumed that 502 00:27:39,636 --> 00:27:43,116 Speaker 1: the five women were, you know, prostitutes, drunkards, you know, 503 00:27:43,156 --> 00:27:45,436 Speaker 1: people that we could have written off, right, And it 504 00:27:45,716 --> 00:27:48,436 Speaker 1: took me some work to really listen to their stories 505 00:27:48,476 --> 00:27:50,076 Speaker 1: and kind of come to terms with the fact that, 506 00:27:50,356 --> 00:27:51,956 Speaker 1: you know, they were just like me. You know, it 507 00:27:52,036 --> 00:27:53,716 Speaker 1: wasn't this kind of thing that I can just kind 508 00:27:53,716 --> 00:27:55,396 Speaker 1: of write them off, like I have to have empathy 509 00:27:55,436 --> 00:27:57,116 Speaker 1: for them too, And I think that that sort of 510 00:27:57,156 --> 00:27:59,636 Speaker 1: fits with a lot of people's experience, right when you 511 00:27:59,676 --> 00:28:02,876 Speaker 1: finally start empathizing with a person who's going through pain, 512 00:28:02,956 --> 00:28:05,796 Speaker 1: when you start finally empathizing with a person who's going 513 00:28:05,796 --> 00:28:09,436 Speaker 1: through tough times. You feel better, it feels good, feels 514 00:28:09,436 --> 00:28:11,996 Speaker 1: like you're enacting justice. You feel more connected to this person. 515 00:28:12,476 --> 00:28:14,596 Speaker 1: And so one of the strategies I like to suggest 516 00:28:14,596 --> 00:28:17,076 Speaker 1: to my students who want to promote their empathy is 517 00:28:17,116 --> 00:28:20,276 Speaker 1: to really take time to like work to see things 518 00:28:20,316 --> 00:28:23,196 Speaker 1: from other people's perspectives, to like work to see things 519 00:28:23,236 --> 00:28:26,356 Speaker 1: from other people's circumstances, and so, you know, talk about 520 00:28:26,396 --> 00:28:28,116 Speaker 1: how history does that for us, and how you've been 521 00:28:28,156 --> 00:28:30,996 Speaker 1: able to do that with these women's stories. I am 522 00:28:31,036 --> 00:28:34,076 Speaker 1: a social historian, and so the type of history that 523 00:28:34,156 --> 00:28:39,556 Speaker 1: I examine is really the nitty gritty of human experience 524 00:28:39,596 --> 00:28:43,476 Speaker 1: in the past, and I think that automatically brings you 525 00:28:43,596 --> 00:28:48,556 Speaker 1: into a place of empathy, because when you understand the 526 00:28:48,756 --> 00:28:52,796 Speaker 1: granular details of what it meant to be alive and 527 00:28:52,876 --> 00:28:57,116 Speaker 1: a particular time and place, you suddenly see things very differently. 528 00:28:57,556 --> 00:28:59,996 Speaker 1: And I think that's one of the important things about history. 529 00:28:59,996 --> 00:29:03,516 Speaker 1: You know, Psychologically, we often think that empathy involves sort 530 00:29:03,556 --> 00:29:06,276 Speaker 1: of perspective taking right where I just kind of, you know, 531 00:29:06,396 --> 00:29:09,476 Speaker 1: sit from my modern environment with no you know, chamber 532 00:29:09,556 --> 00:29:11,796 Speaker 1: pots or feces around, and I think Oh, what must 533 00:29:11,796 --> 00:29:13,556 Speaker 1: it have been like, you know, to be you know, 534 00:29:13,556 --> 00:29:16,756 Speaker 1: a woman then, And often that perspective taking is kind 535 00:29:16,796 --> 00:29:19,676 Speaker 1: of wrong, right, because I haven't thought really through the circumstances. 536 00:29:19,956 --> 00:29:21,716 Speaker 1: And that's why a lot of the science suggests that 537 00:29:21,916 --> 00:29:26,196 Speaker 1: a better way to empathize is through perspective getting right. 538 00:29:26,236 --> 00:29:28,716 Speaker 1: You literally ask people what are their stories? Right, which 539 00:29:28,716 --> 00:29:30,876 Speaker 1: I can do in the modern day. I can interview 540 00:29:30,916 --> 00:29:33,556 Speaker 1: someone who has a different circumstance than me and ask 541 00:29:33,596 --> 00:29:36,236 Speaker 1: what that circumstances like and hear it directly from them. 542 00:29:36,516 --> 00:29:38,636 Speaker 1: And I think that's the power of the historical approach 543 00:29:38,676 --> 00:29:41,076 Speaker 1: is ultimately, you know, what social historians have to do 544 00:29:41,156 --> 00:29:43,356 Speaker 1: is to like find those narratives, you know, even if 545 00:29:43,356 --> 00:29:46,276 Speaker 1: it wasn't a journal, right, Like find those circumstances and 546 00:29:46,356 --> 00:29:49,436 Speaker 1: really pull out what would their perspective have been, you know, 547 00:29:49,476 --> 00:29:51,996 Speaker 1: given these circumstances. And I think it can be really 548 00:29:51,996 --> 00:29:54,236 Speaker 1: powerful because you're like, oh, when I perspective take, I 549 00:29:54,276 --> 00:29:55,516 Speaker 1: was like, well, I wouldn't be out on the street 550 00:29:55,556 --> 00:29:57,836 Speaker 1: at night, you know, to get murdered. But then I'm like, oh, actually, 551 00:29:58,116 --> 00:30:00,116 Speaker 1: when I get their perspective and see what it's really like, 552 00:30:00,356 --> 00:30:04,276 Speaker 1: I think really quite differently. Yeah, exactly, and that is 553 00:30:04,316 --> 00:30:09,716 Speaker 1: really important. Also when you understand other human experience, both 554 00:30:09,756 --> 00:30:12,276 Speaker 1: in the past and the present, of people you know, 555 00:30:12,356 --> 00:30:15,916 Speaker 1: of people you don't know, it helps you become a 556 00:30:15,956 --> 00:30:19,836 Speaker 1: more empathetic person. It helps you see things you wouldn't 557 00:30:19,916 --> 00:30:24,076 Speaker 1: normally see. And I'm a really evangelical when it comes 558 00:30:24,116 --> 00:30:28,796 Speaker 1: to promoting social history as the way forward in historical study, 559 00:30:28,916 --> 00:30:32,076 Speaker 1: because it tells us so much about the present, and 560 00:30:32,116 --> 00:30:35,356 Speaker 1: it helps us unpick so many of our habits, so 561 00:30:35,356 --> 00:30:37,276 Speaker 1: many of our bad habits, and so many of our 562 00:30:37,276 --> 00:30:40,396 Speaker 1: belief systems, and it helps us lead through you know, 563 00:30:40,436 --> 00:30:44,636 Speaker 1: what can we use today and what belongs in the past. Still, 564 00:30:44,676 --> 00:30:46,916 Speaker 1: it also allows us to kind of build up our 565 00:30:46,996 --> 00:30:50,356 Speaker 1: compassion muscles. I mean, there's tons of evidence showing that, 566 00:30:50,436 --> 00:30:52,876 Speaker 1: you know, compassion is kind of a skill that we 567 00:30:52,876 --> 00:30:55,916 Speaker 1: can build up over time, right, but it takes some practice, 568 00:30:56,316 --> 00:30:59,716 Speaker 1: and that practice really does involve thinking through the bad 569 00:30:59,756 --> 00:31:01,636 Speaker 1: things that are going on with other people. You know. 570 00:31:01,676 --> 00:31:05,356 Speaker 1: There's evidence, for example, that it involves actively trying to 571 00:31:05,556 --> 00:31:08,676 Speaker 1: wish well and give compassion to people who might not 572 00:31:08,716 --> 00:31:11,236 Speaker 1: be so nice to you. Our critics are internet rolls, 573 00:31:11,276 --> 00:31:13,436 Speaker 1: people who say me and stuff. The research shows that 574 00:31:13,556 --> 00:31:15,596 Speaker 1: these are muscles that we can build up. And it 575 00:31:15,636 --> 00:31:17,596 Speaker 1: sounds like this is something that you've been working on too, 576 00:31:17,676 --> 00:31:20,396 Speaker 1: especially when it comes to some of the online critics 577 00:31:20,436 --> 00:31:24,316 Speaker 1: and some of the haters too. As much as I 578 00:31:24,396 --> 00:31:28,436 Speaker 1: find it annoying and irritating, and actually some of the 579 00:31:28,876 --> 00:31:31,676 Speaker 1: things they say, it's just so ridiculous that it's laughable, 580 00:31:31,796 --> 00:31:34,036 Speaker 1: you know, And I do laugh at these things, but 581 00:31:34,196 --> 00:31:37,396 Speaker 1: I think I have tried to understand, you know, where 582 00:31:37,396 --> 00:31:41,156 Speaker 1: are these people coming from. I've tried to understand why 583 00:31:41,196 --> 00:31:44,156 Speaker 1: there's so much hatred. You know, whether or not they 584 00:31:44,196 --> 00:31:47,516 Speaker 1: would ever be as kind to me, I think as 585 00:31:47,596 --> 00:31:51,036 Speaker 1: kind of irrelevant. I think when we want to understand 586 00:31:51,116 --> 00:31:53,956 Speaker 1: somebody and we want to understand a problem, it's always 587 00:31:53,956 --> 00:31:56,876 Speaker 1: good to put yourself on somebody else's shoes try to 588 00:31:56,916 --> 00:31:59,396 Speaker 1: figure out where they're coming from. And I think in 589 00:31:59,476 --> 00:32:03,476 Speaker 1: many ways that sort of sometimes takes a lot of 590 00:32:03,516 --> 00:32:08,636 Speaker 1: the poison out of what you see being aimed at 591 00:32:08,636 --> 00:32:12,756 Speaker 1: you or the way you see any particular situation. Yeah, 592 00:32:12,836 --> 00:32:16,316 Speaker 1: and my podcast, we talk with the meditation teacher Tara Brock, 593 00:32:16,436 --> 00:32:19,716 Speaker 1: and she told this story or metaphor where when you're 594 00:32:19,756 --> 00:32:21,956 Speaker 1: walking by, you know a dog that's kind of barking 595 00:32:21,956 --> 00:32:23,516 Speaker 1: at you, and you go to pettit and it sort 596 00:32:23,516 --> 00:32:25,556 Speaker 1: of snaps at you and snarls. You could be like, oh, 597 00:32:25,596 --> 00:32:27,556 Speaker 1: what a you know, what a jerk, this horrible dog. 598 00:32:27,916 --> 00:32:29,556 Speaker 1: But then you look down and you might notice that 599 00:32:29,556 --> 00:32:32,436 Speaker 1: the dog's foot is in a trap or that it's hurt, right, 600 00:32:32,436 --> 00:32:35,716 Speaker 1: and that gives you a completely different emotional perspective on 601 00:32:35,796 --> 00:32:39,196 Speaker 1: this creature. Right now, you're not dealing with another agent 602 00:32:39,276 --> 00:32:41,676 Speaker 1: who's mean and awful and trying to be a jerk 603 00:32:41,716 --> 00:32:44,076 Speaker 1: to you. You're dealing with this person who's hurt, and 604 00:32:44,076 --> 00:32:46,396 Speaker 1: it so instantly turns on compassion. And it sounds like 605 00:32:46,716 --> 00:32:49,076 Speaker 1: by taking the perspective of some of these folks who 606 00:32:49,076 --> 00:32:51,276 Speaker 1: are kind of haters of some of your work, you've 607 00:32:51,276 --> 00:32:52,796 Speaker 1: been able to do a little bit of that. You know, 608 00:32:52,796 --> 00:32:54,956 Speaker 1: they're they're barking dogs, but they're barking dogs who might 609 00:32:54,996 --> 00:32:57,316 Speaker 1: be hurt to some extent. Yeah, exactly, I mean I 610 00:32:57,316 --> 00:33:00,236 Speaker 1: think I would. I would also point out that constantly 611 00:33:00,276 --> 00:33:05,716 Speaker 1: taking that position can get really emotionally exhausting also, where 612 00:33:05,956 --> 00:33:09,436 Speaker 1: you are always what ends up happening I saw times 613 00:33:09,516 --> 00:33:12,916 Speaker 1: feel is that you end up doubting your own position 614 00:33:13,396 --> 00:33:16,316 Speaker 1: because you're giving so much of yourself and so much 615 00:33:16,316 --> 00:33:19,956 Speaker 1: of your empathy out to other people. And I think 616 00:33:19,996 --> 00:33:22,996 Speaker 1: for me and certainly maybe this is even a question 617 00:33:23,076 --> 00:33:27,316 Speaker 1: for you, is how do you draw the line? Where 618 00:33:27,396 --> 00:33:30,356 Speaker 1: are the boundaries? Yeah, well, one of the things you know, 619 00:33:30,676 --> 00:33:33,236 Speaker 1: we learn in compassion research is that there can be 620 00:33:33,276 --> 00:33:36,916 Speaker 1: these compassion fatigue and so using strategies like literally almost 621 00:33:36,916 --> 00:33:38,436 Speaker 1: like you go to the gym and do squats to 622 00:33:38,476 --> 00:33:41,116 Speaker 1: like build up your life muscles, like using strategies to 623 00:33:41,276 --> 00:33:44,196 Speaker 1: literally build up your compassion muscles can be quite powerful. 624 00:33:44,276 --> 00:33:46,356 Speaker 1: And one of the ones you see in the research 625 00:33:46,476 --> 00:33:49,956 Speaker 1: is a technique called loving kindness meditation, which sounds really cheesy, 626 00:33:50,156 --> 00:33:54,436 Speaker 1: but it's a meditation practice where you like literally sit 627 00:33:54,716 --> 00:33:58,396 Speaker 1: and extend compassion to other people. The prompts are usually 628 00:33:58,596 --> 00:34:00,756 Speaker 1: think about someone who's really close to you and say, 629 00:34:01,156 --> 00:34:03,556 Speaker 1: may you be happy, may you be well, may you 630 00:34:03,636 --> 00:34:06,556 Speaker 1: be safe, may you find joy right, which, if you 631 00:34:06,596 --> 00:34:08,196 Speaker 1: do it right, can kind of give you a sense 632 00:34:08,196 --> 00:34:11,236 Speaker 1: of compassion. Sometimes people report feeling something warm in their 633 00:34:11,276 --> 00:34:13,596 Speaker 1: heart or in their heart space. But then what the 634 00:34:13,636 --> 00:34:16,956 Speaker 1: practice is is that over time you kind of extend 635 00:34:17,196 --> 00:34:21,116 Speaker 1: that compassion and that loving kindness to more and more 636 00:34:21,156 --> 00:34:24,756 Speaker 1: difficult people, you know, maybe like a stranger, and then ultimately, 637 00:34:25,076 --> 00:34:26,916 Speaker 1: you know, for me, like that guy who made the 638 00:34:26,956 --> 00:34:29,956 Speaker 1: supermean comment on Twitter about the podcast. Right, you know, 639 00:34:30,076 --> 00:34:32,756 Speaker 1: may you be happy, may you be safe, May you 640 00:34:32,796 --> 00:34:35,876 Speaker 1: find joy? Right. And what the process does is, first 641 00:34:35,876 --> 00:34:38,516 Speaker 1: of all, it allows you that feeling of that emotion 642 00:34:38,556 --> 00:34:40,676 Speaker 1: to extend to that person. But then it kind of 643 00:34:40,716 --> 00:34:42,596 Speaker 1: gets you naturally to do the thing that I think 644 00:34:42,636 --> 00:34:44,836 Speaker 1: we were just talking about, which is like, Okay, what's 645 00:34:44,876 --> 00:34:47,476 Speaker 1: their perspective. They're just a human who's trying to find joy, 646 00:34:47,556 --> 00:34:49,716 Speaker 1: who's like getting through it the same way we all are, 647 00:34:49,716 --> 00:34:51,356 Speaker 1: where they're kind of figuring it out as they go, 648 00:34:51,916 --> 00:34:53,956 Speaker 1: and so it can be a powerful practice. But what 649 00:34:53,996 --> 00:34:57,516 Speaker 1: the evidence suggests is that practices like that can reduce 650 00:34:57,556 --> 00:34:59,796 Speaker 1: the thing you're talking about, which is sort of compassion 651 00:34:59,836 --> 00:35:03,796 Speaker 1: fatigue or sort of burnout fatigue, especially in careers where 652 00:35:03,796 --> 00:35:06,956 Speaker 1: you have to really be empathic. And so the studies 653 00:35:06,996 --> 00:35:09,476 Speaker 1: have been done in people like palliative care, war workers 654 00:35:09,556 --> 00:35:12,636 Speaker 1: and first responders, but I think social historians who are 655 00:35:12,636 --> 00:35:14,956 Speaker 1: dealing with the deep injustice of the world could probably 656 00:35:14,996 --> 00:35:18,596 Speaker 1: benefit from some loving kindness meditation too. Yeah, I think 657 00:35:18,676 --> 00:35:22,236 Speaker 1: I think you're onto something there. And another strategy I 658 00:35:22,276 --> 00:35:24,836 Speaker 1: think we talk about on the podcast a lot is just, 659 00:35:25,276 --> 00:35:28,716 Speaker 1: you know, finding ways to regulate your negative emotion, right. 660 00:35:28,876 --> 00:35:30,836 Speaker 1: I mean, I think this is true for the online trolls, 661 00:35:30,836 --> 00:35:32,916 Speaker 1: but I imagine it's a social historian. It's more true 662 00:35:32,956 --> 00:35:35,596 Speaker 1: for like, you know, facing the injustice of the world 663 00:35:35,636 --> 00:35:37,436 Speaker 1: head on. You know, there's a reason we have this 664 00:35:37,516 --> 00:35:41,556 Speaker 1: just world bias because it sucks to realize that terrible, terrible, 665 00:35:41,596 --> 00:35:44,596 Speaker 1: awful things can happen to people who don't deserve them right. 666 00:35:44,636 --> 00:35:47,036 Speaker 1: And that comes with a deep sadness, It comes with anchor, 667 00:35:47,076 --> 00:35:50,236 Speaker 1: it comes with frustration, and so we need ways to 668 00:35:50,276 --> 00:35:53,836 Speaker 1: regulate those emotions, and you know, the science shows that 669 00:35:53,876 --> 00:35:56,236 Speaker 1: the way we do it is to actually sit with them. 670 00:35:56,396 --> 00:35:58,396 Speaker 1: You know. Tara Brock, who I mentioned, has this lovely 671 00:35:58,436 --> 00:36:01,796 Speaker 1: practice that she calls rain, which is a meditation practice. 672 00:36:01,916 --> 00:36:05,556 Speaker 1: Are ai n You recognize the emotions that your experience, 673 00:36:05,636 --> 00:36:08,196 Speaker 1: you allow them to investigate what they feel like in 674 00:36:08,236 --> 00:36:10,196 Speaker 1: your body, and then you kind of do something to 675 00:36:10,316 --> 00:36:13,236 Speaker 1: nurture yourself. And there's lots of evidence that those kinds 676 00:36:13,236 --> 00:36:15,156 Speaker 1: of practice where you kind of recognize, oh, I'm feeling 677 00:36:15,156 --> 00:36:17,316 Speaker 1: really frustrated by you know, the fact that the world 678 00:36:17,436 --> 00:36:19,836 Speaker 1: is so unfair. I'm feeling really sad about what happened 679 00:36:19,916 --> 00:36:22,316 Speaker 1: to these women back then and what happens to women today. 680 00:36:22,436 --> 00:36:24,476 Speaker 1: You kind of allow those feelings, you said, I'm just 681 00:36:24,476 --> 00:36:26,276 Speaker 1: going to sit with them and then let them play 682 00:36:26,276 --> 00:36:29,276 Speaker 1: out in your body. And that's because the research shows 683 00:36:29,276 --> 00:36:31,716 Speaker 1: that emotions are kind of like like a wave. You know, 684 00:36:31,716 --> 00:36:33,116 Speaker 1: it's going to go up and you feel more and 685 00:36:33,156 --> 00:36:35,436 Speaker 1: more frustrated, but then you sit with it. Over time, 686 00:36:35,436 --> 00:36:37,476 Speaker 1: it's just going to dissipey, it's going to do something, 687 00:36:37,836 --> 00:36:39,356 Speaker 1: But then you do the end, which is something to 688 00:36:39,436 --> 00:36:41,956 Speaker 1: nurture yourself, which is like it's exhausting to deal with 689 00:36:41,956 --> 00:36:43,796 Speaker 1: the just as the world is exhausting to deal with 690 00:36:43,796 --> 00:36:46,036 Speaker 1: online haters. You know, what can you do that's really 691 00:36:46,156 --> 00:36:48,636 Speaker 1: nurturing to kind of take care of yourself. So sounds 692 00:36:48,636 --> 00:36:50,836 Speaker 1: like practices like rain might also be you know, a 693 00:36:50,836 --> 00:36:53,076 Speaker 1: good thing for social his historians to invest in to 694 00:36:53,116 --> 00:36:55,556 Speaker 1: kind of get through some of the yucky stuff out there. Yeah, 695 00:36:55,596 --> 00:36:59,676 Speaker 1: I think that sounds really useful for anybody who is 696 00:37:00,036 --> 00:37:03,036 Speaker 1: surrounded by really unpleasant stuff all the time. And that 697 00:37:03,316 --> 00:37:04,956 Speaker 1: I was even going to say, you know, this day 698 00:37:04,956 --> 00:37:07,676 Speaker 1: and age we're living in right now with COVID, and 699 00:37:08,036 --> 00:37:12,316 Speaker 1: how the New sideical can be extremely overwhelming, and you know, 700 00:37:12,356 --> 00:37:15,556 Speaker 1: every day when the numbers of people dying are published, 701 00:37:15,836 --> 00:37:18,716 Speaker 1: that's really quite overwhelming. And I think that sounds like 702 00:37:18,756 --> 00:37:22,036 Speaker 1: a really good strategy really for all of us. And 703 00:37:22,116 --> 00:37:23,836 Speaker 1: so I think this is a nice way to end, 704 00:37:23,836 --> 00:37:25,036 Speaker 1: you know, And one of the reasons I was so 705 00:37:25,116 --> 00:37:27,396 Speaker 1: happy to talk to you. You know, our natural instinct, 706 00:37:27,436 --> 00:37:30,556 Speaker 1: our brains instinct, is to pretend that bad things don't 707 00:37:30,556 --> 00:37:33,516 Speaker 1: happen to good people. Right. There just must be bad women, right, 708 00:37:33,516 --> 00:37:35,636 Speaker 1: There's not bad things happening to good women. There's just 709 00:37:35,716 --> 00:37:38,236 Speaker 1: bad women. It just simplifies things. And it's one of 710 00:37:38,276 --> 00:37:40,276 Speaker 1: the reasons I've loved how much your podcast is kind 711 00:37:40,316 --> 00:37:42,956 Speaker 1: of brought to light, like, hey, these are biases, you know, 712 00:37:43,036 --> 00:37:45,076 Speaker 1: we need to kind of deal with them and empathize 713 00:37:45,076 --> 00:37:47,516 Speaker 1: with people who are going through some tough times, you know, 714 00:37:47,516 --> 00:37:49,556 Speaker 1: but also that the science of happiness maybe can give 715 00:37:49,596 --> 00:37:52,276 Speaker 1: us some good strategies to deal with our emotions as 716 00:37:52,276 --> 00:37:54,596 Speaker 1: we do that. Hallie, thank you so much for chatting 717 00:37:54,596 --> 00:37:57,036 Speaker 1: with me today. This was awesome. Thank you, Laurie. It's 718 00:37:57,076 --> 00:37:58,236 Speaker 1: been really enlightening.