1 00:00:01,080 --> 00:00:03,400 Speaker 1: My name is Lily Maddon and I'm a proud Arunda 2 00:00:03,640 --> 00:00:08,400 Speaker 1: Bungelung Calcottin woman from Gadigl Country. The Daily oz acknowledges 3 00:00:08,480 --> 00:00:10,639 Speaker 1: that this podcast is recorded on the lands of the 4 00:00:10,720 --> 00:00:14,240 Speaker 1: Gadighl people and pays respect to all Aboriginal and Torres 5 00:00:14,240 --> 00:00:17,159 Speaker 1: Strait Island and nations. We pay our respects to the 6 00:00:17,200 --> 00:00:19,960 Speaker 1: first peoples of these countries, both past and present. 7 00:00:20,960 --> 00:00:24,240 Speaker 2: Here's some content advice for the podcast that's about to come. 8 00:00:24,720 --> 00:00:27,080 Speaker 2: In this episode, we're going to be discussing the meaning 9 00:00:27,160 --> 00:00:29,800 Speaker 2: of trigger warnings, where they come from, and whether or 10 00:00:29,880 --> 00:00:33,479 Speaker 2: not they should be applied in our content. Good morning 11 00:00:33,479 --> 00:00:36,160 Speaker 2: and welcome to the Daily os. It's Wednesday, the fourth 12 00:00:36,159 --> 00:00:39,199 Speaker 2: of October. I'm Zara, I'm Billy. I'm the editor at 13 00:00:39,200 --> 00:00:39,880 Speaker 2: the Daily os. 14 00:00:40,159 --> 00:00:40,319 Speaker 3: Now. 15 00:00:40,360 --> 00:00:42,680 Speaker 2: Over the last few years, we've seen the rise of 16 00:00:42,720 --> 00:00:46,440 Speaker 2: trigger warnings in the content that we consume everywhere. We 17 00:00:46,560 --> 00:00:48,600 Speaker 2: of the Daily Ods have used it in this podcast 18 00:00:48,680 --> 00:00:51,239 Speaker 2: plenty of times, and we've also used it in a 19 00:00:51,280 --> 00:00:54,760 Speaker 2: lot of our written content. But at trigger warnings actually 20 00:00:54,920 --> 00:00:56,280 Speaker 2: protecting people. 21 00:00:56,360 --> 00:00:59,280 Speaker 3: And this is the evidence around trigger warnings. They trigger people, 22 00:01:00,080 --> 00:01:03,400 Speaker 3: They make anticipatory anxiety go through the roof. 23 00:01:03,560 --> 00:01:06,680 Speaker 2: In today's Deep Dive, we speak to the Global Director 24 00:01:06,760 --> 00:01:10,360 Speaker 2: of Men's Health Research at November, doctor Zach Seidler, and 25 00:01:10,440 --> 00:01:13,280 Speaker 2: yes he is my brother, but he also happens to 26 00:01:13,319 --> 00:01:16,400 Speaker 2: be an expert in the field of mental health. Before 27 00:01:16,400 --> 00:01:19,040 Speaker 2: we get to that chat, Billy, what is making headlines. 28 00:01:19,440 --> 00:01:22,039 Speaker 4: The Australian Reserve Bank has kept the cash rate at 29 00:01:22,040 --> 00:01:25,480 Speaker 4: four point one percent for the fourth consecutive month. The 30 00:01:25,520 --> 00:01:28,720 Speaker 4: decision followed the board's first meeting under the leadership of 31 00:01:28,760 --> 00:01:32,320 Speaker 4: the new RBA Governor Michelle Bullock, who said while inflation 32 00:01:32,480 --> 00:01:35,520 Speaker 4: was still too high, future rises would depend on new 33 00:01:35,600 --> 00:01:37,479 Speaker 4: data and risks in the economy. 34 00:01:39,240 --> 00:01:42,160 Speaker 2: A new study has found Australian parents are some of 35 00:01:42,200 --> 00:01:45,560 Speaker 2: the most risk averse in the world. New data from 36 00:01:45,600 --> 00:01:49,560 Speaker 2: Deacon University and Coventry University in the UK found that 37 00:01:49,640 --> 00:01:53,800 Speaker 2: four in five Australian parents put limits on risky activities 38 00:01:53,840 --> 00:01:57,520 Speaker 2: such as play fighting or climbing trees, and that's out 39 00:01:57,560 --> 00:02:00,320 Speaker 2: of fear that their child will be injured. The study 40 00:02:00,360 --> 00:02:03,120 Speaker 2: puts Ozzie parents ahead of New Zealand, Canada and the 41 00:02:03,200 --> 00:02:06,840 Speaker 2: UK when it comes to playtime risk aversion. It's encouraged 42 00:02:06,840 --> 00:02:10,400 Speaker 2: parents to loosen the reins to support physical and mental development. 43 00:02:11,680 --> 00:02:14,919 Speaker 4: Work has begun to convert Hitler's former Austrian home into 44 00:02:14,960 --> 00:02:18,200 Speaker 4: a police station. It's hoped the new purpose for the home, 45 00:02:18,240 --> 00:02:21,360 Speaker 4: where Hitler was born in eighteen eighty nine will deter 46 00:02:21,520 --> 00:02:24,960 Speaker 4: gatherings of Neo Nazi's other site. The property has been 47 00:02:25,040 --> 00:02:27,960 Speaker 4: under the control of the Austrian government for several decades. 48 00:02:29,200 --> 00:02:31,960 Speaker 2: And today's good news. A one hundred and four year 49 00:02:32,000 --> 00:02:34,480 Speaker 2: old woman in the US hopes to become the oldest 50 00:02:34,480 --> 00:02:38,720 Speaker 2: person to ever skydive. Dorothy Hoffner performed a skydive near 51 00:02:38,800 --> 00:02:41,639 Speaker 2: Chicago this week, which she hopes will be certified as 52 00:02:41,639 --> 00:02:45,080 Speaker 2: a record breaking dive. Hoffner told the onlooking crowd that 53 00:02:45,160 --> 00:02:52,239 Speaker 2: her age was just the number. Okay, So, Billy, the 54 00:02:52,280 --> 00:02:55,440 Speaker 2: reason that we wanted to do this interview with expert 55 00:02:55,520 --> 00:02:59,040 Speaker 2: Slash my brother Zach, was because you were asked to 56 00:02:59,040 --> 00:03:02,080 Speaker 2: comment on how the Daily ODS does trigger warnings. 57 00:03:02,160 --> 00:03:06,080 Speaker 4: Right, Yeah, So I participated in an SBS show called Insight, 58 00:03:06,200 --> 00:03:08,320 Speaker 4: and it was all about whether the trigger warnings are 59 00:03:08,520 --> 00:03:12,200 Speaker 4: helpful or harmful. And my perspective was that at the 60 00:03:12,280 --> 00:03:15,200 Speaker 4: Daily ODS we do use trigger warnings and that has 61 00:03:15,240 --> 00:03:19,400 Speaker 4: been informed by the audience. But it became clear throughout 62 00:03:19,440 --> 00:03:22,800 Speaker 4: the discussion on the show that there is mounting evidence 63 00:03:22,880 --> 00:03:26,320 Speaker 4: that they potentially are not helpful, and I thought it 64 00:03:26,400 --> 00:03:28,920 Speaker 4: was really important that we have a discussion about that 65 00:03:29,080 --> 00:03:32,440 Speaker 4: at the Daily Ods. Your brother is a clinical psychologist, 66 00:03:32,520 --> 00:03:35,520 Speaker 4: and I know he feels very passionately about it, and 67 00:03:35,600 --> 00:03:37,840 Speaker 4: he's spoken to me about it before, and I thought 68 00:03:38,240 --> 00:03:41,280 Speaker 4: we should get him on and have a really honest discussion. 69 00:03:41,760 --> 00:03:43,400 Speaker 2: And I think the best part of it is that 70 00:03:43,520 --> 00:03:46,880 Speaker 2: we learn as we go and we don't pretend to 71 00:03:46,920 --> 00:03:49,400 Speaker 2: be experts, and I think this discussion is the best 72 00:03:49,440 --> 00:03:52,760 Speaker 2: example of that. So, without further Ado, Zach and Billy. 73 00:03:52,640 --> 00:03:54,840 Speaker 4: Doctor Zach Sidler, Welcome to the Daily Ods. 74 00:03:55,000 --> 00:03:55,600 Speaker 3: Great to be here. 75 00:03:55,640 --> 00:03:58,280 Speaker 4: Billy. Now, you're a clinical psychologist and you've done a 76 00:03:58,360 --> 00:04:01,720 Speaker 4: lot of research in this space. What is your position 77 00:04:01,760 --> 00:04:02,680 Speaker 4: on trigger warnings? 78 00:04:02,760 --> 00:04:04,560 Speaker 3: So I think it won't be a surprise to you 79 00:04:04,600 --> 00:04:07,320 Speaker 3: given the amount of DMS that I've sent to you 80 00:04:07,440 --> 00:04:10,120 Speaker 3: and Daily Odds about my feelings. 81 00:04:09,680 --> 00:04:10,800 Speaker 1: Around your our troll. 82 00:04:11,240 --> 00:04:14,200 Speaker 3: I'm the troll number one troll. So the way that 83 00:04:14,240 --> 00:04:16,880 Speaker 3: I see trigger warnings and the way that the research 84 00:04:16,960 --> 00:04:20,119 Speaker 3: sees them is that there is no evidence that they work. 85 00:04:20,440 --> 00:04:24,000 Speaker 3: There is evidence that they harm and so depending on 86 00:04:24,040 --> 00:04:26,840 Speaker 3: the context, we need to be really careful about just 87 00:04:26,960 --> 00:04:30,520 Speaker 3: throwing trigger warnings out the front of all of this content, 88 00:04:30,640 --> 00:04:34,159 Speaker 3: especially on social media, and then throwing up our hands 89 00:04:34,160 --> 00:04:38,159 Speaker 3: and kind of saying that's my responsibility done. And it's 90 00:04:38,200 --> 00:04:39,880 Speaker 3: not to say that the Day the Odds has done that. 91 00:04:40,200 --> 00:04:43,560 Speaker 3: But the way that this has now become a commonplace, 92 00:04:44,240 --> 00:04:48,440 Speaker 3: pseudo scientific placeholder at the front of content is really 93 00:04:48,560 --> 00:04:51,480 Speaker 3: surprising and strange to me and suggests that we need 94 00:04:51,520 --> 00:04:53,920 Speaker 3: to have a bit of a conversation about what is 95 00:04:53,960 --> 00:04:55,600 Speaker 3: going on and what we can do in future. 96 00:04:56,000 --> 00:04:58,039 Speaker 4: I think the thing that I struggle with is that 97 00:04:58,120 --> 00:05:01,839 Speaker 4: our approach has been in the audience who tells us 98 00:05:01,920 --> 00:05:05,200 Speaker 4: that they do want trigger warnings. And I understand that 99 00:05:05,240 --> 00:05:08,360 Speaker 4: the evidence is changing, but what do we do when 100 00:05:08,360 --> 00:05:10,080 Speaker 4: the expectation hasn't. 101 00:05:10,880 --> 00:05:15,880 Speaker 3: So my answer to that is, if your audience told 102 00:05:15,920 --> 00:05:19,039 Speaker 3: you to put up more content about UFO sidings, would 103 00:05:19,040 --> 00:05:22,400 Speaker 3: you suddenly listen to them? If the vast majority of 104 00:05:22,440 --> 00:05:25,800 Speaker 3: them said, I swear to God, I've seen UFOs, you 105 00:05:25,920 --> 00:05:29,000 Speaker 3: need to put this up. You don't follow the audiences 106 00:05:29,080 --> 00:05:29,880 Speaker 3: every whim. 107 00:05:30,000 --> 00:05:32,360 Speaker 4: Yeah, But I think the difference is that they're telling 108 00:05:32,440 --> 00:05:35,240 Speaker 4: us that they think that we are harming them or 109 00:05:35,240 --> 00:05:38,799 Speaker 4: triggering them if we don't give them that warning for sure. 110 00:05:39,080 --> 00:05:42,120 Speaker 3: So there are two issues here. The first is that 111 00:05:42,279 --> 00:05:46,520 Speaker 3: your audience is largely, you know, young females, and I 112 00:05:46,520 --> 00:05:49,400 Speaker 3: think that what's very important to consider contextually is the 113 00:05:49,400 --> 00:05:51,480 Speaker 3: fact that most of them have grown up with trigger 114 00:05:51,480 --> 00:05:53,960 Speaker 3: warnings everywhere. Yeah, so it's an expectation because it's all 115 00:05:54,000 --> 00:05:57,160 Speaker 3: they've had. All I'm asking for is for you and 116 00:05:57,240 --> 00:06:00,280 Speaker 3: everyone else who's creating content to respect the audience enough 117 00:06:00,320 --> 00:06:04,320 Speaker 3: to go, Actually, we're going to follow the evidence and 118 00:06:04,480 --> 00:06:07,359 Speaker 3: ask and educate and bring you along on that journey 119 00:06:07,560 --> 00:06:10,520 Speaker 3: around whether or not this is something worth doing. Secondly, 120 00:06:11,160 --> 00:06:15,359 Speaker 3: and this is from a clinical perspective, avoidance is really 121 00:06:15,400 --> 00:06:18,320 Speaker 3: harmful when it comes to things like PTSD for instance. Yeah, 122 00:06:18,320 --> 00:06:22,560 Speaker 3: the best treatment around PTSD or something similar around trauma 123 00:06:22,680 --> 00:06:26,200 Speaker 3: is exposure. Is putting yourself in a situation where you're 124 00:06:26,200 --> 00:06:28,680 Speaker 3: going to be exposed to the situation again so that 125 00:06:28,720 --> 00:06:31,599 Speaker 3: you learn that it's not harmful. All of that is 126 00:06:31,640 --> 00:06:35,039 Speaker 3: to say, you cannot put up a trigger warning and 127 00:06:35,080 --> 00:06:38,760 Speaker 3: then put up harmful content. Yeah, the trigger warning doesn't 128 00:06:38,800 --> 00:06:41,000 Speaker 3: suddenly get rid of the responsibility for you to put 129 00:06:41,040 --> 00:06:44,120 Speaker 3: up safe, respectful content. In fact, what is happening, I 130 00:06:44,120 --> 00:06:48,080 Speaker 3: think is the trigger warnings are providing a blanket. Okay, 131 00:06:48,160 --> 00:06:51,000 Speaker 3: we've done our job now for much of the media. Instead, 132 00:06:51,200 --> 00:06:53,359 Speaker 3: if we get rid of trigger warnings and spend the 133 00:06:53,360 --> 00:06:56,719 Speaker 3: amount of time that is needed to actually ensure that 134 00:06:56,760 --> 00:07:00,000 Speaker 3: the content is safe. When we're talking about suicide, rape, 135 00:07:00,080 --> 00:07:03,080 Speaker 3: domestic violence, we use the right language and then we 136 00:07:03,160 --> 00:07:06,240 Speaker 3: pump information at the end, we make really clear to 137 00:07:06,279 --> 00:07:08,880 Speaker 3: provide a lot of information that is going to support 138 00:07:08,920 --> 00:07:11,120 Speaker 3: those young people to get the help that they need. 139 00:07:11,480 --> 00:07:13,960 Speaker 3: But the fact that they're coming out and saying this 140 00:07:14,240 --> 00:07:16,640 Speaker 3: content is harmful, well, there are two things there. A 141 00:07:16,640 --> 00:07:19,200 Speaker 3: trigger warning is not making it less harmful, So we 142 00:07:19,240 --> 00:07:21,400 Speaker 3: need to consider what is harmful about the content and 143 00:07:21,440 --> 00:07:22,960 Speaker 3: how to go about eradicating. 144 00:07:23,080 --> 00:07:24,440 Speaker 4: But it's giving them the choice. 145 00:07:24,480 --> 00:07:26,600 Speaker 3: But it's not giving them the choice. The trigger warning 146 00:07:26,600 --> 00:07:28,600 Speaker 3: doesn't give them the choice. What it does is it says, 147 00:07:29,040 --> 00:07:32,000 Speaker 3: and this is the evidence around trigger warnings. They trigger people. 148 00:07:32,720 --> 00:07:36,520 Speaker 3: They make anticipatory anxiety go through the roof. It makes 149 00:07:36,520 --> 00:07:39,240 Speaker 3: people go oh, and it gives you mental imagery. It 150 00:07:39,240 --> 00:07:41,800 Speaker 3: makes you start to go, what's coming, what's it going 151 00:07:41,840 --> 00:07:43,440 Speaker 3: to look? Like, how's it going to work? And it 152 00:07:43,600 --> 00:07:47,880 Speaker 3: increases anxiety. Rather than doing this thing, which I think 153 00:07:47,920 --> 00:07:51,000 Speaker 3: society needs to do, which is not over fragilized society, 154 00:07:51,280 --> 00:07:55,600 Speaker 3: not you know, suggest this over therapization of everybody. Rather 155 00:07:55,960 --> 00:07:58,400 Speaker 3: we need to go you are resilient, you are able 156 00:07:58,440 --> 00:08:01,240 Speaker 3: to cope with this. We are putting in safe content, 157 00:08:01,480 --> 00:08:04,080 Speaker 3: and we're going to make sure that you get the 158 00:08:04,120 --> 00:08:06,520 Speaker 3: support that you need at the end. But it's it's 159 00:08:06,600 --> 00:08:09,000 Speaker 3: not right for the media to decide. I don't believe 160 00:08:09,320 --> 00:08:10,960 Speaker 3: what is triggering and what is not. 161 00:08:11,480 --> 00:08:14,680 Speaker 4: But I think as the editor, if someone has gone 162 00:08:14,760 --> 00:08:18,000 Speaker 4: through trauma, who am I to tell them? Well, I 163 00:08:18,040 --> 00:08:20,160 Speaker 4: actually know what's best for you, and it's for you 164 00:08:20,240 --> 00:08:23,680 Speaker 4: to be exposed to this content. I feel so uncomfortable 165 00:08:24,160 --> 00:08:25,960 Speaker 4: taking that approach, do you know what I mean? 166 00:08:26,080 --> 00:08:28,240 Speaker 3: One percent? And I think that that's really to be 167 00:08:28,440 --> 00:08:31,640 Speaker 3: respected and valued. That question and you questioning yourself around 168 00:08:31,640 --> 00:08:34,559 Speaker 3: what is safe and not safe content is the pertinent 169 00:08:34,679 --> 00:08:38,320 Speaker 3: question here. But you deciding what is safe and not 170 00:08:38,440 --> 00:08:42,280 Speaker 3: safe for them around trigger warnings, you're making that call. 171 00:08:42,800 --> 00:08:44,720 Speaker 3: You're making that call about what is safe and what 172 00:08:44,840 --> 00:08:47,240 Speaker 3: is not safe. You're deciding what deserves the trigger warning. 173 00:08:47,280 --> 00:08:49,680 Speaker 3: And what doesn't I understand when it comes to mental 174 00:08:49,760 --> 00:08:51,880 Speaker 3: health and wellbeing that you feel like you have a 175 00:08:51,920 --> 00:08:55,960 Speaker 3: responsibility to look after them. And my response to that 176 00:08:56,080 --> 00:08:58,360 Speaker 3: is you do, and the best way to look after 177 00:08:58,400 --> 00:09:00,560 Speaker 3: them is to make safe content and to get rid 178 00:09:00,640 --> 00:09:01,360 Speaker 3: of trigger warnings. 179 00:09:01,440 --> 00:09:03,960 Speaker 4: Is there nuanced to it, like, are there ever scenarios 180 00:09:04,000 --> 00:09:05,800 Speaker 4: where you think that they could be helpful? 181 00:09:05,880 --> 00:09:09,079 Speaker 3: I think when it comes to aberge and on Torrest Islanders, 182 00:09:09,080 --> 00:09:12,480 Speaker 3: when there has been depictions of somebody who has died, 183 00:09:12,480 --> 00:09:15,960 Speaker 3: it's very clear there are cultural regulations and suggestions about 184 00:09:15,960 --> 00:09:19,520 Speaker 3: how to go about that. When it comes to warnings 185 00:09:19,720 --> 00:09:24,000 Speaker 3: more broadly around rape, domestic violence, assault, that type of thing, 186 00:09:24,320 --> 00:09:25,880 Speaker 3: I think what we need to do is not have 187 00:09:25,960 --> 00:09:29,800 Speaker 3: blanket rules for yes on everything. Rather we need more research, 188 00:09:29,800 --> 00:09:32,880 Speaker 3: which is a very reliable answer for any psychologist to say, 189 00:09:33,160 --> 00:09:38,560 Speaker 3: to go in what contexts for which people are these useful? Instead, 190 00:09:38,840 --> 00:09:42,440 Speaker 3: at the moment we're just going scattergun everywhere. Every piece 191 00:09:42,480 --> 00:09:45,480 Speaker 3: of content deserves this, And I think that you might 192 00:09:45,559 --> 00:09:48,360 Speaker 3: be safeguarding one person. I'm not suggesting that you're not, 193 00:09:48,800 --> 00:09:52,080 Speaker 3: but you may well not be looking after another thousand 194 00:09:52,160 --> 00:09:54,240 Speaker 3: or ten thousand people. And I think that we need 195 00:09:54,280 --> 00:09:57,280 Speaker 3: to realize that that one person should be getting supports 196 00:09:57,320 --> 00:10:00,160 Speaker 3: that they need. But the trigger warning actually does and 197 00:10:00,200 --> 00:10:02,680 Speaker 3: tell them what to do. It doesn't provide them with 198 00:10:02,720 --> 00:10:03,719 Speaker 3: any support. 199 00:10:03,480 --> 00:10:05,400 Speaker 4: But it does at the end of the piece, because 200 00:10:05,520 --> 00:10:08,680 Speaker 4: whenever there's a trigger warning, you also provide the help line. 201 00:10:08,920 --> 00:10:11,640 Speaker 3: Perfect, and that's exactly what is required. Let's have more 202 00:10:11,679 --> 00:10:14,160 Speaker 3: of that, let's build in safety. But the trigger warning 203 00:10:14,200 --> 00:10:18,319 Speaker 3: doesn't do anything for them in that interim period, whether 204 00:10:18,400 --> 00:10:20,920 Speaker 3: or not they watch it or not. It doesn't offer 205 00:10:21,040 --> 00:10:24,120 Speaker 3: any choice around the way that the content is going 206 00:10:24,200 --> 00:10:26,400 Speaker 3: to be offered to them. Does that make sense? You 207 00:10:26,440 --> 00:10:28,320 Speaker 3: put in a trigger warning, you say we're going to 208 00:10:28,320 --> 00:10:30,040 Speaker 3: talk about rape, and then you do whatever you want 209 00:10:30,080 --> 00:10:32,520 Speaker 3: in the middle, and then you offer supports at the end. 210 00:10:32,559 --> 00:10:34,600 Speaker 4: But no, because I don't think it's fair to say 211 00:10:34,679 --> 00:10:37,720 Speaker 4: that just because we use trigger warnings means that then 212 00:10:37,840 --> 00:10:41,480 Speaker 4: in the whole post we just do irresponsible reporting, like 213 00:10:41,520 --> 00:10:45,160 Speaker 4: we would never report on anything irresponsible. But it just 214 00:10:45,320 --> 00:10:48,280 Speaker 4: means that the audience has a choice. 215 00:10:48,480 --> 00:10:51,640 Speaker 3: But what I'm suggesting is that the choice they don't 216 00:10:51,640 --> 00:10:53,640 Speaker 3: have a choice about what you report already. 217 00:10:53,960 --> 00:10:56,000 Speaker 4: And it feels like it sounds maybe like I'm not 218 00:10:56,000 --> 00:10:57,920 Speaker 4: listening to you. I know that's fair, but it's more 219 00:10:58,040 --> 00:11:00,200 Speaker 4: just like, this is what it's been so low, and 220 00:11:00,240 --> 00:11:02,560 Speaker 4: I think it's a really important conversation about how we 221 00:11:02,679 --> 00:11:06,359 Speaker 4: change it. But I'm obviously just stuck on the choice. 222 00:11:05,920 --> 00:11:10,680 Speaker 3: But it's a choice. It's the choice around what type 223 00:11:10,679 --> 00:11:13,760 Speaker 3: of content they're choosing to watch. It's the choice around 224 00:11:13,760 --> 00:11:16,120 Speaker 3: what type of content you decide to put a trigger 225 00:11:16,160 --> 00:11:19,040 Speaker 3: warning on. There are so many arbitrary, subjective decisions that 226 00:11:19,080 --> 00:11:21,360 Speaker 3: are being made here, and that's what I think should 227 00:11:21,360 --> 00:11:24,760 Speaker 3: be eradicated. Rather, there should be a really clear, evidence 228 00:11:24,800 --> 00:11:28,480 Speaker 3: based protocol around this. Stuff is very clearly going to 229 00:11:28,480 --> 00:11:31,120 Speaker 3: trigger this population who we know is watching this, so 230 00:11:31,160 --> 00:11:33,960 Speaker 3: we're going to build it in. Otherwise, you're just building 231 00:11:34,040 --> 00:11:36,880 Speaker 3: up anticipatory anxiety for people who were never going to 232 00:11:36,960 --> 00:11:40,080 Speaker 3: have it before, and for other people who might actually 233 00:11:40,080 --> 00:11:42,400 Speaker 3: benefit from watching this content, they're going to avoid it 234 00:11:42,400 --> 00:11:45,600 Speaker 3: because they're avoiding anxiety, which is understandable. And I think, 235 00:11:45,880 --> 00:11:48,640 Speaker 3: and this is hard for any psychologist to say a 236 00:11:48,640 --> 00:11:51,320 Speaker 3: bit of anxiety is useful. It's useful for them to 237 00:11:51,360 --> 00:11:54,400 Speaker 3: watch the content if it is safely reported, for them 238 00:11:54,440 --> 00:11:57,560 Speaker 3: to embrace and their own resilience and realize that they 239 00:11:57,600 --> 00:11:58,280 Speaker 3: can get through it. 240 00:11:58,679 --> 00:12:01,160 Speaker 4: I have a question about a Star Born. I remember 241 00:12:01,160 --> 00:12:03,680 Speaker 4: when that came out. In the movie there is a 242 00:12:03,720 --> 00:12:07,400 Speaker 4: scene about suicide. There was a discussion about whether that 243 00:12:07,480 --> 00:12:10,080 Speaker 4: movie should have come with a trigger warning because it 244 00:12:10,120 --> 00:12:14,120 Speaker 4: is quite confronting and they do go into some details. 245 00:12:14,120 --> 00:12:17,040 Speaker 4: And obviously they're not a media organization, so they don't 246 00:12:17,360 --> 00:12:21,360 Speaker 4: have the necessity to have responsible reporting. What do you 247 00:12:21,360 --> 00:12:23,360 Speaker 4: think about that movie, like, should something like that have 248 00:12:23,400 --> 00:12:24,120 Speaker 4: a trigger warning? 249 00:12:24,360 --> 00:12:27,640 Speaker 3: So my response to that is the media has really 250 00:12:27,679 --> 00:12:30,360 Speaker 3: clear regulations and there are the mind frame guidelines which 251 00:12:30,400 --> 00:12:33,560 Speaker 3: everyone should follow around how to talk about suicide. Film 252 00:12:33,600 --> 00:12:37,079 Speaker 3: and television has very similar guidelines. The fact that they 253 00:12:37,080 --> 00:12:39,400 Speaker 3: don't feel the responsibility to live up to them is 254 00:12:39,440 --> 00:12:43,160 Speaker 3: a totally different ballgame. When it comes to showing depictions 255 00:12:43,160 --> 00:12:47,040 Speaker 3: of suicide in plays, in film, in television, they should 256 00:12:47,080 --> 00:12:50,720 Speaker 3: be in consultation with clinicians, with researchers about how to 257 00:12:50,760 --> 00:12:52,719 Speaker 3: go about doing that. There are plenty of shows that 258 00:12:52,760 --> 00:12:55,560 Speaker 3: I've worked on with in Australia who have asked for consultation, 259 00:12:55,760 --> 00:12:58,319 Speaker 3: who've gotten lived experience to make sure that this stuff 260 00:12:58,360 --> 00:13:02,160 Speaker 3: is safe. That depiction which shows method and location is 261 00:13:02,200 --> 00:13:02,760 Speaker 3: not safe. 262 00:13:02,840 --> 00:13:05,960 Speaker 4: Do you remember that movie very clearly and did it 263 00:13:06,000 --> 00:13:07,839 Speaker 4: follow responsible Guiden? Not at all? 264 00:13:07,960 --> 00:13:10,280 Speaker 3: And so no trigger warning was going to be safe there. 265 00:13:10,360 --> 00:13:11,800 Speaker 3: What it was going to do was going to amp 266 00:13:11,800 --> 00:13:13,120 Speaker 3: me up for the next hour and a half and 267 00:13:13,160 --> 00:13:14,559 Speaker 3: then I was going to see it and I was 268 00:13:14,600 --> 00:13:18,280 Speaker 3: still going to feel shitty. So that's a perfect example 269 00:13:18,400 --> 00:13:21,360 Speaker 3: of if you actually build in safe content and you 270 00:13:21,480 --> 00:13:25,000 Speaker 3: make sure and this is across media, television, anything, you 271 00:13:25,120 --> 00:13:27,320 Speaker 3: make sure that you are doing what is right by 272 00:13:27,320 --> 00:13:29,920 Speaker 3: the guidelines and the evidence, you shouldn't need a trigger 273 00:13:29,960 --> 00:13:32,360 Speaker 3: warning because the trigger warning would not have safeguarded that 274 00:13:32,400 --> 00:13:32,920 Speaker 3: piece at all. 275 00:13:33,240 --> 00:13:36,360 Speaker 4: So we've just had a whole conversation about trigger warnings. 276 00:13:37,080 --> 00:13:39,880 Speaker 4: We've talked about suicide for example. What do you think 277 00:13:39,880 --> 00:13:41,920 Speaker 4: that we should do at the start of this podcast. 278 00:13:42,120 --> 00:13:43,760 Speaker 3: I feel like I should record this is not a 279 00:13:43,760 --> 00:13:48,199 Speaker 3: trigger warning, enjoy the podcast. But I think that at 280 00:13:48,200 --> 00:13:49,920 Speaker 3: the top of this podcast, I think that you should 281 00:13:49,960 --> 00:13:53,520 Speaker 3: say we're going to be discussing the meaning of trigger warnings, 282 00:13:53,559 --> 00:13:55,559 Speaker 3: where they come from, and whether or not they should 283 00:13:55,559 --> 00:13:58,680 Speaker 3: be applied in all of our content. So suggesting that 284 00:13:58,720 --> 00:14:01,880 Speaker 3: this is going to be an education all situation rather 285 00:14:01,960 --> 00:14:03,120 Speaker 3: than a triggering one. 286 00:14:03,440 --> 00:14:05,679 Speaker 4: But isn't that still even if it's not using the 287 00:14:05,760 --> 00:14:08,000 Speaker 4: term trigger warning, it's still a trigger warning. 288 00:14:08,200 --> 00:14:10,040 Speaker 3: It is so Yeah, I think that there is a 289 00:14:10,080 --> 00:14:13,440 Speaker 3: way of talking about it that goes we're going to 290 00:14:13,920 --> 00:14:16,880 Speaker 3: It's so it's so interesting because even the second you 291 00:14:16,920 --> 00:14:19,560 Speaker 3: start to go in this podcast, we're going to discuss 292 00:14:19,720 --> 00:14:22,480 Speaker 3: immediately takes you two. We're going to have a trigger 293 00:14:22,480 --> 00:14:26,160 Speaker 3: warning here. Yeah, I got to think about I don't 294 00:14:26,160 --> 00:14:26,880 Speaker 3: know what you would. 295 00:14:27,080 --> 00:14:28,800 Speaker 4: Maybe we use the term content advice. 296 00:14:29,000 --> 00:14:32,680 Speaker 3: Yeah, yeah, well yeah, here is some content advice for 297 00:14:32,720 --> 00:14:34,080 Speaker 3: the podcast it's about to come. 298 00:14:34,280 --> 00:14:36,840 Speaker 4: Yeah. Lastly, we haven't even discussed the fact that your 299 00:14:36,920 --> 00:14:40,840 Speaker 4: Zara's brother. I imagine that you two have had lots of 300 00:14:40,840 --> 00:14:43,720 Speaker 4: discussions about this. Why hasn't the Daily Os not been 301 00:14:43,840 --> 00:14:45,240 Speaker 4: using trigger warnings from the start? 302 00:14:45,440 --> 00:14:48,040 Speaker 3: It's a great question. She doesn't listen to me very clearly. 303 00:14:48,080 --> 00:14:51,120 Speaker 3: We've got older brothers in chrome, which is terrifying. But 304 00:14:51,320 --> 00:14:54,240 Speaker 3: I think that I'm really excited to be here today 305 00:14:54,240 --> 00:14:56,720 Speaker 3: and to be able to discuss this. It's something that's 306 00:14:56,760 --> 00:14:58,960 Speaker 3: evolving and I think that's the most important thing. And 307 00:14:59,000 --> 00:15:01,920 Speaker 3: you never want to, you know, jump it preemptively and 308 00:15:01,960 --> 00:15:04,080 Speaker 3: make sure that you do something unsafe, but let's just 309 00:15:04,120 --> 00:15:06,600 Speaker 3: have the conversation, because that's the way to really respect 310 00:15:06,800 --> 00:15:10,280 Speaker 3: the intelligence and the experience of your audience. 311 00:15:09,960 --> 00:15:12,000 Speaker 4: Doctor Zach Saidler, thank you so much for joining the 312 00:15:12,080 --> 00:15:12,640 Speaker 4: Daily Ours. 313 00:15:12,720 --> 00:15:13,360 Speaker 3: I'll see you soon. 314 00:15:13,520 --> 00:15:13,840 Speaker 1: Cheers. 315 00:15:16,960 --> 00:15:19,400 Speaker 2: Thanks for joining us on the Daily Ours. If you 316 00:15:19,440 --> 00:15:21,480 Speaker 2: need someone to talk to, you can give Lifeline a 317 00:15:21,480 --> 00:15:25,560 Speaker 2: call on thirteen eleven fourteen. We'll be back again tomorrow morning, 318 00:15:25,640 --> 00:15:35,360 Speaker 2: but until then, have a brilliant day.