1 00:00:00,520 --> 00:00:04,480 Speaker 1: I get a team, it's harps. It's it's Nicole, not little. 2 00:00:04,960 --> 00:00:08,479 Speaker 1: It's definitely not little, it's Laddell. So don't fuck that up, 3 00:00:08,560 --> 00:00:11,760 Speaker 1: right because she gets very angry. And I did just 4 00:00:11,880 --> 00:00:14,960 Speaker 1: quadruple check that before we went live after knowing her 5 00:00:15,000 --> 00:00:15,800 Speaker 1: for twenty years. 6 00:00:16,200 --> 00:00:18,000 Speaker 2: That you know when you you know, when you have 7 00:00:18,040 --> 00:00:20,760 Speaker 2: a thought and you think, maybe, what if I've. 8 00:00:20,600 --> 00:00:24,880 Speaker 1: Been getting someone's pronunciation wrong for twenty years? But I hadn't, 9 00:00:24,920 --> 00:00:28,600 Speaker 1: So that's good. But imagine if your name wasn't Nicole 10 00:00:28,640 --> 00:00:30,680 Speaker 1: but rather Nikola or something like that. 11 00:00:31,920 --> 00:00:34,080 Speaker 3: No, then it would probably be Nikola. 12 00:00:35,680 --> 00:00:41,560 Speaker 2: Well, no, it's got an e on the end. 13 00:00:40,120 --> 00:00:43,800 Speaker 1: Nicola little, or it could be let's fuck that off. 14 00:00:44,080 --> 00:00:45,960 Speaker 1: That's a ridiculous conversation. How are you? 15 00:00:46,600 --> 00:00:48,240 Speaker 3: I'm good, I'm good. How are you? 16 00:00:48,920 --> 00:00:53,400 Speaker 2: I'm good? How's the how's the PhD landscape? Are you? 17 00:00:53,880 --> 00:00:56,280 Speaker 2: Are you coming up for oxygen every now and then? 18 00:00:56,440 --> 00:00:59,720 Speaker 2: Are you drowning in a sea of academia? 19 00:01:00,120 --> 00:01:00,200 Speaker 4: You? 20 00:01:00,240 --> 00:01:01,720 Speaker 2: Okay? What is the update? 21 00:01:03,000 --> 00:01:03,320 Speaker 3: Well? 22 00:01:03,320 --> 00:01:07,000 Speaker 5: Today, because you know it changes, but today and this week, 23 00:01:07,760 --> 00:01:10,000 Speaker 5: I'm at that point where my to do list is 24 00:01:10,080 --> 00:01:13,720 Speaker 5: so long that I'm spending more time writing a to 25 00:01:13,800 --> 00:01:17,640 Speaker 5: do list than I'm actually doing anything, and then you 26 00:01:17,760 --> 00:01:19,920 Speaker 5: kind of get halfway through something and then you think, shit, 27 00:01:19,959 --> 00:01:22,560 Speaker 5: I've got to do something else, and then you know 28 00:01:22,600 --> 00:01:25,520 Speaker 5: when you just get kind of frozen in that overwhelmed. 29 00:01:25,600 --> 00:01:29,360 Speaker 5: So that's where I'm at this week. But I'm pretty 30 00:01:29,400 --> 00:01:32,240 Speaker 5: sure that tomorrow morning I'll get up and do some 31 00:01:32,319 --> 00:01:37,039 Speaker 5: project management and get myself organized again and I'll be 32 00:01:37,120 --> 00:01:37,760 Speaker 5: back on track. 33 00:01:37,840 --> 00:01:40,000 Speaker 3: But yeah, that's where it's at the moment. 34 00:01:40,640 --> 00:01:43,440 Speaker 1: I have this little ritual every morning which is pursuant 35 00:01:43,440 --> 00:01:45,480 Speaker 1: to your comments just then you're on it, which is 36 00:01:45,520 --> 00:01:46,800 Speaker 1: and you probably heard me say this. 37 00:01:46,880 --> 00:01:50,400 Speaker 2: But I don't say it exactly like this, but it's 38 00:01:50,440 --> 00:01:50,960 Speaker 2: a form of this. 39 00:01:51,080 --> 00:01:53,200 Speaker 1: In terms of what I want to get done today, 40 00:01:53,960 --> 00:01:55,560 Speaker 1: what needs to have my attention, in terms of what 41 00:01:55,600 --> 00:01:57,920 Speaker 1: I want to do be create change today, that's my 42 00:01:58,040 --> 00:02:01,800 Speaker 1: little thing. What needs to have my attention, what's the 43 00:02:01,800 --> 00:02:03,560 Speaker 1: best use of my time, what's the best use of 44 00:02:03,560 --> 00:02:05,400 Speaker 1: my emotional energy, what's the best. 45 00:02:05,280 --> 00:02:06,200 Speaker 2: Use of my focus? 46 00:02:06,680 --> 00:02:09,680 Speaker 1: And it kind of yeah, because you can start the 47 00:02:09,760 --> 00:02:14,040 Speaker 1: day with twenty different things that you could actually pay 48 00:02:14,080 --> 00:02:16,440 Speaker 1: attention to, and then at the end of the day 49 00:02:16,480 --> 00:02:18,359 Speaker 1: you've done nothing nothing. 50 00:02:18,440 --> 00:02:21,640 Speaker 5: Well, yeah, that's how That's pretty much how the last 51 00:02:21,960 --> 00:02:26,400 Speaker 5: couple of weeks have gone. And I'm at that stage 52 00:02:26,440 --> 00:02:30,520 Speaker 5: where there's, you know, five different balls in the air. 53 00:02:30,600 --> 00:02:34,360 Speaker 5: So I've got I'm interviewing participants for one study, and 54 00:02:34,400 --> 00:02:37,840 Speaker 5: I'm getting ethics approval for the next study. I've been 55 00:02:37,880 --> 00:02:42,280 Speaker 5: asked to be part of the Gender based Violence Task 56 00:02:42,320 --> 00:02:44,440 Speaker 5: Force at the UNI, so I've got to do some 57 00:02:44,480 --> 00:02:49,359 Speaker 5: stuff for that, and so yeah, so it's all good, but. 58 00:02:49,360 --> 00:02:49,960 Speaker 3: It's a lot. 59 00:02:50,840 --> 00:02:51,239 Speaker 2: It is. 60 00:02:51,520 --> 00:02:54,520 Speaker 1: Well, let's talk. I mean, we always go what are 61 00:02:54,560 --> 00:02:57,960 Speaker 1: we going to talk about? We just end up chatting. Well, 62 00:02:58,040 --> 00:03:00,680 Speaker 1: let's talk a little bit about efficiency. I feel like, 63 00:03:01,639 --> 00:03:04,000 Speaker 1: you know, we've spoken before on this show about the busy, 64 00:03:04,200 --> 00:03:07,200 Speaker 1: the busy badge. How are you? Oh, what's up busy? 65 00:03:07,240 --> 00:03:08,200 Speaker 1: How you going busy? 66 00:03:08,600 --> 00:03:10,680 Speaker 2: How have you been busy? How are things busy? 67 00:03:10,760 --> 00:03:15,160 Speaker 1: It's like it's like a social kind of a trophy, 68 00:03:15,720 --> 00:03:19,519 Speaker 1: and we take pride in how busy we are, and 69 00:03:19,639 --> 00:03:26,079 Speaker 1: which doesn't necessarily correlate with efficient or productivity or effectiveness. 70 00:03:28,000 --> 00:03:28,880 Speaker 2: You know, I think that. 71 00:03:29,000 --> 00:03:32,200 Speaker 1: Trying to figure out, like what are my priorities, what 72 00:03:32,840 --> 00:03:35,640 Speaker 1: needs to like, what are the one what's the one, two, three? 73 00:03:35,800 --> 00:03:36,000 Speaker 2: Four? 74 00:03:36,080 --> 00:03:39,800 Speaker 1: Things today that really need to be prioritized because it 75 00:03:39,880 --> 00:03:42,760 Speaker 1: is so easy, especially if you are like probably you 76 00:03:42,840 --> 00:03:46,040 Speaker 1: and me, which is you know, you've got work, you've 77 00:03:46,040 --> 00:03:49,160 Speaker 1: got study, you've got a life, you've got training, you've 78 00:03:49,160 --> 00:03:51,840 Speaker 1: got lots of other commitments. We're both in a similar 79 00:03:51,880 --> 00:03:55,280 Speaker 1: position at the moment. I'm just stumbling towards the finished line, 80 00:03:55,280 --> 00:03:58,400 Speaker 1: as everyone knows. But but there are just every day 81 00:03:58,480 --> 00:04:01,120 Speaker 1: you could do. I could actually do forty hours of 82 00:04:01,160 --> 00:04:04,080 Speaker 1: work a day if there was forty hours in a 83 00:04:04,160 --> 00:04:06,040 Speaker 1: day and I have the energy, which I do not, 84 00:04:06,720 --> 00:04:10,240 Speaker 1: but I could easily fill forty hours because every night 85 00:04:10,280 --> 00:04:12,320 Speaker 1: when I go to bed, there's still a bunch of stuff, 86 00:04:13,040 --> 00:04:16,360 Speaker 1: not not half done, but there's still stuff to do tomorrow. 87 00:04:16,440 --> 00:04:20,520 Speaker 1: And you think even even down to like sometimes this 88 00:04:20,680 --> 00:04:23,760 Speaker 1: is bad, but like sometimes two days will go by 89 00:04:24,040 --> 00:04:26,600 Speaker 1: and I go, I haven't thought about my mum or 90 00:04:26,680 --> 00:04:29,479 Speaker 1: dad for two days. I mean, they're always floating around 91 00:04:29,560 --> 00:04:33,040 Speaker 1: my consciousness, you know. Yeah, I think I'm a fucking 92 00:04:33,120 --> 00:04:37,520 Speaker 1: terrible son, you know. Or there's that thing that like 93 00:04:37,560 --> 00:04:40,839 Speaker 1: I got a new dry dryer wash clothes dry the 94 00:04:40,839 --> 00:04:43,320 Speaker 1: other day when I say the other day four weeks ago, 95 00:04:44,279 --> 00:04:47,600 Speaker 1: and I put the old one outside my side door 96 00:04:48,080 --> 00:04:51,040 Speaker 1: next to one of my motorbikes, and I went, I'll 97 00:04:51,080 --> 00:04:52,840 Speaker 1: get rid of that in the next couple of days. Well, 98 00:04:52,880 --> 00:04:56,560 Speaker 1: that's still sitting there next to the motorbike, and it 99 00:04:56,640 --> 00:04:59,159 Speaker 1: may have things growing on it very soon, because you 100 00:04:59,200 --> 00:05:01,680 Speaker 1: know when you go, yeah, that's not a good use 101 00:05:01,720 --> 00:05:04,080 Speaker 1: of my time today. That's not a problem. That can 102 00:05:04,160 --> 00:05:07,880 Speaker 1: just sit there, And I really want to make sure 103 00:05:07,880 --> 00:05:10,880 Speaker 1: it's not still sitting there at Christmas as some kind 104 00:05:10,920 --> 00:05:12,520 Speaker 1: of ornaments in the garden. 105 00:05:12,600 --> 00:05:17,800 Speaker 2: But yeah, I think that being prudent. 106 00:05:17,400 --> 00:05:20,159 Speaker 1: With our time and our energy based on what we 107 00:05:20,240 --> 00:05:23,040 Speaker 1: need to get done is kind of pretty paramount when 108 00:05:23,080 --> 00:05:25,719 Speaker 1: you're trying to especially do a PhD like you or 109 00:05:26,160 --> 00:05:29,240 Speaker 1: create some kind of outcome, you know, be it in 110 00:05:29,320 --> 00:05:30,440 Speaker 1: business or something else. 111 00:05:31,560 --> 00:05:33,760 Speaker 5: Yeah, and that's interesting because I've noticed a habit that 112 00:05:33,800 --> 00:05:35,680 Speaker 5: I've had and I think I've had it for a 113 00:05:35,680 --> 00:05:38,039 Speaker 5: long time, but I catch myself doing it now and 114 00:05:38,080 --> 00:05:40,440 Speaker 5: I stop myself. Is that when I start to get 115 00:05:40,480 --> 00:05:44,480 Speaker 5: a little bit overwhelmed for some stupid reason, I start 116 00:05:44,520 --> 00:05:48,000 Speaker 5: adding things to the list just to make myself feel worse. 117 00:05:48,480 --> 00:05:50,800 Speaker 5: Like it's like, oh shit, I haven't even washed my 118 00:05:50,839 --> 00:05:53,239 Speaker 5: car for months, but I didn't care that I hadn't 119 00:05:53,320 --> 00:05:56,839 Speaker 5: washed my car, but suddenly that, on top of everything else, 120 00:05:57,120 --> 00:06:03,480 Speaker 5: is appriarately And so I don't know why we do that, 121 00:06:03,600 --> 00:06:07,200 Speaker 5: but I managed to catch myself doing that now because 122 00:06:07,200 --> 00:06:11,839 Speaker 5: it's almost like you're deliberately catastrophizing it and making it bigger, 123 00:06:12,120 --> 00:06:18,200 Speaker 5: almost to justify some level of anxiety. So I catch 124 00:06:18,240 --> 00:06:21,120 Speaker 5: myself doing that now, and I go, do you really 125 00:06:21,120 --> 00:06:24,880 Speaker 5: care that your car's not wash? 126 00:06:24,920 --> 00:06:28,400 Speaker 3: I don't really care about it. Does it really matter 127 00:06:28,480 --> 00:06:30,600 Speaker 3: in the big scheme of things that your cars not clean? 128 00:06:31,080 --> 00:06:34,039 Speaker 1: No? Really, And in terms of in terms of my 129 00:06:34,160 --> 00:06:36,599 Speaker 1: priorities right now, is that going to move the needle 130 00:06:36,640 --> 00:06:43,000 Speaker 1: on anything that really? No? So you know, let's make 131 00:06:43,000 --> 00:06:44,960 Speaker 1: sure it's got air in the tiers and petrol. Let's 132 00:06:44,960 --> 00:06:46,960 Speaker 1: make sure it's not falling apart. But if it's got 133 00:06:46,960 --> 00:06:49,240 Speaker 1: a bit of dust and dirt and bird poop, it's 134 00:06:49,279 --> 00:06:52,800 Speaker 1: probably probably not going to derail your life. 135 00:06:53,520 --> 00:06:53,719 Speaker 3: Yeah. 136 00:06:54,360 --> 00:06:56,800 Speaker 1: For those who don't know Nicole, let me do just 137 00:06:56,839 --> 00:06:59,200 Speaker 1: a quick rewind. So we've known each other for a 138 00:06:59,240 --> 00:07:03,040 Speaker 1: long time. She lives in the Sunshine State. You live 139 00:07:03,080 --> 00:07:04,960 Speaker 1: in the Sunshine State. I don't know why I'm talking 140 00:07:04,960 --> 00:07:09,960 Speaker 1: about you in the third person, and that do you 141 00:07:10,000 --> 00:07:13,880 Speaker 1: know what? You probably don't, but that chat that we 142 00:07:13,960 --> 00:07:17,920 Speaker 1: have where you were basically unpacking, where your life was 143 00:07:17,960 --> 00:07:21,760 Speaker 1: falling apart, and you just you ended up starting UNI. 144 00:07:22,560 --> 00:07:24,840 Speaker 1: I don't know what number that was, we might post 145 00:07:24,880 --> 00:07:28,000 Speaker 1: it in the show notes, but where the last thing 146 00:07:28,040 --> 00:07:31,040 Speaker 1: on your mind was really that you would become an academic. 147 00:07:31,280 --> 00:07:34,400 Speaker 1: I think it's like like, and by the way it 148 00:07:34,440 --> 00:07:36,680 Speaker 1: works out, you're really good at academia. You're a really 149 00:07:36,680 --> 00:07:38,720 Speaker 1: good student, you're a really good teacher, you're a really 150 00:07:38,760 --> 00:07:43,800 Speaker 1: good researcher. But it was almost fortuitous that you just 151 00:07:44,000 --> 00:07:46,280 Speaker 1: ended up in this space now where at fifty two, 152 00:07:47,760 --> 00:07:50,520 Speaker 1: kind a halfway or a bit, I don't know, vaguely 153 00:07:50,600 --> 00:07:55,560 Speaker 1: halfway through your PhD where and you started your undergrad 154 00:07:55,840 --> 00:07:59,600 Speaker 1: degree which became an undergrad and an honors degree. You 155 00:07:59,640 --> 00:08:02,680 Speaker 1: started that when you were forty three and twenty sixteen, 156 00:08:03,560 --> 00:08:05,680 Speaker 1: and then you started your PhD. 157 00:08:05,480 --> 00:08:09,880 Speaker 2: Too well, I know, last year in March, so almost 158 00:08:10,000 --> 00:08:11,640 Speaker 2: like a year and a half ago, right. 159 00:08:13,920 --> 00:08:16,880 Speaker 1: But I love that because so many of our listeners 160 00:08:16,920 --> 00:08:19,520 Speaker 1: are summer in the ballpark of kind of thirty five 161 00:08:19,640 --> 00:08:24,280 Speaker 1: to sixty, and it's really nice to hear somebody who's 162 00:08:24,320 --> 00:08:26,920 Speaker 1: maybe not shout out to our twenty three year olds. 163 00:08:26,960 --> 00:08:28,840 Speaker 1: We love our twenty three year olds, but it's nice 164 00:08:28,840 --> 00:08:34,400 Speaker 1: to here somebody who's not twenty three, who's excelling at 165 00:08:34,440 --> 00:08:37,920 Speaker 1: fifty two. And I know you wouldn't bring this up, 166 00:08:37,960 --> 00:08:40,440 Speaker 1: but I'm bringing it up. You just got you just 167 00:08:40,520 --> 00:08:42,520 Speaker 1: did you just win an award or get a grant 168 00:08:42,679 --> 00:08:44,640 Speaker 1: or you just got a whole bunch of money that 169 00:08:44,720 --> 00:08:46,040 Speaker 1: you can use to clean your car? 170 00:08:46,200 --> 00:08:47,720 Speaker 2: What happened? 171 00:08:48,040 --> 00:08:54,120 Speaker 6: Yeah, So my current self is thinking my past self profusely, 172 00:08:54,320 --> 00:08:57,680 Speaker 6: because one afternoon I just kind of sat here and 173 00:08:57,720 --> 00:08:59,640 Speaker 6: looked for external scholarships. 174 00:08:59,800 --> 00:09:03,199 Speaker 5: I came across this scholarship that was opening in July 175 00:09:03,360 --> 00:09:06,680 Speaker 5: this year. This was probably about six or twelve months ago, 176 00:09:06,800 --> 00:09:10,240 Speaker 5: and I put it in my diary to follow up 177 00:09:10,280 --> 00:09:14,920 Speaker 5: when it opened. So it came up in my calendar. 178 00:09:15,320 --> 00:09:17,280 Speaker 5: I had a look at it. I said to my supervisor, 179 00:09:17,520 --> 00:09:19,640 Speaker 5: do you think I should apply for this? And she said, yeah, 180 00:09:19,840 --> 00:09:23,560 Speaker 5: you would probably meet the criteria, And so I put 181 00:09:23,600 --> 00:09:26,480 Speaker 5: in the application, not really giving it too much thought, 182 00:09:26,600 --> 00:09:29,440 Speaker 5: just hoping something would happen. And it was a ten 183 00:09:29,480 --> 00:09:33,480 Speaker 5: thousand dollars scholarship. I got got the phone call week 184 00:09:33,520 --> 00:09:36,800 Speaker 5: before last to say that I was the lucky recipient. 185 00:09:37,040 --> 00:09:42,200 Speaker 1: So wow, congrats, that's amazing. Yeah, how long did you 186 00:09:42,280 --> 00:09:46,920 Speaker 1: feel if you did like you didn't really belong in 187 00:09:46,960 --> 00:09:49,760 Speaker 1: academia or like something of a fraud. 188 00:09:50,040 --> 00:09:52,360 Speaker 2: Maybe you never felt that, I'm sure you did at 189 00:09:52,400 --> 00:09:53,240 Speaker 2: the start. 190 00:09:53,360 --> 00:09:56,000 Speaker 1: How long before you felt like, because this is very 191 00:09:56,040 --> 00:10:00,920 Speaker 1: common experience, the people go into like into a new place, 192 00:10:01,000 --> 00:10:04,119 Speaker 1: doing a new thing, they feel unqualified, they feel unworthy, 193 00:10:04,559 --> 00:10:07,960 Speaker 1: they feel all of these, you know, fear based emotions. 194 00:10:08,600 --> 00:10:10,199 Speaker 2: How long before I felt. 195 00:10:10,120 --> 00:10:12,800 Speaker 1: Kind of like, I know I can do this and 196 00:10:12,880 --> 00:10:14,800 Speaker 1: I belong here, this is where I'm meant to be. 197 00:10:15,760 --> 00:10:23,400 Speaker 5: I once I it kind of has just like it's 198 00:10:23,760 --> 00:10:29,040 Speaker 5: the the feeling of competence just gets stronger and stronger. 199 00:10:29,200 --> 00:10:33,200 Speaker 5: Likes I as kind of as the evidence starts to 200 00:10:33,559 --> 00:10:37,080 Speaker 5: show up that I'm okay in this space. But I 201 00:10:37,120 --> 00:10:41,120 Speaker 5: think I think I was still imposter syndrome. 202 00:10:41,320 --> 00:10:42,880 Speaker 3: Well I still do imposter syndrome. 203 00:10:42,920 --> 00:10:46,720 Speaker 5: But I think I was still kind of doubting myself, 204 00:10:47,120 --> 00:10:50,280 Speaker 5: especially through honors, because that's. 205 00:10:50,080 --> 00:10:52,920 Speaker 3: A really that's a tough year. 206 00:10:55,080 --> 00:10:57,680 Speaker 5: But I think once I got into the PhD, like 207 00:10:57,880 --> 00:11:02,120 Speaker 5: I got accepted, and you know, the feedback that I 208 00:11:02,200 --> 00:11:05,160 Speaker 5: got was because you know they do when you apply 209 00:11:05,240 --> 00:11:07,680 Speaker 5: for a PhD, they put you in an order of merit, 210 00:11:07,840 --> 00:11:11,200 Speaker 5: and because I worked in the research center for AGES, 211 00:11:12,360 --> 00:11:14,200 Speaker 5: I kind of got the feedback that I was quite 212 00:11:14,320 --> 00:11:15,920 Speaker 5: high in the order of merit. 213 00:11:17,120 --> 00:11:18,800 Speaker 3: To be accepted into the pH d. 214 00:11:18,920 --> 00:11:22,200 Speaker 5: Which really just blows me away because the type of 215 00:11:22,240 --> 00:11:28,880 Speaker 5: research we do, we're not really big stats heavy, and 216 00:11:28,920 --> 00:11:37,080 Speaker 5: stats was never my favorite stuff. But you know, I 217 00:11:37,080 --> 00:11:40,160 Speaker 5: if something comes up and you know, look, I know 218 00:11:40,559 --> 00:11:42,240 Speaker 5: I know where to go to find the answer and 219 00:11:42,280 --> 00:11:42,880 Speaker 5: to work it out. 220 00:11:42,920 --> 00:11:46,400 Speaker 3: But yeah, So to answer your question, I think once 221 00:11:46,440 --> 00:11:46,800 Speaker 3: I got. 222 00:11:46,720 --> 00:11:51,200 Speaker 5: Accepted into the PhD, I thought, oh, well, they think 223 00:11:51,240 --> 00:11:56,559 Speaker 5: I can do it, so maybe I can. Because that's 224 00:11:56,880 --> 00:12:00,720 Speaker 5: that's not just you know, someone going you know, someone 225 00:12:00,720 --> 00:12:02,040 Speaker 5: that I know giving. 226 00:12:01,760 --> 00:12:03,320 Speaker 3: Me a position. 227 00:12:03,440 --> 00:12:07,480 Speaker 5: It's based on a whole heap of criteria and in 228 00:12:07,520 --> 00:12:09,600 Speaker 5: an order of merit, and I. 229 00:12:09,600 --> 00:12:10,400 Speaker 3: Did really well. 230 00:12:10,520 --> 00:12:14,360 Speaker 5: So you have to earn your place, and I'd earned 231 00:12:14,360 --> 00:12:17,000 Speaker 5: my place. So I think that was when I really 232 00:12:17,040 --> 00:12:21,600 Speaker 5: started to realize that this was something that I could 233 00:12:22,040 --> 00:12:23,400 Speaker 5: do well. 234 00:12:24,559 --> 00:12:28,839 Speaker 1: Isn't it funny how we tend to almost value what 235 00:12:28,880 --> 00:12:32,400 Speaker 1: others think of us more than what we we go 236 00:12:32,800 --> 00:12:34,880 Speaker 1: do you think I could do this? Oh? Yeah, yeah, 237 00:12:34,920 --> 00:12:37,480 Speaker 1: I'm like, Oh, I must be able to because Nicole 238 00:12:37,640 --> 00:12:43,600 Speaker 1: said I could. Like, you know, it's a really like 239 00:12:43,840 --> 00:12:48,360 Speaker 1: and looking for even worse than that, looking general terms 240 00:12:48,400 --> 00:12:52,080 Speaker 1: for approval and acceptance from people who actually don't even 241 00:12:52,120 --> 00:12:56,120 Speaker 1: get a fuck about us, rather than rather than approval 242 00:12:56,160 --> 00:13:01,520 Speaker 1: from ourselves. You know, I'm so good at not approving 243 00:13:01,679 --> 00:13:04,880 Speaker 1: of me. I'm so good even now. But I have 244 00:13:04,960 --> 00:13:07,640 Speaker 1: awareness around it, so I recognize what's going on. 245 00:13:08,360 --> 00:13:11,880 Speaker 2: Yeah, do you think that you and I were talking? 246 00:13:11,880 --> 00:13:14,520 Speaker 1: Sorry interrupt, but you and I were talking about Oh, 247 00:13:14,600 --> 00:13:15,439 Speaker 1: let's talk about that. 248 00:13:15,480 --> 00:13:17,880 Speaker 2: Why do you think that is? Why do we pay 249 00:13:17,920 --> 00:13:21,560 Speaker 2: attention to what other thinks of us more than we do? Like? 250 00:13:21,600 --> 00:13:24,960 Speaker 1: Why do we need approval from maybe people who don't 251 00:13:25,080 --> 00:13:29,280 Speaker 1: know us and sometimes people who like may not maybe 252 00:13:29,280 --> 00:13:31,720 Speaker 1: don't hate us, but don't really care about us. 253 00:13:31,920 --> 00:13:35,040 Speaker 2: But nonetheless we value that Well, I don't know. 254 00:13:35,120 --> 00:13:36,719 Speaker 3: I think there's there. 255 00:13:37,080 --> 00:13:39,200 Speaker 5: It'll be a combination of a heap of things and 256 00:13:39,280 --> 00:13:44,920 Speaker 5: personal experience as well. I think there is definitely a 257 00:13:44,920 --> 00:13:48,439 Speaker 5: healthy amount of self doubt that we have to have, 258 00:13:48,800 --> 00:13:51,360 Speaker 5: like I because I think if we were if we 259 00:13:51,640 --> 00:13:53,760 Speaker 5: thought that we knew it all, then there's nothing to 260 00:13:53,840 --> 00:13:56,240 Speaker 5: learn and we're just obnoxious and hard to be around. 261 00:13:57,600 --> 00:14:00,760 Speaker 2: So okay, I hit the brakes. Does it have to 262 00:14:00,760 --> 00:14:03,480 Speaker 2: be self doubt or can it just be self awareness? 263 00:14:03,559 --> 00:14:06,480 Speaker 3: I don't know, awareness, yeah, or a level of. 264 00:14:08,080 --> 00:14:11,679 Speaker 5: I want to say humility, But it's like that. One 265 00:14:11,720 --> 00:14:14,160 Speaker 5: of my favorite theories is the Dunning Krueger effect. 266 00:14:14,280 --> 00:14:18,400 Speaker 3: You know that when you're an expert, but you're not an. 267 00:14:18,360 --> 00:14:21,240 Speaker 5: Expert because you don't know enough about the subject to 268 00:14:21,320 --> 00:14:22,440 Speaker 5: know that you don't know enough. 269 00:14:23,080 --> 00:14:26,480 Speaker 1: Oh you know, this is what I get from people. Right, 270 00:14:26,520 --> 00:14:29,520 Speaker 1: I'm not an expert at anything, but you know, if 271 00:14:29,520 --> 00:14:32,720 Speaker 1: I do know a lot about anything, it's about exercise programming, right, 272 00:14:33,200 --> 00:14:36,520 Speaker 1: And people when people who don't know shit about exercise 273 00:14:36,560 --> 00:14:41,280 Speaker 1: come and tell me stuff, and that sounds very arrogant. 274 00:14:41,320 --> 00:14:42,000 Speaker 2: I don't mean. 275 00:14:43,640 --> 00:14:47,360 Speaker 1: They don't know how little they know and they're talking. 276 00:14:48,400 --> 00:14:52,480 Speaker 1: They're talking to someone with four decades of experience, owning gyms, 277 00:14:52,520 --> 00:14:56,360 Speaker 1: working in gym's writing programs, working with elite athletes, and 278 00:14:56,400 --> 00:14:59,280 Speaker 1: then they start to tell me about this thing that 279 00:14:59,320 --> 00:15:01,120 Speaker 1: they don't even understand themselves. 280 00:15:01,200 --> 00:15:02,160 Speaker 2: And I don't want to. 281 00:15:02,160 --> 00:15:05,240 Speaker 1: Shoot them to bits, but it's it's a really observ 282 00:15:05,360 --> 00:15:07,200 Speaker 1: I'm sure I've done that the other way around to 283 00:15:07,280 --> 00:15:08,600 Speaker 1: other people who know more than me. 284 00:15:09,520 --> 00:15:10,920 Speaker 3: Yeah, I think we've all been there. 285 00:15:11,200 --> 00:15:15,080 Speaker 5: But yeah, that's I think when you have that awareness 286 00:15:15,600 --> 00:15:19,800 Speaker 5: that you don't like I think that's a valuable thing 287 00:15:19,840 --> 00:15:23,240 Speaker 5: to have. Like, the more I learned, the more I 288 00:15:23,440 --> 00:15:30,240 Speaker 5: realized how much I don't know. So yeah, so why 289 00:15:30,280 --> 00:15:32,720 Speaker 5: do we? So your question was why do we do that? 290 00:15:32,840 --> 00:15:36,960 Speaker 5: I think part of it is nobody wants to be 291 00:15:38,680 --> 00:15:44,440 Speaker 5: the know it all either, Like, so for me personally, 292 00:15:45,240 --> 00:15:46,880 Speaker 5: I feel like, and I think a lot of people 293 00:15:46,880 --> 00:15:51,880 Speaker 5: can relate to this is I probably was aware somewhere 294 00:15:51,880 --> 00:15:52,200 Speaker 5: in my. 295 00:15:52,320 --> 00:15:57,720 Speaker 3: Past that I was pretty clever. Yeah, but maybe being 296 00:15:57,800 --> 00:16:00,240 Speaker 3: pretty clever didn't really get me be. 297 00:16:01,760 --> 00:16:05,320 Speaker 5: Liked or friends or loved or appreciated or people would 298 00:16:05,600 --> 00:16:08,440 Speaker 5: you know? So, I think I had a fear when 299 00:16:08,440 --> 00:16:15,080 Speaker 5: I was younger of coming across as a bit overconfident, right, So, 300 00:16:15,880 --> 00:16:17,080 Speaker 5: because nobody. 301 00:16:17,440 --> 00:16:19,720 Speaker 2: Like, so you learn to play small? 302 00:16:20,320 --> 00:16:22,520 Speaker 5: Yeah, I learned to play small And I think I 303 00:16:22,600 --> 00:16:26,440 Speaker 5: did that. And I know I'm not the only I 304 00:16:26,560 --> 00:16:29,680 Speaker 5: know that's not a unique thing. 305 00:16:30,840 --> 00:16:35,080 Speaker 3: But there's something, there's a there's a sweet spot of. 306 00:16:36,400 --> 00:16:41,080 Speaker 5: Confidence that that isn't arrogance. There's a sweet spot where 307 00:16:41,160 --> 00:16:43,920 Speaker 5: you can, like I now know that I've earned my 308 00:16:43,960 --> 00:16:48,000 Speaker 5: place at the table, but I don't have to There's 309 00:16:48,000 --> 00:16:51,040 Speaker 5: no badge that goes along with it. There's just this 310 00:16:51,160 --> 00:16:55,280 Speaker 5: this quiet confidence you know that I can bring something 311 00:16:56,000 --> 00:16:58,360 Speaker 5: to a conversation or whatever. 312 00:17:00,640 --> 00:17:02,640 Speaker 3: But I was afraid to have that before. 313 00:17:04,160 --> 00:17:06,280 Speaker 5: I don't know, And maybe it's because I don't look 314 00:17:06,320 --> 00:17:08,840 Speaker 5: for the external validation as much as I used to. 315 00:17:09,800 --> 00:17:15,560 Speaker 1: Yeah, I think that for me, a bit of insecurity, 316 00:17:15,600 --> 00:17:18,160 Speaker 1: a bit of self doubt, a bit of imposter syndrome 317 00:17:19,160 --> 00:17:23,800 Speaker 1: keeps me inspired and motivated. Like it keeps me hungry, 318 00:17:23,840 --> 00:17:27,480 Speaker 1: It keeps me wanting to get better and do better. 319 00:17:27,560 --> 00:17:31,520 Speaker 1: And I know that sounds cliche, but it actually does. Yeah, 320 00:17:31,640 --> 00:17:36,520 Speaker 1: I would feel weird if I thought I'm there now 321 00:17:36,720 --> 00:17:37,400 Speaker 1: like I'm good. 322 00:17:37,560 --> 00:17:41,840 Speaker 2: Yeah, I'm the smartest in the room. I'm this, you know. 323 00:17:43,040 --> 00:17:43,919 Speaker 2: But you know, while you. 324 00:17:45,880 --> 00:17:48,760 Speaker 1: Can simultaneously no, look, I'm pretty good at that. I've 325 00:17:48,760 --> 00:17:51,159 Speaker 1: got some skills that that that's not arrogance, that's just 326 00:17:51,200 --> 00:17:54,359 Speaker 1: awareness because as you said, there's evidence. 327 00:17:53,960 --> 00:17:57,720 Speaker 2: There's data. It's like that whole which I've said before 328 00:17:57,760 --> 00:17:58,280 Speaker 2: on this show. 329 00:17:58,320 --> 00:18:01,080 Speaker 1: You know where I can go you know, like I'm 330 00:18:01,119 --> 00:18:03,359 Speaker 1: coming to Queensland this week, so Thursday night I'm doing 331 00:18:03,400 --> 00:18:08,000 Speaker 1: a workshop. Saturday, I'm doing a corporate gig. And you know, 332 00:18:08,040 --> 00:18:10,639 Speaker 1: when I stand up to do the corporate gig or 333 00:18:10,680 --> 00:18:13,119 Speaker 1: even the public workshop, I don't know what that will be, 334 00:18:13,200 --> 00:18:16,000 Speaker 1: but probably somewhere in the thousands of times that I've 335 00:18:16,040 --> 00:18:20,480 Speaker 1: done a version of that and without any doubt, there 336 00:18:20,480 --> 00:18:24,920 Speaker 1: will still be some fear, some anxiety, some underlying whispering 337 00:18:25,040 --> 00:18:27,640 Speaker 1: voice that goes, well, this could be. 338 00:18:27,640 --> 00:18:30,560 Speaker 2: The time that you absolutely fuck it all up. 339 00:18:30,600 --> 00:18:34,960 Speaker 1: And you know, there's that fearful self doubt or that 340 00:18:35,080 --> 00:18:37,880 Speaker 1: fear driven kind of in a dialogue that's some kind 341 00:18:38,359 --> 00:18:42,919 Speaker 1: in some ways destructive and self sabotaging, while also knowing 342 00:18:43,000 --> 00:18:46,919 Speaker 1: so that's the emotion, but also knowing intellectually, well, I 343 00:18:47,000 --> 00:18:49,439 Speaker 1: must be pretty good at this because people pay to 344 00:18:49,480 --> 00:18:52,679 Speaker 1: come and they've come before, and or I get booked 345 00:18:53,119 --> 00:18:58,119 Speaker 1: and rebooked by the same organizations. So these two things, 346 00:18:58,160 --> 00:19:02,280 Speaker 1: like the intellectual understanding that I'm not shit, can coexist 347 00:19:02,320 --> 00:19:06,199 Speaker 1: with the feeling ah, I am shit, you know, like 348 00:19:07,119 --> 00:19:09,199 Speaker 1: the one doesn't have to disappear. 349 00:19:09,760 --> 00:19:12,280 Speaker 2: So I don't know, I think maybe you're right. 350 00:19:12,160 --> 00:19:15,400 Speaker 7: A little bit of self doubt is or a little 351 00:19:15,400 --> 00:19:18,080 Speaker 7: bit of insecurity or I think as long as it's 352 00:19:18,119 --> 00:19:23,920 Speaker 7: not fucking crippling or overwhelming, yeah, maybe it keeps us grounded. 353 00:19:25,040 --> 00:19:25,399 Speaker 3: Well. 354 00:19:25,800 --> 00:19:29,720 Speaker 5: A classic example is next month, I'm going to the 355 00:19:29,720 --> 00:19:34,640 Speaker 5: Gold Coast to a conference which is it's a stoptimistic 356 00:19:34,720 --> 00:19:39,360 Speaker 5: violence conference, So it's the it'll be like, the audience 357 00:19:39,400 --> 00:19:45,680 Speaker 5: will be the DV sector, And my research is it's a. 358 00:19:45,640 --> 00:19:47,320 Speaker 3: Novel approach to the problem. 359 00:19:47,840 --> 00:19:51,800 Speaker 5: So I'm introducing systems thinking in an area that it's 360 00:19:51,840 --> 00:19:56,040 Speaker 5: not really been introduced before, and I'm talking to practitioners 361 00:19:56,040 --> 00:19:59,800 Speaker 5: that are on the ground, and I'm talking about a 362 00:20:00,080 --> 00:20:03,359 Speaker 5: topic that every single person in that room will be 363 00:20:03,400 --> 00:20:07,920 Speaker 5: extremely passionate and emotionally. 364 00:20:07,240 --> 00:20:08,560 Speaker 3: Tied to in some way. 365 00:20:10,000 --> 00:20:12,560 Speaker 5: And I've not been on the ground in the same 366 00:20:12,600 --> 00:20:17,600 Speaker 5: capacity as they have. So I'm very nervous about I 367 00:20:17,600 --> 00:20:18,840 Speaker 5: would say nervous. 368 00:20:18,440 --> 00:20:21,040 Speaker 3: But I'm very mindful of not coming in. 369 00:20:20,960 --> 00:20:24,080 Speaker 5: And going I've got the solution with this systems thinking 370 00:20:24,680 --> 00:20:29,600 Speaker 5: and really kind of understanding the complexity of the problem 371 00:20:29,760 --> 00:20:31,880 Speaker 5: from a because I'm going to be a room full 372 00:20:31,880 --> 00:20:36,600 Speaker 5: of practitioners basically yes, And so I am just quietly 373 00:20:36,680 --> 00:20:41,200 Speaker 5: shitting myself because I want to make sure that I, 374 00:20:41,240 --> 00:20:43,680 Speaker 5: you know, introduce this in a way that is sensitive 375 00:20:43,760 --> 00:20:47,560 Speaker 5: to their experiences as practitioners dealing with some pretty heavy 376 00:20:48,240 --> 00:20:52,320 Speaker 5: and complex issues. So yeah, it's that it's I have 377 00:20:52,400 --> 00:20:55,119 Speaker 5: the confidence in knowing that what I bring to the table, 378 00:20:55,520 --> 00:20:58,400 Speaker 5: you know, might be useful or it might open up 379 00:20:58,440 --> 00:20:59,919 Speaker 5: some conversation. 380 00:21:00,640 --> 00:21:02,520 Speaker 3: But I also want to make sure that. 381 00:21:02,440 --> 00:21:06,040 Speaker 5: I am respectful in the way and conscious in the 382 00:21:06,040 --> 00:21:07,840 Speaker 5: way that I deliver it, so that I'm not coming 383 00:21:07,880 --> 00:21:10,280 Speaker 5: and going I've got all the answers to this really 384 00:21:10,359 --> 00:21:13,000 Speaker 5: really complex problem. 385 00:21:13,040 --> 00:21:15,240 Speaker 3: So yeah, that's going to be well. 386 00:21:15,280 --> 00:21:18,600 Speaker 1: I think your safety net is your research and your 387 00:21:18,640 --> 00:21:21,960 Speaker 1: results and your data. It's like, that's not an opinion, 388 00:21:22,040 --> 00:21:25,840 Speaker 1: that's results. That's what That's not what you think, that's 389 00:21:25,840 --> 00:21:28,600 Speaker 1: what the research tells us or your research tells us. 390 00:21:28,640 --> 00:21:33,680 Speaker 1: So being able to present science, you know, that's kind 391 00:21:33,720 --> 00:21:37,719 Speaker 1: of you're not getting up there thinking saying, I'm not 392 00:21:37,760 --> 00:21:39,840 Speaker 1: sure about all this, but let me tell you what 393 00:21:39,920 --> 00:21:42,919 Speaker 1: I think, or let me tell you my experiences or 394 00:21:42,920 --> 00:21:46,600 Speaker 1: my opinions. So this is a good opportunity for us 395 00:21:46,600 --> 00:21:50,399 Speaker 1: to learn a little bit about your research. So just 396 00:21:50,640 --> 00:21:54,359 Speaker 1: give us a like an aerial overview of what it is, 397 00:21:54,440 --> 00:21:56,920 Speaker 1: what's the what are you exploring? 398 00:21:56,960 --> 00:21:58,200 Speaker 2: And then let us go from there. 399 00:22:00,040 --> 00:22:01,200 Speaker 3: So we are. 400 00:22:01,320 --> 00:22:04,080 Speaker 5: So I'll tell you a little bit about what systems 401 00:22:04,119 --> 00:22:08,680 Speaker 5: thinking is first. So systems thinking or systems science is 402 00:22:09,280 --> 00:22:12,840 Speaker 5: it's a way of making sense of complexity. So the 403 00:22:12,880 --> 00:22:16,199 Speaker 5: models and frameworks within systems thinking and system science are 404 00:22:16,240 --> 00:22:23,280 Speaker 5: applied to complex issues and instead of looking at and 405 00:22:23,320 --> 00:22:26,560 Speaker 5: so when I say systems, it's like, for a good example, 406 00:22:26,640 --> 00:22:31,119 Speaker 5: is like the healthcare system, the defense force, you know, 407 00:22:31,160 --> 00:22:36,080 Speaker 5: the rail transport systems. So anything that we operate within 408 00:22:36,400 --> 00:22:40,880 Speaker 5: is anything that has rules and regulations and policies and. 409 00:22:41,359 --> 00:22:43,199 Speaker 2: Structure and process and all that. 410 00:22:43,320 --> 00:22:49,840 Speaker 5: So that's a system, right, And so rather than looking 411 00:22:49,880 --> 00:22:54,640 Speaker 5: at certain components of a system, systems thinking looks at 412 00:22:54,840 --> 00:22:57,480 Speaker 5: the whole system as the unit of analysis, and we 413 00:22:57,520 --> 00:23:01,960 Speaker 5: look for interactions across the system, and we look for 414 00:23:02,080 --> 00:23:04,080 Speaker 5: leverage points where. 415 00:23:03,720 --> 00:23:06,359 Speaker 3: We can implement. 416 00:23:08,400 --> 00:23:12,800 Speaker 5: Change for maximum impact for you know, the least amount 417 00:23:12,840 --> 00:23:14,280 Speaker 5: of effort or expense. 418 00:23:15,600 --> 00:23:18,480 Speaker 3: So the system that I'm looking at for my research 419 00:23:18,880 --> 00:23:22,680 Speaker 3: is it's the use of. 420 00:23:22,680 --> 00:23:28,720 Speaker 5: Technology or how technology is used the use of technology 421 00:23:29,119 --> 00:23:33,160 Speaker 5: for coercive control in the context of intimate partner violence. 422 00:23:34,000 --> 00:23:38,200 Speaker 4: Stop stop all right, sorry, just I'm just like how 423 00:23:38,440 --> 00:23:44,480 Speaker 4: technology hang on ad tech is used for coercive control. Yeah, 424 00:23:45,280 --> 00:23:47,159 Speaker 4: sorry everybody, I just don't want to show over this 425 00:23:47,200 --> 00:23:48,919 Speaker 4: because I think it's okay. 426 00:23:48,720 --> 00:23:53,680 Speaker 5: So technology facilitated coercive control. 427 00:23:53,480 --> 00:23:56,639 Speaker 3: So how give us give us an example, an example. 428 00:23:57,520 --> 00:24:01,639 Speaker 5: It can be anything from threat text messages to an 429 00:24:01,680 --> 00:24:08,760 Speaker 5: intimate partner, all the way to surveillance, GPS monitoring, hacking 430 00:24:08,800 --> 00:24:16,600 Speaker 5: into accounts, internet banking docsing, which is you know, putting 431 00:24:16,960 --> 00:24:24,680 Speaker 5: information online, stalking online, stalking, anything anything that uses technology 432 00:24:25,119 --> 00:24:29,600 Speaker 5: to perpetrate coercive control. 433 00:24:28,119 --> 00:24:31,600 Speaker 1: Like threatening to share intimate videos and all that. That 434 00:24:31,720 --> 00:24:36,040 Speaker 1: was in the news recently. Some who but some couple 435 00:24:36,080 --> 00:24:39,040 Speaker 1: of people have been where their partners were putting up 436 00:24:39,320 --> 00:24:44,760 Speaker 1: they broke up, and then that's a really fascinating and 437 00:24:44,800 --> 00:24:48,320 Speaker 1: I would think bigger than people would think that area, 438 00:24:48,680 --> 00:24:49,399 Speaker 1: I mean, oh. 439 00:24:49,359 --> 00:24:52,800 Speaker 5: Massive, Yeah I I didn't. 440 00:24:53,040 --> 00:24:59,359 Speaker 3: Yeah, like even I was surprised at the technology is. 441 00:25:00,000 --> 00:25:03,720 Speaker 5: We cannot function without it in our everyday lives, and 442 00:25:03,840 --> 00:25:09,359 Speaker 5: so it just stands to reason that if someone so 443 00:25:09,440 --> 00:25:12,960 Speaker 5: we use technology as an extension of our normal behaviors. 444 00:25:12,960 --> 00:25:17,520 Speaker 5: So if somebody is coercively controlling or is perpetrating abuse, 445 00:25:18,160 --> 00:25:22,080 Speaker 5: in a relationship. Then technology is naturally going to be 446 00:25:22,119 --> 00:25:26,000 Speaker 5: part of that in today's world, and it escalates when 447 00:25:28,160 --> 00:25:31,920 Speaker 5: relationships end or if someone you know, if victims survivor 448 00:25:32,000 --> 00:25:37,280 Speaker 5: leaves an abusive relationship, then sometimes what they experienced within 449 00:25:37,320 --> 00:25:45,280 Speaker 5: the relationship person to person, just the technology becomes the 450 00:25:45,359 --> 00:25:50,200 Speaker 5: you know, the choice, yeah, to perpetrate abuse, and. 451 00:25:51,760 --> 00:25:56,000 Speaker 3: It's it's the problem with it is is it. 452 00:25:55,920 --> 00:26:02,159 Speaker 5: Becomes so perpetrators then have this or victim survivors have 453 00:26:02,240 --> 00:26:05,840 Speaker 5: this sense of omnipresence, so they can't leave the house 454 00:26:06,000 --> 00:26:09,879 Speaker 5: without you know, because there's tracking devices on you. You 455 00:26:09,920 --> 00:26:12,160 Speaker 5: can get a new phone, but as soon as you 456 00:26:12,240 --> 00:26:17,120 Speaker 5: log in with your Google you know, if the tracking 457 00:26:17,640 --> 00:26:22,720 Speaker 5: software to your Google account or your iCloud or you know, 458 00:26:22,840 --> 00:26:26,520 Speaker 5: or you then the device itself is it doesn't matter, 459 00:26:26,560 --> 00:26:29,800 Speaker 5: it's it's the software you're using and you and you know, 460 00:26:29,840 --> 00:26:33,680 Speaker 5: trying to find out where the where you be, where 461 00:26:33,680 --> 00:26:36,040 Speaker 5: it is located is really difficult. 462 00:26:36,119 --> 00:26:37,880 Speaker 3: So it's a big problem. 463 00:26:38,960 --> 00:26:45,119 Speaker 5: And so my PhD is looking at specifically our the 464 00:26:45,200 --> 00:26:49,199 Speaker 5: systemic responses to it. So how are we responding when 465 00:26:49,320 --> 00:26:53,240 Speaker 5: someone is experiencing We call it TFCC for sure. So 466 00:26:53,520 --> 00:26:58,400 Speaker 5: tech facilitated coursive control. So someone is experiencing TFCC, how 467 00:26:58,480 --> 00:27:02,119 Speaker 5: are we systemically responding to it, and where are the 468 00:27:02,200 --> 00:27:05,800 Speaker 5: leverage points in the system that we can enact some 469 00:27:06,600 --> 00:27:12,280 Speaker 5: change and for some impactful interventions, for better responses to 470 00:27:12,400 --> 00:27:13,359 Speaker 5: keep people safe. 471 00:27:14,480 --> 00:27:16,800 Speaker 1: And I mean, this is and I know this is 472 00:27:17,520 --> 00:27:20,560 Speaker 1: and I guess with AI, this is like, this is 473 00:27:20,640 --> 00:27:25,280 Speaker 1: like a wave just cascade. This is like a tsunami 474 00:27:25,400 --> 00:27:31,399 Speaker 1: of technology that's approaching an unsuspecting town of people, you know, 475 00:27:32,240 --> 00:27:35,360 Speaker 1: just making their sandwich at the kitchen bench. They don't 476 00:27:35,400 --> 00:27:38,880 Speaker 1: know what's coming. But I feel like with AI, that's 477 00:27:38,920 --> 00:27:42,560 Speaker 1: only going to make the problem bigger, the control and 478 00:27:42,600 --> 00:27:47,280 Speaker 1: the ability to do horrible shit for people with an 479 00:27:47,320 --> 00:27:51,720 Speaker 1: agenda easier in a way because now and then tracking 480 00:27:51,760 --> 00:27:56,320 Speaker 1: those people who are doing that horrible shit, Like, yeah, 481 00:27:56,359 --> 00:27:58,359 Speaker 1: I feel like the volumes going up on this what 482 00:27:58,880 --> 00:28:03,760 Speaker 1: are like what is in place currently or some of 483 00:28:03,800 --> 00:28:06,720 Speaker 1: the things that are in place or maybe being explored 484 00:28:07,480 --> 00:28:12,440 Speaker 1: to even begin to mitigate or deal with this stuff. 485 00:28:13,680 --> 00:28:16,240 Speaker 5: Well at the moment, and you know, so I'm interviewing 486 00:28:16,320 --> 00:28:21,119 Speaker 5: victim survivors at the moment for the first study because 487 00:28:21,119 --> 00:28:22,919 Speaker 5: what I want to do first is find out what 488 00:28:22,960 --> 00:28:26,000 Speaker 5: their experiences are while seeking help, so not so much 489 00:28:26,119 --> 00:28:30,320 Speaker 5: their experience specifically around the abuse, although you can't talk 490 00:28:30,359 --> 00:28:32,920 Speaker 5: about the response without talking about the abuse. 491 00:28:33,040 --> 00:28:35,120 Speaker 3: But I'm trying to. 492 00:28:35,040 --> 00:28:38,840 Speaker 5: Get specifically what sort of help they've received, and it's 493 00:28:39,000 --> 00:28:41,880 Speaker 5: really varied. And you know, there are some services out 494 00:28:41,880 --> 00:28:44,640 Speaker 5: there that cost money. There are some that are if 495 00:28:45,360 --> 00:28:48,600 Speaker 5: you're referred to the right service, you can get grants 496 00:28:48,640 --> 00:28:51,960 Speaker 5: and things like that to help you. But you know, 497 00:28:52,000 --> 00:28:58,040 Speaker 5: there are some forensic digital forensic services that will come 498 00:28:58,080 --> 00:29:00,400 Speaker 5: out and do a sweep of your home or I'll 499 00:29:00,480 --> 00:29:04,400 Speaker 5: you know, sweep your car. Police have pretty limited knowledge 500 00:29:04,640 --> 00:29:09,080 Speaker 5: around the depth. So you know, a policeman can come 501 00:29:09,080 --> 00:29:10,680 Speaker 5: out and shine a tort around your car and try 502 00:29:10,720 --> 00:29:14,840 Speaker 5: and find a tracking device, but you really need specialized 503 00:29:14,880 --> 00:29:18,120 Speaker 5: people to come and really go through your devices and 504 00:29:18,160 --> 00:29:18,800 Speaker 5: stuff like that. 505 00:29:19,520 --> 00:29:20,280 Speaker 3: So there is that. 506 00:29:20,600 --> 00:29:25,120 Speaker 5: But the trouble is that that I'm finding talking to 507 00:29:25,200 --> 00:29:29,040 Speaker 5: victim survivors now is that they either don't have access 508 00:29:29,080 --> 00:29:33,320 Speaker 5: to that, or they don't know about it, or yeah, 509 00:29:33,360 --> 00:29:35,640 Speaker 5: it really depends on who they talk to. It's pot 510 00:29:35,760 --> 00:29:38,040 Speaker 5: lark if you get if you get the right person 511 00:29:38,120 --> 00:29:39,600 Speaker 5: to talk to you that's going to refer you to 512 00:29:39,640 --> 00:29:42,680 Speaker 5: the right services, then you might get some service. But 513 00:29:42,720 --> 00:29:46,440 Speaker 5: I've got one, you know, one participant who spends all 514 00:29:46,520 --> 00:29:51,280 Speaker 5: of her income on court fees because she's you know, 515 00:29:51,360 --> 00:29:54,240 Speaker 5: through the court system with her the family, family and 516 00:29:54,280 --> 00:29:57,880 Speaker 5: criminal court system with her perpetrator. So she hasn't got 517 00:29:57,880 --> 00:30:02,040 Speaker 5: the funds to and he's still monitoring her, but she 518 00:30:02,160 --> 00:30:06,360 Speaker 5: has got the funds too, So it's a it's a 519 00:30:06,400 --> 00:30:07,560 Speaker 5: big problem. 520 00:30:07,880 --> 00:30:08,040 Speaker 3: Yeah. 521 00:30:08,960 --> 00:30:12,280 Speaker 1: And also I mean then you throw into that the 522 00:30:12,440 --> 00:30:17,120 Speaker 1: terror and the sleepless nights and the anxiety, I mean obviously, 523 00:30:17,320 --> 00:30:21,720 Speaker 1: and yeah, it's just like it's multi layered. There are 524 00:30:21,760 --> 00:30:24,760 Speaker 1: so many I would imagine, there are so many variables 525 00:30:24,800 --> 00:30:31,040 Speaker 1: to this, and yeah, and that's right the I would imagine, 526 00:30:31,080 --> 00:30:32,960 Speaker 1: which is no poor reflection. 527 00:30:32,640 --> 00:30:33,280 Speaker 2: On the police. 528 00:30:33,320 --> 00:30:37,640 Speaker 1: But firstly, they're under resource in every state in Australia, 529 00:30:37,720 --> 00:30:43,560 Speaker 1: I feel like, definitely in Victoria. And it's it's where 530 00:30:43,560 --> 00:30:47,240 Speaker 1: do you start with this, because there's no this is 531 00:30:47,440 --> 00:30:49,840 Speaker 1: you know, if somebody's punching someone in the face, it's 532 00:30:50,000 --> 00:30:53,000 Speaker 1: it's immediate, Oh, there's a policeman or woman there, we 533 00:30:53,040 --> 00:30:53,800 Speaker 1: can deal with it. 534 00:30:54,280 --> 00:30:55,760 Speaker 2: You know, here's somebody, you know. 535 00:30:55,800 --> 00:30:58,920 Speaker 1: But with stuff like this, there's no there's not even 536 00:30:59,040 --> 00:31:04,720 Speaker 1: a kind of quick fix, a complicated slow fix. 537 00:31:04,800 --> 00:31:07,240 Speaker 5: I would think, well, and we need to move away 538 00:31:07,360 --> 00:31:12,360 Speaker 5: from the idea of responding to an incident, so which 539 00:31:12,400 --> 00:31:17,240 Speaker 5: is what you know, Like a couple has a fight, 540 00:31:17,440 --> 00:31:22,680 Speaker 5: someone gets physical or violent, the police respond to that incident. 541 00:31:22,880 --> 00:31:25,880 Speaker 5: So we need to be responding to the pattern of 542 00:31:25,960 --> 00:31:33,640 Speaker 5: behavior because sometimes you know, and it's complicated and there's 543 00:31:33,840 --> 00:31:40,000 Speaker 5: you know, there's there's situational couple violence where you know, 544 00:31:40,160 --> 00:31:43,080 Speaker 5: there's actually no coercive control happening in that relationship, but 545 00:31:43,160 --> 00:31:45,280 Speaker 5: both parties got had a few too many to drink 546 00:31:45,320 --> 00:31:48,360 Speaker 5: and they've had an incident, but they don't but there's 547 00:31:48,360 --> 00:31:54,480 Speaker 5: no ongoing controlling behavior. So you know, that's it's difficult 548 00:31:54,480 --> 00:31:57,040 Speaker 5: for police to when they respond to a call out. 549 00:31:57,160 --> 00:32:01,800 Speaker 5: Are they responding to just a situational couple's violence incident 550 00:32:01,920 --> 00:32:06,640 Speaker 5: or are they are they stepping into you know, a 551 00:32:06,680 --> 00:32:11,080 Speaker 5: complex issue of control across the whole relationship. 552 00:32:11,200 --> 00:32:16,480 Speaker 8: So yeah, it's can I ask, And I know it's 553 00:32:16,640 --> 00:32:19,080 Speaker 8: it's baby steps in it's early days and you've just 554 00:32:19,200 --> 00:32:22,240 Speaker 8: kind of you've just put the key in the car 555 00:32:22,360 --> 00:32:26,840 Speaker 8: and you're just hitting second gear on the bloody academic. 556 00:32:26,400 --> 00:32:30,480 Speaker 1: Highway, right, I know, that, But so can you tell 557 00:32:30,600 --> 00:32:34,640 Speaker 1: us what you do know now that you didn't, Like, 558 00:32:34,760 --> 00:32:38,920 Speaker 1: has there been any outcomes? What is your research and 559 00:32:38,960 --> 00:32:42,600 Speaker 1: it doesn't need to be conclusive or extrapolated across all 560 00:32:42,960 --> 00:32:46,400 Speaker 1: kind of spectrums, but like, what can you tell us 561 00:32:46,440 --> 00:32:48,720 Speaker 1: that you now know that you didn't you know a 562 00:32:48,840 --> 00:32:49,760 Speaker 1: year and a half ago. 563 00:32:50,240 --> 00:32:53,640 Speaker 5: Yeah, So, from a systems thinking perspective, we I did 564 00:32:54,040 --> 00:32:57,040 Speaker 5: my literature review involved having a look. We were looking 565 00:32:57,120 --> 00:33:02,360 Speaker 5: at factors that influence tech facilitated abuse and what we 566 00:33:02,480 --> 00:33:07,200 Speaker 5: found applying systems thinking frameworks, what we found was a 567 00:33:07,240 --> 00:33:09,960 Speaker 5: lot of the research or most of the research has 568 00:33:10,040 --> 00:33:14,720 Speaker 5: identified factors that influence the behavior at the we call 569 00:33:14,760 --> 00:33:17,040 Speaker 5: it the lower levels of the system. So it's kind 570 00:33:17,080 --> 00:33:22,720 Speaker 5: of like at that perpetrator victim interface. So they're looking 571 00:33:22,760 --> 00:33:26,880 Speaker 5: at attachment issues or personality traits and things like that. 572 00:33:26,960 --> 00:33:29,960 Speaker 5: So the most or a lot of the literature is 573 00:33:30,000 --> 00:33:36,160 Speaker 5: looking at those lower levels, and those factors really are 574 00:33:36,240 --> 00:33:41,040 Speaker 5: pertaining to behaviors or characteristics of individuals. 575 00:33:41,960 --> 00:33:43,840 Speaker 3: But there was very little. 576 00:33:43,800 --> 00:33:48,200 Speaker 5: As far as the higher levels, you know, so policing, 577 00:33:51,040 --> 00:33:53,880 Speaker 5: you know, policy, government, all of those higher levels of 578 00:33:53,920 --> 00:33:57,400 Speaker 5: the system. There wasn't a lot of factors relating to 579 00:33:57,440 --> 00:34:00,520 Speaker 5: those higher levels, and the ones that did that we're 580 00:34:00,560 --> 00:34:04,040 Speaker 5: sitting at those higher levels really related more to responses. 581 00:34:04,960 --> 00:34:06,480 Speaker 3: So I was I was wanting. 582 00:34:06,240 --> 00:34:11,160 Speaker 5: To look at what enables the abuse, but what we 583 00:34:11,239 --> 00:34:12,920 Speaker 5: found was anything at the higher. 584 00:34:12,680 --> 00:34:15,960 Speaker 3: Levels of the system was really it was the lack. 585 00:34:15,760 --> 00:34:19,480 Speaker 5: Of response or the inadequate responses that was enabling the 586 00:34:19,520 --> 00:34:20,719 Speaker 5: abuse to continue. 587 00:34:21,360 --> 00:34:24,799 Speaker 3: So that's why my PhD kind of after. 588 00:34:24,560 --> 00:34:27,160 Speaker 5: That literature review and we realized that that was when 589 00:34:27,160 --> 00:34:29,239 Speaker 5: we realized we really need to look at how we 590 00:34:29,320 --> 00:34:33,759 Speaker 5: are responding from a systemic because we already know a 591 00:34:33,800 --> 00:34:39,640 Speaker 5: lot around you know, attachment issues, you know, you know, 592 00:34:39,760 --> 00:34:44,440 Speaker 5: childhood history, family of origin, history of violence, and you know, 593 00:34:44,560 --> 00:34:48,360 Speaker 5: we know quite a lot around that, but we don't 594 00:34:48,440 --> 00:34:50,000 Speaker 5: know very much at. 595 00:34:49,920 --> 00:34:50,920 Speaker 3: All around. 596 00:34:52,520 --> 00:34:59,080 Speaker 5: How the how the current responses are actually creating more 597 00:35:00,120 --> 00:35:01,920 Speaker 5: opportunity for abuse to occur. 598 00:35:03,120 --> 00:35:03,640 Speaker 2: Is there a. 599 00:35:06,360 --> 00:35:10,240 Speaker 1: Is there a clear line between just someone who's really 600 00:35:10,280 --> 00:35:13,560 Speaker 1: fucking annoying and texts and doesn't read the signs and 601 00:35:14,080 --> 00:35:17,480 Speaker 1: you know, thirteen messages a day. It's not threatening particularly, 602 00:35:17,520 --> 00:35:21,360 Speaker 1: it's not intimidating per se, but it's just really fucking 603 00:35:21,600 --> 00:35:30,160 Speaker 1: unwanted and annoying, and like what qualifies as coercive control, 604 00:35:30,360 --> 00:35:34,680 Speaker 1: Like is it is it like there's a particular threat 605 00:35:34,920 --> 00:35:39,799 Speaker 1: or demand or is there some kind of line that 606 00:35:40,840 --> 00:35:41,400 Speaker 1: is crossed? 607 00:35:41,960 --> 00:35:49,120 Speaker 3: So coercive control is and like, I'm really grateful. 608 00:35:48,719 --> 00:35:51,279 Speaker 5: For the changes in legislation, and I know that it's 609 00:35:51,280 --> 00:35:53,480 Speaker 5: difficult to police, and there's a lot of people that 610 00:35:53,520 --> 00:35:57,239 Speaker 5: are saying, you know, what's the point. The point for 611 00:35:57,320 --> 00:36:00,239 Speaker 5: me is people are talking about it, and there are 612 00:36:00,280 --> 00:36:02,719 Speaker 5: a lot of people that now understand or at least 613 00:36:02,840 --> 00:36:06,680 Speaker 5: asking the question, what is coercive control or you know, 614 00:36:06,680 --> 00:36:08,920 Speaker 5: because they've never even heard the term before. So the 615 00:36:08,960 --> 00:36:13,680 Speaker 5: definition of it, and that the definition varies, as is 616 00:36:13,719 --> 00:36:17,799 Speaker 5: often the case, but it's a pattern of behaviors that 617 00:36:18,000 --> 00:36:26,319 Speaker 5: is designed to dominate or control another person. So one 618 00:36:26,360 --> 00:36:31,040 Speaker 5: of the or two of the key elements to cours 619 00:36:31,120 --> 00:36:37,160 Speaker 5: of control is understanding intent and harm. So there are 620 00:36:37,239 --> 00:36:42,160 Speaker 5: behaviors that are a little bit controlling, you know, and 621 00:36:42,239 --> 00:36:44,640 Speaker 5: especially when you look at young people in relationships and 622 00:36:44,640 --> 00:36:46,920 Speaker 5: they've got these find my Friends apps and you know 623 00:36:46,960 --> 00:36:50,960 Speaker 5: they're tracking each other. Those behaviors can be quite innocuous. 624 00:36:53,800 --> 00:36:59,239 Speaker 5: But it's intent, it's the intention of controlling, and it's 625 00:36:59,280 --> 00:37:04,440 Speaker 5: the harm that is being caused. So you can't talk 626 00:37:04,440 --> 00:37:07,640 Speaker 5: about coursive control without talking about intent and harm. 627 00:37:08,440 --> 00:37:15,520 Speaker 2: Mmmm. Wow? And do you where? 628 00:37:16,200 --> 00:37:18,680 Speaker 1: Who do you sit down with? How do you like? 629 00:37:18,760 --> 00:37:21,880 Speaker 1: Are people forthcoming for your research? Are people putting up 630 00:37:21,920 --> 00:37:25,120 Speaker 1: their hand and saying? Do people want to talk about it? 631 00:37:25,160 --> 00:37:28,919 Speaker 1: Are people comfortable? Uncomfortable? Is it all done? I guess 632 00:37:28,920 --> 00:37:32,920 Speaker 1: it's all done anonymously for obvious reasons. But are you 633 00:37:33,000 --> 00:37:36,640 Speaker 1: finding it easy to get you know, people for your research? 634 00:37:37,560 --> 00:37:37,920 Speaker 3: Yeah? 635 00:37:37,960 --> 00:37:42,160 Speaker 5: So, because we're looking at systemic responses. I'm not like, 636 00:37:42,480 --> 00:37:45,600 Speaker 5: I'm not looking at the psychological side of coercive control, 637 00:37:45,680 --> 00:37:51,920 Speaker 5: and I'm not looking into, you know, individual behaviors and 638 00:37:52,040 --> 00:37:57,120 Speaker 5: traits and characteristics. We're really focusing on responses. So we're 639 00:37:57,280 --> 00:38:01,000 Speaker 5: really focusing on the systems perspective. 640 00:38:01,800 --> 00:38:02,799 Speaker 3: Of the responses. 641 00:38:02,920 --> 00:38:07,520 Speaker 5: So I have had I was really surprised at the 642 00:38:08,320 --> 00:38:09,400 Speaker 5: response that I got. 643 00:38:09,480 --> 00:38:14,560 Speaker 3: And what I have found is that the women that 644 00:38:14,600 --> 00:38:19,480 Speaker 3: I've spoken to have had so coersive. 645 00:38:19,800 --> 00:38:22,800 Speaker 5: One of the impacts of coercive control is really feeling 646 00:38:22,960 --> 00:38:30,799 Speaker 5: completely isolated, helpless, defeated. You know that you have no 647 00:38:31,200 --> 00:38:34,280 Speaker 5: control of your life because someone else has taken control. 648 00:38:34,400 --> 00:38:39,840 Speaker 5: So the feedback that I've gotten from a huge percentage 649 00:38:39,840 --> 00:38:42,759 Speaker 5: of the women that I've spoken to is they're more 650 00:38:42,800 --> 00:38:46,560 Speaker 5: than happy to participate in the research because it gives 651 00:38:46,640 --> 00:38:47,000 Speaker 5: them a. 652 00:38:47,000 --> 00:38:47,920 Speaker 3: Sense of. 653 00:38:50,280 --> 00:38:52,120 Speaker 2: The validation being heard. 654 00:38:52,160 --> 00:38:56,120 Speaker 3: And being heard. But also, you know, one of the questions. 655 00:38:55,800 --> 00:38:58,000 Speaker 5: I ask is what advice would you give women in 656 00:38:58,040 --> 00:39:01,240 Speaker 5: the future who are seeking help for tech facilitated abuse 657 00:39:02,680 --> 00:39:05,840 Speaker 5: and so they're they're having a voice and helping a 658 00:39:05,880 --> 00:39:11,640 Speaker 5: problem that they felt helpless in. So because I was 659 00:39:11,719 --> 00:39:15,759 Speaker 5: really concerned, you know, as I should be as a researcher, 660 00:39:17,480 --> 00:39:20,040 Speaker 5: I wanted to make sure that the conversations I was 661 00:39:20,080 --> 00:39:24,239 Speaker 5: having was not going to retraumatize anybody. But what I 662 00:39:24,280 --> 00:39:28,759 Speaker 5: have found is that it is actually giving people a 663 00:39:28,840 --> 00:39:35,080 Speaker 5: sense of control over the whole situation because they're contributing 664 00:39:35,760 --> 00:39:39,520 Speaker 5: to making it a change, you know, so which also 665 00:39:40,000 --> 00:39:42,000 Speaker 5: puts a little bit of pressure on the researcher. But 666 00:39:42,120 --> 00:39:44,480 Speaker 5: I think, not that I need to fix the problem 667 00:39:44,560 --> 00:39:47,480 Speaker 5: for them, but it means that what I'm doing matters 668 00:39:47,760 --> 00:39:50,759 Speaker 5: and it has meaning, and you know, if I can 669 00:39:50,920 --> 00:39:55,279 Speaker 5: if we can make even a small difference to how 670 00:39:55,320 --> 00:39:58,239 Speaker 5: we respond, then that's good. 671 00:39:59,560 --> 00:40:04,239 Speaker 1: Would part of the I'm just thinking now about, you know, 672 00:40:04,320 --> 00:40:06,440 Speaker 1: I was going to ask you so what's the solution. 673 00:40:06,640 --> 00:40:09,879 Speaker 1: Obviously that's a dumb question, but moving towards like how 674 00:40:09,960 --> 00:40:14,759 Speaker 1: do we address this practically systems thinking I get it, well, 675 00:40:14,880 --> 00:40:17,400 Speaker 1: kind of get it. Thanks for teaching me. But in 676 00:40:17,520 --> 00:40:22,400 Speaker 1: terms of moving towards something, which is where we're hopefully 677 00:40:22,440 --> 00:40:24,719 Speaker 1: shining a little bit of a light in the darkness 678 00:40:24,719 --> 00:40:27,239 Speaker 1: of all of this that gives some people hope and 679 00:40:27,280 --> 00:40:31,800 Speaker 1: some people answers or protection or do you have anything 680 00:40:31,840 --> 00:40:33,480 Speaker 1: in the back of your head? And I know this 681 00:40:33,560 --> 00:40:35,680 Speaker 1: is probably getting ahead of where you are right now, 682 00:40:35,719 --> 00:40:39,160 Speaker 1: but I don't mean this in terms of your research, 683 00:40:39,239 --> 00:40:43,239 Speaker 1: but just you as a researcher, thinking what would you 684 00:40:43,440 --> 00:40:48,000 Speaker 1: like to see happen potentially in the future that might 685 00:40:48,800 --> 00:40:52,880 Speaker 1: help the probably thousands, tens of thousands, hundreds maybe of 686 00:40:53,000 --> 00:40:57,080 Speaker 1: people in Australia and beyond who are dealing with this. 687 00:40:57,400 --> 00:40:58,960 Speaker 2: Like is there anything. 688 00:41:00,520 --> 00:41:04,640 Speaker 1: Like such like you surely you've thought about well maybe, 689 00:41:05,560 --> 00:41:09,000 Speaker 1: But it's like when I'm talking about stuff like, for example, 690 00:41:09,000 --> 00:41:13,279 Speaker 1: with metabception, metaaccuracy, metacognition is realizing that most people you know, 691 00:41:13,320 --> 00:41:15,399 Speaker 1: so in my stuff, I'm thinking, all right, well, most 692 00:41:15,440 --> 00:41:18,759 Speaker 1: people don't understand how other people think. Most people don't 693 00:41:18,840 --> 00:41:22,160 Speaker 1: understand what they're like to be around most people, you know, 694 00:41:22,239 --> 00:41:24,360 Speaker 1: all that stuff, and then I so identify that and 695 00:41:24,360 --> 00:41:27,560 Speaker 1: I go, well, what's what's some what's not what's the answer, 696 00:41:27,560 --> 00:41:30,040 Speaker 1: but what's something that we could do to maybe move the. 697 00:41:29,960 --> 00:41:30,640 Speaker 2: Needle on this? 698 00:41:31,440 --> 00:41:34,880 Speaker 1: How how do I approach this where it might create 699 00:41:35,000 --> 00:41:38,480 Speaker 1: some kind of positive impact regarding this issue in this 700 00:41:38,640 --> 00:41:42,520 Speaker 1: place with these people. And I know you're far away 701 00:41:42,520 --> 00:41:46,640 Speaker 1: from that, and I think this is a very complicated thing. 702 00:41:47,120 --> 00:41:50,120 Speaker 1: It needs to be dealt with, so I don't have 703 00:41:50,160 --> 00:41:52,160 Speaker 1: to have a question in there, but maybe you've got 704 00:41:52,160 --> 00:41:53,160 Speaker 1: some thoughts. 705 00:41:53,320 --> 00:41:56,960 Speaker 5: Well from a system sew perspective, we don't know yet, 706 00:41:57,000 --> 00:41:59,200 Speaker 5: but what we are going to do is take the 707 00:42:00,600 --> 00:42:04,040 Speaker 5: We'll be developing like a almost like an aggregated journey 708 00:42:04,040 --> 00:42:09,240 Speaker 5: map of the experiences of a victim survivor while seeking help, 709 00:42:10,040 --> 00:42:12,279 Speaker 5: and then we're then we'll be doing some so the 710 00:42:12,320 --> 00:42:15,880 Speaker 5: next three studies after that will be actually interactive workshops 711 00:42:15,880 --> 00:42:22,919 Speaker 5: with stakeholders, so you know, police, legal professionals, digital software developers, 712 00:42:23,280 --> 00:42:27,279 Speaker 5: you know, any even telecommunications companies because you know they 713 00:42:27,320 --> 00:42:30,480 Speaker 5: play a big part in as well. So we'll be 714 00:42:30,560 --> 00:42:32,520 Speaker 5: taking the journey map to them and then we want 715 00:42:32,560 --> 00:42:34,600 Speaker 5: to have a look, and we'll be applying some systems 716 00:42:34,600 --> 00:42:37,960 Speaker 5: thinking methods and frameworks, and you know, after two or 717 00:42:37,960 --> 00:42:44,400 Speaker 5: three more studies, we'll be developing some recommendations for interventions. 718 00:42:44,719 --> 00:42:47,600 Speaker 3: Across the system, right, right, So there's that. 719 00:42:48,000 --> 00:42:50,719 Speaker 5: I don't know what they will be yet because we 720 00:42:50,760 --> 00:42:55,160 Speaker 5: don't know where the leverage points are yet. So that's 721 00:42:55,200 --> 00:42:57,719 Speaker 5: I'm excited to kind of move through that because it's 722 00:42:57,800 --> 00:42:59,920 Speaker 5: it's a it's an approach that's not been applied by 723 00:43:00,080 --> 00:43:03,440 Speaker 5: for so I'm hoping that some new novel kind of 724 00:43:03,480 --> 00:43:06,759 Speaker 5: interventions and some new ideas come out of that. So 725 00:43:06,840 --> 00:43:10,600 Speaker 5: there's that as far as because there's responses, and then 726 00:43:10,640 --> 00:43:18,759 Speaker 5: there's addressing the issue you know, primary primary intervention at 727 00:43:18,840 --> 00:43:23,319 Speaker 5: changing attitudes and you know, teaching young people healthy relationships 728 00:43:23,320 --> 00:43:24,080 Speaker 5: and things like that. 729 00:43:25,560 --> 00:43:27,920 Speaker 3: But yeah, it's I think there's. 730 00:43:27,760 --> 00:43:33,920 Speaker 5: A lot being done in that space already. It's a 731 00:43:33,920 --> 00:43:35,640 Speaker 5: public health campaign really. 732 00:43:36,440 --> 00:43:36,840 Speaker 2: Yeah. 733 00:43:37,160 --> 00:43:41,160 Speaker 1: I feel like also, and remember everyone, this is just 734 00:43:41,200 --> 00:43:46,560 Speaker 1: a chat, and I have thrown Nicole under the bus. 735 00:43:47,160 --> 00:43:48,840 Speaker 2: Yeah what pressure? What pressure? 736 00:43:50,080 --> 00:43:50,160 Speaker 7: No? 737 00:43:50,239 --> 00:43:53,600 Speaker 1: But I think also you know, like obviously your research 738 00:43:53,640 --> 00:43:58,120 Speaker 1: has focused on in relationships, but also we see you know, 739 00:43:58,320 --> 00:44:02,839 Speaker 1: tech tech facility abuse in the workplace. 740 00:44:02,920 --> 00:44:03,839 Speaker 2: We see it in. 741 00:44:05,000 --> 00:44:08,960 Speaker 1: Like I reckon, I know, I don't know how many 742 00:44:09,000 --> 00:44:11,760 Speaker 1: I'm guessing, but maybe twenty people who are no longer 743 00:44:11,880 --> 00:44:16,080 Speaker 1: on either any social media or one particular platform because 744 00:44:16,080 --> 00:44:20,520 Speaker 1: they just got so much abuse, so much hate, and 745 00:44:20,560 --> 00:44:22,719 Speaker 1: they were so fearful, they were so anxious, there was 746 00:44:22,800 --> 00:44:26,080 Speaker 1: so much negative shit around that they just got off. 747 00:44:26,239 --> 00:44:30,400 Speaker 1: And and you know, for some of them, it actually 748 00:44:30,520 --> 00:44:33,160 Speaker 1: is in their interest to have a social media presence 749 00:44:33,200 --> 00:44:35,960 Speaker 1: because of their business or because of their career or whatever, 750 00:44:37,440 --> 00:44:41,000 Speaker 1: but they're not doing it because it's just when you do, 751 00:44:41,600 --> 00:44:44,560 Speaker 1: you do a cost benefit analysis, like it just ain't 752 00:44:44,560 --> 00:44:47,919 Speaker 1: worth it, like the emotional and psychological cost of being 753 00:44:47,960 --> 00:44:52,600 Speaker 1: on social media for some people. You know, it's just 754 00:44:52,800 --> 00:44:56,759 Speaker 1: and so I think it's like, yes, obviously specific to 755 00:44:56,800 --> 00:44:59,520 Speaker 1: your research intimate couples, but it's something that. 756 00:45:00,880 --> 00:45:01,520 Speaker 2: Is I think. 757 00:45:02,280 --> 00:45:05,040 Speaker 1: I mean, it's just going to expand because technology is growing, 758 00:45:05,160 --> 00:45:07,719 Speaker 1: the eye is growing. I mean, I wonder you know 759 00:45:07,840 --> 00:45:10,400 Speaker 1: all the stuff, all the different forms of abuse that 760 00:45:10,440 --> 00:45:14,719 Speaker 1: are going to you know, present themselves over the next 761 00:45:14,800 --> 00:45:19,239 Speaker 1: year or ten that come by way of artificial intelligence, 762 00:45:19,400 --> 00:45:24,239 Speaker 1: like being able to manipulate and control and influence and 763 00:45:24,400 --> 00:45:28,800 Speaker 1: terrify people using greater and greater in inverted commas. 764 00:45:29,239 --> 00:45:32,800 Speaker 2: You know technology well, and one. 765 00:45:32,640 --> 00:45:36,480 Speaker 3: Of the things that really concerns me personally. 766 00:45:37,480 --> 00:45:44,120 Speaker 5: Is AI is like we we feed AI, right, So 767 00:45:44,200 --> 00:45:48,480 Speaker 5: AI is based on is a virtual world, based on 768 00:45:48,560 --> 00:45:54,080 Speaker 5: our physical world, and we have and we are completely imperfect, 769 00:45:55,520 --> 00:45:59,120 Speaker 5: you know, we have we live in a patriarchal society. 770 00:45:59,239 --> 00:46:04,279 Speaker 5: We have you know, there's gender biases, there's all of this. 771 00:46:04,239 --> 00:46:06,319 Speaker 3: Stuff is being fed into AI. 772 00:46:07,920 --> 00:46:12,600 Speaker 5: And so outputs AI outputs will have those biases built in. 773 00:46:14,320 --> 00:46:17,040 Speaker 3: So you know that's a concern. 774 00:46:17,840 --> 00:46:21,399 Speaker 5: So you know, like there's so many layers to this 775 00:46:21,840 --> 00:46:24,480 Speaker 5: that you know, and you know, you can't talk about 776 00:46:24,920 --> 00:46:27,839 Speaker 5: course of control and intimate partner vance without talking about 777 00:46:28,000 --> 00:46:33,480 Speaker 5: gender based violence because it is a gendered issue. It 778 00:46:33,560 --> 00:46:37,120 Speaker 5: is a gender issue, and you know, I've had this 779 00:46:37,239 --> 00:46:40,760 Speaker 5: argument with so many people that is it really though, well, yes, 780 00:46:42,320 --> 00:46:46,080 Speaker 5: actually is you just have to look at the statistics. Yes, 781 00:46:48,040 --> 00:46:51,759 Speaker 5: so AI is AI concerns me in that regard, So 782 00:46:52,000 --> 00:46:53,520 Speaker 5: you know, we can be all we can be doing 783 00:46:53,560 --> 00:46:56,200 Speaker 5: all of this stuff over here as researchers and coming 784 00:46:56,239 --> 00:46:59,040 Speaker 5: up with interventions and trying to change attitudes. 785 00:46:59,239 --> 00:47:01,919 Speaker 3: But if AI horses already bolted. 786 00:47:01,600 --> 00:47:05,160 Speaker 5: With AI, then no amount of changing attitudes here is 787 00:47:05,200 --> 00:47:08,960 Speaker 5: really going to We're always going to be behind the april. 788 00:47:09,280 --> 00:47:10,440 Speaker 3: So that behinds me. 789 00:47:11,200 --> 00:47:14,960 Speaker 2: Do you think part of well, I don't know. 790 00:47:15,040 --> 00:47:19,280 Speaker 1: I feel like and this is different to intimate relationships, 791 00:47:19,320 --> 00:47:21,560 Speaker 1: but I think a lot of the abuse, the abuse 792 00:47:21,640 --> 00:47:26,400 Speaker 1: is you know, rampant, because people can do it to 793 00:47:26,480 --> 00:47:30,520 Speaker 1: an extent anonymously, right, they can do it, you know, 794 00:47:30,640 --> 00:47:35,480 Speaker 1: they can, you know, and some people do it you know, openly, 795 00:47:35,800 --> 00:47:38,359 Speaker 1: But there are so many people that are hateful and 796 00:47:38,440 --> 00:47:43,040 Speaker 1: hurtful and fucking evil, use fake names and all of 797 00:47:43,040 --> 00:47:47,200 Speaker 1: that stuff because you know, there's no and then like 798 00:47:47,480 --> 00:47:50,000 Speaker 1: it's really not even worth the time and effort for 799 00:47:50,480 --> 00:47:54,040 Speaker 1: unless they're committing a crime. You know, who's going to 800 00:47:54,080 --> 00:47:57,640 Speaker 1: do anything about that? Because it's just such an overwhelm 801 00:47:57,880 --> 00:48:00,960 Speaker 1: of that. You know, It's like that happens billions of 802 00:48:01,000 --> 00:48:03,600 Speaker 1: times a day around the world. 803 00:48:03,640 --> 00:48:06,160 Speaker 2: So where do you even begin to address that stuff? 804 00:48:06,200 --> 00:48:09,480 Speaker 5: I mean, well, this is where, you know, one of 805 00:48:09,480 --> 00:48:14,560 Speaker 5: the one of the principles of systems thinking is safety 806 00:48:15,280 --> 00:48:18,560 Speaker 5: because systems thinking, you know, a lot of the frameworks 807 00:48:18,560 --> 00:48:21,200 Speaker 5: we're designed for safety, So we do a lot of 808 00:48:21,239 --> 00:48:25,319 Speaker 5: safety research around you know, aviation safety, rail of us, 809 00:48:25,360 --> 00:48:27,840 Speaker 5: crossing safety and that sort of thing. So we've just 810 00:48:27,960 --> 00:48:32,120 Speaker 5: kind of moved it over to personal safety. But one 811 00:48:32,160 --> 00:48:36,680 Speaker 5: of the tenants is that safety is a shared responsibility. 812 00:48:37,239 --> 00:48:42,320 Speaker 5: So every single player in the system, from the tech 813 00:48:42,400 --> 00:48:48,800 Speaker 5: companies to the software to the social media companies, everyone 814 00:48:48,880 --> 00:48:51,040 Speaker 5: has a shared responsibility for safety. 815 00:48:53,960 --> 00:48:57,040 Speaker 1: So that I think that's theoretically, but how the fuck 816 00:48:57,080 --> 00:48:57,880 Speaker 1: do you endorse that? 817 00:48:58,000 --> 00:49:00,560 Speaker 2: How do you? I mean, I'm with you, but good 818 00:49:00,640 --> 00:49:01,799 Speaker 2: luck in that. 819 00:49:01,760 --> 00:49:03,400 Speaker 3: Real mean, And that's the challenge. 820 00:49:03,440 --> 00:49:06,479 Speaker 5: That's the challenge, and that's where we that's where I'm 821 00:49:06,520 --> 00:49:11,960 Speaker 5: hoping to find some you know, effective leverage points across 822 00:49:11,960 --> 00:49:14,480 Speaker 5: the system where we can kind of pull in some 823 00:49:14,640 --> 00:49:20,640 Speaker 5: of those stakeholders to accept the shared responsibility, yeah, exand 824 00:49:20,719 --> 00:49:24,840 Speaker 5: that they have responsibility or policies or laws or you know, 825 00:49:24,960 --> 00:49:29,240 Speaker 5: legislation put in place so that the responsibility is shared. 826 00:49:31,239 --> 00:49:34,560 Speaker 1: I think, so try to interrupt I think, like with 827 00:49:34,640 --> 00:49:37,960 Speaker 1: a lot of these things, even with a lot of 828 00:49:38,000 --> 00:49:40,120 Speaker 1: the I don't even want to mention what it is, 829 00:49:40,160 --> 00:49:42,040 Speaker 1: but like a lot of the ship that we're seeing, 830 00:49:42,360 --> 00:49:44,160 Speaker 1: a lot of the crimes that we're seeing in and 831 00:49:44,160 --> 00:49:49,120 Speaker 1: around Melbourne, I'm sure it happens in Queensland, like like 832 00:49:49,360 --> 00:49:53,120 Speaker 1: violence with all groups on, Like there's no real consequences, 833 00:49:53,480 --> 00:49:56,520 Speaker 1: Like people are committing crime after crime after crime after crime, 834 00:49:57,160 --> 00:50:00,480 Speaker 1: and they're not going to jail. And I think one 835 00:50:00,480 --> 00:50:03,200 Speaker 1: of the challenges is that, well, when people can do 836 00:50:03,320 --> 00:50:10,879 Speaker 1: heinous shit and they're not really penalized, you know, then 837 00:50:11,160 --> 00:50:15,480 Speaker 1: their life has not changed dramatically. There they do something horrible, 838 00:50:15,560 --> 00:50:19,799 Speaker 1: something hurtful, hateful, evil, and then depending on what it is, 839 00:50:20,560 --> 00:50:22,360 Speaker 1: you know, they're just given a slap on the wrist, 840 00:50:22,400 --> 00:50:25,640 Speaker 1: if that at all, and then there's no there's no 841 00:50:25,800 --> 00:50:27,840 Speaker 1: real deterrence for some of this stuff. 842 00:50:27,840 --> 00:50:32,799 Speaker 2: Obviously this is different. And it's like, well, you know, 843 00:50:34,560 --> 00:50:36,800 Speaker 2: like I've had. 844 00:50:36,239 --> 00:50:39,040 Speaker 1: You know, nothing horrendous, but well yeah, I've had people 845 00:50:39,120 --> 00:50:44,920 Speaker 1: send me emails and messages that are like really fucking horrendous, 846 00:50:45,480 --> 00:50:48,879 Speaker 1: like you know, like which I won't elaborate on here, 847 00:50:48,920 --> 00:50:52,919 Speaker 1: but it's like, oh, and I mean, firstly, it's quite 848 00:50:53,000 --> 00:50:58,520 Speaker 1: hateful and hurtful. Secondly, I think what's going on in 849 00:50:58,520 --> 00:51:02,760 Speaker 1: this person's head that they take the time and find 850 00:51:02,800 --> 00:51:05,399 Speaker 1: the energy to send me this person they've never met, 851 00:51:05,440 --> 00:51:06,839 Speaker 1: this thing based. 852 00:51:06,480 --> 00:51:11,440 Speaker 2: On It's like, wow, what is curious? Around that. But 853 00:51:11,719 --> 00:51:13,799 Speaker 2: then you go, oh, what do I do? And it's like, well, 854 00:51:14,560 --> 00:51:15,759 Speaker 2: what the fuck can you do? 855 00:51:15,920 --> 00:51:20,879 Speaker 5: It's like yeah, yeah, But then it's I guess it's 856 00:51:20,920 --> 00:51:28,320 Speaker 5: the it's knowing or not knowing if a threat comes 857 00:51:28,360 --> 00:51:31,200 Speaker 5: to you via some electronic means like an email or 858 00:51:31,239 --> 00:51:35,280 Speaker 5: a Facebook message or something like that, where is this person? 859 00:51:35,800 --> 00:51:38,200 Speaker 5: Like are they outside the front door? Or are they 860 00:51:38,239 --> 00:51:40,160 Speaker 5: in another state or are they in another country? 861 00:51:40,600 --> 00:51:47,520 Speaker 3: Like it can be frightening, Oh for sure, yeah, and 862 00:51:48,400 --> 00:51:48,840 Speaker 3: I mean this. 863 00:51:50,719 --> 00:51:55,680 Speaker 2: With complete compassion. And then I think about more vulnerable people, 864 00:51:55,760 --> 00:52:00,640 Speaker 2: women and children. I'm talking more physically vulnerable, Like that's 865 00:52:00,640 --> 00:52:02,279 Speaker 2: got to be fucking terrifying. 866 00:52:03,239 --> 00:52:03,560 Speaker 3: Yeah. 867 00:52:03,560 --> 00:52:05,680 Speaker 1: Like I'm a big al fora male, and there's ship 868 00:52:05,760 --> 00:52:09,160 Speaker 1: that still scares me. And I'm like, wow, I think 869 00:52:09,200 --> 00:52:13,880 Speaker 1: about my mum, who's forty five kilos and five foot 870 00:52:14,600 --> 00:52:14,840 Speaker 1: you know. 871 00:52:16,719 --> 00:52:18,680 Speaker 2: Yeah, it's so. 872 00:52:20,520 --> 00:52:24,840 Speaker 1: Well, well, it's it's an ongoing kind of unfolding lesson, 873 00:52:24,920 --> 00:52:27,279 Speaker 1: isn't it. It's like we're still figuring out what the 874 00:52:28,160 --> 00:52:32,040 Speaker 1: what the real problem, what the potential not solutions but 875 00:52:32,080 --> 00:52:34,560 Speaker 1: perhaps strategies moving towards some. 876 00:52:34,520 --> 00:52:37,359 Speaker 2: Kind of I don't know that we'll ever solve it. 877 00:52:37,440 --> 00:52:38,280 Speaker 2: I don't know if that's. 878 00:52:38,160 --> 00:52:42,759 Speaker 1: The right phrase, but or the right kind of you 879 00:52:42,800 --> 00:52:47,440 Speaker 1: know word, but definitely moving towards dealing with it practically 880 00:52:47,480 --> 00:52:49,120 Speaker 1: and alleviating some of the pain. 881 00:52:49,400 --> 00:52:49,720 Speaker 2: Maybe. 882 00:52:49,840 --> 00:52:50,680 Speaker 3: Yeah. 883 00:52:50,719 --> 00:52:53,040 Speaker 5: Well, and just to give you an example, and this 884 00:52:53,080 --> 00:52:55,720 Speaker 5: is what these are. These are the things that really 885 00:52:55,880 --> 00:53:06,200 Speaker 5: shocked me is things like robovacs, a robot vacuum cleaner. Yeah, 886 00:53:06,400 --> 00:53:12,200 Speaker 5: can be used, has been used as a spy. 887 00:53:12,040 --> 00:53:14,200 Speaker 3: Where listening device. 888 00:53:16,640 --> 00:53:18,680 Speaker 5: Knowing I knew who you were talking to on the 889 00:53:18,719 --> 00:53:20,279 Speaker 5: phone last night, how did you know? 890 00:53:20,680 --> 00:53:23,600 Speaker 3: Doesn't matter, you don't know. Weeks later down the. 891 00:53:23,520 --> 00:53:26,600 Speaker 5: Track you work out that it's the robovac following because 892 00:53:26,600 --> 00:53:30,280 Speaker 5: it's following you around the house because the perpetrator's got 893 00:53:30,360 --> 00:53:34,200 Speaker 5: it on remote control cameras in the house being taken 894 00:53:34,239 --> 00:53:40,839 Speaker 5: over remotely, laptop's being taken over remotely. It's like, it 895 00:53:40,960 --> 00:53:44,840 Speaker 5: is insidious the things that people. 896 00:53:44,840 --> 00:53:45,680 Speaker 3: Have to deal with. 897 00:53:45,880 --> 00:53:49,520 Speaker 5: And you know, if you're if you're someone who's running 898 00:53:49,520 --> 00:53:50,640 Speaker 5: a business. 899 00:53:50,680 --> 00:53:52,600 Speaker 3: You can't afford to be without your devices. 900 00:53:52,680 --> 00:53:55,560 Speaker 5: You can't afford to be you know, without your internet 901 00:53:55,640 --> 00:53:57,440 Speaker 5: access or your or whatever. 902 00:53:57,719 --> 00:53:59,319 Speaker 3: So some of. 903 00:53:59,280 --> 00:54:03,239 Speaker 5: These women have just learned to live knowing that they're 904 00:54:03,280 --> 00:54:07,960 Speaker 5: constantly being monitored by an ex partner and just trying 905 00:54:07,960 --> 00:54:11,400 Speaker 5: to do the best that they can to minimize that, 906 00:54:13,760 --> 00:54:15,919 Speaker 5: knowing that he could turn up anywhere at any time. 907 00:54:17,680 --> 00:54:19,719 Speaker 1: So yeah, there are a couple of solutions that come 908 00:54:19,760 --> 00:54:22,280 Speaker 1: to mind for me, but they're not they're not particularly legal. 909 00:54:22,400 --> 00:54:30,359 Speaker 1: So I was that is that was interesting. Congratulations on 910 00:54:30,800 --> 00:54:33,680 Speaker 1: the research that you're doing. Thanks for shining a light 911 00:54:33,760 --> 00:54:34,040 Speaker 1: on that. 912 00:54:34,640 --> 00:54:35,240 Speaker 2: And if you're. 913 00:54:35,120 --> 00:54:38,799 Speaker 1: Listening, I know we didn't really fix or change or 914 00:54:38,840 --> 00:54:43,040 Speaker 1: solve anything, but I think we opened an important conversational door. 915 00:54:43,680 --> 00:54:44,879 Speaker 2: And I think the more that we. 916 00:54:44,800 --> 00:54:48,040 Speaker 1: Talk about this and think about this, and you know, 917 00:54:48,160 --> 00:54:50,879 Speaker 1: while it's not I wish it wasn't happening. I wish 918 00:54:50,920 --> 00:54:52,960 Speaker 1: it wasn't a thing in our culture. I wish it 919 00:54:53,040 --> 00:54:56,719 Speaker 1: wasn't a thing in our society. But wishing doesn't make 920 00:54:56,760 --> 00:55:00,120 Speaker 1: it so. And so you know, it's good that we 921 00:55:00,120 --> 00:55:03,240 Speaker 1: we have these hard conversations about these hard topics and 922 00:55:03,400 --> 00:55:06,000 Speaker 1: at the very least we increase a little bit of awareness. 923 00:55:06,120 --> 00:55:12,200 Speaker 1: And do you need any more participants for your research? 924 00:55:12,320 --> 00:55:15,160 Speaker 2: Can? Do you want people to reach out? Yep? So 925 00:55:15,200 --> 00:55:16,320 Speaker 2: tell them how to do that. 926 00:55:16,880 --> 00:55:23,080 Speaker 5: So they can email me at Nicole dot Liddell L 927 00:55:23,120 --> 00:55:30,000 Speaker 5: I double l at research dot USC dot edu dot 928 00:55:30,040 --> 00:55:35,080 Speaker 5: au and I'll send them a link. So basically, to 929 00:55:35,120 --> 00:55:37,920 Speaker 5: meet the criteria for the study, you need to be 930 00:55:39,200 --> 00:55:43,160 Speaker 5: living or not necessarily living in Queensland, but have experienced 931 00:55:44,360 --> 00:55:49,640 Speaker 5: tech facilitated course of control and sort help from Queensland services. 932 00:55:51,440 --> 00:55:57,320 Speaker 5: And for people's safety, we request that they consider themselves 933 00:55:57,360 --> 00:56:00,600 Speaker 5: to no longer be part be in that relation. So 934 00:56:01,680 --> 00:56:04,640 Speaker 5: you know, it might be that they're still in co 935 00:56:04,760 --> 00:56:09,439 Speaker 5: parenting or something like that, but I just to minimize 936 00:56:09,680 --> 00:56:13,720 Speaker 5: people's risks if they can be if they consider themselves 937 00:56:13,800 --> 00:56:15,759 Speaker 5: being no longer in that relationship. 938 00:56:15,239 --> 00:56:17,319 Speaker 3: But they need to have sought help, so we need 939 00:56:17,360 --> 00:56:17,800 Speaker 3: to get. 940 00:56:17,640 --> 00:56:22,879 Speaker 5: Some information around their help seeking. So yes, they can 941 00:56:22,960 --> 00:56:27,480 Speaker 5: email me there. The other thing I was going to 942 00:56:27,520 --> 00:56:30,759 Speaker 5: suggest to people is if this has opened up kind 943 00:56:30,760 --> 00:56:32,920 Speaker 5: of like what do they want to know more about 944 00:56:33,360 --> 00:56:37,040 Speaker 5: this entire problem? I just listened to an audio book. 945 00:56:37,440 --> 00:56:39,960 Speaker 5: Can I give someone else a plug and just listen 946 00:56:40,040 --> 00:56:43,200 Speaker 5: to Jess Hill's audio books See What. 947 00:56:43,120 --> 00:56:43,759 Speaker 3: You Made Me Do? 948 00:56:44,600 --> 00:56:48,320 Speaker 5: And she's an investigative journalist, She's been in this space 949 00:56:48,360 --> 00:56:52,760 Speaker 5: for a long time and she really gives a really good, 950 00:56:52,960 --> 00:56:58,000 Speaker 5: in depth understanding of the problem of domestic. 951 00:56:57,520 --> 00:57:01,080 Speaker 3: Abuse in Australia. 952 00:57:01,320 --> 00:57:04,879 Speaker 5: So if if you're keen for that, that's a really 953 00:57:04,880 --> 00:57:07,000 Speaker 5: good listen, we'll read great. 954 00:57:07,440 --> 00:57:09,479 Speaker 2: What's that? Just give us the name of the book, 955 00:57:09,880 --> 00:57:10,879 Speaker 2: See What You Made Me. 956 00:57:10,800 --> 00:57:13,120 Speaker 3: Do, See What You Made Me Do? By Jess Hill. 957 00:57:13,760 --> 00:57:15,480 Speaker 2: That's a very clever name, isn't it. 958 00:57:15,680 --> 00:57:19,920 Speaker 1: Like as soon as you said that title, I knew exactly. 959 00:57:19,400 --> 00:57:20,600 Speaker 2: What that was going to be about. 960 00:57:20,880 --> 00:57:26,400 Speaker 1: Yeah, yeah, we appreciate you. I think I'm going to 961 00:57:26,440 --> 00:57:27,240 Speaker 1: see you in a few. 962 00:57:27,160 --> 00:57:27,840 Speaker 2: Days on the goal. 963 00:57:28,520 --> 00:57:29,280 Speaker 3: I'll be there. 964 00:57:29,920 --> 00:57:30,480 Speaker 2: You'd be good. 965 00:57:30,600 --> 00:57:33,480 Speaker 1: I'll see in a few and well it's two it's 966 00:57:33,560 --> 00:57:34,640 Speaker 1: forty eight hours. 967 00:57:34,880 --> 00:57:38,000 Speaker 3: Yeah, what's too right? Oh my godday. 968 00:57:38,680 --> 00:57:40,680 Speaker 2: We'll see you in a couple of days with a 969 00:57:40,720 --> 00:57:41,400 Speaker 2: few others. 970 00:57:41,600 --> 00:57:45,400 Speaker 1: And we appreciate you as always, and congrats with what 971 00:57:45,440 --> 00:57:48,240 Speaker 1: you're doing and good luck with your research and all 972 00:57:48,280 --> 00:57:49,440 Speaker 1: the stuff moving forward. 973 00:57:49,560 --> 00:57:50,200 Speaker 2: Thanks next. 974 00:57:51,320 --> 00:57:51,680 Speaker 3: So yeah,