1 00:00:00,160 --> 00:00:02,599 Speaker 1: Daily bespoke content that you won't find on the radio 2 00:00:02,640 --> 00:00:04,960 Speaker 1: show The Hurdaki Breakfast podcast. 3 00:00:05,280 --> 00:00:08,879 Speaker 2: Ruben Miller joins us on the podcast this morning, Good Rubin, 4 00:00:08,960 --> 00:00:09,440 Speaker 2: How are you. 5 00:00:10,280 --> 00:00:13,760 Speaker 3: Good, Hey, Jerry, I'm yeah, I'm good. I'm feeling good. 6 00:00:14,040 --> 00:00:17,960 Speaker 2: Your book The Blood Says Otherwise Murder, Forensics and Hidden 7 00:00:17,960 --> 00:00:22,880 Speaker 2: Truths is in stores today. You spent twenty two years 8 00:00:22,960 --> 00:00:26,720 Speaker 2: on the front line of forensics here in New Zealand. 9 00:00:27,720 --> 00:00:29,160 Speaker 2: You must have seen some stuff. 10 00:00:29,840 --> 00:00:33,080 Speaker 3: Yeah, I saw a bit. Not everything I was prepared for, 11 00:00:33,800 --> 00:00:36,880 Speaker 3: but it was it was an adventure. It was certainly exciting. 12 00:00:37,720 --> 00:00:40,880 Speaker 3: I got to fulfill lots of those things I'd sort 13 00:00:40,920 --> 00:00:46,160 Speaker 3: of laid out for myself and you know, done something 14 00:00:46,159 --> 00:00:52,600 Speaker 3: cool with science, certainly something meaningful. And yeah, it really 15 00:00:52,720 --> 00:00:56,520 Speaker 3: hit me as well. It was. It was an eye 16 00:00:56,520 --> 00:00:58,760 Speaker 3: opening maturing experience. 17 00:00:58,920 --> 00:01:00,680 Speaker 4: I bet it was. When you were into the studio, 18 00:01:00,720 --> 00:01:02,320 Speaker 4: what's the first thing you noticed? Like you know, when 19 00:01:02,360 --> 00:01:04,280 Speaker 4: you when you've done a job for so long, you 20 00:01:04,360 --> 00:01:06,560 Speaker 4: must see the same sorts of things everywhere you go. 21 00:01:06,720 --> 00:01:09,160 Speaker 4: What's what's Is there anything in the studio that jumps 22 00:01:09,200 --> 00:01:09,520 Speaker 4: out to you? 23 00:01:09,640 --> 00:01:11,360 Speaker 2: Did you spot that blood that's just doing that? 24 00:01:14,400 --> 00:01:14,600 Speaker 4: Well? 25 00:01:16,800 --> 00:01:21,280 Speaker 3: Yeah, the details of everything right in the job. So yeah, 26 00:01:21,319 --> 00:01:23,640 Speaker 3: I mean I'm looking at that packet. I'm seeing some 27 00:01:23,680 --> 00:01:26,000 Speaker 3: of those oily stains on the inside, wondering what those 28 00:01:26,080 --> 00:01:29,000 Speaker 3: might be. You know, that is that yesterday's lunch? Is 29 00:01:29,000 --> 00:01:31,160 Speaker 3: it like a week ago? I'm not sure. You've got 30 00:01:31,200 --> 00:01:34,640 Speaker 3: an eclectic mix of things right here. Maybe trying to 31 00:01:34,640 --> 00:01:38,080 Speaker 3: cover up something else unsavory that's going on. 32 00:01:38,200 --> 00:01:42,199 Speaker 4: It was savory. That was a cheese gone, and that's 33 00:01:42,200 --> 00:01:44,759 Speaker 4: a butter that I got the bottom of that one. 34 00:01:44,800 --> 00:01:49,600 Speaker 4: There not a body fluid then, Look, I was celebrating 35 00:01:49,600 --> 00:01:51,960 Speaker 4: over the top of that bloody good cheese gone from 36 00:01:51,960 --> 00:01:52,560 Speaker 4: across the road. 37 00:01:53,120 --> 00:01:53,320 Speaker 3: Yeah. 38 00:01:53,360 --> 00:01:55,960 Speaker 4: What are some of the things that you take away 39 00:01:56,160 --> 00:01:58,120 Speaker 4: from a twenty two year career and doing this kind 40 00:01:58,160 --> 00:02:00,000 Speaker 4: of thing? You know, what's what what stays with you? 41 00:02:00,560 --> 00:02:02,640 Speaker 4: Like for example, Jerry, if he talks for longer than 42 00:02:02,680 --> 00:02:05,240 Speaker 4: three minutes, he has to stop and throw to a song. 43 00:02:05,400 --> 00:02:08,280 Speaker 4: Just an ether day conversation. Okay, that's what he's let's 44 00:02:08,320 --> 00:02:08,800 Speaker 4: his hang over. 45 00:02:08,960 --> 00:02:13,839 Speaker 3: Okay. Yeah. I think primarily the biggest thing is we're 46 00:02:13,880 --> 00:02:17,920 Speaker 3: all people and we all have stories, and we're not 47 00:02:18,880 --> 00:02:21,920 Speaker 3: just the worst thing that we've done. So it's the 48 00:02:22,040 --> 00:02:25,519 Speaker 3: job weirdly, maybe not weirdly, has given me a whole 49 00:02:25,560 --> 00:02:30,200 Speaker 3: ton of compassion for the way that people live and 50 00:02:30,360 --> 00:02:33,560 Speaker 3: what they're exposed to and exactly why they might have 51 00:02:33,639 --> 00:02:34,520 Speaker 3: done the things they've done. 52 00:02:34,720 --> 00:02:36,640 Speaker 4: Yeah. I guess because when you see this kind of 53 00:02:36,639 --> 00:02:38,600 Speaker 4: stuff that you would have been looking at on the news, 54 00:02:38,639 --> 00:02:42,399 Speaker 4: you just automatically go evil person did an evil thing. 55 00:02:43,040 --> 00:02:46,160 Speaker 4: But I guess are we actually every one of us 56 00:02:46,160 --> 00:02:47,959 Speaker 4: a little bit closer to that than you'd think. 57 00:02:49,320 --> 00:02:52,280 Speaker 3: Yeah, I have to say I think so. You know, 58 00:02:53,040 --> 00:02:56,639 Speaker 3: to brand someone as evil is pretty it's it's easy, right, 59 00:02:56,639 --> 00:03:00,160 Speaker 3: it's others them, and it just makes their world not 60 00:03:00,200 --> 00:03:00,680 Speaker 3: our world. 61 00:03:01,000 --> 00:03:01,240 Speaker 4: Yeah. 62 00:03:01,320 --> 00:03:04,200 Speaker 3: But you know, like I've I've been into scenes that 63 00:03:04,280 --> 00:03:08,040 Speaker 3: are absolutely horrific, and especially when I started having kids, 64 00:03:08,160 --> 00:03:12,760 Speaker 3: and you know that little bib or or nappy that 65 00:03:12,800 --> 00:03:15,320 Speaker 3: you got from the baby factory and then you see 66 00:03:15,320 --> 00:03:18,520 Speaker 3: it at a scene and you go, shit, there's this 67 00:03:18,600 --> 00:03:19,880 Speaker 3: is this is not that different? 68 00:03:19,880 --> 00:03:20,720 Speaker 4: I've got the same one. 69 00:03:20,800 --> 00:03:22,639 Speaker 3: Yeah, I've got the same one. I see that that 70 00:03:22,880 --> 00:03:25,640 Speaker 3: that ornament there or like, you know, one of the 71 00:03:25,639 --> 00:03:31,000 Speaker 3: worst things I've ever seen is a a certificate for 72 00:03:31,600 --> 00:03:34,560 Speaker 3: diligence in maths from a primary school. 73 00:03:34,800 --> 00:03:35,040 Speaker 4: Wow. 74 00:03:35,400 --> 00:03:38,080 Speaker 3: And it's up on the wall and amongst the worst 75 00:03:38,120 --> 00:03:40,480 Speaker 3: thing you can imagine, and it's got blood spatter on it, 76 00:03:40,960 --> 00:03:46,600 Speaker 3: you know. So these the little oases of of pride, 77 00:03:46,760 --> 00:03:50,040 Speaker 3: you know, And just like everyone's everyone wants the best 78 00:03:50,080 --> 00:03:52,560 Speaker 3: for those around them, but we're just so limited. 79 00:03:52,960 --> 00:03:55,880 Speaker 2: When you worked on over one hundred and sixty forensic 80 00:03:56,240 --> 00:04:00,880 Speaker 2: homicide investigations, one hundred and sixty, do you member details 81 00:04:00,880 --> 00:04:03,480 Speaker 2: from you think all of those one hundred and sixty? 82 00:04:03,520 --> 00:04:04,880 Speaker 2: Can you carry that with you? 83 00:04:06,120 --> 00:04:11,720 Speaker 3: Well? I couldn't to some degree. I remember intricate details 84 00:04:13,440 --> 00:04:15,800 Speaker 3: about you know, things that are more than twenty years old. 85 00:04:16,320 --> 00:04:18,840 Speaker 3: And I also have total blanks. So you know, there 86 00:04:18,839 --> 00:04:20,920 Speaker 3: are things that you know. Over the last five or 87 00:04:20,960 --> 00:04:23,720 Speaker 3: ten years, people have actually said to me, oh, do 88 00:04:23,760 --> 00:04:25,680 Speaker 3: you remember that case you worked on? And I said, well, 89 00:04:25,680 --> 00:04:27,479 Speaker 3: how did you know that? I didn't disclose anything, and 90 00:04:27,480 --> 00:04:29,360 Speaker 3: they said, no, no, I saw you in the paper. 91 00:04:29,680 --> 00:04:31,680 Speaker 3: Well you know, and like that that's gone for me, 92 00:04:31,839 --> 00:04:33,720 Speaker 3: And then I start researching it's oh yeah, I did 93 00:04:33,760 --> 00:04:37,760 Speaker 3: do that, you know. So just there's very very limited 94 00:04:37,760 --> 00:04:39,960 Speaker 3: ways of making sense of this stuff as. 95 00:04:39,880 --> 00:04:43,080 Speaker 4: Part of that, just because it's your job. It's like 96 00:04:43,120 --> 00:04:45,320 Speaker 4: asking someone what they did at work on Tuesday last 97 00:04:45,320 --> 00:04:47,919 Speaker 4: week as part of it also that maybe some of 98 00:04:47,920 --> 00:04:50,040 Speaker 4: that stuff get blocked out, do you think or no. 99 00:04:50,440 --> 00:04:54,360 Speaker 3: Ah, yeah, absolutely right, Yeah, I mean it's a survival mechanism, right, yeah, 100 00:04:54,320 --> 00:04:56,000 Speaker 3: you know, I mean it is a job, as you say, 101 00:04:57,000 --> 00:04:59,479 Speaker 3: you know, whether I'm sort of whether you're in accounts 102 00:04:59,600 --> 00:05:03,600 Speaker 3: or you know, radio DJs. I mean, thing's become you know, 103 00:05:04,240 --> 00:05:07,520 Speaker 3: every day and you don't necessarily want to talk about 104 00:05:07,520 --> 00:05:10,160 Speaker 3: them all the time, but you I actively have to 105 00:05:10,200 --> 00:05:12,560 Speaker 3: try to keep that away from bringing that home. 106 00:05:13,080 --> 00:05:17,040 Speaker 2: So yeah, absolutely, I have always wanted to know people 107 00:05:17,040 --> 00:05:19,840 Speaker 2: at work and your line of work that you worked 108 00:05:19,880 --> 00:05:21,480 Speaker 2: in forensics. 109 00:05:21,360 --> 00:05:27,320 Speaker 5: Dreams, dreams, because obviously we are Our dreams are the 110 00:05:27,360 --> 00:05:29,600 Speaker 5: input that we put into our brains during the day, 111 00:05:29,839 --> 00:05:32,400 Speaker 5: and that's what's working away there. 112 00:05:33,520 --> 00:05:34,440 Speaker 4: What are your dreams? 113 00:05:34,560 --> 00:05:40,360 Speaker 3: Like, my dreams are a lot better now, and they 114 00:05:40,400 --> 00:05:43,680 Speaker 3: weren't they pretty for quite a long time. One of 115 00:05:43,720 --> 00:05:45,760 Speaker 3: the you know, I mean you can probably imagine what 116 00:05:45,839 --> 00:05:49,520 Speaker 3: I dream when I bring my work home, but one 117 00:05:49,600 --> 00:05:53,119 Speaker 3: of my recurring dreams is actually getting into the mind 118 00:05:53,240 --> 00:05:56,720 Speaker 3: of a perpetrator trying to get away with what they've done, 119 00:05:57,480 --> 00:05:59,560 Speaker 3: Like as I reconstruct crime scene so I have to 120 00:05:59,560 --> 00:06:02,320 Speaker 3: get into the mind of the killer. So it's very 121 00:06:02,360 --> 00:06:03,880 Speaker 3: hard to get out of the mind of the killer, 122 00:06:04,320 --> 00:06:07,720 Speaker 3: and that mind is with me at night as well. 123 00:06:08,000 --> 00:06:10,240 Speaker 3: And it's bloody stressful trying to get away with a 124 00:06:10,279 --> 00:06:11,600 Speaker 3: crime I've had. 125 00:06:11,800 --> 00:06:13,920 Speaker 2: I mean, I have dreams or I'm trying to where 126 00:06:13,960 --> 00:06:16,719 Speaker 2: I've woke. I'm not working up, but I'm in the dream. 127 00:06:17,240 --> 00:06:20,159 Speaker 2: But I haven't actually murdered the person on the dream. 128 00:06:20,200 --> 00:06:20,479 Speaker 5: I don't. 129 00:06:20,640 --> 00:06:23,479 Speaker 2: I don't murder anyone. But you've had something, I've already 130 00:06:23,520 --> 00:06:25,560 Speaker 2: done it, and then I've got exactly the same thing. 131 00:06:25,560 --> 00:06:27,360 Speaker 2: I've got to try and get away with it. Times 132 00:06:27,400 --> 00:06:31,960 Speaker 2: I've buried a body and it's oftentimes outside and it's 133 00:06:32,000 --> 00:06:38,200 Speaker 2: in clay and there's terrible floods or rain, and it's away. 134 00:06:38,680 --> 00:06:41,120 Speaker 2: It's a horrible dream, and when you wake. 135 00:06:40,960 --> 00:06:43,400 Speaker 4: Up, you're like, oh my god, this is bad. 136 00:06:43,480 --> 00:06:44,839 Speaker 2: I'm going to be in prison for the rest of 137 00:06:44,839 --> 00:06:47,200 Speaker 2: my life. How am I going to deal with? No, 138 00:06:47,279 --> 00:06:47,800 Speaker 2: I didn't do that. 139 00:06:48,240 --> 00:06:51,680 Speaker 3: Assion what it takes me a little longer than you 140 00:06:51,720 --> 00:06:53,480 Speaker 3: to get out of that motive. I didn't do it, 141 00:06:53,560 --> 00:06:56,240 Speaker 3: you know, because you know, especially when I'm on call, 142 00:06:56,279 --> 00:06:57,800 Speaker 3: what's the next one? That's going to come along with 143 00:06:57,800 --> 00:06:59,880 Speaker 3: whose other Whose head am I going to have to inhabit? 144 00:07:00,800 --> 00:07:02,919 Speaker 4: What are some of the commonalities you've noticed having to 145 00:07:02,960 --> 00:07:05,480 Speaker 4: try and recreate these and get into the mindset of 146 00:07:05,520 --> 00:07:08,760 Speaker 4: that that might surprise people about how people think in 147 00:07:08,800 --> 00:07:09,920 Speaker 4: these situations. 148 00:07:10,440 --> 00:07:13,800 Speaker 3: Yeah, that we're pretty ordinary. I mean New Zealand. You know, 149 00:07:13,880 --> 00:07:15,760 Speaker 3: it's a tough thing with this job. You know, you 150 00:07:15,840 --> 00:07:19,520 Speaker 3: wait for that really elaborate, complex, premeditated crime and that 151 00:07:19,560 --> 00:07:21,960 Speaker 3: comes up. Yeah, yeah, that comes along very occasionally, and 152 00:07:21,960 --> 00:07:23,560 Speaker 3: then it's like, yes, that's why I'm doing the job, 153 00:07:24,080 --> 00:07:29,440 Speaker 3: and the rest are ordinary people reacting to weird circumstances. 154 00:07:29,480 --> 00:07:33,280 Speaker 3: So it's emotional, there's addiction, there's you know, gang extortion 155 00:07:33,360 --> 00:07:36,880 Speaker 3: stuff going on, and it's a quick hit over the 156 00:07:36,880 --> 00:07:39,800 Speaker 3: head of a drunken mate and then it's oh, should 157 00:07:39,800 --> 00:07:42,280 Speaker 3: I have to try and cover this up? And you 158 00:07:42,320 --> 00:07:44,800 Speaker 3: don't realize how hard it is to cover up a crime. Yeah, 159 00:07:44,840 --> 00:07:46,320 Speaker 3: So then there's a quick call to one one one 160 00:07:46,400 --> 00:07:48,920 Speaker 3: and say, oh, look, this is what I did, right, 161 00:07:48,920 --> 00:07:50,960 Speaker 3: And then we come in as the just in case 162 00:07:51,680 --> 00:07:54,520 Speaker 3: and that's that's your eighty percent of crime, right, But 163 00:07:54,560 --> 00:07:57,840 Speaker 3: it's the twenty percent of legitimate who done it? Yeah, 164 00:07:57,920 --> 00:07:59,000 Speaker 3: that really tests you. 165 00:07:59,120 --> 00:08:02,880 Speaker 2: Okay, have there ever been out of those one hundred 166 00:08:02,920 --> 00:08:09,000 Speaker 2: and sixty, how many got away with something that you 167 00:08:09,200 --> 00:08:13,760 Speaker 2: know happened? Obviously you can't be specific about what that was. 168 00:08:13,840 --> 00:08:17,000 Speaker 2: I don't expect that. But what's the percentage you reckon 169 00:08:17,320 --> 00:08:19,240 Speaker 2: of people that got away with something? 170 00:08:19,680 --> 00:08:21,720 Speaker 3: I can tell you beyond a percentage, I can tell 171 00:08:21,720 --> 00:08:25,160 Speaker 3: you an exact number. Okay, And that's one one out 172 00:08:25,160 --> 00:08:28,040 Speaker 3: of the one hundred and sixty, right, okay, And that's 173 00:08:28,040 --> 00:08:30,920 Speaker 3: a bona fide unsolved homicide. 174 00:08:31,680 --> 00:08:34,120 Speaker 4: Oh right, there was no you couldn't read, there was 175 00:08:34,160 --> 00:08:35,600 Speaker 4: no like conclusive. 176 00:08:35,640 --> 00:08:37,800 Speaker 3: Well, we did our best where there were many strands, 177 00:08:37,840 --> 00:08:42,840 Speaker 3: many leads, many dead ends. But yeah, that's the only 178 00:08:42,960 --> 00:08:45,560 Speaker 3: homicide case I've worked on that remains unsolved. 179 00:08:45,800 --> 00:08:48,200 Speaker 2: Wow, because I guess nowadays as well, it's getting even 180 00:08:48,240 --> 00:08:50,520 Speaker 2: harder and harder to get away with something because of 181 00:08:50,679 --> 00:08:53,760 Speaker 2: phone records. You've got something tracking you around the whole time, 182 00:08:54,200 --> 00:08:55,760 Speaker 2: knows where you are at all times. 183 00:08:55,840 --> 00:09:01,040 Speaker 3: Yeah, oh absolutely, CCTV and cell phone towers, so a 184 00:09:01,080 --> 00:09:04,920 Speaker 3: whole ton of cases. Yeah, and you know, and in 185 00:09:04,960 --> 00:09:09,320 Speaker 3: forensic science does definitely play a role, but it's part 186 00:09:09,320 --> 00:09:12,199 Speaker 3: of a bigger investigation, right, Like I mean, the rhetoric 187 00:09:12,320 --> 00:09:14,559 Speaker 3: is that you know DNA will solve you know, and 188 00:09:14,640 --> 00:09:16,920 Speaker 3: you know you've got that one DNA hit that's the 189 00:09:16,920 --> 00:09:20,760 Speaker 3: case closed. It's way more complex than that, you know, Yeah. 190 00:09:20,520 --> 00:09:22,559 Speaker 4: Because all that might prove is that you know, someone 191 00:09:22,600 --> 00:09:23,840 Speaker 4: may have been there, but they might live in that 192 00:09:23,880 --> 00:09:24,880 Speaker 4: house or you know what I mean. 193 00:09:24,920 --> 00:09:29,679 Speaker 3: Absolutely, And the more sensitive DNA technology gets, the you've 194 00:09:29,720 --> 00:09:32,280 Speaker 3: got that double edged sword, right. So yeah, it's great 195 00:09:32,280 --> 00:09:35,559 Speaker 3: we can find DNA profiles that we previously wouldn't have 196 00:09:35,559 --> 00:09:38,560 Speaker 3: been able to. But then the explanations for that and 197 00:09:38,559 --> 00:09:41,439 Speaker 3: the care that you've got to take around contamination, yeah, 198 00:09:41,640 --> 00:09:42,679 Speaker 3: go up exponentially. 199 00:09:43,280 --> 00:09:46,600 Speaker 4: Well off the back of Jerry's Christian how often do 200 00:09:46,679 --> 00:09:51,800 Speaker 4: you you know, like you've done the forensic analysis of 201 00:09:51,840 --> 00:09:55,319 Speaker 4: the scene of everything, you moralleyst know what's happened here, 202 00:09:55,720 --> 00:09:58,520 Speaker 4: but it can't be they can't be convicted for that. 203 00:09:58,600 --> 00:10:00,720 Speaker 4: Does that ever happen? You know, whereas in the back 204 00:10:00,720 --> 00:10:03,160 Speaker 4: of your mind you know what's happened, but you actually 205 00:10:03,240 --> 00:10:05,840 Speaker 4: can't but the literal the law, you can't get them 206 00:10:05,920 --> 00:10:06,920 Speaker 4: for it. Does that ever happen? 207 00:10:07,040 --> 00:10:11,480 Speaker 3: Yeah, one hundred percent, it happens quite often actually, and 208 00:10:11,520 --> 00:10:13,640 Speaker 3: you just have to go into a mindset that actually, 209 00:10:14,040 --> 00:10:16,000 Speaker 3: and it's difficult at times when you see the results 210 00:10:16,000 --> 00:10:17,720 Speaker 3: of what people do to each other, but you have 211 00:10:17,800 --> 00:10:22,400 Speaker 3: to get into that mindset of it's not important to 212 00:10:22,440 --> 00:10:25,840 Speaker 3: me whether they're guilty or innocent. It's whether the the 213 00:10:25,880 --> 00:10:30,200 Speaker 3: evidence stacks up and is represented in an honest way, 214 00:10:30,280 --> 00:10:32,520 Speaker 3: and that that's critical, you know. So that's what I'm 215 00:10:32,600 --> 00:10:33,640 Speaker 3: far more interested in. 216 00:10:33,760 --> 00:10:35,640 Speaker 4: Right, and then at that point you've done your job, 217 00:10:35,720 --> 00:10:39,120 Speaker 4: You've presented all the evidence. However that plays out past there. 218 00:10:39,400 --> 00:10:43,040 Speaker 3: Yeah, it gets sticky as well, because the endpoint to 219 00:10:43,120 --> 00:10:47,240 Speaker 3: my process is presenting in court. So then it's a 220 00:10:47,240 --> 00:10:52,560 Speaker 3: whole nother level of trying to interpret and translate scientific 221 00:10:53,640 --> 00:10:56,480 Speaker 3: findings to a lay audience a jury, you know, And 222 00:10:56,760 --> 00:10:58,400 Speaker 3: I say, how do I come across exactly? 223 00:10:58,400 --> 00:10:58,560 Speaker 4: You know. 224 00:10:58,559 --> 00:11:01,280 Speaker 3: There's something called the CSI fair and that's big. 225 00:11:02,120 --> 00:11:02,320 Speaker 4: Yeah. 226 00:11:02,400 --> 00:11:04,600 Speaker 2: The other part about it is, I suppose sometimes you'll 227 00:11:04,640 --> 00:11:08,520 Speaker 2: be grilled by defense lawyers. Yeah yeah, and your integrity 228 00:11:08,520 --> 00:11:09,679 Speaker 2: gets called into question. 229 00:11:09,920 --> 00:11:10,319 Speaker 3: Yeah yeah. 230 00:11:10,400 --> 00:11:14,200 Speaker 2: How do you overcome the feeling of maybe that you're 231 00:11:14,240 --> 00:11:18,080 Speaker 2: being wrong? Yeah yeah, yeah in that sense, yeah, I. 232 00:11:18,080 --> 00:11:20,640 Speaker 3: Mean, they don't it's hard to prep for you know. 233 00:11:21,880 --> 00:11:24,560 Speaker 3: I mean I try to get round and say like 234 00:11:24,600 --> 00:11:28,280 Speaker 3: everyone needs a good defense, you know, and my integrity 235 00:11:28,280 --> 00:11:30,600 Speaker 3: should be called into question. Sometimes it gets a little 236 00:11:30,640 --> 00:11:33,840 Speaker 3: petsy from both sides. But we have this adversarial system 237 00:11:33,960 --> 00:11:37,360 Speaker 3: and not always the healthiest, you know, because it really 238 00:11:37,360 --> 00:11:39,400 Speaker 3: plays people off against each other. You've got an expert 239 00:11:39,440 --> 00:11:42,959 Speaker 3: from the prosecution expert against for the defense, and they 240 00:11:43,000 --> 00:11:44,439 Speaker 3: just go at it and it's like, well, who do 241 00:11:44,480 --> 00:11:47,760 Speaker 3: you believe? But it is it can be nerve wracking, 242 00:11:47,800 --> 00:11:49,760 Speaker 3: and it's like, you know, I've put all this work 243 00:11:49,800 --> 00:11:52,000 Speaker 3: in and then I'm trying to the whole case is 244 00:11:52,000 --> 00:11:56,439 Speaker 3: trying to be dismantled by one little technicality. It's pretty 245 00:11:56,440 --> 00:11:58,160 Speaker 3: hard and you have to stay pretty professional. 246 00:11:59,080 --> 00:12:01,040 Speaker 4: You mentioned the cs What do you remember that. What's 247 00:12:01,040 --> 00:12:02,080 Speaker 4: the CSI effect? 248 00:12:02,520 --> 00:12:05,360 Speaker 3: Well, CSI effect like if we're not careful in this environment, 249 00:12:05,480 --> 00:12:07,599 Speaker 3: like the way we're talking about, things that can perpetuate 250 00:12:07,679 --> 00:12:10,880 Speaker 3: some myths around you know what forensic science can do, 251 00:12:11,280 --> 00:12:15,439 Speaker 3: and then the public gets an idea of of what 252 00:12:15,520 --> 00:12:16,480 Speaker 3: forensic science can. 253 00:12:16,400 --> 00:12:18,040 Speaker 4: Do and they think they know how it works. 254 00:12:18,080 --> 00:12:19,480 Speaker 3: Yeah, and they think they know how it works, and 255 00:12:19,480 --> 00:12:23,040 Speaker 3: then they watch CSI and it reinforces that and suddenly, 256 00:12:23,120 --> 00:12:25,000 Speaker 3: you know, before you know it, everyone expects that you 257 00:12:25,000 --> 00:12:27,760 Speaker 3: can get results in a day and it's conclusive, and 258 00:12:28,120 --> 00:12:30,839 Speaker 3: everyone goes home and how high fives because it's a 259 00:12:30,920 --> 00:12:33,880 Speaker 3: nice resolution and the reality is pretty far from that. 260 00:12:34,280 --> 00:12:36,719 Speaker 2: Do you ever solve crimes walking down hallways? 261 00:12:37,520 --> 00:12:38,440 Speaker 3: Like you know? 262 00:12:38,520 --> 00:12:41,160 Speaker 2: I find on CSI they just talk a lot about 263 00:12:41,240 --> 00:12:45,360 Speaker 2: what happened walking down hall always walking talking about very 264 00:12:45,520 --> 00:12:50,160 Speaker 2: very I would say, confidential information. Its a lot of 265 00:12:50,200 --> 00:12:51,560 Speaker 2: people are walking past you. 266 00:12:51,640 --> 00:12:53,920 Speaker 3: Well, these guys are really compromised, right, they're sort of 267 00:12:54,040 --> 00:12:56,960 Speaker 3: they're doing sort of twelve people's jobs, you know, they're 268 00:12:57,000 --> 00:13:00,200 Speaker 3: they're collecting evidence, they're interviewing suspects and victims. They're they're 269 00:13:00,200 --> 00:13:04,320 Speaker 3: completely biasing themselves. So that's one issue. And those halways, 270 00:13:04,360 --> 00:13:07,280 Speaker 3: those halways always tend to be quite dimly lit with 271 00:13:07,640 --> 00:13:10,800 Speaker 3: like a little pencil torch. And funnily enough, in New Zealand, 272 00:13:10,800 --> 00:13:12,960 Speaker 3: maybe it's a Kiwi thing. We got to crime scenes 273 00:13:12,960 --> 00:13:16,079 Speaker 3: when it's daytime because our biggest tool is being able 274 00:13:16,120 --> 00:13:19,280 Speaker 3: to see. So often they'll close down the scene unless 275 00:13:19,280 --> 00:13:21,160 Speaker 3: it's an emergency. I know, we'll see you guys at eight, 276 00:13:22,040 --> 00:13:25,280 Speaker 3: and then we've got a whole Day of Light Jeremy Wells. 277 00:13:25,000 --> 00:13:28,760 Speaker 1: And Manaia Stewart. Find them on Instagram at Hrdarki Breakfast. 278 00:13:29,760 --> 00:13:33,200 Speaker 1: Jerry and Mania joined the complay the Hardarki Breakfast discussion 279 00:13:33,240 --> 00:13:34,720 Speaker 1: group on Facebook for more. 280 00:13:35,240 --> 00:13:39,880 Speaker 2: Is it impossible to depersonalize a crime scene? So to 281 00:13:39,920 --> 00:13:44,439 Speaker 2: walk into it and not feel that someone's life has 282 00:13:44,520 --> 00:13:48,160 Speaker 2: been extinguished in this situation and think about all the 283 00:13:48,160 --> 00:13:50,319 Speaker 2: things that happened in their life. 284 00:13:50,559 --> 00:13:55,680 Speaker 3: Oh, I would say, well, it's a yes or no answer. 285 00:13:55,720 --> 00:13:57,520 Speaker 3: I mean, yes, I was able to do that for 286 00:13:57,600 --> 00:13:59,480 Speaker 3: a big chunk of my career, but it came as 287 00:13:59,480 --> 00:14:03,600 Speaker 3: a bit of a lost because that's about suppression and 288 00:14:03,600 --> 00:14:06,520 Speaker 3: bearing and combat mentalization. Everyone's different, right the way they 289 00:14:06,559 --> 00:14:10,800 Speaker 3: react to this stuff. But the honest answer for me, no, 290 00:14:10,840 --> 00:14:13,680 Speaker 3: not entirely. And it got harder through my career where 291 00:14:14,160 --> 00:14:18,160 Speaker 3: I was paid to reconstruct, right, But I was reconstructing 292 00:14:18,200 --> 00:14:20,520 Speaker 3: those last ten minutes of someone's life, and it was 293 00:14:20,640 --> 00:14:22,840 Speaker 3: really hard to get away from what happened to them 294 00:14:23,400 --> 00:14:25,720 Speaker 3: and how that might have been for them, you know, 295 00:14:25,920 --> 00:14:30,520 Speaker 3: And I mean you walk into these houses, generally impoverished 296 00:14:30,560 --> 00:14:35,160 Speaker 3: every areas where the living conditions are so horrible that 297 00:14:35,320 --> 00:14:38,800 Speaker 3: you know, I've stepped over a body several times, almost 298 00:14:38,840 --> 00:14:42,000 Speaker 3: before noticing it, just looking in the environment, thinking, man, 299 00:14:42,200 --> 00:14:44,040 Speaker 3: if anyone makes it out of here, they need to 300 00:14:44,080 --> 00:14:46,480 Speaker 3: deserve a metal, you know. And so again we go 301 00:14:46,520 --> 00:14:49,600 Speaker 3: back to that compassion thing. But compassion and emotions at 302 00:14:49,600 --> 00:14:53,480 Speaker 3: a crime scene and coping sort of do clash. 303 00:14:53,760 --> 00:14:57,560 Speaker 2: Yeah, like this, this is this next question of mine, 304 00:14:58,120 --> 00:15:02,280 Speaker 2: feel free not to answer it. Usually. So you worked 305 00:15:02,280 --> 00:15:07,440 Speaker 2: on one hundred and sixty homicides, highly successful at what 306 00:15:07,480 --> 00:15:11,960 Speaker 2: you did. You're at the top of your field, knowing 307 00:15:12,000 --> 00:15:15,520 Speaker 2: what you know now, would you go back and do 308 00:15:15,600 --> 00:15:17,800 Speaker 2: what you did again? Do you think you're a better 309 00:15:17,840 --> 00:15:20,520 Speaker 2: person for all of the experience that you've had, or 310 00:15:20,560 --> 00:15:26,040 Speaker 2: do you think that it's on balance damaged you more. 311 00:15:30,320 --> 00:15:32,680 Speaker 3: I would say no, I wouldn't go back and do 312 00:15:32,720 --> 00:15:37,080 Speaker 3: it again, because that I did my you know, twenty 313 00:15:37,160 --> 00:15:39,760 Speaker 3: two years at the front line, twenty three years in 314 00:15:39,800 --> 00:15:42,400 Speaker 3: that industry, and twenty five years now still as a 315 00:15:42,440 --> 00:15:47,200 Speaker 3: forensic scientist. I feel I can bring all of those 316 00:15:47,240 --> 00:15:53,200 Speaker 3: skills to another realm, which is the interpretation and analytical 317 00:15:53,200 --> 00:15:55,000 Speaker 3: work that I now do for the New Zealand courts. 318 00:15:55,400 --> 00:15:58,640 Speaker 3: So that served me well, and you know, the front 319 00:15:58,640 --> 00:16:04,000 Speaker 3: lines is intense and for me, I needed a limited 320 00:16:04,000 --> 00:16:08,120 Speaker 3: lifespan there. And again everyone's different, but I would say 321 00:16:08,480 --> 00:16:12,600 Speaker 3: this is not a long, long career for me. It 322 00:16:12,640 --> 00:16:15,440 Speaker 3: has to change, So I don't think I would go 323 00:16:15,480 --> 00:16:17,440 Speaker 3: back and do it again, although I'm in a better 324 00:16:17,520 --> 00:16:20,120 Speaker 3: place now than I was when I started, because I've 325 00:16:20,160 --> 00:16:23,880 Speaker 3: actually been able to reflect process and have totally different tools. 326 00:16:24,080 --> 00:16:26,040 Speaker 4: So he's grateful you did it, but probably wouldn't want 327 00:16:26,040 --> 00:16:26,480 Speaker 4: to do it again. 328 00:16:27,320 --> 00:16:31,760 Speaker 3: Yeah, I think younger, more enthusiastic, and other people who 329 00:16:31,840 --> 00:16:35,440 Speaker 3: can have their own experience of it should definitely take 330 00:16:35,480 --> 00:16:36,920 Speaker 3: up that mental And it's a good thing for a 331 00:16:36,960 --> 00:16:39,720 Speaker 3: scientist to say, you know, it's like, you know, I've 332 00:16:39,800 --> 00:16:43,160 Speaker 3: done this, I did it well, and now I don't 333 00:16:43,160 --> 00:16:44,080 Speaker 3: need to do it forever. 334 00:16:44,440 --> 00:16:47,000 Speaker 4: How do you go when you like so? For example, 335 00:16:47,320 --> 00:16:49,440 Speaker 4: my partner works in a completely different industry than me, 336 00:16:49,560 --> 00:16:51,080 Speaker 4: and I love that because when I go home, she 337 00:16:51,120 --> 00:16:54,040 Speaker 4: doesn't care at all what happened at work. How do 338 00:16:54,080 --> 00:16:55,800 Speaker 4: you go when you you know, when you go home 339 00:16:55,840 --> 00:16:58,440 Speaker 4: from a harrowing day on the work site, then you 340 00:16:59,000 --> 00:17:00,760 Speaker 4: know I did did talk about it when you get 341 00:17:00,760 --> 00:17:02,320 Speaker 4: hold do you not talk about it? Or how does 342 00:17:02,360 --> 00:17:02,720 Speaker 4: that work? 343 00:17:03,760 --> 00:17:07,080 Speaker 3: Yeah, I definitely talked about it, but I mean, just 344 00:17:07,119 --> 00:17:10,640 Speaker 3: to put it in context, my wife is clinical psychologist, okay, 345 00:17:10,760 --> 00:17:16,320 Speaker 3: so you know, and she's talking she's worked for corrections, 346 00:17:17,119 --> 00:17:21,080 Speaker 3: you know, and and her work stories. You know, I 347 00:17:21,080 --> 00:17:23,560 Speaker 3: thought mine were bad, but you know, I mean, but 348 00:17:23,600 --> 00:17:26,000 Speaker 3: there was we could only go so far because we 349 00:17:26,119 --> 00:17:29,199 Speaker 3: both hold confidential information, you know, and there's there's you know, 350 00:17:29,240 --> 00:17:31,879 Speaker 3: there's all sorts of sensitivity there. So yes, and no, 351 00:17:32,720 --> 00:17:35,040 Speaker 3: we could talk around it. But you know, I mean 352 00:17:35,520 --> 00:17:37,440 Speaker 3: part of my lack of skill back in the day 353 00:17:37,640 --> 00:17:40,399 Speaker 3: was actually not talking enough, you know, and just going now, 354 00:17:40,440 --> 00:17:42,520 Speaker 3: I'm not going to bring this home. Yeah, I'm stoic, 355 00:17:42,600 --> 00:17:45,320 Speaker 3: I'm resilient. I got this. My kids don't need to 356 00:17:45,400 --> 00:17:47,439 Speaker 3: know anything, my wife doesn't need to know anything. But 357 00:17:47,520 --> 00:17:48,639 Speaker 3: you know, that stuff builds up. 358 00:17:49,000 --> 00:17:51,320 Speaker 2: Well in the end, you ended up with PTSD, didn't 359 00:17:51,359 --> 00:17:55,280 Speaker 2: you describe it as a slow erosion and PTSD. 360 00:17:55,720 --> 00:17:57,159 Speaker 3: Yeah, it was like we're watching a bit of a 361 00:17:57,160 --> 00:17:59,160 Speaker 3: train wreck. You know. When I could actually be more 362 00:17:59,200 --> 00:18:01,840 Speaker 3: conscious of it, it's like where is that all going? 363 00:18:02,760 --> 00:18:04,800 Speaker 3: And what am I doing to the people around me? 364 00:18:04,880 --> 00:18:08,000 Speaker 3: You know, became very controlling, irritable, you know, it wasn't 365 00:18:08,040 --> 00:18:10,399 Speaker 3: like I was on you know, other people may have 366 00:18:10,480 --> 00:18:13,159 Speaker 3: abused substances or whatever it was, but you know, my 367 00:18:13,280 --> 00:18:16,440 Speaker 3: avoidance skills and my suppression just felt like the same thing. 368 00:18:16,480 --> 00:18:19,480 Speaker 3: You know. It was easier to stay on that then 369 00:18:19,600 --> 00:18:22,119 Speaker 3: try and step out of it, because it was it 370 00:18:22,240 --> 00:18:24,119 Speaker 3: was such a deep coping mechanism. 371 00:18:24,320 --> 00:18:26,960 Speaker 2: It's a full on job. And I would have thought 372 00:18:27,040 --> 00:18:30,440 Speaker 2: that anybody doing what you were doing would have huge 373 00:18:30,520 --> 00:18:35,040 Speaker 2: levels of support around them, like people do. They offer 374 00:18:35,240 --> 00:18:37,800 Speaker 2: a huge amount of support people doing your kind of work. 375 00:18:37,920 --> 00:18:42,639 Speaker 3: Yeah, I mean, there are definitely services there, there's mandatory 376 00:18:42,680 --> 00:18:49,320 Speaker 3: and voluntary sort of therapeutic options. It's just that, you know, 377 00:18:49,359 --> 00:18:53,800 Speaker 3: when you're under one aspect is the material you're dealing with, 378 00:18:53,840 --> 00:18:57,600 Speaker 3: the other aspect is the constant grind of the case workload, 379 00:18:57,880 --> 00:19:01,200 Speaker 3: So you can barely take a break just to breathe, 380 00:19:01,320 --> 00:19:05,080 Speaker 3: you know. And so you know, my my coping mechanism 381 00:19:05,200 --> 00:19:08,520 Speaker 3: was performance and resilience and just punching through. So I 382 00:19:08,560 --> 00:19:11,720 Speaker 3: didn't spend myself that time to actually look at those 383 00:19:12,000 --> 00:19:16,560 Speaker 3: options properly, you know. And and every again, everyone's different, 384 00:19:16,600 --> 00:19:20,840 Speaker 3: but it was that's that's challenging to manage, you know, 385 00:19:20,920 --> 00:19:24,760 Speaker 3: And and most industries are just doing their best, but 386 00:19:24,960 --> 00:19:26,560 Speaker 3: you know it's kind of like how do you train 387 00:19:26,640 --> 00:19:29,280 Speaker 3: for that one? And someone's got to do it too, 388 00:19:29,600 --> 00:19:31,920 Speaker 3: you know, and you know, how do you support that? 389 00:19:32,040 --> 00:19:35,520 Speaker 4: Yeah? How many people do that kind of thing in 390 00:19:35,560 --> 00:19:37,760 Speaker 4: New Zealand? Because you know, we hear so much about 391 00:19:37,760 --> 00:19:41,520 Speaker 4: how underresourced all the various different government functions are. Is 392 00:19:41,560 --> 00:19:43,159 Speaker 4: that one of them as well? How many people do 393 00:19:43,240 --> 00:19:43,840 Speaker 4: what you did? 394 00:19:44,359 --> 00:19:47,640 Speaker 3: So frontline forensic services across New Zealand, so that would 395 00:19:47,680 --> 00:19:51,719 Speaker 3: be two different levels. There are technicians and scientists, approximately 396 00:19:51,800 --> 00:19:56,040 Speaker 3: fifty people. So that's that's you know, dedicated crime scene labs. 397 00:19:56,080 --> 00:19:59,680 Speaker 3: You've got firearms scenes, and you've got clandestine laboratories, and 398 00:19:59,720 --> 00:20:01,840 Speaker 3: you've got a whole bunch of support stuff you know 399 00:20:01,880 --> 00:20:04,160 Speaker 3: that vicariously take that stuff as well. 400 00:20:04,520 --> 00:20:08,399 Speaker 4: Is it enough? Nah? 401 00:20:08,640 --> 00:20:11,480 Speaker 3: You know, I would say it's you know, you could 402 00:20:11,520 --> 00:20:16,000 Speaker 3: it's almost arguably serviceable. But it's like people need a 403 00:20:16,000 --> 00:20:18,080 Speaker 3: break just getting you know, and that's true for so 404 00:20:18,200 --> 00:20:20,320 Speaker 3: many industries. You know, Yeah, it's just getting by, and 405 00:20:20,359 --> 00:20:22,919 Speaker 3: it's just like okay, can I breathe? No, I just 406 00:20:22,920 --> 00:20:25,320 Speaker 3: got to carry on. Yeah, So I would say it's 407 00:20:25,359 --> 00:20:27,680 Speaker 3: not enough. I mean and It's probably true for all 408 00:20:27,680 --> 00:20:30,760 Speaker 3: frontline services, police, ambulance, fire service. 409 00:20:31,520 --> 00:20:35,280 Speaker 2: Were you traveling around all so a particular crime occurs 410 00:20:35,280 --> 00:20:37,159 Speaker 2: and then you were. 411 00:20:37,000 --> 00:20:39,240 Speaker 3: Based in I was in here in Auckland and. 412 00:20:40,080 --> 00:20:43,280 Speaker 2: Then everyone that's part that gets flown to the place 413 00:20:43,320 --> 00:20:45,640 Speaker 2: where it is because I mean, there's not that many 414 00:20:45,680 --> 00:20:47,959 Speaker 2: homicides and using one of there fifty a year or 415 00:20:48,000 --> 00:20:49,280 Speaker 2: maybe yeah, there's there's, there's. 416 00:20:49,119 --> 00:20:52,960 Speaker 3: There isn't that Ye're between fifty and seventy and yeah, 417 00:20:53,000 --> 00:20:58,480 Speaker 3: so we did from the basically the Kaimunos National parkish 418 00:20:58,520 --> 00:21:02,680 Speaker 3: area north, so sixty to seventy percent of the population yep, 419 00:21:03,080 --> 00:21:05,919 Speaker 3: and there were six six or seven scientists servicing that. 420 00:21:05,960 --> 00:21:07,879 Speaker 2: Does that e quate for sixty or seventy percent of 421 00:21:07,920 --> 00:21:09,399 Speaker 2: the homicides. 422 00:21:08,880 --> 00:21:11,719 Speaker 3: Of crime as well? Maybe a little more is there? Yeah? 423 00:21:11,760 --> 00:21:16,840 Speaker 3: So you know, so yeah, we would just we'd never fly. 424 00:21:17,119 --> 00:21:20,240 Speaker 3: We're just bung whatever we could in the ute and 425 00:21:20,240 --> 00:21:22,440 Speaker 3: and drive. So you'd have four or five six hour 426 00:21:22,520 --> 00:21:23,240 Speaker 3: road trips. 427 00:21:22,960 --> 00:21:26,200 Speaker 4: Sometimes and then you're stand in like a motel somewhere. 428 00:21:26,000 --> 00:21:28,200 Speaker 3: Like yeah, or you know, if you're lucky, you would 429 00:21:28,240 --> 00:21:30,200 Speaker 3: make your own choice. Otherwise the police might put you 430 00:21:30,280 --> 00:21:31,720 Speaker 3: up in some lodge or something. 431 00:21:31,760 --> 00:21:33,880 Speaker 2: And get balloted. 432 00:21:34,080 --> 00:21:37,919 Speaker 3: Yeah yeah, yeah, so you know, I've been on police 433 00:21:37,960 --> 00:21:41,240 Speaker 3: launchers and stayed in shacks. I've done all sorts of 434 00:21:41,240 --> 00:21:41,880 Speaker 3: weird stuff. 435 00:21:42,560 --> 00:21:44,040 Speaker 2: Maybe that contributed to the pets. 436 00:21:44,080 --> 00:21:47,160 Speaker 4: Yeah, yeah, absolutely. So you have this harrowing day at work, 437 00:21:47,240 --> 00:21:49,640 Speaker 4: he's got to go home and land a bunk next 438 00:21:49,640 --> 00:21:51,800 Speaker 4: to Yeah, Jared is snoring next to you. 439 00:21:51,960 --> 00:21:54,000 Speaker 3: Well not only that, but you've been You've been sitting 440 00:21:54,000 --> 00:21:57,720 Speaker 3: next to Jared for five hours. Yeah, you know, in 441 00:21:57,840 --> 00:22:00,280 Speaker 3: your in your ute, trying to make conversation. And that's 442 00:22:00,280 --> 00:22:02,760 Speaker 3: a whole different level of stress. You know, it's like, 443 00:22:03,080 --> 00:22:03,840 Speaker 3: what are you talking about? 444 00:22:03,880 --> 00:22:04,160 Speaker 4: Guys? 445 00:22:04,960 --> 00:22:07,359 Speaker 2: Yeah, we got sent a text earlier on because we 446 00:22:07,400 --> 00:22:09,520 Speaker 2: do a segment on our show called lame Claims to Fame, 447 00:22:09,520 --> 00:22:13,960 Speaker 2: We got sent a text someone who claimed their lame 448 00:22:14,000 --> 00:22:19,120 Speaker 2: claim to fame was that they sold the pie that 449 00:22:19,400 --> 00:22:21,520 Speaker 2: was found the PI rapple that was found in Mark 450 00:22:21,600 --> 00:22:25,159 Speaker 2: Lundy's vehicle, and they sold the pie to Mark Lundy 451 00:22:26,600 --> 00:22:30,000 Speaker 2: claim claim to fame. That may have ended up as 452 00:22:30,080 --> 00:22:34,840 Speaker 2: the evidence and potentially his shirt or whatever it's jumper 453 00:22:34,960 --> 00:22:36,080 Speaker 2: or whatever that. 454 00:22:36,119 --> 00:22:37,520 Speaker 3: Was brain matter? Was it a pie? 455 00:22:37,720 --> 00:22:38,640 Speaker 4: Yeah? 456 00:22:38,880 --> 00:22:41,040 Speaker 2: You do you know much about that case? 457 00:22:41,680 --> 00:22:45,240 Speaker 3: A little yeah, yeah. I mean what I know is that, 458 00:22:45,480 --> 00:22:48,600 Speaker 3: you know, the big controversy there was novel science. You know, 459 00:22:48,680 --> 00:22:52,080 Speaker 3: it's like, you know, just because you can perform an 460 00:22:52,080 --> 00:22:56,960 Speaker 3: experiment and establish that something is consistent with something else, 461 00:22:57,440 --> 00:22:59,880 Speaker 3: there's no backing to that. You know, you haven't validated 462 00:22:59,920 --> 00:23:03,240 Speaker 3: the process. It hasn't been peer reviewed, it's not internationally accepted. 463 00:23:03,480 --> 00:23:05,879 Speaker 3: All that sort of stuff comes into place. So and 464 00:23:05,920 --> 00:23:07,560 Speaker 3: that's very similar to a lot of the things that 465 00:23:07,560 --> 00:23:09,080 Speaker 3: we do at work. It's kind of like, you know, 466 00:23:09,119 --> 00:23:11,280 Speaker 3: oh no, my guts, is I reckon? You know, there's 467 00:23:11,280 --> 00:23:13,280 Speaker 3: there's there's that sort of stuff that creeps in, But 468 00:23:13,359 --> 00:23:15,040 Speaker 3: it's like how do you bring the science back in, 469 00:23:15,200 --> 00:23:17,320 Speaker 3: especially when you're like, for me, a lot of the 470 00:23:17,359 --> 00:23:21,200 Speaker 3: experimental work I did was smacking you know, mannequins filled 471 00:23:21,200 --> 00:23:23,119 Speaker 3: with blood around the head, you know, and seeing what 472 00:23:23,200 --> 00:23:26,320 Speaker 3: sort of blood stains emanated from that, and it's like, yeah, 473 00:23:26,359 --> 00:23:28,399 Speaker 3: it looks the same as what I saw previously, So 474 00:23:28,480 --> 00:23:31,320 Speaker 3: it probably is yeah, yeah, yeah, and you can build 475 00:23:31,400 --> 00:23:36,240 Speaker 3: up this so called backstory of expertise. Yeah, but is 476 00:23:36,240 --> 00:23:39,520 Speaker 3: it unless you do some real grounded science and experimentation. 477 00:23:39,960 --> 00:23:42,159 Speaker 4: Yeah, And I suppose that that'd always be able to 478 00:23:42,200 --> 00:23:44,719 Speaker 4: pick some sort of hole in anything you do. And 479 00:23:45,359 --> 00:23:47,439 Speaker 4: are you aware of that while you're while you're doing it, 480 00:23:47,720 --> 00:23:49,119 Speaker 4: are you thinking in the back of your mind, well, 481 00:23:49,119 --> 00:23:50,439 Speaker 4: they're going to try and pick a hole in this, 482 00:23:50,560 --> 00:23:51,880 Speaker 4: or I need to make sure I do this because 483 00:23:51,880 --> 00:23:52,080 Speaker 4: of the. 484 00:23:52,600 --> 00:23:55,080 Speaker 3: Yeah, it's impossible to get away from what's going to 485 00:23:55,119 --> 00:23:57,320 Speaker 3: trip me up here? Yeah, you know what, what little 486 00:23:57,680 --> 00:24:00,239 Speaker 3: niggli thing are they going to focus on? And how 487 00:24:00,320 --> 00:24:02,719 Speaker 3: is it all going to come unrid? And you know, 488 00:24:02,760 --> 00:24:06,679 Speaker 3: we also go quite conservative with our conclusions, so it's 489 00:24:06,760 --> 00:24:10,920 Speaker 3: kind of like could have in my opinion, I can't 490 00:24:10,960 --> 00:24:14,840 Speaker 3: exclude the possibility all those fence sitting scientists talk, you 491 00:24:14,880 --> 00:24:17,720 Speaker 3: know that that drives law is nuts. Yeah, but we're 492 00:24:17,720 --> 00:24:19,600 Speaker 3: trying to protect our integrity and you know. 493 00:24:19,640 --> 00:24:21,080 Speaker 4: Yeah, for sure you don't know. 494 00:24:21,240 --> 00:24:22,280 Speaker 2: I mean what we know. 495 00:24:22,440 --> 00:24:25,919 Speaker 3: No, no, even even DNA DNA analysis, you know, it 496 00:24:26,080 --> 00:24:30,680 Speaker 3: feels like it's conclusive. But unless you know the DNA 497 00:24:30,760 --> 00:24:34,119 Speaker 3: profile of every single person in the world, you can't 498 00:24:34,200 --> 00:24:37,239 Speaker 3: ever say that that DNA profile could not have been 499 00:24:37,240 --> 00:24:40,199 Speaker 3: shared by somebody else, you know, So we we we 500 00:24:40,280 --> 00:24:43,720 Speaker 3: talk about you know, factors like one hundred thousand, million, million, 501 00:24:43,760 --> 00:24:46,240 Speaker 3: million times more likely to be X than X, you know, 502 00:24:47,200 --> 00:24:49,560 Speaker 3: and that's like it seem sounds ludicrous, like you know, 503 00:24:49,800 --> 00:24:51,560 Speaker 3: you're just saying it's the dude, but he's the guy. 504 00:24:51,720 --> 00:24:55,480 Speaker 4: Yeah, yeah, but you can't because you can't exclude the Yeah. 505 00:24:55,480 --> 00:24:57,639 Speaker 4: The other one I know they talked about in recent 506 00:24:57,680 --> 00:25:00,720 Speaker 4: trials is the such and such as con stan't worth. 507 00:25:00,520 --> 00:25:03,399 Speaker 3: It really unhelpful. What does that actually mean? 508 00:25:03,640 --> 00:25:06,159 Speaker 2: I think the juries as well. It's a bit like 509 00:25:06,280 --> 00:25:09,520 Speaker 2: beyond reasonable doubt. Yeah, very hard thing for the jury 510 00:25:09,560 --> 00:25:09,920 Speaker 2: to get there. 511 00:25:10,240 --> 00:25:13,880 Speaker 3: It's it's largely meaningless and it just leaves the sort 512 00:25:13,920 --> 00:25:16,440 Speaker 3: of the conclusion up to the jury, you know. And 513 00:25:16,440 --> 00:25:19,040 Speaker 3: and the other thing is if it's very different if 514 00:25:19,080 --> 00:25:22,240 Speaker 3: you have a like research has shown that the mere 515 00:25:22,280 --> 00:25:25,800 Speaker 3: presence of a forensic scientist in court delivering evidence amps 516 00:25:25,840 --> 00:25:29,119 Speaker 3: that up, even if that evidence is not material to 517 00:25:29,160 --> 00:25:32,000 Speaker 3: the question being asked in trial. You know. So it's 518 00:25:32,040 --> 00:25:36,600 Speaker 3: really important that you know, if something's not critical, that 519 00:25:36,640 --> 00:25:39,560 Speaker 3: a forensic scientist isn't demonstrating it, because people wake up 520 00:25:39,560 --> 00:25:41,719 Speaker 3: when they see the CSI coming. Yeah, you know I've 521 00:25:41,760 --> 00:25:43,720 Speaker 3: seen juries like suddenly wake up because I'm talking about 522 00:25:43,720 --> 00:25:46,680 Speaker 3: blood or something else. You know, it's like, oh, find. 523 00:25:47,000 --> 00:25:49,240 Speaker 4: Anything just because they're like they're gonna believe, you know, 524 00:25:49,880 --> 00:25:51,040 Speaker 4: brings a weight with them. 525 00:25:52,359 --> 00:25:57,719 Speaker 2: Rumor Miller the Blood says otherwise. Murder Forensics and Hidden Truths. 526 00:25:57,800 --> 00:26:00,160 Speaker 2: It's a book that's in stores today. I think, thank 527 00:26:00,119 --> 00:26:02,840 Speaker 2: you so much for your time. It's been absolutely fascinating, 528 00:26:02,880 --> 00:26:04,639 Speaker 2: bist of luck. I hope the book goes well for you. 529 00:26:04,880 --> 00:26:06,760 Speaker 3: Thank you so much, guys, I really appreciate it. 530 00:26:06,960 --> 00:26:09,600 Speaker 1: Jeri and Manaiah catch the radio show from six to 531 00:26:09,760 --> 00:26:11,920 Speaker 1: ten weekdays, The Hadaky Breakfast