1 00:00:05,040 --> 00:00:08,960 Speaker 1: Can you measure pedophilia in a brain scan? Can you 2 00:00:09,000 --> 00:00:13,560 Speaker 1: measure a lie from somebody's blood pressure? And how should 3 00:00:13,600 --> 00:00:16,400 Speaker 1: a judge in court who's not an expert in science 4 00:00:17,040 --> 00:00:20,000 Speaker 1: decide these things? What does any of this have to 5 00:00:20,000 --> 00:00:24,959 Speaker 1: do with President Ronald Reagan or antisocial personality disorder or 6 00:00:25,000 --> 00:00:32,920 Speaker 1: how the television show CSI has impacted courtrooms. Welcome to 7 00:00:33,000 --> 00:00:36,680 Speaker 1: Inner Cosmos with me David Eagleman. I'm a neuroscientist and 8 00:00:36,720 --> 00:00:39,519 Speaker 1: an author at Stanford and I've spent my career at 9 00:00:39,520 --> 00:00:43,680 Speaker 1: the intersection of our brains and our lives. In today's 10 00:00:43,760 --> 00:00:49,080 Speaker 1: episode is about an aspect of the intersection between brains 11 00:00:49,159 --> 00:00:57,320 Speaker 1: and the legal system, and it's a tricky one. The 12 00:00:57,440 --> 00:01:04,520 Speaker 1: question is when neuroscience techniques are allowed in courts, When 13 00:01:04,520 --> 00:01:08,000 Speaker 1: should they be allowed? What bars need to be passed 14 00:01:08,360 --> 00:01:12,720 Speaker 1: for a technology to be accepted. So let's start on 15 00:01:13,160 --> 00:01:16,640 Speaker 1: March thirtieth, nineteen eighty one, when the President of the 16 00:01:16,760 --> 00:01:21,920 Speaker 1: United States, Ronald Reagan, has been delivering a speech and afterwards, 17 00:01:21,920 --> 00:01:25,240 Speaker 1: he and his team are returning to his limousine and 18 00:01:25,319 --> 00:01:28,600 Speaker 1: he gives a two big arm wave to the crowd 19 00:01:29,160 --> 00:01:34,200 Speaker 1: and suddenly there are gunshots ringing out and everyone's diving, 20 00:01:34,840 --> 00:01:38,360 Speaker 1: and President Reagan is hit with a ricochet off his limousine, 21 00:01:38,680 --> 00:01:43,000 Speaker 1: and Press Secretary James Brady falls and Secret Service agent 22 00:01:43,040 --> 00:01:47,280 Speaker 1: Tim McCarthy falls, and a DC Police officer named Thomas 23 00:01:47,319 --> 00:01:51,520 Speaker 1: del Haunty is also wounded, and the President arrives at 24 00:01:51,600 --> 00:01:56,000 Speaker 1: the emergency room in critical condition and almost dies. And 25 00:01:56,080 --> 00:01:58,640 Speaker 1: for those of you who weren't alive in nineteen eighty one, 26 00:01:58,960 --> 00:02:01,840 Speaker 1: or for whom this has re seated in memory, just 27 00:02:01,880 --> 00:02:06,320 Speaker 1: try to picture the horror that this entailed. Now you 28 00:02:06,360 --> 00:02:11,400 Speaker 1: may remember that the gunman, John Hinckley, had a deep psychosis. 29 00:02:11,440 --> 00:02:15,519 Speaker 1: He was divorced from reality, and he believed that if 30 00:02:15,520 --> 00:02:19,360 Speaker 1: he shot the president, he would win the love of 31 00:02:19,480 --> 00:02:22,960 Speaker 1: the actress Jody Foster. There's a lot to say about 32 00:02:23,000 --> 00:02:26,000 Speaker 1: this case, and in episodes thirty six and thirty seven 33 00:02:26,160 --> 00:02:30,000 Speaker 1: I talked about the insanity defense, but here I want 34 00:02:30,000 --> 00:02:33,400 Speaker 1: to zoom in on a very particular aspect. The thing 35 00:02:33,960 --> 00:02:36,840 Speaker 1: most salient to us today was the fact that this 36 00:02:37,040 --> 00:02:40,440 Speaker 1: was the first high profile case to use a form 37 00:02:40,680 --> 00:02:46,280 Speaker 1: of brain imaging. Hinckley's lawyers pled not guilty by reason 38 00:02:46,360 --> 00:02:51,200 Speaker 1: of insanity, and to support their defense, they introduced brain 39 00:02:51,360 --> 00:02:57,799 Speaker 1: imaging evidence so his defense counsel argued that he was schizophrenic, 40 00:02:57,919 --> 00:03:01,799 Speaker 1: and they argued they could prove this by showing CAT 41 00:03:01,840 --> 00:03:07,120 Speaker 1: scans or CT scans. CT stands for computer aided tomography 42 00:03:07,280 --> 00:03:11,760 Speaker 1: or computerized demography. Now, the lawyers on both sides agreed 43 00:03:12,320 --> 00:03:16,880 Speaker 1: that cat scans had never before been admitted as evidence 44 00:03:16,960 --> 00:03:20,920 Speaker 1: in a courtroom. Neuroimaging was brand new at this time, 45 00:03:21,360 --> 00:03:26,440 Speaker 1: So should the judge allow this new Fengal technology to 46 00:03:26,520 --> 00:03:30,880 Speaker 1: be accepted or not? Well, it's not obvious. Can you 47 00:03:30,960 --> 00:03:34,760 Speaker 1: really tell if someone suffers from schizophrenia just by looking 48 00:03:35,280 --> 00:03:39,640 Speaker 1: at an anatomical picture of the brain. It's not obvious. 49 00:03:39,760 --> 00:03:43,240 Speaker 1: So the judge decided to dismiss the jury so that 50 00:03:43,280 --> 00:03:46,360 Speaker 1: he could hear the arguments about whether or not the 51 00:03:46,440 --> 00:03:51,880 Speaker 1: technology was relevant and should be admitted. And expert witness, 52 00:03:51,920 --> 00:03:56,560 Speaker 1: a physician, pointed out that Hinckley's soul sigh, which are 53 00:03:56,560 --> 00:03:59,880 Speaker 1: the valleys running along the outside of the brain, these 54 00:04:00,080 --> 00:04:04,320 Speaker 1: were wider than average, and this physician cited a paper 55 00:04:04,440 --> 00:04:10,800 Speaker 1: suggesting a connection between schizophrenia and wider sulsie. So the 56 00:04:10,880 --> 00:04:15,040 Speaker 1: assertion was, if you have schizophrenia, you can see that 57 00:04:15,240 --> 00:04:18,000 Speaker 1: just by looking at a cat scan of the brain. 58 00:04:18,560 --> 00:04:22,400 Speaker 1: So this doctor said, quote the fact that one third 59 00:04:22,440 --> 00:04:27,040 Speaker 1: of schizophrenic participants in the study had these widened sulsie 60 00:04:27,320 --> 00:04:29,800 Speaker 1: whereas in normals probably less than one out of fifty 61 00:04:29,839 --> 00:04:33,839 Speaker 1: have them. That is a very powerful fact. But the 62 00:04:33,920 --> 00:04:37,839 Speaker 1: prosecution rebutted this. They said, no way, it has not 63 00:04:38,000 --> 00:04:41,760 Speaker 1: been proven that a cat scan can aid in the 64 00:04:41,839 --> 00:04:47,000 Speaker 1: diagnosis of schizophrenia, and therefore this evidence should not be 65 00:04:47,080 --> 00:04:50,400 Speaker 1: presented to the jury. In other words, they argued the 66 00:04:50,480 --> 00:04:54,000 Speaker 1: technology should be excluded from the courtroom because it was 67 00:04:54,040 --> 00:04:57,880 Speaker 1: not yet ready for prime time. The judge listened to 68 00:04:57,920 --> 00:05:01,479 Speaker 1: the arguments and he finally decided that he would not 69 00:05:01,880 --> 00:05:05,640 Speaker 1: admit the cat scan. Then nine days later he heard 70 00:05:05,680 --> 00:05:08,960 Speaker 1: more expert testimony and he confirmed that he would not 71 00:05:09,120 --> 00:05:12,760 Speaker 1: take the cat scan. And then he changed his mind 72 00:05:12,800 --> 00:05:15,800 Speaker 1: and reported he would take the cat scan. Okay, So 73 00:05:15,880 --> 00:05:19,200 Speaker 1: what is this back and forth illustrate? It illustrates the 74 00:05:19,360 --> 00:05:24,840 Speaker 1: difficulty for a judge to decide what makes meaningful evidence 75 00:05:25,000 --> 00:05:28,880 Speaker 1: and what does not. In the end, Hinckley was found 76 00:05:29,120 --> 00:05:32,240 Speaker 1: not guilty by reason of insanity, although that had little 77 00:05:32,480 --> 00:05:34,760 Speaker 1: or nothing to do with the cat scan. But this 78 00:05:34,880 --> 00:05:38,560 Speaker 1: high profile case is just one of hundreds where this 79 00:05:38,680 --> 00:05:43,400 Speaker 1: question comes up, should neuroimaging be allowed in the courtroom. 80 00:05:43,720 --> 00:05:47,080 Speaker 1: There's no single answer to this question, and in part 81 00:05:47,120 --> 00:05:49,880 Speaker 1: that's because there are many different guyses in which it 82 00:05:49,920 --> 00:05:52,320 Speaker 1: comes up. And so that's what we're going to talk 83 00:05:52,360 --> 00:05:55,520 Speaker 1: about today. We're going to talk about how any technology 84 00:05:55,920 --> 00:06:01,760 Speaker 1: gets into courtrooms. So to motivate this, imagine that we 85 00:06:01,880 --> 00:06:06,280 Speaker 1: start seeing advertisements for a new Silicon Valley company that 86 00:06:06,320 --> 00:06:10,599 Speaker 1: has developed a new mind reading technology. They call this 87 00:06:10,800 --> 00:06:14,440 Speaker 1: the Palo Alto three thousand, and they strap it to 88 00:06:14,480 --> 00:06:17,719 Speaker 1: your head and they measure some brain waves and pass 89 00:06:17,839 --> 00:06:21,320 Speaker 1: that through a large language model, and they print out 90 00:06:21,360 --> 00:06:24,680 Speaker 1: in words to a screen what you are thinking. So 91 00:06:24,720 --> 00:06:28,200 Speaker 1: you might be thinking about wanting a hot dog with pickles. 92 00:06:28,200 --> 00:06:31,440 Speaker 1: In this machine will print to the screen, I want 93 00:06:31,440 --> 00:06:34,600 Speaker 1: a hot dog with pickles. Now this is totally made up, 94 00:06:34,680 --> 00:06:38,120 Speaker 1: but pretend it's true. In a few years that said 95 00:06:38,200 --> 00:06:42,920 Speaker 1: company launches, and let's say the technology looks pretty good. 96 00:06:42,960 --> 00:06:47,040 Speaker 1: It captures the gist of what you are thinking about. Now, 97 00:06:47,080 --> 00:06:51,400 Speaker 1: the question is should this be admissible in a court 98 00:06:51,520 --> 00:06:55,640 Speaker 1: of law. Let's imagine that someone puts it on and 99 00:06:55,720 --> 00:06:59,080 Speaker 1: states under oath that on April twenty fifth, he was 100 00:06:59,080 --> 00:07:02,320 Speaker 1: getting dinner with his family, but suddenly the screen prints 101 00:07:02,360 --> 00:07:05,520 Speaker 1: out I committed the crime. Now, how do you know 102 00:07:05,560 --> 00:07:10,240 Speaker 1: whether to believe that or not? The company is started 103 00:07:10,240 --> 00:07:12,800 Speaker 1: by a handful of young people who dropped out of college, 104 00:07:13,040 --> 00:07:16,360 Speaker 1: and they claim to be experts in neuroscience, But how 105 00:07:16,360 --> 00:07:20,520 Speaker 1: do you know whether it really works? And especially in 106 00:07:20,560 --> 00:07:23,840 Speaker 1: a high stakes situation, should you accept this in a 107 00:07:23,960 --> 00:07:27,800 Speaker 1: court of law or not? Well, some people, in order 108 00:07:27,880 --> 00:07:30,600 Speaker 1: to judge the quality of the technology, they ask, well, 109 00:07:30,640 --> 00:07:33,520 Speaker 1: are they charging for this technology? But that's not a 110 00:07:33,760 --> 00:07:37,320 Speaker 1: meaningful measure. Of course they're charging. They can't develop new 111 00:07:37,360 --> 00:07:41,560 Speaker 1: technologies for free, anymore than you would expect Apple to 112 00:07:41,640 --> 00:07:44,440 Speaker 1: not charge for their laptop. But the fact that they're 113 00:07:44,840 --> 00:07:49,560 Speaker 1: charging certainly doesn't rule in or out anything about its efficacy. 114 00:07:49,720 --> 00:07:53,160 Speaker 1: So how do you know whether the technology is efficacious? 115 00:07:53,240 --> 00:07:55,440 Speaker 1: Can it be used in a court of law? How 116 00:07:55,480 --> 00:07:58,800 Speaker 1: do you know whether it works and provides what the 117 00:07:58,880 --> 00:08:03,480 Speaker 1: legal system calls probative value, which means can it do 118 00:08:03,560 --> 00:08:07,280 Speaker 1: what it's supposed to do? Can it provide sufficiently useful 119 00:08:07,400 --> 00:08:12,080 Speaker 1: evidence to prove something in a trial? So this is 120 00:08:12,120 --> 00:08:14,720 Speaker 1: what we're going to talk about today. Most of the 121 00:08:14,760 --> 00:08:18,360 Speaker 1: time we don't realize that new technologies always have to 122 00:08:18,400 --> 00:08:22,640 Speaker 1: be assessed by courtrooms to know whether they should be 123 00:08:22,720 --> 00:08:26,320 Speaker 1: accepted or rejected. And some get in and then we 124 00:08:26,360 --> 00:08:29,560 Speaker 1: take that as background furniture, and others never make it. 125 00:08:29,840 --> 00:08:32,600 Speaker 1: And what we're going to see today is how and why. 126 00:08:33,160 --> 00:08:37,120 Speaker 1: So fast forward some decades from the Hinckley trial. Where 127 00:08:37,160 --> 00:08:40,680 Speaker 1: are we now? What is allowed in the courtroom? Well, 128 00:08:40,720 --> 00:08:45,280 Speaker 1: we have more sophisticated technologies to image the brain. Now, 129 00:08:45,360 --> 00:08:48,480 Speaker 1: for example, we can get a picture of the brain 130 00:08:48,679 --> 00:08:53,520 Speaker 1: in an MRII scan. Magnetic resonance imaging MRI gives you 131 00:08:53,679 --> 00:08:57,800 Speaker 1: a snapshot of what the brain of a person looks like. 132 00:08:58,200 --> 00:09:01,679 Speaker 1: You're not seeing activity there, You're just seeing the anatomy. 133 00:09:02,360 --> 00:09:04,760 Speaker 1: Think of this an analogy to the way you would 134 00:09:04,760 --> 00:09:07,080 Speaker 1: look at someone's skeleton with an X ray. You can't 135 00:09:07,080 --> 00:09:10,640 Speaker 1: see anything moving around what you see as a snapshot. 136 00:09:10,960 --> 00:09:14,920 Speaker 1: So with MRI you can hope to see abnormalities like 137 00:09:15,000 --> 00:09:19,000 Speaker 1: a tumor or evidence of a stroke, or the consequences 138 00:09:19,040 --> 00:09:22,960 Speaker 1: of a traumatic brain injury. Now, I've been called by 139 00:09:23,040 --> 00:09:26,160 Speaker 1: many defense lawyers over the years who say, I have 140 00:09:26,200 --> 00:09:28,880 Speaker 1: a client who's going up for trial. Can you take 141 00:09:28,920 --> 00:09:31,800 Speaker 1: a brain scan and see if you can find something 142 00:09:31,920 --> 00:09:34,680 Speaker 1: wrong with their brain, so this can serve as a 143 00:09:34,800 --> 00:09:37,680 Speaker 1: mitigating factor. But I always tell them the same thing. 144 00:09:38,120 --> 00:09:40,920 Speaker 1: If you find something wrong with your client's brain, that 145 00:09:40,960 --> 00:09:44,680 Speaker 1: can serve as a double edged sword. The duray might think, Okay, 146 00:09:44,720 --> 00:09:48,319 Speaker 1: I'm convinced there's something different about this man's brain. But 147 00:09:48,440 --> 00:09:52,920 Speaker 1: this presumably means he'll be predisposed to committing this kind 148 00:09:52,960 --> 00:09:55,840 Speaker 1: of crime again, so we should probably lock him up 149 00:09:55,880 --> 00:09:58,880 Speaker 1: for a longer time. So a defense lawyer has to 150 00:09:59,400 --> 00:10:03,480 Speaker 1: utilize this argument with care. In any case, what MRI 151 00:10:03,760 --> 00:10:07,600 Speaker 1: gives you is an anatomical snapshot. And now I want 152 00:10:07,640 --> 00:10:11,280 Speaker 1: to tell you about the next level of technology called fMRI. 153 00:10:11,760 --> 00:10:15,120 Speaker 1: Where the F stands for fancy MRI. Okay, I'm kidding. 154 00:10:15,240 --> 00:10:20,400 Speaker 1: It stands for functional magnetic resonance imaging fMRI. And this 155 00:10:20,440 --> 00:10:24,320 Speaker 1: is because it's telling you about the function of the brain. 156 00:10:24,920 --> 00:10:28,920 Speaker 1: It's measuring blood flow to show you where the activity 157 00:10:28,920 --> 00:10:32,480 Speaker 1: in the brain just was. This works because when brain 158 00:10:32,559 --> 00:10:36,280 Speaker 1: cells are active, they consume energy, and the blood flow 159 00:10:36,320 --> 00:10:39,760 Speaker 1: to that specific region needs to increase so that you 160 00:10:39,800 --> 00:10:43,680 Speaker 1: can bring fresh oxygenated blood to the area to restore 161 00:10:43,720 --> 00:10:48,280 Speaker 1: the used energy. So in fMRI, we see where the 162 00:10:48,320 --> 00:10:51,840 Speaker 1: new oxygenated blood is going and we say, aha, there 163 00:10:51,920 --> 00:10:55,360 Speaker 1: must have just been some activity there a few seconds ago. 164 00:10:55,880 --> 00:10:59,320 Speaker 1: So that's the difference between an anatomical snapshot or a 165 00:10:59,520 --> 00:11:03,200 Speaker 1: functional picture of what's going on. Now. Part of the 166 00:11:03,240 --> 00:11:06,839 Speaker 1: reason that you can use the static snapshot the MRI 167 00:11:07,160 --> 00:11:11,360 Speaker 1: in court is because it's generally seen as hard science. 168 00:11:11,520 --> 00:11:15,240 Speaker 1: This is the guy's brain. But when we're talking about fMRI, 169 00:11:16,000 --> 00:11:17,960 Speaker 1: what we're looking at is the activity in the brain, 170 00:11:18,280 --> 00:11:22,120 Speaker 1: and we're generally asking something about the person's mental state, 171 00:11:22,640 --> 00:11:26,160 Speaker 1: and can that be the same kind of hard science. 172 00:11:26,559 --> 00:11:28,680 Speaker 1: On the one hand, it's a clear question with a 173 00:11:28,720 --> 00:11:31,600 Speaker 1: clear answer if someone has a stroke or a brain tumor. 174 00:11:32,000 --> 00:11:34,960 Speaker 1: But this isn't the case if you want to pose 175 00:11:35,000 --> 00:11:39,880 Speaker 1: a question like did this defendant intend to kill the victim? 176 00:11:40,320 --> 00:11:44,760 Speaker 1: fMRI doesn't and can't give you clear answers like that 177 00:11:45,440 --> 00:11:48,240 Speaker 1: to questions that are useful for the legal system. So 178 00:11:48,280 --> 00:12:04,080 Speaker 1: we're going to dig into this now. So first let's 179 00:12:04,080 --> 00:12:08,559 Speaker 1: start with the question of whether fMRI has been used 180 00:12:08,600 --> 00:12:11,920 Speaker 1: in courts. The answer is yes, But the technology can 181 00:12:11,960 --> 00:12:14,160 Speaker 1: be used in different ways. It doesn't always have to 182 00:12:14,200 --> 00:12:19,000 Speaker 1: involve an individual's brain, but can sometimes be about brains 183 00:12:19,240 --> 00:12:22,000 Speaker 1: in general. So let me give you an example. There 184 00:12:22,040 --> 00:12:25,559 Speaker 1: was a murder case in Missouri where a young man 185 00:12:25,640 --> 00:12:29,240 Speaker 1: named Christopher Simmons broke into the home of a woman 186 00:12:29,320 --> 00:12:33,680 Speaker 1: named Shirley Crook. He covered her eyes and mouth with 187 00:12:33,800 --> 00:12:36,920 Speaker 1: duct tape, he bound her hands together, and then he 188 00:12:37,040 --> 00:12:40,440 Speaker 1: drove her to a state park and threw her off 189 00:12:40,559 --> 00:12:44,920 Speaker 1: a bridge to her death. Now, this was a premeditated crime, 190 00:12:45,320 --> 00:12:48,600 Speaker 1: and the evidence was overwhelming, and he admitted to the murder. 191 00:12:48,960 --> 00:12:52,280 Speaker 1: So the judge and jury handed down a death sentence. 192 00:12:53,040 --> 00:12:57,480 Speaker 1: But there was a complication. Christopher Simmons was only seventeen 193 00:12:57,559 --> 00:13:00,680 Speaker 1: years old at the time he committed the crime. And 194 00:13:00,760 --> 00:13:03,200 Speaker 1: so this case spun all the way up to the 195 00:13:03,280 --> 00:13:07,439 Speaker 1: United States Supreme Court, and the question was can you 196 00:13:07,679 --> 00:13:11,160 Speaker 1: execute someone who was under the age of eighteen when 197 00:13:11,160 --> 00:13:15,160 Speaker 1: they committed the crime? After all, the argument goes, adolescence 198 00:13:15,280 --> 00:13:19,480 Speaker 1: is characterized by poor decision making, and young people should 199 00:13:19,520 --> 00:13:22,960 Speaker 1: have the chance to grow up into a different life. Well, 200 00:13:23,480 --> 00:13:26,120 Speaker 1: one of the things that happened at his trial is 201 00:13:26,120 --> 00:13:30,880 Speaker 1: that the Supreme Court considered fMRI evidence. Now this wasn't 202 00:13:30,920 --> 00:13:35,720 Speaker 1: from Simmons's brain in particular, but from adolescence in general. 203 00:13:36,360 --> 00:13:40,640 Speaker 1: The study compared young people and adults performing the same 204 00:13:41,120 --> 00:13:45,240 Speaker 1: cognitive tasks, and what the researchers found, not surprisingly, is 205 00:13:45,280 --> 00:13:48,160 Speaker 1: that young brains are not doing precisely the same thing 206 00:13:48,160 --> 00:13:53,440 Speaker 1: as older brains. There are measurable differences. A juvenile's brain 207 00:13:53,760 --> 00:13:56,600 Speaker 1: just isn't the same thing as an adult. So the 208 00:13:56,640 --> 00:14:01,800 Speaker 1: Supreme Court justices saw this evidence, considered it, and presumably 209 00:14:01,880 --> 00:14:04,520 Speaker 1: this is part of what led the court to conclude 210 00:14:04,960 --> 00:14:09,040 Speaker 1: that it is unconstitutional to execute someone for a crime 211 00:14:09,080 --> 00:14:13,120 Speaker 1: who is a minor. Now that's an example of fMRI 212 00:14:13,400 --> 00:14:16,000 Speaker 1: making it into the court. It's been used in this 213 00:14:16,120 --> 00:14:20,320 Speaker 1: way to compare groups of people, juveniles versus adults in 214 00:14:20,360 --> 00:14:24,840 Speaker 1: this case. But things get a little trickier when you're 215 00:14:24,880 --> 00:14:28,680 Speaker 1: trying to say something about an individual's brain, the brain 216 00:14:29,120 --> 00:14:31,520 Speaker 1: of the one guy standing in front of the bench. 217 00:14:32,160 --> 00:14:34,840 Speaker 1: So what can we and can we not say with 218 00:14:34,960 --> 00:14:38,200 Speaker 1: the technology? So let's zoom in on a few examples. 219 00:14:38,640 --> 00:14:42,320 Speaker 1: Many researchers and legal minds have been asking whether one 220 00:14:42,360 --> 00:14:47,160 Speaker 1: can use brain imaging to diagnose whether someone has antisocial 221 00:14:47,280 --> 00:14:50,600 Speaker 1: personality disorder, which is a condition in which a person 222 00:14:50,680 --> 00:14:55,280 Speaker 1: has a long term pattern of manipulating, exploiting, and violating 223 00:14:55,320 --> 00:15:00,600 Speaker 1: other people people with antisocial personality disorder or a SPD. 224 00:15:01,240 --> 00:15:06,400 Speaker 1: They'll commit crimes, they'll flaunt rules, they'll act impulsively and aggressively, 225 00:15:06,960 --> 00:15:10,480 Speaker 1: they'll lie and cheat and steal. Now, this is a 226 00:15:10,520 --> 00:15:15,920 Speaker 1: condition that is massively overrepresented in the prison population. But 227 00:15:16,040 --> 00:15:19,560 Speaker 1: biologically it's not obvious what it's about. There's no single 228 00:15:19,680 --> 00:15:23,760 Speaker 1: gene here, and there's not a single environmental factor. It's 229 00:15:23,760 --> 00:15:27,840 Speaker 1: a complicated combination. And the legal system often cares to 230 00:15:27,920 --> 00:15:33,600 Speaker 1: know whether someone has ASPD or not. And so researchers 231 00:15:33,720 --> 00:15:36,240 Speaker 1: started to wonder a long time ago, could you use 232 00:15:36,320 --> 00:15:41,880 Speaker 1: brain imaging to determine in some clear categorical way, does 233 00:15:42,040 --> 00:15:46,240 Speaker 1: this person have ASPD or not. So, in one study, 234 00:15:46,600 --> 00:15:51,240 Speaker 1: researchers highlighted the brain regions that had high probability of 235 00:15:51,280 --> 00:15:56,600 Speaker 1: being anatomically different between people with ASPD and those without. 236 00:15:56,880 --> 00:15:59,360 Speaker 1: And you can look in the cortex what's called the 237 00:15:59,360 --> 00:16:02,160 Speaker 1: gray matter, or below the core text what's called the 238 00:16:02,200 --> 00:16:06,320 Speaker 1: white matter, and you can measure these small anatomical differences 239 00:16:06,400 --> 00:16:09,760 Speaker 1: between those with and without. So the question arose, can 240 00:16:09,840 --> 00:16:13,600 Speaker 1: you use this technology in court as a diagnostic tool 241 00:16:13,920 --> 00:16:18,960 Speaker 1: to say that this person has ASPD or not? Now 242 00:16:19,040 --> 00:16:20,920 Speaker 1: do you see any problems with this off the top 243 00:16:20,960 --> 00:16:24,200 Speaker 1: of your head about whether this technology can be used. 244 00:16:24,600 --> 00:16:28,160 Speaker 1: The problem is that all the scientific results come about 245 00:16:28,440 --> 00:16:32,960 Speaker 1: from examining groups of people, like fifty people in each group, 246 00:16:33,320 --> 00:16:37,320 Speaker 1: and the question is whether these group differences are strong 247 00:16:37,440 --> 00:16:41,840 Speaker 1: enough to tell you about individual differences. So this is 248 00:16:41,880 --> 00:16:45,320 Speaker 1: known as the group to individual problem. In other words, 249 00:16:45,400 --> 00:16:48,640 Speaker 1: you have data from groups of people that can be 250 00:16:48,720 --> 00:16:52,920 Speaker 1: distinguished on average, but you're trying to say something about 251 00:16:53,400 --> 00:16:57,280 Speaker 1: this individual. It would be like making an accurate statement 252 00:16:57,360 --> 00:17:02,000 Speaker 1: that men on average are taught than women, and then 253 00:17:02,360 --> 00:17:06,199 Speaker 1: asking whether some individual, like a tall woman, could be 254 00:17:06,280 --> 00:17:09,639 Speaker 1: categorized as a man because her height clocks in at 255 00:17:09,680 --> 00:17:12,800 Speaker 1: the average mail. The legal system is well aware of 256 00:17:12,840 --> 00:17:17,800 Speaker 1: this grouped individual problem, and so as technologies are introduced, 257 00:17:18,280 --> 00:17:22,040 Speaker 1: the justice system always needs to ask how specific is 258 00:17:22,080 --> 00:17:24,880 Speaker 1: this technology and how sensitive is it? Is it good 259 00:17:25,040 --> 00:17:30,400 Speaker 1: enough for individual diagnosis. Brain imaging studies generally just give 260 00:17:30,520 --> 00:17:34,159 Speaker 1: us group average results, and the question is whether it 261 00:17:34,359 --> 00:17:38,360 Speaker 1: tells us enough or anything about the person who's standing 262 00:17:38,440 --> 00:17:41,720 Speaker 1: in front of the bench right now. Now. The idea 263 00:17:41,960 --> 00:17:46,320 Speaker 1: of bringing functional brain imaging to bear on questions of 264 00:17:46,359 --> 00:17:50,400 Speaker 1: criminal behavior is an old one, and this grouped individual 265 00:17:50,440 --> 00:17:53,680 Speaker 1: problem is just as old. For example, there was a 266 00:17:53,720 --> 00:17:57,520 Speaker 1: study in nineteen ninety seven where researchers image the brain 267 00:17:57,720 --> 00:18:02,880 Speaker 1: of normal participants and murderers, and they found, on average, 268 00:18:02,920 --> 00:18:07,119 Speaker 1: there was less activity in the frontal lobes in murderers. 269 00:18:07,560 --> 00:18:09,879 Speaker 1: So you look at the activity in the front of 270 00:18:09,920 --> 00:18:12,440 Speaker 1: the brain behind the forehead and you say, hey, on average, 271 00:18:12,720 --> 00:18:16,280 Speaker 1: there's less going on here in the murderer group. But 272 00:18:16,400 --> 00:18:19,400 Speaker 1: you can't use this on an individual. You can't say, oh, 273 00:18:19,440 --> 00:18:22,160 Speaker 1: this person has less activity, so he must have been 274 00:18:22,200 --> 00:18:25,280 Speaker 1: the murderer. In other words, it has no power in 275 00:18:25,320 --> 00:18:28,199 Speaker 1: a court of law. You still face the problem of 276 00:18:28,359 --> 00:18:32,680 Speaker 1: trying to say anything about an individual from a group average. 277 00:18:33,000 --> 00:18:36,160 Speaker 1: And so it's for reasons like this that brain imaging 278 00:18:36,359 --> 00:18:40,560 Speaker 1: on individuals has not gotten very far in courtrooms. Let 279 00:18:40,600 --> 00:18:44,080 Speaker 1: me give one more example. Another research group used brain 280 00:18:44,160 --> 00:18:49,200 Speaker 1: imaging fMRI to see if they could identify pedophiles. They 281 00:18:49,240 --> 00:18:53,600 Speaker 1: found twenty four pedophiles and thirty four controls, and they 282 00:18:53,680 --> 00:18:57,919 Speaker 1: showed them images of naked men and women and boys 283 00:18:57,960 --> 00:19:00,480 Speaker 1: and girls. And what they found is that they could 284 00:19:00,640 --> 00:19:05,359 Speaker 1: on average separate the participants who were pedophiles from the 285 00:19:05,400 --> 00:19:09,399 Speaker 1: participants who were not. In other words, the pedophilic brain 286 00:19:09,560 --> 00:19:13,960 Speaker 1: shows a subtly different signature of brain activity than the 287 00:19:14,040 --> 00:19:17,960 Speaker 1: non pedophilic brain when shown these pictures. It turns out 288 00:19:17,960 --> 00:19:22,240 Speaker 1: that heterosexual versus homosexual seems to be distinguishable as well. 289 00:19:22,840 --> 00:19:25,359 Speaker 1: So you might think that sounds quite useful for the 290 00:19:25,440 --> 00:19:29,840 Speaker 1: legal system, but when scientists and legal scholars take a 291 00:19:29,880 --> 00:19:33,280 Speaker 1: closer look, it's not as clear. The first question is 292 00:19:33,960 --> 00:19:38,520 Speaker 1: what are these brain signals actually measuring. The assumption is 293 00:19:38,560 --> 00:19:42,000 Speaker 1: that it's measuring a state of arousal, like sexual attraction, 294 00:19:42,560 --> 00:19:45,840 Speaker 1: but what else might be going on? Well, the difference 295 00:19:45,880 --> 00:19:49,840 Speaker 1: in brain signals could be driven by a stress response 296 00:19:49,960 --> 00:19:54,239 Speaker 1: or an anxiety response by the pedophilic participants who know 297 00:19:54,280 --> 00:19:58,520 Speaker 1: they're being measured. Or perhaps what you're seeing is a 298 00:19:58,600 --> 00:20:02,120 Speaker 1: measure of disgust by the non pedophilic group who knows 299 00:20:02,200 --> 00:20:05,000 Speaker 1: the purpose of the study and doesn't like gazing at 300 00:20:05,040 --> 00:20:08,280 Speaker 1: pictures of children in this context. Or what if the 301 00:20:08,280 --> 00:20:12,760 Speaker 1: pedophilic participants were just slightly more likely to avert their 302 00:20:12,800 --> 00:20:16,360 Speaker 1: eyes because of shame or not wanting to get measured. 303 00:20:16,840 --> 00:20:20,840 Speaker 1: That would cause a statistical difference in the brain signals 304 00:20:21,200 --> 00:20:24,480 Speaker 1: and could, in theory, explain the results. So there are 305 00:20:24,640 --> 00:20:29,400 Speaker 1: lots of things that could yield this brain imaging result 306 00:20:29,480 --> 00:20:33,080 Speaker 1: of a difference between the two groups, beyond the hypothesis 307 00:20:33,080 --> 00:20:37,959 Speaker 1: that it's just measuring arousal, So stress, anxiety, discuss shame, 308 00:20:38,119 --> 00:20:41,600 Speaker 1: all these things might be what's getting measured here. And 309 00:20:41,680 --> 00:20:44,000 Speaker 1: part of why this matters is because there are many 310 00:20:44,480 --> 00:20:48,439 Speaker 1: brain imaging measures where it turns out it's easy to 311 00:20:48,560 --> 00:20:52,320 Speaker 1: manipulate the results. So let's say you are a pedophile 312 00:20:52,320 --> 00:20:54,919 Speaker 1: who doesn't want to be labeled as such. Can you 313 00:20:55,040 --> 00:20:58,760 Speaker 1: purposely move your eyes whenever you see a picture of 314 00:20:58,800 --> 00:21:02,399 Speaker 1: the children, and that messes up the ability of the 315 00:21:02,480 --> 00:21:06,080 Speaker 1: scanner to measure something. If something can be faked or 316 00:21:06,119 --> 00:21:09,720 Speaker 1: messed up, then the technology is useless. But let's say, 317 00:21:09,760 --> 00:21:12,240 Speaker 1: for argument's sake, that you have a technology that can't 318 00:21:12,320 --> 00:21:15,240 Speaker 1: be faked or manipulated, and that allows us to move 319 00:21:15,240 --> 00:21:17,600 Speaker 1: on to the second point. Let's say you don't even 320 00:21:17,720 --> 00:21:20,600 Speaker 1: care what's getting measured, like stress or anxiety or whatever. 321 00:21:21,000 --> 00:21:23,280 Speaker 1: All you care to know is whether there is a 322 00:21:23,359 --> 00:21:27,280 Speaker 1: neural signature that can distinguish the pedophiles from the non pedophiles, 323 00:21:27,480 --> 00:21:30,920 Speaker 1: irrespective of what is causing that signal. Well, there's also 324 00:21:30,960 --> 00:21:33,720 Speaker 1: a legal problem here, which is that it's not illegal 325 00:21:33,800 --> 00:21:36,760 Speaker 1: for a person to be attracted to children. It is 326 00:21:36,880 --> 00:21:40,840 Speaker 1: only illegal if they act on that. All that's illegal 327 00:21:40,960 --> 00:21:43,840 Speaker 1: is whether you have committed a crime or not, not 328 00:21:44,080 --> 00:21:47,360 Speaker 1: whether you are attracted to children. So you can think 329 00:21:47,359 --> 00:21:51,720 Speaker 1: about whatever attracts you ostriches or jello or whatever, as 330 00:21:51,840 --> 00:21:57,160 Speaker 1: long as you don't commit an illegal act. So whether 331 00:21:57,200 --> 00:22:00,760 Speaker 1: you're talking about ASPD or murderers or pedophile while, you'll 332 00:22:00,800 --> 00:22:04,159 Speaker 1: see that measuring something that matters for a court of 333 00:22:04,240 --> 00:22:08,959 Speaker 1: law isn't as straightforward as it might have originally seemed. 334 00:22:09,760 --> 00:22:12,800 Speaker 1: So now let's return to the Palo Alto three thousand. 335 00:22:13,480 --> 00:22:18,080 Speaker 1: The question is just because the company claims that it functions, well, 336 00:22:18,560 --> 00:22:20,679 Speaker 1: how do you know whether or not to admit it 337 00:22:20,720 --> 00:22:23,720 Speaker 1: into the courtroom? After all, remember what I said about 338 00:22:23,760 --> 00:22:27,320 Speaker 1: the John Hinckley case, how cat scans were admitted into 339 00:22:27,320 --> 00:22:31,000 Speaker 1: the court to argue that he had schizophrenia. Well, it's 340 00:22:31,080 --> 00:22:34,480 Speaker 1: now known that wide and salsa in the brain have 341 00:22:34,600 --> 00:22:39,639 Speaker 1: no relationship to schizophrenia. There are other better anatomical signatures 342 00:22:39,640 --> 00:22:42,560 Speaker 1: that we have now, like thinner cortices in the frontal 343 00:22:42,600 --> 00:22:46,960 Speaker 1: and temporal lobes and shrunkened thalamuses. But it turned out 344 00:22:47,400 --> 00:22:49,880 Speaker 1: that the idea of white and salsa I just didn't 345 00:22:49,920 --> 00:22:52,720 Speaker 1: hold up. Now, there was nothing fraudulent going on with 346 00:22:52,800 --> 00:22:55,600 Speaker 1: the claim. It was just a new technology at the 347 00:22:55,680 --> 00:22:58,000 Speaker 1: time and they were doing the best they could with 348 00:22:58,080 --> 00:23:01,960 Speaker 1: small sample sizes. But it turned out the theory of 349 00:23:02,080 --> 00:23:05,840 Speaker 1: white and sul sight was scientifically unsound. Remember how I 350 00:23:06,000 --> 00:23:09,040 Speaker 1: mentioned that the judge went back and forth several times 351 00:23:09,440 --> 00:23:12,720 Speaker 1: about the issue of whether to accept Hinckley's cat scan 352 00:23:12,840 --> 00:23:16,000 Speaker 1: into the courtroom. That's exactly the right thing that should 353 00:23:16,040 --> 00:23:19,240 Speaker 1: have happened. Not all claims are going to be correct 354 00:23:19,320 --> 00:23:24,000 Speaker 1: just because a scientist says so. Despite best efforts, science 355 00:23:24,080 --> 00:23:28,359 Speaker 1: can often be incorrect, and that is the importance of 356 00:23:28,440 --> 00:23:32,360 Speaker 1: the scientific method. It's always knocking down its own walls. 357 00:23:33,440 --> 00:23:38,320 Speaker 1: So what is a court to do about all this? Well, 358 00:23:38,800 --> 00:23:41,639 Speaker 1: let's say that someone wants to introduce the Palo Alto 359 00:23:41,720 --> 00:23:46,400 Speaker 1: three thousand into a court case, and you are the judge. 360 00:23:47,040 --> 00:23:49,240 Speaker 1: You have expertise in the legal system, but you don't 361 00:23:49,240 --> 00:23:53,840 Speaker 1: know the details of what's possible in neuroscience and large 362 00:23:53,920 --> 00:23:58,000 Speaker 1: language models, and you have questions about whether this technology 363 00:23:58,080 --> 00:24:02,040 Speaker 1: should be admitted, questions about whether it can accurately read 364 00:24:02,119 --> 00:24:05,760 Speaker 1: people's thoughts, So how do you decide whether it should 365 00:24:05,920 --> 00:24:22,040 Speaker 1: or should not be admitted. So let's step back to 366 00:24:22,119 --> 00:24:25,560 Speaker 1: nineteen twenty three. There was a man named mister Fry 367 00:24:25,760 --> 00:24:29,320 Speaker 1: who said that he had developed a lie detection technology 368 00:24:29,800 --> 00:24:32,800 Speaker 1: and it relied on a measure of your blood pressure, 369 00:24:32,920 --> 00:24:36,480 Speaker 1: and he wanted to introduce this into a court case 370 00:24:36,960 --> 00:24:38,960 Speaker 1: the way you might want to get the Palo Alto 371 00:24:38,960 --> 00:24:41,600 Speaker 1: three thousand into a case. But it turned out that 372 00:24:41,640 --> 00:24:46,240 Speaker 1: mister Fry's claims were not widely accepted by anyone else 373 00:24:46,320 --> 00:24:50,800 Speaker 1: in the scientific community, and so on those grounds, the 374 00:24:50,920 --> 00:24:55,000 Speaker 1: court decided not to admit it into the courtroom. What 375 00:24:55,080 --> 00:24:59,720 Speaker 1: they said was, look, we'll accept expert testimony that comes 376 00:24:59,760 --> 00:25:04,119 Speaker 1: from well recognized science, but if there's some new technology, 377 00:25:04,200 --> 00:25:07,680 Speaker 1: it has to be sufficiently established so that it's gained 378 00:25:08,400 --> 00:25:12,560 Speaker 1: general acceptance in the field in which it belongs. In 379 00:25:12,600 --> 00:25:16,720 Speaker 1: other words, if other experts in the field don't believe 380 00:25:16,800 --> 00:25:20,520 Speaker 1: that mister Fry's systolic blood measurement is actually good at 381 00:25:20,520 --> 00:25:24,720 Speaker 1: detecting lies, then you can't admit it as evidence in 382 00:25:24,720 --> 00:25:27,560 Speaker 1: the court. And that case set the bar for what 383 00:25:27,720 --> 00:25:31,320 Speaker 1: came to be known as the Fry standard, which is 384 00:25:31,320 --> 00:25:35,360 Speaker 1: that technologies need to be generally accepted by other experts 385 00:25:35,400 --> 00:25:38,879 Speaker 1: in the field before they can be admitted into the courtroom. 386 00:25:39,119 --> 00:25:41,919 Speaker 1: So under the Fry standard, the court would work to 387 00:25:42,000 --> 00:25:46,320 Speaker 1: determine whether the Palo Alto three thousand has met the 388 00:25:46,520 --> 00:25:51,520 Speaker 1: general acceptance of the scientific community. If science experts around 389 00:25:51,520 --> 00:25:54,359 Speaker 1: the world say, I've never heard of this Palo Alto 390 00:25:54,400 --> 00:25:56,960 Speaker 1: three thousand, I don't think that it can actually work, 391 00:25:57,359 --> 00:26:01,639 Speaker 1: then you, as the judge, can glued it from admissibility. 392 00:26:02,040 --> 00:26:05,800 Speaker 1: So the court solves the problem by deferring to the 393 00:26:05,840 --> 00:26:09,680 Speaker 1: expertise of other people in the field. But this isn't 394 00:26:09,720 --> 00:26:12,920 Speaker 1: the only way to make that decision. The Fry standard 395 00:26:13,160 --> 00:26:16,320 Speaker 1: still is the rule in about half the states in America, 396 00:26:16,600 --> 00:26:20,439 Speaker 1: but the rest use a different rule to decide whether 397 00:26:20,600 --> 00:26:25,119 Speaker 1: evidence should be admitted, and this is called the Dowbert standard. 398 00:26:25,520 --> 00:26:27,880 Speaker 1: So in nineteen ninety three there was a lawsuit from 399 00:26:27,880 --> 00:26:31,960 Speaker 1: a man named Jason Dalbert. He was born with severe 400 00:26:32,359 --> 00:26:36,600 Speaker 1: birth defects and his parents brought suit against Merrill Dow 401 00:26:36,760 --> 00:26:41,200 Speaker 1: the pharmaceutical company, and they said these severe birth defects 402 00:26:41,200 --> 00:26:43,919 Speaker 1: were caused by the medication that the mother was on 403 00:26:44,400 --> 00:26:49,520 Speaker 1: called Bendicton. So the pharmaceutical company said the birth defects 404 00:26:49,560 --> 00:26:51,960 Speaker 1: were not caused by this medication, and it went to 405 00:26:52,080 --> 00:26:57,320 Speaker 1: federal court and Dalbert said, look, here are animal studies 406 00:26:57,480 --> 00:27:00,760 Speaker 1: showing that this drug is related to birth defects. And 407 00:27:01,000 --> 00:27:04,720 Speaker 1: the pharmaceutical companies expert witnesses got up and said, look, 408 00:27:04,800 --> 00:27:07,600 Speaker 1: this is not generally accept in the field because these 409 00:27:07,640 --> 00:27:11,600 Speaker 1: are just animal studies and there's no conclusive evidence that 410 00:27:11,760 --> 00:27:15,320 Speaker 1: shows the link between these and humans. So if you're 411 00:27:15,400 --> 00:27:18,480 Speaker 1: the judge, how do you know how to arbitrate this? 412 00:27:18,920 --> 00:27:22,399 Speaker 1: It's difficult. Right, here's some science from the laboratories and 413 00:27:22,480 --> 00:27:25,560 Speaker 1: here's the pharmaceutical company saying it's not generally accepted in 414 00:27:25,560 --> 00:27:28,520 Speaker 1: the field that this causes birth effects. So what do 415 00:27:28,600 --> 00:27:32,400 Speaker 1: you do? Well, what happened is the case was decided 416 00:27:32,400 --> 00:27:35,359 Speaker 1: in favor of the pharmaceutical company. So Dalbert took it 417 00:27:35,400 --> 00:27:38,119 Speaker 1: on appeal to the Ninth Circuit and the Ninth Circuit 418 00:27:38,200 --> 00:27:41,920 Speaker 1: judges also awarded this to the pharmaceutical company. So Dowbert 419 00:27:41,960 --> 00:27:44,760 Speaker 1: brought this case to the Supreme Court, and the Supreme 420 00:27:44,840 --> 00:27:47,440 Speaker 1: Court analyzed this carefully, and what came out of this 421 00:27:48,080 --> 00:27:52,360 Speaker 1: was a new standard for when evidence should be admissible, 422 00:27:52,400 --> 00:27:56,040 Speaker 1: and that's known as the Dalbert standard, and the Doalbert 423 00:27:56,080 --> 00:28:01,080 Speaker 1: standard says, look, you accept expert testimony about, for example, 424 00:28:01,119 --> 00:28:04,040 Speaker 1: these labrat studies if it will help the jury to 425 00:28:04,119 --> 00:28:08,359 Speaker 1: understand the evidence better or determine the fact in issue. 426 00:28:08,600 --> 00:28:12,720 Speaker 1: In other words, it doesn't demand general acceptance in the community. 427 00:28:13,080 --> 00:28:15,960 Speaker 1: Under the Dalbert standard, the key is just whether some 428 00:28:16,119 --> 00:28:20,320 Speaker 1: piece of evidence is relevant and reliable. Now, the key 429 00:28:20,880 --> 00:28:25,400 Speaker 1: is that the Fry standard made the scientific community the gatekeeper, 430 00:28:25,960 --> 00:28:29,439 Speaker 1: but the Dalbert standard makes the judge the gatekeeper. The 431 00:28:29,600 --> 00:28:33,159 Speaker 1: judge gets to say from the beginning that they'll evaluate 432 00:28:33,240 --> 00:28:37,160 Speaker 1: this and ask is this evidence relevant and reliable? Does 433 00:28:37,160 --> 00:28:40,960 Speaker 1: it pass my bar for that? So, regarding this hypothetical 434 00:28:41,120 --> 00:28:45,280 Speaker 1: palo alto three thousand, the judge might ask has the 435 00:28:45,320 --> 00:28:48,720 Speaker 1: technique been tested in actual field conditions as opposed to 436 00:28:48,840 --> 00:28:51,800 Speaker 1: just in a laboratory. Have there been any papers on 437 00:28:51,840 --> 00:28:54,640 Speaker 1: the palab alter three thousand that were published in peer 438 00:28:54,680 --> 00:28:59,160 Speaker 1: reviewed journals? What does the rate of error? Do standards 439 00:28:59,200 --> 00:29:02,200 Speaker 1: exist for controlling the operation of the machine, and so on? 440 00:29:02,480 --> 00:29:05,440 Speaker 1: These are often difficult questions. It's not always easy for 441 00:29:05,560 --> 00:29:08,600 Speaker 1: a judge to make a decision about whether or not 442 00:29:08,680 --> 00:29:12,000 Speaker 1: to accept a new technology. But this gives a pathway 443 00:29:12,240 --> 00:29:16,440 Speaker 1: where the judge is the gatekeeper. So let's imagine for 444 00:29:16,480 --> 00:29:20,320 Speaker 1: a moment that the Palo Alto three thousand passes the 445 00:29:20,360 --> 00:29:24,560 Speaker 1: standard for admissibility. Is there any reason why the technology 446 00:29:25,000 --> 00:29:29,120 Speaker 1: might still be excluded from the courtroom. There is one reason. 447 00:29:29,560 --> 00:29:33,320 Speaker 1: Let's say that you're the defence lawyer and you say, Gosh, 448 00:29:33,360 --> 00:29:37,400 Speaker 1: this thing is so stunning that it's going to prejudice 449 00:29:37,560 --> 00:29:40,360 Speaker 1: the jury because they're going to look at this fancy technology, 450 00:29:40,960 --> 00:29:45,040 Speaker 1: and even in the absence of really good evidence, they'll say, Wow, 451 00:29:45,320 --> 00:29:48,040 Speaker 1: this guy seems guilty. Let's send him to the electric 452 00:29:48,120 --> 00:29:52,800 Speaker 1: chair without considering the other points. So to prevent that 453 00:29:52,880 --> 00:29:56,480 Speaker 1: from happening, there's a special rule called Federal Rules of 454 00:29:56,520 --> 00:30:00,040 Speaker 1: Evidence four h three, and this just says you you 455 00:30:00,040 --> 00:30:03,880 Speaker 1: should exclude evidence if what you can learn from it 456 00:30:03,920 --> 00:30:08,920 Speaker 1: is substantially outweighed by the risk of undue prejudice. In 457 00:30:08,960 --> 00:30:13,160 Speaker 1: other words, does it sway the jurors more than it should. 458 00:30:13,640 --> 00:30:15,760 Speaker 1: So what you'll see in courtrooms all the time is 459 00:30:15,800 --> 00:30:18,520 Speaker 1: that if a lawyer tries to exclude a piece of 460 00:30:18,560 --> 00:30:22,240 Speaker 1: evidence from being admitted based on let's say a Doubert objection, 461 00:30:22,600 --> 00:30:25,680 Speaker 1: but the evidence gets past that, then the lawyer is 462 00:30:25,680 --> 00:30:28,440 Speaker 1: going to take a second bite at the apple by 463 00:30:28,560 --> 00:30:33,040 Speaker 1: calling on federal rules of Evidence four three, saying, look, 464 00:30:33,160 --> 00:30:36,560 Speaker 1: even if this is relevant and reliable, it's going to 465 00:30:36,640 --> 00:30:39,960 Speaker 1: have too much sway on the jury. So why is 466 00:30:40,000 --> 00:30:44,160 Speaker 1: this an issue? Are there technologies that have undue sway 467 00:30:44,480 --> 00:30:48,280 Speaker 1: on jurors? Is that a concern? It is? And this 468 00:30:48,360 --> 00:30:52,520 Speaker 1: brings us back to fMRI. In a court of law 469 00:30:52,960 --> 00:30:57,320 Speaker 1: where jurors are your neighbors and your community and probably 470 00:30:57,320 --> 00:31:00,400 Speaker 1: not experts in neuroscience, a lot of people will be 471 00:31:00,600 --> 00:31:05,000 Speaker 1: swayed by a colorful brain image. They're going to put 472 00:31:05,040 --> 00:31:07,960 Speaker 1: a higher weight on this than maybe they should, and 473 00:31:08,160 --> 00:31:11,760 Speaker 1: possibly at the cost of not weighing this evidence appropriately 474 00:31:12,160 --> 00:31:14,840 Speaker 1: in the context of the whole case. And this is 475 00:31:14,920 --> 00:31:18,200 Speaker 1: part of the concern that some legal scholars have, and 476 00:31:18,240 --> 00:31:21,920 Speaker 1: this has come to be known as the CSI effect. 477 00:31:22,040 --> 00:31:25,320 Speaker 1: So you remember the television show CSI. This stood for 478 00:31:25,680 --> 00:31:29,280 Speaker 1: Crime Scene Investigation, and it's a television drama about a 479 00:31:29,320 --> 00:31:33,560 Speaker 1: team of forensic scientists and detectives in Las Vegas who 480 00:31:33,680 --> 00:31:38,600 Speaker 1: use cutting edge scientific techniques to solve murders. So they 481 00:31:38,880 --> 00:31:42,640 Speaker 1: go around each week and meticulously gather and analyze evidence 482 00:31:42,640 --> 00:31:47,200 Speaker 1: from crime scenes and each episode features a complex case 483 00:31:47,520 --> 00:31:50,440 Speaker 1: with an intricate puzzle and the CSI team has to 484 00:31:50,520 --> 00:31:54,360 Speaker 1: solve this to bring the criminals to justice. Well, the 485 00:31:54,520 --> 00:31:59,480 Speaker 1: idea with the real life CSI effect is that jurors 486 00:31:59,520 --> 00:32:03,040 Speaker 1: come to expect what they've seen on TV in terms 487 00:32:03,080 --> 00:32:07,320 Speaker 1: of magical machinery that does something like you hit a 488 00:32:07,320 --> 00:32:10,800 Speaker 1: button to enhance the picture and then the computer enhances it, 489 00:32:10,840 --> 00:32:13,840 Speaker 1: and they see everything with clarity, where the plot twist 490 00:32:14,160 --> 00:32:18,240 Speaker 1: requires that the investigator pull out some magical technology that 491 00:32:18,280 --> 00:32:21,640 Speaker 1: suddenly solves the crime, or looking at the pedophile's brain 492 00:32:21,760 --> 00:32:24,680 Speaker 1: with neuroimaging and knowing whether he did the crime or not. 493 00:32:25,400 --> 00:32:28,080 Speaker 1: So jurors have come to expect this sort of thing 494 00:32:28,560 --> 00:32:31,200 Speaker 1: because you don't spend all your time in a courtroom 495 00:32:31,240 --> 00:32:34,280 Speaker 1: if you're not a lawyer, and something like the television 496 00:32:34,280 --> 00:32:37,840 Speaker 1: show CSI is their only window into that world. The 497 00:32:38,040 --> 00:32:40,640 Speaker 1: problem is that it often turns out to be a 498 00:32:40,840 --> 00:32:44,480 Speaker 1: false window, and when researchers do studies on this, they 499 00:32:44,520 --> 00:32:50,040 Speaker 1: generally find that jurors see neuroimaging as the truth of 500 00:32:50,120 --> 00:32:53,840 Speaker 1: the matter asserted. So we just spent a minute on 501 00:32:53,960 --> 00:32:56,959 Speaker 1: looking at the claim that you can measure pedophilia and 502 00:32:57,040 --> 00:33:00,280 Speaker 1: we noted that the brain signals might represent that you're 503 00:33:00,280 --> 00:33:03,840 Speaker 1: a pedophile, or it might represent stress or anxiety, or 504 00:33:03,880 --> 00:33:06,640 Speaker 1: disgust or shame or averting the eyes or all kinds 505 00:33:06,640 --> 00:33:10,360 Speaker 1: of things. But that kind of nuanced analysis doesn't usually 506 00:33:10,400 --> 00:33:14,560 Speaker 1: get done, and so neuroimaging often comes to be interpreted 507 00:33:14,600 --> 00:33:17,960 Speaker 1: by the jury as the truth of the matter asserted. 508 00:33:18,360 --> 00:33:22,720 Speaker 1: This is what scholars sometimes call the fallacy of neurorealism, 509 00:33:22,760 --> 00:33:25,840 Speaker 1: and the fallacy is just that what you see in 510 00:33:25,880 --> 00:33:30,160 Speaker 1: these pretty false color images is the truth. In other words, 511 00:33:30,160 --> 00:33:34,400 Speaker 1: somebody thinks, oh, you're capturing the moment of pedophilia in 512 00:33:34,440 --> 00:33:37,600 Speaker 1: its raw form there whereas, of course, the truth is 513 00:33:37,640 --> 00:33:43,360 Speaker 1: that fMRI signals are not direct proof of the experience itself. 514 00:33:43,760 --> 00:33:47,640 Speaker 1: As a side note, these questions of bringing visual evidence 515 00:33:47,680 --> 00:33:50,280 Speaker 1: into the courtroom, they're not unique to brain imaging. They've 516 00:33:50,320 --> 00:33:52,920 Speaker 1: been around for a long time. It goes back at 517 00:33:53,000 --> 00:33:57,080 Speaker 1: least to X rays. So when X rays got introduced 518 00:33:57,120 --> 00:34:00,520 Speaker 1: in the eighteen nineties, they immediately started showing up in 519 00:34:00,640 --> 00:34:04,160 Speaker 1: court and everybody was absolutely blown away by the idea 520 00:34:04,240 --> 00:34:07,840 Speaker 1: of being able to see inside of a body. It's 521 00:34:07,880 --> 00:34:11,239 Speaker 1: like magic. So what happened over a century ago is 522 00:34:11,239 --> 00:34:13,719 Speaker 1: people asked this question of can we use this as 523 00:34:13,880 --> 00:34:17,320 Speaker 1: evidence in court? And the judge said at the time, 524 00:34:17,800 --> 00:34:21,560 Speaker 1: as long as it was scientifically reliable, it could be introduced. 525 00:34:21,800 --> 00:34:25,520 Speaker 1: But the same questions about influence on the jury came up, 526 00:34:25,640 --> 00:34:30,239 Speaker 1: because there's a real power to seeing something. And of 527 00:34:30,280 --> 00:34:33,439 Speaker 1: course what we have currently with brain imaging is even 528 00:34:33,480 --> 00:34:36,920 Speaker 1: a deeper issue because it touches on all our notions 529 00:34:37,239 --> 00:34:41,160 Speaker 1: of being human. For example, I saw a cover of 530 00:34:41,360 --> 00:34:44,680 Speaker 1: a Time magazine a while ago and the title read 531 00:34:45,360 --> 00:34:49,240 Speaker 1: what makes us Good or Evil? And the cover image 532 00:34:49,280 --> 00:34:52,560 Speaker 1: was a huge picture of a brain scan, and there 533 00:34:52,600 --> 00:34:56,120 Speaker 1: was a little picture of Mahatma Gandhi with a pointer 534 00:34:56,600 --> 00:34:58,239 Speaker 1: to a part of the brain. And there was a 535 00:34:58,239 --> 00:35:01,600 Speaker 1: little picture of Adolph Hitler with a pointer to a 536 00:35:01,680 --> 00:35:04,359 Speaker 1: different part of the brain. And in case you haven't 537 00:35:04,360 --> 00:35:06,359 Speaker 1: heard my other episodes on this, I want to make 538 00:35:06,360 --> 00:35:09,040 Speaker 1: it clear there is no such thing. You can't measure 539 00:35:09,200 --> 00:35:12,920 Speaker 1: some spot in the brain to determine whether someone is 540 00:35:13,400 --> 00:35:17,120 Speaker 1: good or evil. And by the way, Friedrich Nietzsche wrote 541 00:35:17,120 --> 00:35:20,719 Speaker 1: about this over a century ago, the words good and 542 00:35:20,920 --> 00:35:26,120 Speaker 1: evil don't even represent something fundamental, but instead these words 543 00:35:26,200 --> 00:35:29,600 Speaker 1: end up getting defined by your moment in time. What 544 00:35:29,800 --> 00:35:33,160 Speaker 1: is good right now may be seen as evil in 545 00:35:33,239 --> 00:35:37,839 Speaker 1: a century. These terms are defined by your culture. What 546 00:35:37,880 --> 00:35:40,400 Speaker 1: you think is good might be seen as sacrilege by 547 00:35:40,440 --> 00:35:44,160 Speaker 1: another group. So the idea that you could just measure 548 00:35:44,239 --> 00:35:46,520 Speaker 1: something in the brain and say whether the person is 549 00:35:46,600 --> 00:35:51,239 Speaker 1: good or evil really makes no sense. However, millions of 550 00:35:51,280 --> 00:35:55,000 Speaker 1: people see this kind of Time magazine cover, and this 551 00:35:55,200 --> 00:35:59,560 Speaker 1: is why legal scholars worry that brain images could be 552 00:35:59,600 --> 00:36:03,200 Speaker 1: persue of past the point that they should be in 553 00:36:03,239 --> 00:36:08,440 Speaker 1: the legal argot. This is known as something having undue influence. 554 00:36:08,600 --> 00:36:13,680 Speaker 1: Brain images are influential because they take some abstract issue 555 00:36:13,760 --> 00:36:17,640 Speaker 1: like evil intent and seem to nail it down to 556 00:36:17,680 --> 00:36:22,160 Speaker 1: the physical. So this is why something like Federal Rules 557 00:36:22,160 --> 00:36:25,960 Speaker 1: of Evidence four h three plays an important role in 558 00:36:26,120 --> 00:36:31,680 Speaker 1: asking whether something has undo influence, whether it sways people 559 00:36:31,920 --> 00:36:35,240 Speaker 1: more than it should now. At the extreme, some people 560 00:36:35,280 --> 00:36:39,160 Speaker 1: say functional brain images should never be allowed in the 561 00:36:39,200 --> 00:36:42,840 Speaker 1: courtroom because of their influence. One solution that a colleague 562 00:36:42,840 --> 00:36:46,920 Speaker 1: of mind suggested is that you ban the visual aspects 563 00:36:46,960 --> 00:36:49,600 Speaker 1: of brain images from the courtroom, so you just have 564 00:36:49,719 --> 00:36:52,239 Speaker 1: expert witnesses come on to the stand and tell you 565 00:36:52,680 --> 00:36:54,919 Speaker 1: what they think is going on as best they can. 566 00:36:55,120 --> 00:36:58,680 Speaker 1: But they're verbally presenting the results, not showing them. But 567 00:36:58,719 --> 00:37:01,600 Speaker 1: these are tough issues, right because you can show a 568 00:37:01,640 --> 00:37:05,560 Speaker 1: gory photograph from a crime scene, which can also prejudice 569 00:37:05,560 --> 00:37:09,640 Speaker 1: an entire courtroom. Or you can show a reenactment of 570 00:37:09,680 --> 00:37:12,520 Speaker 1: a murder, but if you can't show a brain scan, 571 00:37:13,160 --> 00:37:15,799 Speaker 1: that seems like maybe a double standard. So should you 572 00:37:15,880 --> 00:37:20,680 Speaker 1: rule out all visual images or allow everything? And if 573 00:37:20,680 --> 00:37:25,360 Speaker 1: you heard episode nineteen, I talked about eyewitness testimony and 574 00:37:25,400 --> 00:37:29,520 Speaker 1: how massively swaying that is to jurors. You can have 575 00:37:29,640 --> 00:37:33,120 Speaker 1: all sorts of expert scientific testimony, but then you have 576 00:37:33,239 --> 00:37:36,000 Speaker 1: the person get up on the stand with tears and 577 00:37:36,040 --> 00:37:38,879 Speaker 1: a cracking voice and say, I don't care what they say. 578 00:37:38,960 --> 00:37:41,719 Speaker 1: I know that's the guy. And we're all moved and 579 00:37:41,800 --> 00:37:46,560 Speaker 1: influenced by that, even though eyewitness testimony is so deeply fallible. 580 00:37:47,239 --> 00:37:49,160 Speaker 1: So this is all just to say that the question 581 00:37:49,280 --> 00:37:53,880 Speaker 1: of undue influence always has to be asked. Compared to 582 00:37:54,000 --> 00:37:59,000 Speaker 1: what compared to other technologies, compared to gory photographs of 583 00:37:59,040 --> 00:38:02,640 Speaker 1: the crime scene, compared to acting out a rape scene 584 00:38:02,680 --> 00:38:07,000 Speaker 1: or a murder scene, do those unduly sway a jury? 585 00:38:07,440 --> 00:38:08,960 Speaker 1: So I hope what you see is that These are 586 00:38:09,080 --> 00:38:13,239 Speaker 1: tough issues, perhaps tougher than you had intuited at the 587 00:38:13,280 --> 00:38:17,640 Speaker 1: beginning of the episode, So let's wrap up. We often 588 00:38:17,960 --> 00:38:20,799 Speaker 1: think that when a new technology comes along, like a 589 00:38:20,840 --> 00:38:24,759 Speaker 1: new brain technology, it always gives useful information, and we 590 00:38:24,880 --> 00:38:28,400 Speaker 1: might assume that courts start leveraging it right away. But 591 00:38:28,440 --> 00:38:32,440 Speaker 1: there are complexities around this. For example, in an earlier episode, 592 00:38:32,440 --> 00:38:34,439 Speaker 1: I talked about lie detection. How do you know when 593 00:38:34,480 --> 00:38:38,040 Speaker 1: somebody is actually lying? There are lots of technologies that 594 00:38:38,120 --> 00:38:41,600 Speaker 1: try to measure some version of this, but nothing can 595 00:38:41,640 --> 00:38:44,719 Speaker 1: simply tell you the answer because the whole concept of 596 00:38:44,760 --> 00:38:48,840 Speaker 1: a lie is complex. Sometimes you might be telling the 597 00:38:48,920 --> 00:38:53,040 Speaker 1: truth but you're factually incorrect, for example, because you're honestly 598 00:38:53,440 --> 00:38:57,319 Speaker 1: misremembering how something went, but you believe your memory. Or 599 00:38:57,360 --> 00:39:01,840 Speaker 1: for someone else, they might have no associated stress response 600 00:39:01,920 --> 00:39:04,279 Speaker 1: because they just don't care that they're lying. So when 601 00:39:04,320 --> 00:39:06,920 Speaker 1: somebody comes to the courts and says, hey, I have 602 00:39:06,960 --> 00:39:11,800 Speaker 1: a new lie detection technology, the judge can't just say great, 603 00:39:12,000 --> 00:39:14,239 Speaker 1: bring it to the case, because the judge first has 604 00:39:14,320 --> 00:39:18,920 Speaker 1: to decide whether it should be admitted or instead, whether 605 00:39:19,040 --> 00:39:23,880 Speaker 1: its promise will sway the jurors more than its value. 606 00:39:24,160 --> 00:39:27,920 Speaker 1: We're all enthusiastic about the next stages of technology and 607 00:39:28,000 --> 00:39:31,279 Speaker 1: being able to make important measures about what's happening in 608 00:39:31,320 --> 00:39:34,560 Speaker 1: the brain. But the legal system has to be very 609 00:39:34,600 --> 00:39:39,080 Speaker 1: careful about this, whether by standards of general acceptance in 610 00:39:39,120 --> 00:39:43,400 Speaker 1: the scientific community or by the choice of the judge's gatekeeper. 611 00:39:44,080 --> 00:39:48,400 Speaker 1: Each new technology has to be weighed carefully for admissibility 612 00:39:48,440 --> 00:39:59,440 Speaker 1: every time before it can enter the esteemed halls of justice. 613 00:40:00,080 --> 00:40:03,600 Speaker 1: Eagleman dot com slash podcast. For more information and to 614 00:40:03,680 --> 00:40:07,719 Speaker 1: find further reading, send me an email at podcast at 615 00:40:07,800 --> 00:40:11,799 Speaker 1: eagleman dot com with questions or discussion, and check out 616 00:40:11,840 --> 00:40:15,520 Speaker 1: and subscribe to Inner Cosmos on YouTube for videos of 617 00:40:15,560 --> 00:40:19,120 Speaker 1: each episode and to leave comments Until next time. I'm 618 00:40:19,200 --> 00:40:22,080 Speaker 1: David Eagleman, and this is Inner Cosmos.