At least in science fiction, we talk a lot about the idea of pre-crime, mind reading, and compelled testimony. Can we put someone in jail because a brain scan suggests they have a higher potential for committing a crime? Can we force witnesses to tell the truth, the whole truth, and nothing but the truth by jacking into their brains and reading their memories?
The answers to these questions are still very firm "maybes," which might make you think that neurology has played a very limited role in the court room. And, while it's true that we haven't seen much that feels like science fiction hit real life trials, it's also true that neuroscience and brain imaging has been used, quite literally, to decide whether people should live or die.
Brain scans were used as evidence in 800 cases between 1992 and 2012, according to a review published by Fordham University School of Law Professor Deborah Denno. Between 2005 and 2012, there were 1,800 cases in which judges referenced neurological or behavioral genetics (a similarly emerging field) evidence in their opinions, according to Duke Law's Nita Farahany, who is also a member of President Obama's Commission for the Study of Bioethical Issues.
In 2002, Peter Chiesa caught two of his neighbors stealing firewood from his California farm. Chiesa calmly called the police and told them that he was going to approach the neighbors and shoot them. He killed them both, but Chiesa was found guilty of the lesser crime of second degree murder (rather than capital murder) primarily because three different brain scans showed that he had brain damage.
Attorneys have attempted to use brain scans to defend their clients 514 times
"Computer-assisted tomography (CAT), positron emission tomography (PET), and single-photon emission computerized tomography (SPECT) scans revealed damage in his prefrontal cortex, temporal lobes, and cerebellum—damage that experts claimed would affect his impulse control and temper," Obama's commission wrote in a report published in March. "His doctors opined that, although Mr. Chiesa was aware of what he was doing when he shot the neighbors, his conduct was driven primarily by impulse, not choice. Despite evident planning—notifying the police of his plan, driving his truck without incident, and aiming the gun at two separate individuals—the jury convicted Mr. Chiesa of the lesser offense of second-degree murder instead of first-degree, premeditated murder."
There have been hundreds of cases just like Chiesa's. According to Denno's paper, attorneys have attempted to use brain scans to defend their clients 514 times, most commonly in first degree murder cases. The defense has become so common that, on appeal, defendants regularly claim they received "ineffective counsel" at their original trial if their attorney didn't put brain scans into evidence.
"Some defense attorneys decided to omit potentially mitigating evidence because they thought it may bolster the perception of a client's future dangerousness"
So far, it's most common to use brain scans as a defense—but can't the same brain scans that show a person has brain damage be used to show that they're a danger to society? That they should be locked away forever or are likely to continue committing crimes?
Interestingly, that's been a somewhat rare occurrence. Prosecutors have shied away from using brain scans to convince a judge or jury that defendants are dangerous, perhaps because there are various protections for those shown to be not capable of standing trial or otherwise mentally impaired. Denno said that prosecutors attempted to prove future dangerousness in 14 death penalty cases (it has not been tried in any other type of case) and were successful in 10.
"In those rare instances when prosecutors did utilize neuroscience evidence to suggest a defendant's propensity to commit crimes, they typically did so only by building upon the evidence first introduced by a defense expert," she wrote. "In contrast, some defense attorneys decided to omit potentially mitigating evidence because they thought it may bolster the perception of a client's future dangerousness."
But her data suggests that, really, brain scans are just another piece of evidence. Judges don't seem to weigh them any more heavily than other facts in the case, and any fears about using brain scans by either the defense or the prosecution are somewhat unfounded.
"This finding controverts the popular image of neuroscience evidence as a double-edged sword—one that will either get defendants off the hook altogether or unfairly brand them as posing a future danger to society," she wrote. "To the contrary, my study indicates that neuroscience evidence is typically introduced for a well-established legal purpose—to provide fact-finders with more complete, reliable, and precise information when determining a defendant's fate."
"Do individuals have a right to mental privacy?"
A major reason why judges seem to "get" neuroscience and its limitations is because there has been a concerted effort to inform them over the years. New York District Court Judge Jed Rakoff published "A Judge's Guide to Neuroscience," in which he got neurologists to explain both the future of brain scans in the courtroom and their limitations today. Rakoff and others have organized various seminars and classes to present some of this information.
Even though some of the more advanced stuff we imagine—the brain interrogations, for instance—haven't been seen in courtrooms yet, we are certainly heading in that direction.
Neuroscientists can already do some pretty useful things, such as measure pain and perhaps detect lies using an fMRI, which measures blood flow to certain parts of the brain. The problem is, there is still some noise in these tests—they're already more effective than a polygraph, but they're more intense tests that have generally only been used in clinical settings. While we can tell broadly whether or not someone is lying or experiencing pain, fMRI patterns don't work for every person, every time.
"The legal profession would be more interested in an objective measure of a pain that occurred following an injury or that results from a disease process. There the goal would be to determine degree of harm to the plaintiff and the liability of a defendant," Howard Fields, a neurologist at the University of California San Francisco wrote in Rakoff's guide. "It turns out that this is, hypothetically, possible. Because we know where to look in the brain for activity induced by painful stimuli, we can question the subject about their current ongoing pain level."
"Empathy, guilt, and remorse are, after all, mental processes that are instantiated in the neural systems of the brain"
The use of antidepressants, anxiety meds, and a person's mood during the scan can influence its results. Still, this is an area of study that is rapidly maturing, and Hank Greely, director of Stanford Law School's Program in Neuroscience and Society told me that it's only a matter of time before fMRI scans are used to detect lies and determine pain levels at trial.
"The first thing we'll see is the pain detection, and then, I think there's a good chance—not 90 percent, but better than 50-50, that sometime in the next two decades we'll see it used for other forms of lie detection," he said. "We're in a bit of a lull right now with regards to neuroscience, but I suspect that will change—there hasn't really been the breakthrough science in legally relevant areas, but it's kind of early."
It's easy to see where these breakthroughs might be. As we learn more about how the brain works and about criminality, we may be able to answer questions about criminal responsibility, identify "psychopaths," and may even be able to tell whether someone feels bad about what they did.
"Empathy, guilt, and remorse are, after all, mental processes that are instantiated in the neural systems of the brain," Kent Kiehl, a neurologist at the University of New Mexico wrote in Rakoff's guide. "To understand the symptoms of psychopathy, scientists need to develop methods to accurately quantify these latter neural systems that engender the symptoms under study (e.g., lack of guilt or remorse)."
And once we can do that, what happens? The Obama administration agrees that much more powerful and legally relevant neuroscience is coming, and that we must answer some important questions sooner rather than later.
"Do individuals have a right to mental privacy that safeguards them from being compelled to submit to EEG, fMRI, or other brain-based interrogations? Should eyewitnesses have their memories validated by neuroscience techniques?" it wrote in its March report. "Does the Fourth Amendment of the U.S. Constitution, which protects individuals from unreasonable searches and seizures by the government, safeguard individuals against such uses?"
Right now, our brains go on trial if defense attorneys think they can help us. But what happens when we don't really have a choice?
Jacked In is a series about brains and technology. Follow along here.