Wednesday, February 24, 2010

Shedding a little light on the reality of DNA evidence

This article is a must read for all criminal defense attorneys and anyone interested in learning the truth about the limitations of DNA evidence.  Please, please read it.  There may be a quiz.

We have come to think of DNA as if it's the perfect evidence.  Infallible.  Incapable of manipulation.  Not subject to interpretation.  And so simple.  It's either a match or it isn't a match. 

All of these notions are wrong.  DNA evidence contains so many more variations and involve so much more nuance than the popular understanding.  This article does a great job of highlighting some of those shades of gray.  It also should provide defense attorneys (who have often been far too complacent in simply accepting what the state's experts tell us about DNA evidence) with some useful ideas for what we should be fighting.  There are lots of rich areas for litigation in the unexplored minefield of DNA evidence.  Sadly, though, to some extent the defense bar has also fallen under the spell of DNA so we aren't taking full advantage of the opportunities for challenge that exist.  When combined with the reticence district court's commonly feel when we ask for expert funding or want to present some new-fangeled scientific theory of defense, we've let some myths about DNA evidence persist, even as the realities are proving to be very different.

First, we need to distinguish between DNA results that include a suspect as a possible contributor and those that exclude a suspect.  Test results that exclude a suspect as a contributor are much more reliable.  False negatives are highly unlikely in DNA testing.  In an ideal DNA comparison, the analyst will look at 13 points of the crime scene evidence and compare them to 13 points on the suspect's DNA sample.  Those points are called loci.  If any one of those loci are different, then it's pretty clear the suspect is not the source of the crime scene evidence.  (Usually, in a non-match, most, if not all, of the loci won't match.) 

Think of it like the tire marks in "My Cousin Vinny."  The tire marks could not have been made by a '64 Buick Skylark because it had a solid rear axle and did not have positraction, the car equivalents of loci.  That car can be categorically excluded as a possible contributor to the tire marks.  Heck, it could have been excluded on the basis of the positraction alone.  The rear axle was just gravy.

A result that includes a suspect as a possible contributor (otherwise referred to by the state as a "match") is not as black and white.  Like the '63 Pontiac Tempest that Sheriff Farley tracked down.  It did have all the necessary physical attributes to have left the tire marks.  Plus, there is other evidence that makes it even more likely that the particular car Farley testified about was, in fact, the car involved in the shooting.  Of course, it is theoretically possible that there is an identical '63 Tempest out there that just happens to be connected to the crime, but it's pretty darn unlikely.

That's a very simplified analogy, but I think it helps to highlight the preliminary point about the difference between exclusionary results and "match" results.

In a perfect world, a "match" would involve the comparison of the full 13 loci of the crime scene evidence against the full 13 loci of a known suspect.  In that ideal world, a match is pretty strong and powerful evidence.  

The real problem with DNA matches is that most crime scene evidence isn't that perfect world evidence of 13 loci.  Most are partial profiles, with only 9 or 10 loci, or even fewer.  In the case highlighted in the Washington Monthly article, only 5 1/2 loci were available.  In addition, a lot of evidence is a mixture, meaning it contains DNA of two people, which can be more difficult to separate and decide which DNA markers belong to which contributor.  It is with these cases that DNA evidence begins to lose its power.  When the profiles are partial and when they include mixtures, declaring a "match" becomes a much more subjective process. From the article:

In 2005, Peter Gill, then a researcher at the Forensic Science Service, which administers the national DNA database for the British police, told a conference of forensic scientists, "If you show ten colleagues a mixture, you will probably end up with ten different answers." Dan Krane, a molecular biologist at Wright State University and a leading critic of the government’s stance on DNA evidence, agrees. "There is a public perception that DNA profiles are black and white," he told me. "The reality is that easily in half of all cases—namely, those where the samples are mixed or degraded—there is the potential for subjectivity."
DNA matches aren't declared by computers just spitting out clear and uncontested results. 

My biggest quibble with DNA evidence has always been the statistical analysis presented to the jury to show just how (un)likely it is that some random individual could match the evidence.  It's always some astronomical probability, like 1 in 1 quadrillion.  There aren't anywhere near a quadrillion people on the planet, so you can see how powerful that evidence is.  Sometimes for a mixture or a partial profile, you might get that lab analyst to reduce the probability down to something like 1 in a million, which still sounds pretty big to most people.  But the lab analysts testifying to these outlandish probabilities aren't statisticians or mathematicians at all.  They're biologists.  They usually don't know anything beyond the most rudimentary probability statistics, if they even know that.  They just get these numbers from some big database created by the FBI; the actual lab analysts who testify in criminal cases can't defend how those numbers were derived.  But they're always really sure of them.

We're starting to realize that those numbers probably can't be defended.  Really, read the article.  Over the past few years, researchers have stumbled upon a stunning secret about DNA: matches of partial profiles are a lot more common than we had realized.  In Arizona and Illinois, researchers found hundreds of random matches.  A detailed search of the Arizona DNA database found 122 sets of profiles that shared 9 loci.  It's far more possible for two random strangers to share 9 DNA loci than the FBI statisticians ever dreamed of.  Imagine how much more likely it might be to find two people in the same database who share 5 or 6 or 7 loci.  So it's become a downright lie for DNA experts to stick to their "1 in a million" probabilities.  We know that the actual occurrence of random matches is much, much likelier than that.

This is going to become a huge problem (if we choose to acknowledge it) as every legislative body in this nation appears hell bent on building up huge databases of DNA from convicted felons.  The idea of these databases is that law enforcement can just then plug in DNA from crime scenes, including old cold cases, and see if any hits come up.  But since a lot of crime scene evidence involves partial profiles and since the purpose is to search for cold hits, it seems inevitable that a random incorrect match will be made.  The probabilities of finding a false match are nowhere near 1 in a million.  In the case mentioned in the article, the true probability was 1 in 3.  Makes finding guilt beyond a reasonable doubt a whole lot harder, doesn't it? 

As you'll learn when you read the article, though, the jury in that case never heard the 1 in 3 figure.  All they heard was 1 in a million.  And they didn't hear evidence about the random matches discovered in the Arizona DNA database.  Why?  Because the judge said all that evidence was irrelevant.  Which is code for "really, really bad for the state's case."  The judge had precedent he could cite to back up his refusal to let the defense challenge the DNA statistics.  In Kansas, we've had some difficulty challenging the statistical evidence, too.  Our Supreme Court has held that the statistical evidence does not implicate the Confrontation Clause so we are not entitled to cross-examine the actual statisticians who programmed the probability software that every DNA biologist uses.  Courts seem reluctant to look too closely behind the curtain of DNA evidence.  Indeed, courts have historically been hesitant to let defense attorneys present experts who challenge what we all think we know about things like false confessions and eyewitness identifications.  There's no reason to think courts will be less reluctant to let us challenge the assumptions we have made about DNA evidence.

Finally, as this article highlights, the potential problems with DNA evidence are further complicated by the fact that most crime labs in the country are under the direct control of law enforcement.  The FBI has started playing tough with states who allow defense counsel too much access to their DNA databases.  They don't want us poking around too much and investigating the truth of their scientific claims.  They don't want to do the hard work to get at the right results; they just want the quick, dirty, and easy answers, evidently without concern for whether those answers are the right ones.  Once again, when science and law enforcement come together, scientific integrity inevitably takes the back seat.

DNA can be an excellent tool in the criminal justice system but we have got to learn how to use it correctly.  We have got to acknowledge its pitfalls and limitations.  We, the defense bar, have to stop accepting it as smoking gun evidence.  We have got to develop the statistical analysis so we can truly challenge it in court.  We have got to find experts in probability that we can call to the stand to explain these things to juries and courts.  We have got to do more to expose the problems with partial profiles and mixtures.  We have got to fight against the criminal justice system's reluctance to get at the best evidence.  We have got to break the hold that DNA evidence has over us.  Otherwise, we'll never get to the truth of it.

4 comments:

Randall Hodgkinson said...

I argued (unsuccessfully) a long time ago that the statistics should be the subject of separate testimony and analysis in these cases. The chemists aren't statisticians, they are going by what somebody else told them regarding the import of the numbers.

S said...

I think math is something a lot of lawyers avoid (most of us aren't math majors), so it's not that surprising to me that arguments over the statistics, when they are made, don't get any traction. Perhaps it's time to start asking courts to hold Daubert hearings on the statistics. The more noise we make on this front, the more likely we are to get some court to finally hear what we're saying.

Anonymous said...

Great piece. Referenced it here:

http://www.bigbrotherwatch.org.uk/home/2010/02/dnas-dirty-little-secret-a-forensic-tool-renowned-for-exonerating-the-innocent-may-actually-be-putti.html.html

Shalom Systems said...

Thank you for your post. This is excellent information. It is amazing and wonderful to visit your site.
road spike systems in hyderabad
revolving glass turnstiles in hyderabad
drop arm turnstiles in hyderabad
full height turnstiles in hyderabad
turnstile Systems in hyderabad
automatic gates in hyderabad
road blocker in hyderabad
hydraulic rising bollards in hyderabad

 
Blog Designed by : NW Designs