Skip navigation
The Habeas Citebook Ineffective Counsel - Header
× You have 2 more free articles available this month. Subscribe today.

U.S. Government Lab Withheld Groundbreaking Study for 5 Years That Can Help Defendants Question the Reliability of Certain DNA Evidence

by Steve Horn

A study that called into question the reliability of DNA as a piece of smoking-gun evidence due to its propensity to be easily transferred and detected by modern DNA-detection technology was big news. However, a U.S. government lab sat on the laboratory-based and peer-reviewed study results for nearly five years, without publishing them in a peer-reviewed journal.

The paper, “NIST interlaboratory studies involving DNA mixtures (MIX05 and MIX13): Variation observed and lessons learned” was eventually published on August 1, 2018, in the journal Forensic Science International: Genetics. 

As the article’s title indicates, the study was published by the National Institute of Standards and Technology (“NIST”), a unit of the U.S. Department of Commerce, by its Applied Genetics Group. Critics have complained that, had the article come out sooner, it could have been cited by criminal defendants in their cases and might have helped to halt wrongful convictions. 

Today, this journal paper could send shockwaves through the U.S. legal system because it can now potentially be introduced as evidence under the landmark Daubert v. Merrell Dow Pharmaceuticals, Inc. precedent. Daubert sets certain standards for what scientific evidence is admissible during pretrial motions and trials, and one key, perhaps the most central one, is that the research must be peer-reviewed. 

It is perhaps another layer in the armor for defendants to bolster their defense and call into question the appearance of DNA at or near the scene as a seemingly smoking gun piece of evidence treated as the gold standard within the criminal justice system and especially jurors. 

“In the 2005 NIST MIX05 study, 69 laboratories interpreted data in the form of electropherograms of two-person DNA mixtures representing four different mock sexual assault cases with different contributor ratios,” explained the authors of the study’s design. “In the 2013 NIST MIX13 study, 108 laboratories interpreted electropherogram data for five different case scenarios involving two, three, or four contributors, with some of the contributors potentially related.”

Essentially, the study revealed that in the cross-laboratory collaboration with the most laboratories involved — MIX13 — saw the greatest variation in DNA results. That variation exhibited many cases of secondary DNA transferred to an item left behind at a mock crime scene, a ski mask. 

And variation is not something desired when DNA evidence is presented as part of criminal cases, because to date it has been presented as an essentially foolproof piece of evidence standing above all other types of evidences. The crown jewel, if you will. 

Case Five

The case study that piqued the interest of most within the forensic science community for MIX13 was Case Five, which will likely become known as the ski-mask scenario both in the science world and by lay people. 

“Several gang-related robberies have targeted multiple banks in the city. The robberies have typically involved two or three perpetrators. A ski mask was recovered in a trash can one block away from the latest bank robbery and is submitted for DNA testing,” the study explains of the case study set-up. “Evidence is a DNA profile developed from a ski mask recovered near a bank robbery scene. A confidential informant has implicated two suspects (References 5A and 5B) in at least three of the armed robberies. Police have obtained buccal swab references from the two suspects identified from the informant, and another known accomplice of the suspects.”

The result? It put someone at the scene of this hypothetical crime who was not even part of the original four people known to have been there to begin with. It is the nightmare scenario previously highlighted here in a cover story published by Criminal Legal News (See: CLN, Sept. 2018, p. 1).

“The fabric showed a mixture of touch DNA including four people, but due to its complexity, it initially appeared as a mixture of only two people,” explained the publication Forensic Magazine of the study. “The labs were given two of the four likely contributors, along with a fifth person. But that fifth person was not in the mixture, and had never touched the ski mask ... Seventy-four laboratories out of 108 got it wrong by including the fifth person in their interpretation.

Indeed, only seven out of the 108 laboratories got things “right” using the FBI’s methodology for testing DNA, known as CPI, or combined probability of inclusion. And even in those, Forensic Magazine explained, the labs had differing reasons as to why it transpired that way. 

“Four of the laboratories cited a missing allele at a key location,” wrote the publication. “Two more, using data from the Identifiler Plus (a ThermoFisher PCR amplification kit), showed that the fifth person could not fit.”  

Or in the lingo of a criminal law case, “reasonable doubt” has arisen from a multitude of vantage points as to how DNA can get to a particular place and why, MIX13 has shown. 

The study’s lead author, John Butler, spoke directly to the phenomenon of the study exhibiting that modern DNA technology can, indeed, place someone at the scene of a crime that he or she didn’t commit. The study, he says, was designed specifically to see if that could in fact happen.

“The mixture itself was designed to not show too many alleles,” Butler told Forensic Magazine. “People would be tricked into thinking there are only two or three people there, instead of the four people that were really there. The way that it was designed was on purpose, to kind of help people realize that CPI can falsely include people—that was its purpose. And it demonstrated that really nicely.”

Carve Out Language?

Boise State University forensic biology and forensic science professor Greg Hampikian wrote in an opinion piece published by The New York Times on September 21, 2018, that he had heard about the study years back in a PowerPoint presentation, which left the forensic science community asking when — or, indeed, if — it would see the light of day in the form of a published and peer-reviewed study. 

“I first learned about the results of this study in 2014, at a talk by one of its authors. It was clear that crime labs were making mistakes, and I expected the results to be published quickly,” wrote Hampikian in The New York Times. “Peer-reviewed publication is important, because most judges won’t let you cite someone’s PowerPoint slide in your testimony. But years went by before the study was published, preventing lawyers from using the findings in court, and academics from citing the results in journal articles. If some of us had not complained publicly, it may not ever have been published.”

Worse, says Hampikian, the study’s authors bury the lede in terms of the magnitude of the findings and also have language in the study that appears to attempt to block its findings from use by criminal defendants.  

“The results described in this article provide only a brief snapshot of DNA mixture interpretation as practiced by participating laboratories in 2005 and 2013. Any overall performance assessment is limited to participating laboratories addressing specific questions with provided data based on their knowledge at the time,” reads the study’s language cited by Hampikian as potential Daubert carve-out language. “Given the adversarial nature of the legal system, and the possibility that some might attempt to misuse this article in legal arguments, we wish to emphasize that variation observed in DNA mixture interpretation cannot support any broad claims about ‘poor performance’ across all laboratories involving all DNA mixtures examined in the past.”

Real World Impacts

Before the study’s findings saw the light of day in a peer-reviewed, published study, Hampikian pointed to the real-world implications of not having the study in the public domain for Daubert purposes. He had once tried, in a case in which he was involved as an expert consultant, to have the PowerPoint slides introduced. But because the results were not yet published in a peer-reviewed journal, the judge denied the motion under the Daubert standard. 

“This is about five years of people being convicted by bad interpretations,” Hampikian told Forensic Magazine back in May 2018 in response to the dynamics at play at the time. “This is a huge story. It’s the problem with DNA—the one everyone trusts. If it was contaminated peanut butter, or faulty airplanes, or airbags that failed, wouldn’t NIST have felt compelled to do something more than just a couple PowerPoint shows? This is not just a case of salmonella—this is 20 years in prison. Some of these people died in prison.”

Only time will tell whether the findings in this new peer-reviewed study will make a dent in preventing, or help in overtuning, wrongful convictions. Or perhaps the carve-out language pointed out by Hampikian will maintain the status quo, resulting in more people being wrongfully convicted based, at least in part, on faulty DNA evidence. 

---

Sources: biology.boisestate.edu, strbase.nist.gov, forensicmag.com, nytimes.com, fsigenetics.com, criminallegalnews.org 

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

The Habeas Citebook: Prosecutorial Misconduct Side
Advertise here
PLN Subscribe Now Ad