Study Finds Lack of Uniformity in New DNA Technology
by Jayson Hawkins
Forensic DNA evidence has been used since the 1980s. Public confidence and familiarity with this method grew in the wake of the O.J. Simpson trial and the popularity of television police procedural shows, but the traditional methods of gathering and analyzing blood and semen samples has given way to far murkier techniques utilizing trace evidence found in “touch DNA” (skin cells, etc. left behind on an object someone handles) and “DNA mixture” (small amounts of DNA from a variety of people found in a single sample).
Analyzing samples in these new, complex fields presents multiple challenges, including distinguishing one person’s DNA from another in a mixture, estimating how many individuals contributed DNA, determining whether the DNA is relevant or is from contamination, and whether there is a trace amount of suspect or victim DNA present in the sample. Because the technology is new, and because laboratories are reluctant to share data due to intellectual property and genetic privacy concerns, there is not only a lack of uniform analytical methods but also no clear or accepted way to compare results from multiple labs in terms of reliability.
These concerns, as well as the significance DNA analysis can have in the lives of people involved in criminal cases, inspired Congress in 2018 to direct the National Institute of Standards and Technology (“NIST”) to identify priorities for future research, help laboratories identify appropriate limitations on the use of forensic methods, and suggest steps for moving the field forward. In June 2021, NIST published “DNA Mixture Interpretation: A Scientific Foundation Review,” which catalogs both the problems and possibilities of this evolving field.
DNA mixtures are inherently more difficult to process than single-source samples, and the profiles generated from mixtures can produce random variation and artifacts. Due to the variety of factors that contribute to this complexity, the NIST report found that using probabilistic genotyping software (“PGS”) methods utilize more information and generate more consistent results than traditional binary approaches. PGS methods produce likelihood ratios (“LRs”) as opposed to “match/no match” binary results. LRs are not clear-cut answers. They represent probabilities. The study found that LRs vary greatly depending on personnel, protocols, and computational algorithms. Focusing on standardization of PGS methods and the resulting LRs is essential for future analysis of mixed DNA samples.
The study also found that there is little empirical data to allow different labs to meaningfully compare reports on the same DNA mixture. Even when comparable reliability can be assessed, “there is no threshold or criteria established to determine what is an acceptable level of reliability.” As a practical matter, this means that there is no accepted method for comparing results between DNA mixture tests run by labs working for prosecutors and defendants in a particular case.
Additional questions were raised in the study about highly sensitive DNA methods detecting irrelevant material or trace evidence when examining a mixture. In the 13 laboratories NIST examined, they found no uniform methods for negating the effects of irrelevant material and contamination.
The overall conclusion of the NIST study is that while the technologies associated with DNA mixture analysis have grown increasingly sophisticated, the methods and procedures used to collect, examine, and interpret the resulting data have not kept up. As these new technologies become more widely available, uniform practices and a system of comparative analysis must be developed to insure accurate results.
Source: forensicmag.com, NISTIR 8351 - DRAFT DNA Mixture Interpretation: A NIST Scientific Foundation Review
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login