Skip navigation
The Habeas Citebook: Prosecutorial Misconduct - Header
× You have 2 more free articles available this month. Subscribe today.

Modern Forensics Findings Not Always 100 Percent Reliable

by Ed Lyon 

An extremely important and informative study concerning forensic comparison matches sponsored by The Royal Statistical Society was recently published. It represented cooperative efforts by attorney Dana M. Belger of the Innocence Project’s Strategic Litigation Unit, statisticians Bill Eddy and Robin Mejia of Carnegie Mellon University, and Maria Cuellar, Ph.D., of the University of Pennsylvania’s Criminology Department. 

This study is particularly unique in its approach to crime-related forensic studies as it was written from a standpoint of understanding the probability of presenting crime scene evidence testimony to a petit jury indicating the accused’s innocence rather than guilt. The publication imparts a clear level of understanding concerning forensic evidence, which shows in an easy-to-understand format that television dramas about criminal forensics like Bones and CSI leave a lot to be desired as far as real-world criminal justice matters are concerned. 

The study opens with the fingerprint misidentification by the FBI from a bag fragment that, when intact, had held some explosives used in the 2004 terrorist railway bombings in Spain. Traditional fingerprint lore teaches that no two people in the world, including identical twins, have the same fingerprint. That being the case, a “senior fingerprint examiner” employed by the FBI still managed to erroneously match that print to Brandon Mayfield, an attorney living in Oregon. 

This misidentification resulted in Mayfield being held in custody for the two weeks it took for Spanish police to determine the print actually belonged to a man living in Spain. Yet, the FBI analyst’s match to Mayfield had been verified by the FBI’s Latent Print Unit’s chief, a man with over 30 years of experience working with fingerprints. As it turned out, “the unusual similarity of details on the fingers of Mayfield and the true source of the print ... confused the FBI laboratory examiners, and was an important factor contributing to the erroneous identification” of Mayfield. 

As far as this being an isolated incident of error, the study points out that just the opposite is true. There have been more than 2,000 exonerations from wrongful convictions in criminal cases in the United States since 1989. About 500 of them, or 25 percent, were caused by “false or misleading forensic evidence.” 

Subjective, as opposed to quantitative methods of analysis, determinations often introduce an element of uncertainty into any expert’s conclusions. The way many so-called “experts” address any amount of uncertainty in their testimonial presentations is often by incorporating phraseology like “100 percent certain,” have error rates that are “essentially zero,” “vanishingly small,” “microscopic,” or have a chance of error so remote as to be a “practical impossibility.”

As convincing as statements like these are to the trier of fact, statisticians are reminded of the old adage that “even though figures don’t lie, any fool can still figure.” Those testimonial phrases raise red flags as they actually indicate the testifying expert’s opinion and belief about whether or not two given items actually do match while failing altogether to evaluate the purported value of the match. 

The study suggests a two-step analytical process to be used by a forensic analyst in the assessment of connecting a suspect to a crime. The first step is to determine “whether the crime scene and suspect-associated evidence appear to be similar, and, if so, what that similarity means.” That is, does the crime scene evidence “match” a sample from the suspect by a specifically defined set of match criteria or category. The second step of the analysis is to determine the probability of a match of the actual crime scene evidence with a suspect-associated sample occurring by random chance. 

Returning to the fingerprint mismatch, with all of the loops, whorls, and many other categorical fingerprint similarities, a subjective (human-driven) analysis of Mayfield’s fingerprints to the print on the bag fragment may well produce a “visually indistinguishable” match. However, a computer (quantitative) algorithmic analysis would produce a virtually and visually distinguishable print altogether as a quantitative analysis supported by a viable population eliminates random chance. 

The FBI maintains an updated, streamlined computer algorithmic fingerprint program called the Next Generation Identification (“NGI”) system. Its companion system, the National Integrated Ballistics Information Network (“NIBIN”), exists to identify ammunitions and cartridges. Vast amounts of information concerning all aspects of fingerprints coupled with other biometric information contained in the NGI system will produce fingerprint matches with absolute or virtual certainty when quantitatively analyzed. The same results may be reasonably expected where ballistics is concerned with quantitative analyses done using the NIBIN system. These systems eliminate any likelihood of an examiner being mistaken in his or her conclusion that an item of trace evidence matches or is linked to a suspect. 

The study highlights the widely known fact that the vast majority of DNA evidentiary analysis errors are attributable to the examiner rather than to chance matches. In promoting any reform or refinements to forensic sciences, two things must be clearly understood. The first is “the probability that the evidence from the crime scene matches a sample from a suspect and the probability that this match could occur by chance.” The second is “adequate databases of forensic evidence must be generated” and be readily available with which to eliminate any random chance match chance probabilities. 

The study concludes by pointing out that in subjective trial testimonial presentations, “experts must present their analysis of the data in a transparent way [before a jury or a judge], by describing their assumptions for modeling and data assumption.” 

---

Source: https://rss.onlinelibrary.wiley.com

As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.

Subscribe today

Already a subscriber? Login

 

 

PLN Subscribe Now Ad
PLN Subscribe Now Ad 450x450
PLN Subscribe Now Ad