Television crime dramas and docudramas have, for decades, lulled the public into accepting the infallibility of forensic crime science. However, a groundbreaking study by the National Academy of Sciences (“NAS”) — made up of legal, technical, and policy experts authorized by Congress in 2005—was tasked with investigating the reliability of forensic science, ultimately casting serious doubt on many of the techniques investigators used to convict defendants.
According to S.J. Nightingale with the School of Information at the University of California, Berkeley, “The NAS Report” published in 2009, “calls for a broad and deep restructuring of how forensic techniques are validated and applied, and how forensic analysts are trained and accredited.” The report determined, “[with] the exception of nuclear DNA analysis ... no forensic method has been rigorously shown to have the capacity to consistently, and with a high degree of certainty, demonstrate a connection between evidence and a specific individual or source.”
One highly questionable technique routinely used over the past 50 years by the FBI to secure convictions is that of photographic pattern analysis. When subjected to rigorous, unbiased testing, photographic analysis may be used to compare something as innocuous as a seam pattern found on a pair of blue jeans. The analysis, however, when used to compare the stitch pattern worn by a subject captured on surveillance camera during a bank robbery to the pattern appearing on a suspect’s jeans, proved to be reliable only 20% of the time.
Alarmingly, FBI examiners claim that photographic comparison is recognized as central evidence in thousands of cases. According to a 2019 article by ProPublica, in many of these cases, juries have been misled by “baseless statistics [posited to show] the risk of errors in their analysis was extremely low.” The ProPublica article goes on to report, “examiners on the forensic Audio, Video and Image Analysis Unit based at the FBI lab in Quantico, Virginia, continue to use similarly flawed methods, and to testify to the precision of these methods.”
U.S. District Judge Jed Rakoff, ex officio on the National Commission on Forensic Science, is of the opinion that much of pattern analysis “science” is more about examiner intuition than real science. He opines, it is basically more about “hunches” rather than “facts.”
There have been a multitude of instances in which DNA evidence has substantially contradicted pattern analysis. These contradictions often arise in cases using hair follicle analysis.
Before DNA, hair follicle comparison was regularly used to convict defendants, and it was considered to be “reliable forensic science” in its day. As DNA technology progressed, many who had been found guilty largely on the basis of hair follicle analysis and sentenced to prison were later exonerated.
Other defendants have not been as fortunate. Joe Bryan, convicted solely on the basis of another flawed forensic discipline — blood-splatter analysis — has spent the last three decades in a Texas prison for the murder of his wife, a crime he maintains he did not commit.
Pattern analysis of garments worn by suspects continues to be used, and they’re not limited to jean stitch patterns. The use of this antiquated and highly dubious forensic technique has progressed to include photo analysis of body segments, including hands, arms or fingers, and comparing skin patterns that may be analyzed within the photo.
Richard Vorder Bruegge, an FBI image analyst, has staked his reputation on photo analysis, a technique he claims is almost as reliable as DNA testing itself. In one case, he claimed that the fabric pattern in a plaid shirt worn by a suspect in a surveillance photo generated a “1 in 650 billion match ... give or take a few billion.”
Statisticians and other independent forensic scientists told ProPublica that this assertion, like multiple other statements made by Vorder Bruegge, are preposterous. In one investigation involving a multiple bank robbery case, and one that earned the accused a 92-year prison sentence, Vorder Bruegge’s photo analysis evidence was the only evidence that directly connected the accused, Wilbert McKreith, to this spree of robberies.
Karen Kafadar, chairwoman of the Statistics Department at the University of Virginia, has called Vorder Bruegge’s statements “brazen.” She suggests, studies carried out in the matching of various suspect features such as skin, face, arms or hands, or clothing, have yielded alarming inconsistencies. Given the same sample image, different analysis methods were directed to focus on different features of the same individual and arrived at inconsistent and dissimilar conclusions.
For most of its existence since 1932, the FBI crime lab has enjoyed an unchallenged reputation for reliability. That reputation was finally shaken when recent scientific advancements, such as DNA, upended the tried-and-true myths of crime forensic “science.”
In 1995, Fredric Whitehurst, a chemist on the bureau’s Explosives Unit, testified that inaccurate reports had been generated in the first World Trade Center bombing in New York. The Justice Department’s Office of the Inspector General investigated Whitehurst’s allegations and found in 1997 that “significant instances of testimonial errors, substandard analytical work, and deficient practices” existed in several high-profile cases, including the World Trade Center and the Oklahoma City bombings.
After finding these problems in the Explosives Unit, the Justice Department began reviewing hair and filament analysis on hundreds of other cases. The department found irreconcilable irregularities in at least 250 cases but refused to make its findings public. It also did not notify attorneys for defendants in those cases.
Another unit at the FBI Lab had been matching bullets based on their chemical composition. FBI chemists were making assertions that suggested certain compounds identified in a crime-scene bullet could be matched with compounds found in bullets possessed by a suspect. The National Academy of Science, Engineering and Medicine, however, found that such assertions could not — and should not — be made. Its report suggested that the accuracy of such findings could differ so much so that similar compositions could exist in anywhere from 12,000 and 35 million other matching bullets, hardly a reliable metric by anyone’s measure.
Under President Obama’s administration in 2016, another Council of Science and Technology was convened. “Advisors,” ProPublica reported, “highlighted the lack of validation in several Pattern evidence fields [and] called on the FBI to increase spending on studies to prove its methods.” Not only did the Department of Justice ignore the advisors’ conclusions, it has since doubled down on federal law enforcement’s reliance upon unproven forensic science. ProPublica went on to report in 2017, that, then-Attorney General Jeff Sessions, closed the National Commission on Forensic Science, “ending an effort to set standards for crime laboratory practices.”
The only “checks and balances” left in place is the Daubert Standard. (Daubert v. Merell Dow Pharmaceuticals Inc., 509 U.S. 579 (1993)) This standard, upon which federal judges depend to evaluate accuracy and reliability, seeks to determine “whether the reasoning or methodology underlying the testimony [of an alleged expert witness] is ‘scientifically valid’ before allowing it at trial.” Id. Unfortunately, the Standard is often neglected and is imperfect at best even when utilized.
Given the overzealous and adversarial nature of judges and prosecutors over the previous three decades, it seems highly probable that junk forensic science will continue to be used as a prosecutorial tool, to the detriment of truthful and accurate justice.
As a digital subscriber to Criminal Legal News, you can access full text and downloads for this and other premium content.
Already a subscriber? Login