NDAA Responds to Scientific American Article on Firearms Forensics

Contributing Author: NDAA’s Forensic Science Working Group

Photo Credit: Louis Reed/Unsplash

Dear Editors:
editors@sciam.com

In a day when gun violence is surging, it is unfortunate that the authors of The Field of Firearms Forensics is Flawed argue for the removal of a tried-and-true forensic methodology that helps keep shooters accountable. The authors make false and misleading claims throughout the article — some demand a response. While the authors present themselves as the neutral arbiters of “science”, they in fact stand to benefit financially and professionally from their “anti-expert” approach.

As the oldest and largest association representing state and local prosecutors in the country, the National District Attorneys Association (NDAA) advocates for the use of reliable forensics to exonerate the innocent and inculpate the guilty. With more than 5,000 members nationwide, NDAA is recognized as the leading source of national expertise on the prosecution functions, including forensic science, and is a valuable resource for the media, academia, government, and community leaders. The NDAA strongly contests several unsupported claims in this opinion piece and will address each one accordingly in the order presented by the authors.

This is simply false. The authors themselves admit later in the article that several studies proving reliability have recently been published. PCAST called for at least two studies of “appropriate design”. Now that there are multiple appropriately designed studies showing firearms examination meets PCAST’s criteria, the authors obfuscate the data.

First, to determine efficacy in a particular case, you might well ask the nurse who administered the shot and then observed over time whether the patient became ill with a virus or suffered side effects of the vaccine. Second, there are dozens of researchers in the firearms and toolmark discipline who are classically trained scientists. Many are authors of the studies that Faigman/Scurich/Albright ignore. These scientists have PhDs in fields such as materials science and engineering, applied research, statistics, chemistry, etc. Meanwhile, the authors have not personally conducted any empirical research — not about examiners’ rate of error or emerging technology. The authors’ claimed expertise is that they have read studies. One can read a book on baseball, but that will not make you good at baseball.

Courts have said — and the law is clear — that only relevant scientists have the right to opine on the acceptance of a forensic methodology. In both Frye and Daubert, the concept of “general acceptance in the relevant scientific community” is axiomatic. The self-exaltation of the authors does not overcome black letter law. In fact, one of the authors was recently excluded from testifying before a California jury because the judge found he lacked the appropriate expertise. It is bizarre that the authors, who could not perform any aspect of firearms toolmark microscopy, deem themselves thoroughly qualified to criticize the discipline. While it is abundantly clear that they could not reliably recognize matching toolmarks, that fact does not mean experienced and qualified forensic practitioners are similarly inept.

Why did the authors write this article? Perhaps to maintain their business model as expert witnesses, they need to redefine “scientific” acceptance. Otherwise, they would have no role whatsoever in a criminal courtroom. Their publication in Scientific American may be an attempt to persuade a judge that their personal beliefs have the imprimatur of “real” scientists, when in fact, they do not.

In 2016, PCAST said that more than the single study they deemed “appropriately designed” was needed. In 2017, The Co-Chair of PCAST, Dr. Eric Lander, wrote in the Fordham Law Review, “With only a single well-designed study estimating accuracy, PCAST judged that firearms analysis fell just short of the criteria for scientific validity, which requires reproducibility. A second study would solve this problem.”

Photo Credit: Jay Rembert/Unsplash

Would it surprise the reader to know that there have been further studies? It may seem baffling at first that the authors neglect to mention the “appropriately designed studies” that followed PCAST. This omission is important and seems deliberate. There are at least four studies [Kiesler 2018, Smith 2020, AMES II 2022 (pre-publication), Iowa State 2022 (pre-publication)] that met PCAST’s prescribed standards since 2016. All show that examiners are remarkably accurate when they call something a match. In fact, false positive error rates in these studies are less than 1%. To acknowledge this easily ascertainable fact completely upends the authors’ arguments.

This is misleading. Firearms examiners were just one group of many PCAST critics. PCAST was disparaged by numerous prominent forensic science organizations. For example, the American Society of Crime Laboratory Directors (ASCLD), a well-known and respected forensic science group, said of PCAST, “The report seems to favor that all scientific evaluation activities be performed completely separate from scientists with direct forensic science experience. ASCLD strongly disagrees with the removal of forensic scientists from the evaluation of scientific integrity or technical merit of analyses. ASCLD supports the involvement of academic scientists in the process, but strongly disagrees that these evaluations should be performed in a vacuum devoid of participation by the forensic scientists who can impart an applied knowledge and understanding to the research.” The American Congress of Forensic Science Laboratories noted that PCAST, “… was born of an imbalanced and inexperienced working group whose make-up included no forensic practitioners nor any other professionals with demonstrated experience in the practice of forensic science.”

Prominent experts in forensic DNA also decried what they deemed shoddy work by PCAST. Former President Obama’s United States Attorney General Loretta Lynch announced the Department of Justice would not follow PCAST’s recommendations.

It is noteworthy that PCAST did not correct the many errors noted by practitioners who reviewed the draft during the comment period. Nor did PCAST publish their report after a formal peer review process. Scientists know that peer review is foundational for “good science.”

Near the conclusion of their article, the authors acknowledge that there have been additional studies with false positive error rates less than 1%; their term “amazingly” conveys disbelief that experienced practitioners could be good at their jobs. To avoid this inconvenient fact, the authors shift the goal posts by attempting to manipulate the statistics. The outlier statistical approach they propose has never been adopted by the scientific community. In fact, under Professor Faigman’s watch, PCAST did not adopt the statistical approach the authors now proffer. Clearly, statisticians do not agree with the authors’ “anti-expert expert” approach.

This demonstrates a profound misunderstanding of firearm and toolmark analysis. Inconclusive decisions are commonplace in casework as well as in studies. The reasons for inconclusive decisions are apparent to any trained examiner: there is no guarantee that a sufficient quantity and quality of toolmarks will be made on ballistic evidence in any particular case. This is true for pristine laboratory samples as well as in casework. Since the authors have personally never analyzed firearm evidence, it is no surprise they fail to recognize this basic concept.

Inconclusive is, indeed, the correct “scientific” conclusion for poorly marked ballistics evidence, regardless of the ground truth. Moreover, the most salient issue in a criminal case is whether a ballistics identification is accurate; inconclusive decisions are conservative, and do not put people in jail.

Conclusion:

We implore the reader to examine the data that establishes the reliability of firearm and toolmark analysis. When viewed fairly, this data decisively supports the scientific validity of this discipline. The members of NDAA, who are the “boots on the ground” in courtrooms throughout this country, know by experience that firearm evidence is scientifically sound and able to withstand rigorous testing in the crucible of the courtroom. While the authors of “The Field of Firearms Forensics is Flawed” may find it profitable to criticize the field as paid expert witnesses, their incentive-based opinions are at odds with the many scientific studies supporting forensic firearm analysis.

As John Adams, criminal defense attorney-cum-President of the United States once said, “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passion, they cannot alter the state of the facts and evidence.”

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
National District Attorneys Association

The National District Attorneys Association (NDAA) is the oldest and largest national organization representing state and local prosecutors in the country.