Literature DB >> 28950155

Production of high-fidelity electropherograms results in improved and consistent DNA interpretation: Standardizing the forensic validation process.

Kelsey C Peters1, Harish Swaminathan1, Jennifer Sheehan1, Ken R Duffy2, Desmond S Lun3, Catherine M Grgicak4.   

Abstract

Samples containing low-copy numbers of DNA are routinely encountered in casework. The signal acquired from these sample types can be difficult to interpret as they do not always contain all of the genotypic information from each contributor, where the loss of genetic information is associated with sampling and detection effects. The present work focuses on developing a validation scheme to aid in mitigating the effects of the latter. We establish a scheme designed to simultaneously improve signal resolution and detection rates without costly large-scale experimental validation studies by applying a combined simulation and experimental based approach. Specifically, we parameterize an in silico DNA pipeline with experimental data acquired from the laboratory and use this to evaluate multifarious scenarios in a cost-effective manner. Metrics such as signal1copy-to-noise resolution, false positive and false negative signal detection rates are used to select tenable laboratory parameters that result in high-fidelity signal in the single-copy regime. We demonstrate that the metrics acquired from simulation are consistent with experimental data obtained from two capillary electrophoresis platforms and various injection parameters. Once good resolution is obtained, analytical thresholds can be determined using detection error tradeoff analysis, if necessary. Decreasing the limit of detection of the forensic process to one copy of DNA is a powerful mechanism by which to increase the information content on minor components of a mixture, which is particularly important for probabilistic system inference. If the forensic pipeline is engineered such that high-fidelity electropherogram signal is obtained, then the likelihood ratio (LR) of a true contributor increases and the probability that the LR of a randomly chosen person is greater than one decreases. This is, potentially, the first step towards standardization of the analytical pipeline across operational laboratories.
Copyright © 2017 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Analytical threshold; Forensic DNA; Limit of detection; Signal to noise; Single copy DNA; Validation

Mesh:

Year:  2017        PMID: 28950155     DOI: 10.1016/j.fsigen.2017.09.005

Source DB:  PubMed          Journal:  Forensic Sci Int Genet        ISSN: 1872-4973            Impact factor:   4.882


  2 in total

1.  Towards developing forensically relevant single-cell pipelines by incorporating direct-to-PCR extraction: compatibility, signal quality, and allele detection.

Authors:  Nidhi Sheth; Harish Swaminathan; Amanda J Gonzalez; Ken R Duffy; Catherine M Grgicak
Journal:  Int J Legal Med       Date:  2021-01-23       Impact factor: 2.686

2.  Four model variants within a continuous forensic DNA mixture interpretation framework: Effects on evidential inference and reporting.

Authors:  Harish Swaminathan; Muhammad O Qureshi; Catherine M Grgicak; Ken Duffy; Desmond S Lun
Journal:  PLoS One       Date:  2018-11-20       Impact factor: 3.240

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.