Matthew J Mazzella1, Dana Boyd Barr2, Kurunthachalam Kannan3, Chitra Amarasiriwardena4, Syam S Andra4, Chris Gennings4. 1. Department of Environmental Medicine and Public Health, Icahn School of Medicine at Mount Sinai, New York, NY, USA. matthew.mazzella@mssm.edu. 2. Department of Environmental Health, Rollins School of Public Health, Emory University, Atlanta, GA, USA. 3. Department of Pediatrics and Department of Environmental Medicine, New York University School of Medicine, New York, NY, USA. 4. Department of Environmental Medicine and Public Health, Icahn School of Medicine at Mount Sinai, New York, NY, USA.
Abstract
BACKGROUND: The Children's Health Exposure Analysis Resource (CHEAR) program allows researchers to expand their research goals by offering the assessment of environmental exposures in their previously collected biospecimens. Samples are analyzed in one of CHEAR's network of six laboratory hubs with the ability to assess a wide array of environmental chemicals. The ability to assess inter-study variability is important for researchers who want to combine datasets across studies and laboratories. OBJECTIVE: Herein we establish a process of evaluating inter-study variability for a given analytic method. METHODS: Common quality control (QC) pools at two concentration levels (A and B) in urine were created within CHEAR for insertion into each batch of samples tested at a rate of three samples of each pool per 100 study samples. We assessed these QC pool results for seven phthalates analyzed for five CHEAR studies by three different lab hubs utilizing multivariate control charts to identify out-of-control runs or sets of samples associated with a given QC sample. We then tested the conditions that would lead to an out-of-control run by simulating outliers in an otherwise "in-control" set of 12 trace elements in blood QC samples (NIST SRM 955c). RESULTS: When phthalates were assessed within study, we identified a single out-of-control run for two of the five studies. Combining QC results across lab hubs, all of the runs from these two studies were now in-control, while multiple runs from two other studies were pushed out-of-control. In our simulation study we found that 3-6 analytes with outlier values (5xSD) within a run would push that run out of control in 65-83% of simulations, respectively. SIGNIFICANCE: We show how acceptable bounds of variability can be established for a given analytic method by evaluating QC materials across studies using multivariate control charts.
BACKGROUND: The Children's Health Exposure Analysis Resource (CHEAR) program allows researchers to expand their research goals by offering the assessment of environmental exposures in their previously collected biospecimens. Samples are analyzed in one of CHEAR's network of six laboratory hubs with the ability to assess a wide array of environmental chemicals. The ability to assess inter-study variability is important for researchers who want to combine datasets across studies and laboratories. OBJECTIVE: Herein we establish a process of evaluating inter-study variability for a given analytic method. METHODS: Common quality control (QC) pools at two concentration levels (A and B) in urine were created within CHEAR for insertion into each batch of samples tested at a rate of three samples of each pool per 100 study samples. We assessed these QC pool results for seven phthalates analyzed for five CHEAR studies by three different lab hubs utilizing multivariate control charts to identify out-of-control runs or sets of samples associated with a given QC sample. We then tested the conditions that would lead to an out-of-control run by simulating outliers in an otherwise "in-control" set of 12 trace elements in blood QC samples (NIST SRM 955c). RESULTS: When phthalates were assessed within study, we identified a single out-of-control run for two of the five studies. Combining QC results across lab hubs, all of the runs from these two studies were now in-control, while multiple runs from two other studies were pushed out-of-control. In our simulation study we found that 3-6 analytes with outlier values (5xSD) within a run would push that run out of control in 65-83% of simulations, respectively. SIGNIFICANCE: We show how acceptable bounds of variability can be established for a given analytic method by evaluating QC materials across studies using multivariate control charts.
Authors: Alexandros G Asimakopoulos; Jingchuan Xue; Bruno Pereira De Carvalho; Archana Iyer; Khalid Omer Abualnaja; Soonham Sami Yaghmoor; Taha Abdullah Kumosani; Kurunthachalam Kannan Journal: Environ Res Date: 2015-12-03 Impact factor: 6.498
Authors: Manori J Silva; A Ryan Slakman; John A Reidy; James L Preau; Arnetra R Herbert; Ella Samandar; Larry L Needham; Antonia M Calafat Journal: J Chromatogr B Analyt Technol Biomed Life Sci Date: 2004-06-05 Impact factor: 3.205
Authors: J T van der Steen; R L Kruse; K L Szafara; D R Mehr; G van der Wal; M W Ribbe; R B D'Agostino Journal: Palliat Med Date: 2008-09 Impact factor: 4.762
Authors: Annemarie Stroustrup; Jennifer B Bragg; Syam S Andra; Paul C Curtin; Emily A Spear; Denise B Sison; Allan C Just; Manish Arora; Chris Gennings Journal: PLoS One Date: 2018-03-05 Impact factor: 3.240
Authors: Kurunthachalam Kannan; Alexa Stathis; Matthew J Mazzella; Syam S Andra; Dana Boyd Barr; Stephen S Hecht; Lori S Merrill; Aubrey L Galusha; Patrick J Parsons Journal: Int J Hyg Environ Health Date: 2021-03-24 Impact factor: 5.840
Authors: Susan Marie Viet; Jill C Falman; Lori S Merrill; Elaine M Faustman; David A Savitz; Nancy Mervish; Dana B Barr; Lisa A Peterson; Robert Wright; David Balshaw; Barbara O'Brien Journal: Int J Hyg Environ Health Date: 2021-05-23 Impact factor: 7.401