Literature DB >> 35507527

Statewide Quantitative Microbial Risk Assessment for Waterborne Viruses, Bacteria, and Protozoa in Public Water Supply Wells in Minnesota.

Tucker R Burch1,2, Joel P Stokdyk2,3, Nancy Rice4, Anita C Anderson4, James F Walsh4, Susan K Spencer1,2, Aaron D Firnstahl2,3, Mark A Borchardt1,2.   

Abstract

Infection risk from waterborne pathogens can be estimated via quantitative microbial risk assessment (QMRA) and forms an important consideration in the management of public groundwater systems. However, few groundwater QMRAs use site-specific hazard identification and exposure assessment, so prevailing risks in these systems remain poorly defined. We estimated the infection risk for 9 waterborne pathogens based on a 2-year pathogen occurrence study in which 964 water samples were collected from 145 public wells throughout Minnesota, USA. Annual risk across all nine pathogens combined was 3.3 × 10-1 (95% CI: 2.3 × 10-1 to 4.2 × 10-1), 3.9 × 10-2 (2.3 × 10-2 to 5.4 × 10-2), and 1.2 × 10-1 (2.6 × 10-2 to 2.7 × 10-1) infections person-1 year-1 for noncommunity, nondisinfecting community, and disinfecting community wells, respectively. Risk estimates exceeded the U.S. benchmark of 10-4 infections person-1 year-1 in 59% of well-years, indicating that the risk was widespread. While the annual risk for all pathogens combined was relatively high, the average daily doses for individual pathogens were low, indicating that significant risk results from sporadic pathogen exposure. Cryptosporidium dominated annual risk, so improved identification of wells susceptible to Cryptosporidium contamination may be important for risk mitigation.

Entities:  

Keywords:  Cryptosporidium; disinfection; drinking water; groundwater; public water supply wells; quantitative microbial risk assessment (QMRA)

Mesh:

Year:  2022        PMID: 35507527      PMCID: PMC9118547          DOI: 10.1021/acs.est.1c06472

Source DB:  PubMed          Journal:  Environ Sci Technol        ISSN: 0013-936X            Impact factor:   11.357


Introduction

An estimated 2.2 billion people rely on groundwater as a source of drinking water globally, including approximately 140 million in the United States (U.S).[1] Though often regarded as a pristine drinking water source, groundwater can be contaminated by waterborne pathogens that cause outbreaks of acute gastrointestinal illness (AGI) as well as sporadic diseases.[1−3] Quantitative microbial risk assessment (QMRA) is one means to quantify risk, and AGI risk estimates in various groundwater settings range from less than 10–2 to as much as 2.3 × 10–1 AGI cases person–1 year–1.[4−8] Broadly speaking, these estimates are consistent with epidemiological assessments of AGI risk for groundwater and more general national-level estimates for public drinking water systems in the U.S. and Canada (i.e., those relying on surface water instead of or in addition to groundwater).[6,9−13] However, few previous risk estimates for groundwater are based on site-specific hazard identification and exposure assessment. Most are based on individual index pathogens (e.g., verotoxigenic Escherichia coli in Ireland),[14] pathogen concentrations extrapolated from the published scientific literature,[6,8] or a combination of the two.[4,5] Few groundwater QMRAs have considered more than one index pathogen with concentrations measured in situ, as was done for a peri-urban region of Argentina and a rural region in Wisconsin, USA.[7,15] Lack of site-specific pathogen data is a recognized limitation of many QMRAs,[16] one that affects site-specific risk management at the level of individual states or even individual drinking water systems. Furthermore, public water regulations in the U.S. aim to apply site-specific risk management by classifying the source water (based on monitoring and physical factors) and by addressing specific pathogen hazards appropriate to each classification. Separate pathogen removal criteria for viruses, bacteria, and protozoa may be specified, and regulation or guidance is provided for acceptable treatment technologies that meet the removal criteria. For example, public water supplies regulated under the Surface Water Treatment Rules (including groundwater under the direct influence of surface water (GWUDI)) must treat by filtration and disinfection for Cryptosporidium, Giardia, and viruses.[17−19] In contrast, groundwater sources determined to be fecally contaminated under the Groundwater Rule are required to treat for viruses (or perform other corrective action) but not protozoa.[20] Thus, proper classification of water sources along with site-specific hazard identification and exposure assessment are crucial. This study presents a QMRA for public wells throughout Minnesota, USA, where our objective was to characterize the expected population-level risk at the level of individual wells and across major public water supply (PWS) types. QMRA was based on a large and inclusive occurrence study that included a broad suite of pathogens, a diverse array of microbiological measurements, and repeated sampling of 145 public wells.[21,22] Twenty-one percent of samples and 70% of sampled wells were positive for at least one pathogen during the study period.[22] The current study estimates risk (infections person–1 year–1) for individual public water supply (PWS) wells and examines variation in risk by well, PWS type, and disinfection status. Our results consider current U.S. groundwater regulations and highlight potential areas for improvement of their implementation in Minnesota and elsewhere in the U.S.

Materials and Methods

Overview

QMRA is based on a site-specific pathogen occurrence study conducted in Minnesota, USA.[21,22] Groundwater wells were sampled once every 2 months over 2 consecutive study years, with 60 wells sampled in year 1 (2014–2015), 56 wells sampled in year 2 (2015–2016), and 29 wells sampled in both years. Wells in year 1 were randomly selected from among the year-round nondisinfecting systems in the state with one exception; one disinfecting well (sampled in both years) was misclassified as nondisinfecting during well selection. Wells in year 2 were selected without regard to disinfection status and by prioritizing systems in sensitive geologic settings. Well selection produced a sample of wells that represented the five main aquifer types (sand and gravel, sandstone, crystalline rocks, carbonate rocks, and mixed rocks) and a range of geologic sensitivities of all public wells statewide.[21,22] A total of 964 samples were collected from 145 wells, including community (n = 88), noncommunity (n = 57), disinfecting (n = 40), and nondisinfecting (n = 105) PWS wells. Public water supplies regularly serve 25 or more of the same people for >60 days per year. Community water supplies serve residents; noncommunity supplies serve people outside of their homes (e.g., schools and campgrounds). Disinfecting systems implement chlorine disinfection that provides 4-log inactivation of viruses. Samples were collected prior to treatment and distribution, and measurements of nine waterborne pathogens were made using quantitative polymerase chain reaction (qPCR). Cryptosporidium and Giardia were also measured using immunofluorescence assay (IFA), and Cryptosporidium species were determined by sequencing the 18S rRNA gene, allowing species-specific risks to be addressed. Laboratory methods and quality controls are described in reports by Stokdyk et al.[21,22] QMRA is based on well-level risk estimates because these are relevant to the Minnesota Department of Health (MDH) for determining management priorities. Risk estimates were calculated for each well in each year (referred to as a “well-year”) and typically based on six water samples per well-year. Major inputs included drinking water consumption, pathogen concentrations,[21,22] dose-harmonization parameters, log-removal values for disinfecting wells, and dose–response models. Well-level risk estimates were generated using two-dimensional Monte Carlo (2DMC) simulations implemented using the “mc2d” package in R version 4.1.1.[23,24] Well-level risk estimates were then combined for major PWS categories to inspect the overall risk for noncommunity, disinfecting community, and nondisinfecting community wells.

Hazard Identification

The hazard investigated was gastrointestinal infection due to waterborne pathogens ingested via drinking water from public wells. Nine waterborne pathogens quantified by Stokdyk et al. were included: adenovirus, enterovirus, norovirus, Shiga toxin 2-producing bacteria (modeled as EPEC), nontyphoidal Salmonella spp., Cryptosporidium hominis, Cryptosporidium parvum, Cryptosporidium spp., and Giardia duodenalis.[21,22] Pathogens analyzed by Stokdyk et al.[22] that were not detected quantitatively were excluded from the QMRA, and samples without detected pathogens were considered negative (i.e., dose equal to zero). For example, Campylobacter jejuni was detected once but excluded because it was not quantified.[22] Rotavirus A was detected but excluded because the qPCR assay could not distinguish wild-type rotavirus from vaccine shed in stool, and Cryptosporidium andersoni was detected but excluded because no dose–response model exists for it. We did not estimate risk for pathogens that were not analyzed and/or detected, consistent with our emphasis on using site-specific hazard and exposure assessments.

Exposure Assessment

Exposure assessment estimated daily dose (D), average daily dose (E[D]), and the variance of daily dose (Var[D]) for each pathogen at the level of well-year and accounted for the sporadic nature of daily pathogen exposures using 2DMC simulations. Daily dose for the jth pathogen in the ith well-year (i = 1–174; j = 1–9) was calculated aswhere D is the daily dose, C is the corresponding concentration (gene copies L–1), V is the daily drinking water consumption (L), LRV are log-removal values that vary by pathogen and disinfection status, and H is a pathogen-specific dose-harmonization factor to convert from qPCR units to infectious dose–response units (gene copies per infectious unit; Table S1). The quantities D were estimated using 2DMC simulations as objects with both variability and uncertainty dimensions. Variable inputs to eq included C and V; uncertain inputs included C and most values of H (see Table S1 for additional details). The quantities LRV were treated as constants. Variability and uncertainty in the distribution of C were estimated based on bootstrap sampling (with replacement) of Stokdyk et al.’s concentration measurements at the level of well-year (Table S1); this approach avoids the need to make parametric assumptions about the distribution of pathogen concentrations across samples.[21,22] The quantities E[D] were estimated as the arithmetic mean of D in the variability dimension, and Var[D] was estimated as the variance of D in the variability dimension.[23] Inputs for eq were from published studies. Values of C were derived from Stokdyk et al.’s occurrence studies that quantified pathogen concentrations using qPCR in large-volume (140–1783 L) dead-end ultrafiltration samples (n = 964 water samples).[21,22] Equivalent sample volumes analyzed by qPCR were in the range 0.12–17 L. The quantity V was assumed to follow a log–normal distribution with mean equal to 1.1 L/person-day (Table S1). This distribution was selected based on the U.S. Environmental Protection Agency’s (USEPA) Exposure Factors Handbook and to be consistent with assumptions made by MDH for chemical risk assessments.[25] Log-removal values were based on log-removal credits granted by MDH for each pathogen type (Table S1). LRV equaled 4 for viral and bacterial pathogens in disinfecting (i.e., chlorinating) systems and 0 for Giardia and Cryptosporidium. For the small number of systems providing both disinfection and filtration (n = 3), LRV equaled 4 for viruses and bacteria, 3 for Giardia, and 2 for Cryptosporidium. The 4-log LRV credit for viruses and bacteria is granted by MDH based on verification that the product of free chlorine concentration and contact time for each system follows minimum guidance suggested by USEPA.[26] Values of H were extrapolated from available published data. Values of H for Giardia and Cryptosporidium were derived from linear regressions of qPCR measurements against IFA measurements in Stokdyk et al.’s groundwater samples;[21,22] they are therefore directly applicable to conditions of the current study. In contrast, all other values of H were extrapolated from nongroundwater settings due to limited available data. H for Salmonella was derived from comparison of qPCR measurements to culture-based measurements in Corsi et al.’s surface water samples.[27] Similarly, we are unaware of estimates for H specific to Shiga toxin 2-producing bacteria, so we extrapolated its value from Salmonella based on their physiological similarity. We adopted Kundu et al.’s value of H for adenovirus,[28] which was derived by considering ratios of 50% tissue culture infectious dose (TCID50) to plaque-forming units (PFU) and gene copies/PFU in wastewater for adenovirus.[29,30]H for enterovirus was derived from comparisons of gene copies/PFU for enteroviruses in laboratory stocks by Jonsson et al.[31] Numerical values for H and their treatment with respect to 2DMC simulations are reported in Table S1.

Dose–Response Assessment

Dose–response assessment used published parameters for all pathogens. The primary criterion for parameter selection was that corresponding dose–response models were validated against or developed with observational epidemiological data. This provides some assurance that risk estimates represent wild-type pathogens and a natural mix of host populations, and it applies to models selected for C. parvum,[32]C. hominis,[33,34]Giardia duodenalis,[35] Shiga toxin 2-producing bacteria (modeled as EPEC),[36,37] and Salmonella spp.[37,38] Similarly, dose–response for Cryptosporidium spp. was based on combining output from the dose–response models for C. parvum and C. hominis (Table S1).[33] Additional rationale for dose–response model selection is provided in the online Supporting Information (pp S3–S4), including species-specific choices for C. parvum and C. hominis. Accounting for Cryptosporidium species is important in the current analysis because our exposure data allow us to identify species, and its infectivity is known to vary by species.[34,39] Exceptions to our primary model selection criterion included each of the viral pathogens. Observational dose–response data are unavailable for adenovirus or enterovirus, so Crabtree et al.’s adenovirus model was used for the former, while Mena et al.’s coxsackievirus model was used for the latter.[40,41] Both models are based on experimental dose–response data. Norovirus dose–response is notoriously uncertain, so consistent with best practices,[42] we used two norovirus dose–response models with model selection represented as an uncertain 2DMC input (Table S1). The first was Messner et al.’s fractional Poisson,[43] and the second was Schmidt’s exact β-Poisson with immunity;[44] these represent the highest and lowest published estimates of norovirus infectivity, respectively. We estimated the proportion of seropositive individuals (i.e., those who express FUT2 and are therefore susceptible to norovirus infection) for the populations of interest based on data from Lindesmith et al.[45] In some cases, the best-available dose–response relationships were for illness as the endpoint instead of infection (i.e., for C. hominis, EPEC, and Salmonella). In these cases, we rearranged the equation that relates probability of illness (Pill), probability of infection (Pinf), and the conditional probability of illness given infection (i.e., the morbidity ratio Pill|inf) to solve for Pinf = Pill/Pill|inf and used morbidity ratios to estimate Pinf (Table S1).

Risk Characterization

We estimated average annual risk from variable daily exposure conditions, and the risk was characterized at two levels (Figure S1). First, the risk was estimated at the level of individual well-years (one well observed for one year) because wells were the primary units of observation in the previous occurrence studies and the primary management units for MDH.[21,22] This approach also accommodates multiple years of observation and allows comparison between years 1 and 2. Daily probability of infection (P) by pathogen and well-year was estimated from D (eq ) using selected dose–response models (Table S1). Average daily probability of infection (E[P]) was then estimated as the arithmetic mean of P in the variability dimension of 2DMC simulations, and annual probability of infection (E[P]) was estimated using the equation We multiplied annual probabilities of infection from eq by the number of consumers served by each well (N) to estimate the total number of expected infections per pathogen per well-year (Y). We then normalized Y to the total number of person-years at risk per well-year in order to estimate corresponding infection incidence rates (IR) in units of infections/person-year. Values of IR were summed to calculate total infection incidence rates (IR) across all pathogens for each well-year Risk was also characterized at a second level across PWS type (community or noncommunity) and disinfection status (disinfecting or nondisinfecting) (Figure S1). Values of IR were summed within each combination of PWS type and disinfection status and normalized to the sum of N for that same combination in order to estimate infections person–1 day–1. These results represent population-weighted means and can be used to infer risk for corresponding populations of wells throughout Minnesota. They are reported as median risk estimates with 95% confidence intervals (CIs) with respect to the uncertainty dimension of 2DMC simulations.[23]

Risk Versus HF183 Bacteroides Concentrations

To investigate its relationship with risk, concentrations of the human fecal marker HF183 Bacteroides were summarized at the level of well-year by taking their arithmetic mean across all samples within each well-year (n = 6 in most cases). Annual well-year risk estimates were compared to mean HF183 concentrations with respect to four categories: 0, >0, >0.1, and >1.0 gc L–1.

Results

Exposure Assessment

Average daily doses calculated for the nine pathogens in each of 174 well-years (9 × 174 = 1566 estimates in total) were generally low. For most pathogens, they were equal to 0 in at least 75% of well-years, and 50% of all well-years were estimated to produce no pathogen exposures at all (Table ). Furthermore, non-zero daily doses were often much less than 1 infectious unit per person per day (Table ). Exceptions included Salmonella spp. and Cryptosporidium spp.; both occurred in ≥25% of well-years (Table ). Similarly, high percentiles of norovirus, Salmonella spp., and C. parvum daily dose were all >1 infectious unit person–1 day–1 (Table ).
Table 1

Mean and Variance (in Parentheses) of Daily Dose Estimates for Each Pathogen at Selected Percentiles of Well-Year

 percentile (n = 174 well-years)
pathogen, dose units0.500.750.900.950.99
adenovirus, TCID500 (0)0 (0)7.3 × 10–5 (4.1 × 10–8)2.5 × 10–3 (4.6 × 10–5)9.9 × 10–2 (5.6 × 10–2)
enterovirus, PFU0 (0)0 (0)0 (0)3.5 × 10–6 (3.6 × 10–10)4.6 × 10–4 (1.6 × 10–6)
norovirus, gc0 (0)0 (0)0 (0)0 (0)4.1 × 101 (1.1 × 104)
Stx2-producing bacteria, CFU0 (0)0 (0)0 (0)0 (0)4.8 × 10–3 (3.1 × 10–4)
Salmonella spp., CFU0 (0)3.3 × 10–3 (8.3 × 10–5)9.5 × 10–1 (6.7 × 100)9.0 × 100 (4.1 × 102)3.5 × 102 (1.4 × 106)
C. hominis, oocysts0 (0)0 (0)0 (0)0 (0)9.7 × 10–3 (2.6 × 10–3)
C. parvum, oocysts0 (0)0 (0)6.0 × 10–3 (3.0 × 10–4)4.7 × 10–1 (1.5 × 100)2.8 × 100 (3.2 × 101)
Cryptosporidium spp., oocysts0 (0)8.0 × 10–4 (4.6 × 10–6)2.3 × 10–3 (2.3 × 10–5)4.7 × 10–3 (1.2 × 10–4)3.2 × 10–2 (1.3 × 10–3)
Giardia, cysts0 (0)0 (0)0 (0)0 (0)8.6 × 10–1 (2.1 × 100)
The magnitude of over-dispersion varied by pathogen and well-year. Over-dispersion with respect to the Poisson distribution was estimated based on comparing the variance of daily dose (Var[D]) to its mean (E[D]) from 2DMC simulations. For some pathogens, there was little to no over-dispersion for most or all well-years (i.e., Var[D] ≤ E[D]). This included enterovirus, Shiga toxin 2-producing bacteria, C. hominis, and Cryptosporidium spp. (Table ). Likewise, for many well-years, there was little over-dispersion for all but a few pathogens (e.g., norovirus and Giardia) (Table ). However, in cases where Var[D] ≫ E[D], our use of 2DMC simulations to account for variability in exposure prevents overestimation of risk caused by overestimating the probability of exposure (e.g., for high percentiles of norovirus, Salmonella spp., C. parvum, and Giardia; see Table ).[47]

Annual Probability of Infection by Pathogen, Well, and Year

Risk was estimated as 0 infections person–1 year–1 for 50–75% of well-years and most individual pathogens (Table ), which reflects the low mean daily doses (Table ). However, the annual risk was significant when all pathogens were considered. Median risk from all pathogens combined exceeded 10–4 infections person–1 year–1 by more than a factor of 100 in year 1 and by a factor of more than 6 in year 2 (Table ). In fact, annual risk estimates exceeding 1 infection person–1 year–1 occurred (Table ), indicating that consumers in some systems may contract infections from multiple pathogens in a given year. High total annual risk estimates can be attributed to accumulation of daily risk across 365 days of exposure per year and accumulation of individual pathogen risks across all nine pathogens included in the study.
Table 2

Percentiles of Annual Infection Risk (Infections Person–1 Year–1) by Pathogen Type and Calculated at the Level of Well-Year (n = 145 Wells and 174 Well-Years in Total; 29 Wells Were Sampled in Both Years)

 percentile (Year 1, n = 89 well-years)
percentile (Year 2, n = 85 well-years)
pathogen0.250.500.750.900.95max0.250.500.750.900.95max
adenovirus003.2 × 10–53.1 × 10–16.9 × 10–11.0 × 100000001.0 × 10–2
enterovirus00003.6 × 10–41.5 × 10–3000006.7 × 10–4
norovirus00006.0 × 10–11.0 × 100000000
Stx2-producing bacteria000000000003.7 × 10–2
Salmonella spp.004.3 × 10–11.0 × 1001.0 × 1001.0 × 1000001.4 × 10–67.1 × 10–27.8 × 10–1
C. hominis000009.7 × 10–1000000
C. parvum0008.9 × 10–21.0 × 1001.0 × 1000001.3 × 10–19.1 × 10–11.0 × 100
Cryptosporidium spp.0007.5 × 10–22.2 × 10–15.1 × 10–1009.1 × 10–22.4 × 10–14.4 × 10–11.0 × 100
Giardia00002.0 × 10–11.0 × 100000001.0 × 100
total (IRi,total)a04.2 × 10–21.0 × 1001.1 × 1002.0 × 1002.6 × 10006.8 × 10–41.9 × 10–19.6 × 10–11.0 × 1001.4 × 100

Total risk is summed across all pathogens within well-years. It can exceed risk for individual pathogens at a given percentile because the individual pathogens that contribute to total risk occur in different wells.

Total risk is summed across all pathogens within well-years. It can exceed risk for individual pathogens at a given percentile because the individual pathogens that contribute to total risk occur in different wells. Among pathogens, infection risk was the highest for Salmonella spp. in year 1, and Cryptosporidium, which contributed significantly to risk in both years, produced the highest risk estimates in year 2 (Table ). Infection risk for individual pathogens differed between the two years, reflecting variation in the presence and concentration over time as well as differences in the wells sampled. For example, norovirus contributed to risk in year 1 but not year 2. All nine pathogens included in the QMRA contributed to overall risk, and pathogens that were not detected in study wells by Stokdyk et al.[22] were excluded from the QMRA. Infection risk estimates were higher for year 1 than year 2 for many individual pathogens, including adenovirus, norovirus, Salmonella, C. hominis, and Giardia, resulting in higher overall risk for year 1 (Table ). Median infection risk due to all pathogens in year 1 (4.2 × 10–2 infections person–1 year–1) was >60 times higher than in year 2 (6.8 × 10–4 infections person–1 year–1) (Table ). However, the variation in risk among individual wells within years was greater than the variation between years. For example, maximum risk due to all pathogens in year 2 (1.4 × 100 infections person–1 year–1) was more than 2000 times greater than median risk (6.8 × 10–4 infections person–1 year–1) (Table ).

Annual Probability of Infection by PWS Type and Disinfection Status

Summed across all pathogens, risk estimates for all PWS types were approximately 10–1 infections person–1 year–1 (Table ), 1000 times higher than the threshold of 10–4 infections person–1 year–1, and 59% of individual well-years exceeded that threshold (Table ). Infection risk for community wells (disinfecting and nondisinfecting combined) was 9.6 × 10–2 infections person–1 year–1 (95% CI: 2.9 × 10–2 to 2.1 × 10–1), and risk for disinfecting and nondisinfecting community wells differed (Table ). Risk estimates were high for noncommunity wells, though these serve a small population and so contribute less to overall risk.
Table 3

Infection Incidence Rate (Infections Person–1 Year–1) and Infections per Year Totaled Across all Pathogens by PWS Type and Disinfection Status

PWS type and disinfectionn wellsn well-yearspopulation servedtotal infections person–1 year–1 median (95% CI)atotal infections per yearb median (95% CI)no. well-years > 10–4 infections person–1 year–1
noncommunityc577781893.3 × 10–1 (2.3 × 10–1 to 4.2 × 10–1)2702 (1883–3439)50 (65%)
community8897476,3819.6 × 10–2 (2.9 × 10–2 to 2.1 × 10–1)45,733 (13,815–100,040)52 (54%)
nondisinfecting community5058128,8283.9 × 10–2 (2.3 × 10–2 to 5.4 × 10–2)5024 (2963–6957)35 (60%)
disinfecting communityd3839347,5531.2 × 10–1 (2.6 × 10–2 to 2.7 × 10–1)41,706 (9036–93,839)17 (44%)
all145174484,5701.0 × 10–1 (3.5 × 10–2 to 2.2 × 10–1)48,457 (16,960–106,605)102 (59%)

Risk estimates are summarized as medians with 95% CIs with respect to the uncertainty dimension of 2DMC simulations; risk estimates represent weighted means across all well-years in the variability dimension, with weights equal to population served.

Infections per year are calculated as population served × infections person–1 year–1.

Includes 55 nondisinfecting wells (75 well-years) and 2 disinfecting wells (2 well-years); results for disinfecting wells not shown separately due to low n.

Risk estimates for disinfecting community wells reflect assumed LRVs for viral and bacterial pathogens (see Section ).

Risk estimates are summarized as medians with 95% CIs with respect to the uncertainty dimension of 2DMC simulations; risk estimates represent weighted means across all well-years in the variability dimension, with weights equal to population served. Infections per year are calculated as population served × infections person–1 year–1. Includes 55 nondisinfecting wells (75 well-years) and 2 disinfecting wells (2 well-years); results for disinfecting wells not shown separately due to low n. Risk estimates for disinfecting community wells reflect assumed LRVs for viral and bacterial pathogens (see Section ). For the 38 disinfecting community wells, disinfection reduced risk estimates by 48% to 1.2 × 10–1 infections person–1 year–1 (from 2.3 × 10–1 infections person–1 year–1 when log removal credits were not applied) (Table S4). However, risk remained >10–4 infections person–1 year–1 for 44% of well-years (Table ) despite high log-removal values for many pathogens (e.g., 4 logs for bacteria and viruses), which can be attributed to the dominance of Cryptosporidium in risk estimates. Cryptosporidium was present in 40% of study wells based on qPCR,[21] is highly infectious,[34] and resistant to chlorine disinfection.[48] It therefore receives 0 log-removal credits from MDH when assessing waterborne pathogen risk for disinfecting systems.

Annual Probability of Infection Versus HF183 Bacteroides Concentrations

The human-specific fecal marker HF183 Bacteroides was present in 92% of study wells (90% of well-years),[22] and median risk was greater in well-years when it was present than when it was absent (Table ). However, the overall relationship between HF183 presence and risk was inconsistent and highly variable. Increasing average HF183 concentrations above 0 gene copies L–1 did not correspond to monotonic increases in risk at any risk percentile, and the risk varied widely from 0 to more than 1 infection person–1 year–1 at each level of HF183 concentration investigated (Table ). Thus, the absence of HF183 (10% of well-years) was weakly associated with lower risk, but higher concentrations of HF183 had no clear relationship with risk, which is consistent with the conclusions drawn from Stokdyk et al.’s investigation of sample-level associations between HF183 and pathogen occurrence.[22] Overall, pathogen measurements provided the best indication of health risk, and risk could not be directly inferred from measurements of HF183 alone.
Table 4

Percentiles of Total Infections Person–1 Year–1 Estimated at the Level of Well-Year (n = 174 Well-Years Total) for Several Categories of Average Annual HF183 Bacteroides Concentration (Based on 6 Samples per Well-Year Typically)

  percentile
HF183 concentrationn (well-years)0.250.500.750.900.95maximum
0 gene copies L–118002.6 × 10–11.3 × 1002.1 × 1002.7 × 100
>0 gene copies L–115604.0× 10–27.5 × 10–11.0 × 1001.1 × 1002.1 × 100
>0.1 gene copies L–112504.4× 10–26.0 × 10–11.0 × 1001.1 × 1002.0 × 100
>1.0 gene copies L–15701.5× 10–22.4 × 10–19.1 × 10–11.0 × 1001.0 × 100

Discussion

Study Summary and Strengths

We used QMRA to estimate infection risks associated with consumption of groundwater from noncommunity, disinfecting community, and nondisinfecting community wells in the state of Minnesota, USA. Risk estimates included nine viral, bacterial, and protozoan pathogens measured in a large site-specific occurrence study (n = 964 samples from 145 wells).[21,22] The sporadic nature of pathogen occurrence was accounted for by repeated sampling of the study wells and use of 2DMC simulations during risk characterization. Pathogen concentrations from qPCR were converted to estimates of infectious units for compatibility with dose–response models. Dose-harmonization factors for Cryptosporidium and Giardia were derived from IFA measurements of the samples analyzed via qPCR,[21,22] providing a study-specific dose-harmonization factor for Cryptosporidium, the most prevalent pathogen in our study. Our computational approach accounted for uncertainty in exposure at the levels of individual well-year, by PWS type, and by disinfection status.

Risk Estimates in Context

Infection risk was estimated as 1.0 × 10–1 infections person–1 year–1 study-wide, and 59% of individual well-years exceeded the U.S. reference level of 10–4 infections person–1 year–1 (Table ). Risk estimates varied from 3.9 × 10–2 infections person–1 year–1 for nondisinfecting community wells to 3.3 × 10–1 infections person–1 year–1 for noncommunity wells (Table ), which corresponds to 2.0 × 10–2 to 1.7 × 10–1 AGI cases person–1 year–1 (assuming that approximately 50% of infections result in symptomatic illness).[46,49] While risk estimates exceeded the 10–4 reference level, our results are consistent with other estimates for risk associated with drinking water consumption in general and for groundwater specifically. Previous estimates for the risk of AGI from groundwater based on QMRA or epidemiology vary from 2.6 × 10–2 to 4.5 × 10–1 AGI cases person–1 year–1 and reflect a variety of study scopes and settings, including broad national-level estimates for public groundwater systems in the U.S.,[4,5] national-level estimates for private wells in Canada,[6] and regional assessments for public groundwater systems and private wells in Wisconsin.[7,9] These estimates are similar to ours, though differences in scope and approach may confound comparisons among studies. Lower risk estimates were reported for private wells in Ireland (2.8 × 10–4 illnesses person–1 year–1) and a neighborhood borehole in Argentina (1.1 × 10–3 infections person–1 year–1),[14,15] but these estimates were based on single index pathogens (verotoxigenic E. coli and Giardia), whereas our estimates summed risk across nine pathogens. Median risk estimates for individual pathogens in our study were lower than those reported for the studies from Ireland and Argentina and only exceeded the U.S. reference level when summed across pathogens (Table ). Our risk estimates are also comparable to AGI risk estimates for public drinking water systems in general, which include systems using groundwater, surface water, or a mixture of the two. AGI risk in these systems has been estimated at 1.5 × 10–2 to 6.0 × 10–2 AGI cases person–1 year–1 for community systems in the U.S. and 7.0 × 10–3 to 4.7 × 10–2 AGI cases person–1 year–1 for municipal systems in Canada.[10−12] Owens et al. reviewed QMRAs of public drinking water systems worldwide and reported that estimates commonly exceeded reference levels; risk estimates for 29 of 64 (45%) scenarios (each based on a single index pathogen) exceeded the 10–4 reference level.[16] Finally, our results are consistent with past estimates of the proportion of AGI attributed to drinking water. In the U.S., AGI from all transmission routes has been estimated to occur at a rate of approximately 6.0 × 10–1 illnesses person–1 year–1.[50] Assuming that the cited value applies within our study area and that our infection risk estimates can be translated to AGI risk as described above, we estimate that transmission of gastrointestinal pathogens via drinking water causes approximately 3 to 28% of all AGI cases in the populations we considered. For comparison, previous studies have estimated this value as 6 to 22% in public groundwater systems in Wisconsin,[9] 2 to 17% for public drinking water systems in the U.S. in general,[11] and 12% in a randomized intervention trial conducted for a disinfecting water system in Sonoma County, California supplied by groundwater under the influence of surface water.[13]

Risk is Driven by Sporadic Pathogen Exposures Across Many Wells

Pathogen occurrence in drinking water wells is sporadic,[22,51] suggesting that risk is driven by sporadic exposure across numerous wells rather than continuous exposure in a few wells. The current QMRA demonstrates the relevance of widespread, sporadic contamination to public health. Importantly, the sporadic and widespread nature of waterborne infection risk occurred across pathogens, so the broad hazard identification was crucial to characterizing risk. Most wells in each study year had non-zero infection risk (Table ), revealing that groundwater-borne infection risk was common. Though widespread across wells, pathogen exposures were sporadic within individual wells. For most pathogens, average daily doses were zero for 75 to 90% of well-years (Table ). However, when annualized and accumulated across all pathogens, more than 50% of well-years had non-zero risk (Table ). Similarly, even one pathogen detection for a given well-year frequently produced annual estimates in excess of 10–4 infections person–1 year–1; this occurred in 49 of 52 well-years where a single pathogen was detected just once. The repeated sampling of study wells and inclusion of multiple pathogens captured these sporadic events, and our computational approach accounted for the sporadic nature of these exposures (i.e., using 2DMC simulations). The importance of sporadic pathogen exposures in our study is consistent with growing recognition that extreme exposure events often drive risk.[16] Extreme exposure events could result from acute contamination events, variation in operation of wells or treatment systems, or other factors.[51,52] Because risk and exposure were widespread and sporadic, risk mitigation strategies informed by both group-level (e.g., PWS type) and well-specific characteristics (e.g., water quality monitoring) may be most effective.

Risk by Public Water System Category and Disinfection Status

Infection risk differed by PWS type, making it a potentially important characteristic for understanding and mitigating risk. Noncommunity wells had higher infection risk than community wells (Table ), indicating that these systems may warrant prioritization for risk mitigation efforts. Alternatively, the total cases of infection per year for each PWS type can be considered, which is estimated as risk (infections person–1 year–1) multiplied by population served. Disinfecting community wells in our study were estimated to account for the most cases (41,706 infections/year), followed by nondisinfecting community wells (5024 infections/year) and noncommunity wells (2702 infections/year) (Table ). Structuring the QMRA by PWS type therefore allows an assessment of management priorities. Infection risk also differed by disinfection status of the community systems. Infection risk for disinfecting community wells in the study (1.2 × 10–1 infections person–1 year–1; Table ) was higher than for nondisinfecting community wells (3.9 × 10–2 infections person–1 year–1; Table ) even when including the effect of disinfection (4-log removal for viral and bacterial pathogens). This difference by disinfection status may reflect higher quality source water for nondisinfecting community wells in this study as risk estimates for nondisinfecting community wells were also lower than those for the nondisinfected source water of disinfecting community wells (i.e., without disinfection credits applied), which was 2.3 × 10–1 (8.7 × 10–2 to 3.3 × 10–1; Table S4). Disinfection is only required for sources identified as susceptible to fecal contamination under the Groundwater Rule.[20] However, the majority of community groundwater systems in Minnesota provide disinfection as a standard practice for maintaining water quality, and most nondisinfecting groundwater systems in Minnesota serve smaller populations compared to those that disinfect. Compared to the nondisinfected source water (without disinfection credits), disinfection reduced risk by approximately 48% (from 2.3 × 10–1 to 1.2 × 10–1 infections person–1 year–1) for the disinfecting community wells. This unexpectedly modest effect can be attributed to the fact that Cryptosporidium is resistant to disinfection.[48]Cryptosporidium was detected in 40% of wells,[21] and 33% of individual well-years in the study were estimated to have a non-zero Cryptosporidium dose. Consequently, Cryptosporidium was a major contributor to risk across pathogens and dominated the risk in year 2 (Table ). Because disinfecting wells were sampled almost entirely in year 2, temporal factors (e.g., precipitation) or well selection may also affect the comparison to nondisinfecting wells, which were sampled in both years. Nonetheless, the presence of Cryptosporidium and its resistance to disinfection were prominent factors for groundwater-borne infection risk.

Risk Mitigation for Minnesota Public Wells

Diverse pathogens contributed to infection risk across a broad sample of PWSs in Minnesota. Identifying specific pathogen hazards is important for mitigating risk, and while every pathogen included in the QMRA contributed to risk (Table ), total risk was dominated by Cryptosporidium. Furthermore, Cryptosporidium requires specific types of treatment to remove or inactivate it, and these are currently only required for surface water systems.[17−20] Current U.S. regulatory structure does not associate significant Cryptosporidium presence with wells unless they are designated as “GWUDI.” However, approximately half of Cryptosporidium-contaminated wells in our study showed no other signs of surface water influence.[21] Identifying wells prone to contamination remains a challenge without extensive monitoring like that conducted in this study. Nonetheless, various interventions have the potential to substantially affect health risk. These may include actions intended to prevent contamination (managing or removing contaminant sources, improving well construction, or moving to a well with better geologic protection) as well as those intended to inactivate pathogens after contamination has occurred (i.e., treating water). For example, assuming that all disinfecting community wells in the current study were required to filter or install ultraviolet disinfection and assuming that treatment resulted in 2-log removal of Cryptosporidium, the risk for disinfecting community wells would decrease from approximately 1.2 × 10–1 infections person–1 year–1 (Table ) to 1.7 × 10–2 infections person–1 year–1 (Table S4). This is a decrease in risk of 86% that could prevent more than 35,000 total infections per year for the sample of wells in our study (out of ∼42,000 total). The costs of treatment and the decrease of risk would both need to be considered.

The U.S. 10–4 Risk Threshold: Is it Applicable?

As our study was conducted in the U.S., we focused on an infection risk outcome and compared it to the U.S. benchmark of 10–4 infections person–1 year–1. Other benchmarks are potentially applicable, particularly the European benchmark of 10–6 disability-adjusted life years (DALYs) per person-year,[14] but there are insufficient data for calculating DALYs for all nine pathogens considered in the current study. Furthermore, the 10–6 DALYs person–1 year–1 benchmark is regarded as generally equivalent with the 10–4 infections person–1 year–1 benchmark, such that exceeding one usually equates with a high probability of exceeding the other.[16] As described in an EPA panel discussion on U.S. water regulations and the Surface Water Treatment Rule, the criterion of 1 infection per 10,000 person-years was chosen as a reasonable risk goal compared to known giardiasis incidence rates in the U.S. and Canada that occurred as part of detected outbreaks.[17,53] Our study has estimated a drinking water risk that exceeds that goal, consistent with several previous studies.[4−7,9−12,16] However, its results represent estimates of endemic infection risk from multiple pathogens, so it is reasonable to ask if the 10–4 goal is applicable in this context. It is not clear if that goal is intended to account for multiple waterborne pathogens, endemic/sporadic infections, or varying disease burdens among pathogens. Future work that refines estimates of actual waterborne infection risks in public drinking water systems could improve application of U.S. and international risk thresholds.

Limitations and Considerations for Interpretation

QMRA was limited by several assumptions related to data inputs required for mathematical modeling. We lack data on pathogen infectivity in our water samples, and our use of qPCR to enumerate pathogens necessitates dose-harmonization for compatibility between dose estimates and existing dose–response models. We have used the best available dose-harmonization factors from the scientific literature, but current options for many pathogens are limited, so future work to develop more site-specific dose-harmonization data could be beneficial to the field of QMRA as a whole. We also note that LRVs were based on MDH credits because wells were sampled prior to drinking water treatment, and LRVs do not reflect the effects of treatment technologies that lack log-removal credits (e.g., iron removal). Finally, we assumed that pathogens not analyzed or not quantified by Stokdyk et al.[21,22] were absent and that samples without detected pathogens were truly negative. Though the presence of pathogens not analyzed or not quantified would increase the risk, this limitation is mitigated by our use of a large site-specific occurrence study to estimate pathogen dose, which is considered the most consistently important variable in drinking water QMRAs for generating accurate predictions.[16]

Implications

Infection risk was widespread and caused by sporadic contamination of many pathogens, which illustrates the value of site-specific exposure data and the inclusion of many pathogens for understanding risk and prioritizing risk mitigation. The prominence of Cryptosporidium emphasizes its importance as a potential groundwater-borne hazard and illustrates the significance of etiology in risk mitigation. Results demonstrate that regulations for public water supplies in the United States provide opportunities for mitigating health risk at the level of implementation. Specifically, improved identification of wells at risk of pathogen contamination and applying treatment over a broader range of groundwater sources can offer substantial public health benefits with respect to reducing the incidence of waterborne disease. Other mitigation strategies such as eliminating the source of contamination or improving well construction could also be considered but were not the focus of this study.
  35 in total

Review 1.  A review of household drinking water intervention trials and an approach to the estimation of endemic waterborne gastroenteritis in the United States.

Authors:  John M Colford; Sharon Roy; Michael J Beach; Allen Hightower; Susan E Shaw; Timothy J Wade
Journal:  J Water Health       Date:  2006       Impact factor: 1.744

2.  Cryptosporidium Incidence and Surface Water Influence of Groundwater Supplying Public Water Systems in Minnesota, USA.

Authors:  Joel P Stokdyk; Susan K Spencer; James F Walsh; Jane R de Lambert; Aaron D Firnstahl; Anita C Anderson; Lih-In W Rezania; Mark A Borchardt
Journal:  Environ Sci Technol       Date:  2019-03-21       Impact factor: 9.028

3.  Quantification of enterococci and human adenoviruses in environmental samples by real-time PCR.

Authors:  Jian-Wen He; Sunny Jiang
Journal:  Appl Environ Microbiol       Date:  2005-05       Impact factor: 4.792

4.  Cryptosporidium hominis: experimental challenge of healthy adults.

Authors:  Cynthia L Chappell; Pablo C Okhuysen; Rebecca Langer-Curry; Giovanni Widmer; Donna E Akiyoshi; Sultan Tanriverdi; Saul Tzipori
Journal:  Am J Trop Med Hyg       Date:  2006-11       Impact factor: 2.345

5.  The prevalence and levels of enteric viruses in groundwater of private wells in rural Alberta, Canada.

Authors:  Xiaoli Pang; Tiejun Gao; Yuanyuan Qiu; Niamh Caffrey; Jessica Popadynetz; John Younger; Bonita E Lee; Norman Neumann; Sylvia Checkley
Journal:  Water Res       Date:  2021-07-10       Impact factor: 11.236

6.  Human susceptibility and resistance to Norwalk virus infection.

Authors:  Lisa Lindesmith; Christine Moe; Severine Marionneau; Nathalie Ruvoen; Xi Jiang; Lauren Lindblad; Paul Stewart; Jacques LePendu; Ralph Baric
Journal:  Nat Med       Date:  2003-04-14       Impact factor: 53.440

7.  Implementation of quantitative microbial risk assessment (QMRA) for public drinking water supplies: Systematic review.

Authors:  Christopher E L Owens; Mark L Angles; Peter T Cox; Paul M Byleveld; Nicholas J Osborne; Md Bayzid Rahman
Journal:  Water Res       Date:  2020-02-13       Impact factor: 11.236

8.  Viruses in nondisinfected drinking water from municipal wells and community incidence of acute gastrointestinal illness.

Authors:  Mark A Borchardt; Susan K Spencer; Burney A Kieke; Elisabetta Lambertini; Frank J Loge
Journal:  Environ Health Perspect       Date:  2012-06-01       Impact factor: 9.031

9.  Estimating the number of cases of acute gastrointestinal illness (AGI) associated with Canadian municipal drinking water systems.

Authors:  H M Murphy; M K Thomas; D T Medeiros; S McFADYEN; K D M Pintar
Journal:  Epidemiol Infect       Date:  2015-11-13       Impact factor: 2.451

10.  Quantitative Microbial Risk Assessment for Contaminated Private Wells in the Fractured Dolomite Aquifer of Kewaunee County, Wisconsin.

Authors:  Tucker R Burch; Joel P Stokdyk; Susan K Spencer; Burney A Kieke; Aaron D Firnstahl; Maureen A Muldoon; Mark A Borchardt
Journal:  Environ Health Perspect       Date:  2021-06-23       Impact factor: 9.031

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.