Literature DB >> 36037463

Resource Savings Associated With Use of an Automated Symptom Monitoring Tool for COVID-19 Public Health Response, Summer 2020-Summer 2021.

Kellen F Sweeney1, Heather M Halter, Kerry Krell, Donald McCormick, Janet Brown, Aimee Simons, Christian J Santiago-Rosas, Sylvianette Luna-Anavitate, Miriam V Ramos-Colon, Melissa Marzán-Rodriguez, Carla P Bezold.   

Abstract

CONTEXT: Active symptom monitoring is a key component of the public health response to COVID-19, but these activities are resource-intensive. Digital tools can help reduce the burden of staff time required for active symptom monitoring by automating routine outreach activities. PROGRAM: Sara Alert is an open-source, Web-based automated symptom monitoring tool launched in April 2020 to support state, tribal, local, and territorial jurisdictions in their symptom monitoring efforts. IMPLEMENTATION: As of October 2021, a total of 23 public health jurisdictions in the United States had used Sara Alert to perform daily symptom monitoring for more than 6.1 million individuals. This analysis estimates staff time and cost saved in 3 jurisdictions that used Sara Alert as part of their COVID-19 response, across 2 use cases: monitoring of close contacts exposed to COVID-19 (Arkansas; Fairfax County, Virginia), and traveler monitoring (Puerto Rico). EVALUATION: A model-based approach was used to estimate the additional staff resources that would have been required to perform the active symptom monitoring automated by Sara Alert, if monitoring instead relied on traditional methods such as telephone outreach. Arkansas monitored 283 705 individuals over a 10-month study period, generating estimated savings of 61.9 to 100.6 full-time equivalent (FTE) staff, or $2 798 922 to $4 548 249. Fairfax County monitored 63 989 individuals over a 13-month study period, for an estimated savings of 24.8 to 41.4 FTEs, or $2 826 939 to $4 711 566. In Puerto Rico, where Sara Alert was used to monitor 2 631 306 travelers over the 11-month study period, estimated resource savings were 849 to 1698 FTEs, or $26 243 161 to $52 486 322. DISCUSSION: Automated symptom monitoring helped reduce the staff time required for active symptom monitoring activities. Jurisdictions reported that this efficiency supported a rapid and comprehensive COVID-19 response even when experiencing challenges with quickly scaling up their public health workforce.
Copyright © 2022 Wolters Kluwer Health, Inc. All rights reserved.

Entities:  

Mesh:

Year:  2022        PMID: 36037463      PMCID: PMC9532362          DOI: 10.1097/PHH.0000000000001552

Source DB:  PubMed          Journal:  J Public Health Manag Pract        ISSN: 1078-4659


Case investigation, contact tracing, and active monitoring of potentially exposed persons are part of the public health response to COVID-19.1 Health departments perform symptom monitoring for persons at risk for COVID-19, including close contacts identified through contact tracing2 or travelers arriving from affected areas. Daily active symptom monitoring can help improve compliance with quarantine and allow public health agencies to promptly identify symptomatic persons for further follow-up.3 Early public health intervention for symptomatic persons allows agencies to provide treatment recommendations and isolation guidance faster, reducing the risk of continued exposures and transmission.4 Active symptom monitoring is resource-intensive for state, tribal, local, and territorial (STLT) public health staff.1,5–7 To address resource constraints, some health departments use technology to automate aspects of symptom monitoring. Sara Alert (The MITRE Corporation, McLean, Virginia) is an open-source, Web-based automated symptom monitoring tool launched in April 2020 to support STLT jurisdictions in their symptom monitoring efforts.8 Sara Alert enables public health officials to enroll individuals diagnosed with or at risk of developing a disease of interest (in this case, COVID-19). Once enrolled, individuals can report their symptoms daily through multiple platforms (eg, e-mail, text message, automated phone). If an individual reports symptoms or does not submit a daily report, the record is flagged by the system so that staff can quickly and efficiently identify individuals requiring follow-up for care coordination or nonresponse. The Sara Alert user interface was designed to be simple and intuitive for public health users to enable rapid uptake during the COVID-19 response. As of October 2021, a total of 23 public health jurisdictions in the United States had used Sara Alert to monitor more than 6.1 million individuals, performing daily symptom monitoring of persons at risk for COVID-19. Groups identified for symptom monitoring vary by jurisdiction and include close contacts of persons with COVID-19,9 travelers,10 critical infrastructure employees,11 and persons in isolation with confirmed or probable cases of COVID-19. Public health jurisdiction determine what populations will be included in their symptom monitoring effort and how to incorporate Sara Alert into their case investigation and contact tracing workflow, based on local public health priorities and operational considerations. Previous reports on Sara Alert have documented the use of the tool for symptom monitoring,9 and other studies have estimated the overall cost of active symptom monitoring as part of a disease containment strategy12–15; however, none have addressed the potential resource savings associated with adoption and use of Sara Alert or other automated symptom monitoring tools. The objective of this study was to estimate the cost and staff resource savings among 3 jurisdictions that implemented automated symptom monitoring using Sara Alert as part of their COVID-19 response strategy.

Methods

This analysis uses a model-based approach to quantify the extent to which Sara Alert might improve efficiency and save costs by reducing staff hours required for active symptom monitoring. The analysis estimates savings over an approximately 1-year study period in 3 jurisdictions that used Sara Alert to perform automated symptom monitoring as part of their COVID-19 response. Representatives from each jurisdiction were involved throughout the process, from constructing the logic of the economic model to validating model parameters and final estimates. Each jurisdiction integrated Sara Alert into its COVID-19 response differently. This analysis includes 2 use cases: (1) monitoring of close contacts following confirmed or potential exposure to COVID-19 (Arkansas; Fairfax County, Virginia); (2) monitoring travelers to reduce travel-related transmission, including both visitors to the jurisdiction and residents returning from travel outside the jurisdiction (Puerto Rico). In both use cases, the public health jurisdictions have clearly established actions for responding to a person reporting symptoms, and daily symptom monitoring can help quickly identify which persons need further intervention. Jurisdictions reported that they generally followed Centers for Disease Control and Prevention guidance for monitoring periods and quarantine recommendations.16

Data sources

Data for this analysis were drawn from Sara Alert system data, where possible, to minimize the burden of data collection on jurisdictions actively engaged in responding to COVID-19. Jurisdictions provided records or estimates for model parameters where system data were not available. The data sources are detailed as follows: Sara Alert purged data include information about all records ever added to Sara Alert. After records are closed (ie, the person is no longer being monitored) and remain inactive for 14 days, the records are purged of identifying information, and only limited data elements are retained for archival purposes. While this source includes only a limited set of data elements, it allows for insights across the entire time Sara Alert has been operational (since April 2020). Where available, data were drawn from this source to reflect the entirety of the study period. However, not all variables required for analysis were captured in this source—these records were not created for evaluative purposes and the data use agreements governing what data elements could be preserved were determined before the initiation of this evaluation. Sara Alert production data contain complete records that are either currently active or recently closed and not yet purged as described earlier. Aggregate values were summarized for each jurisdiction for a 2-week period each month during April-September 2021. These periods were a convenience sample, reflecting data already being summarized for operational and performance monitoring, to reduce burden on users and developers still actively engaged in the COVID-19 response. In addition, periods with known data quality issues identified by jurisdiction staff (eg, systematic inaccuracies in data entry) were excluded from the analysis. Jurisdiction records or estimates were used for model parameters that could not be drawn from Sara Alert production or purged data. These estimates were drawn from a variety of internal sources specific to each jurisdiction. In the case of Fairfax County, the jurisdiction provided all data required for the analysis.

Economic model

This analysis uses an economic model to estimate the additional staff resources that would have been required to perform the same volume of symptom monitoring that was automated by Sara Alert if monitoring were instead done using traditional methods such as telephone outreach by public health staff. The same model was used for all 3 jurisdictions. The model template, along with a detailed explanation of the calculation logic, is provided as Supplemental Digital Content (available at http://links.lww.com/JPHMP/A981) and can also be found on the Sara Alert Web site.8 The model begins with the volume of persons enrolled in automated symptom monitoring for the study period. This figure is multiplied by the average number of expected reporting days left in the monitoring period at the time of enrollment to determine a total volume of eligible reporting days for the study period. Days following a report of symptoms are excluded from the total volume of eligible reporting days, since persons reporting symptoms typically receive follow-up through more traditional manual methods. The model calculates the total number of successfully automated reporting days, defined as days in which the persons being monitored completed their symptom report through the automated system, with no staff intervention required. Because of differences in response rates for different reporting methods, these calculations are broken out by reporting method. The number of successfully automated reporting days is then divided by the average volume of outreach per hour that a public health staffer could perform in the absence of automated symptom monitoring to determine the overall staff hours saved because of automation. This estimate of total staff hours saved is converted to the number of full-time equivalent (FTE) staff for outreach and associated data entry and supervisory staff, respectively. Finally, jurisdiction-specific staff cost estimates are used to calculate the cost that would have been incurred to pay for those FTEs. Modeling was performed separately for each of the 3 jurisdictions. The specific parameters used to populate the model are captured in Table 1; additional detail on how values were calculated can be found in Supplemental Digital Content Appendix 1 (available at http://links.lww.com/JPHMP/A982).
TABLE 1

Model Parameters

ParameterSourceHow Variable Was CalculatedArkansasFairfax County, VirginiaPuerto Rico
Section 1: Study period and volume of persons monitored
Study period startJurisdictionJurisdictions identified a period that reflected “typical” use of Sara Alert.Sep 1, 2020Jul 1, 2020Aug 1, 2020
Study period endJurisdiction (see above) Jun 30, 2021Jul 31, 2021Jun 30, 2021
Total number of persons enrolled in automated monitoringSara Alert purged dataNumber of records added to Sara Alert within jurisdiction during the study period and purged as of Sep 2021. Referred to as “number of eligible records” in the following sections.283 70563 9892 631 306
Section 2: Volume of eligible reporting days
Average number of days between exposure and enrollment in Sara AlertSara Alert production dataCalculated as difference between the last date of exposure and the date of enrollment in Sara Alert.5.02.81.9
Total recommended days of monitoring following exposureJurisdictionProvided by jurisdiction based on the length of monitoring period postexposure.141414
Percentage of persons who report symptomsSara Alert purged dataPercentage of eligible records with a symptom-onset date recorded.10.1%6.7%0.3%
Average number of days between enrollment and first symptom report among those reporting symptomsSara Alert purged dataDifference between date of enrollment and date of symptom onset for eligible records with a symptom-onset date recorded.2.02.23.8
Section 3: Estimated time savings due to automated monitoring
Distribution of automated follow-up, by reporting method
Phone callSara Alert production dataCalculated as the percentage of persons enrolled in automated monitoring who chose that reporting method to receive their daily symptom report.8.3%0%0.1%
Plain text message73.0%0%92.3%
Text message Web link9.0%85%0.2%
E-mail9.8%15%7.4%
Response rate for automated follow-up, by reporting method
Phone callSara Alert production dataCalculated as the percentage of daily symptom reports to which the person responded through automated monitoring, for persons who chose that reporting method.51.2%n/a33.1%
Plain text message60.3%n/a53.9%
Text message Web link53.0%77%50.9%
E-mail37.7%66%28.5%
Average volume of outreach per hourJurisdictionAverage volume of persons that an outreach staff member could process per hour in the absence of automated monitoring. This average includes persons who cannot be reached and those requiring multiple outreach attempts. The estimate includes outreach and associated data entry.8-136-106-12
Section 4: Overall resource (FTE/cost) savings
Productive hours per staff FTE per weekJurisdictionTotal hours per FTE specific to the jurisdiction.404037.5
Hourly staff cost: Outreach and associated data entryJurisdictionTotal cost incurred by jurisdiction. Includes fringe, benefits, or overhead, where applicable.$25.71$48.85$17.00
Number of outreach and associated data entry staff per supervisorJurisdictionRatio of staff to supervisors for jurisdiction.251010
Hourly staff cost: SupervisoryJurisdictionTotal cost incurred by jurisdiction. Includes fringe, benefits, or overhead, where applicable.$36.00$64.78$20.00

Abbreviations: FTE, full-time equivalent; n/a, not applicable.

Abbreviations: FTE, full-time equivalent; n/a, not applicable.

Sensitivity analysis

A sensitivity analysis was performed to identify the variables to which the overall model results were most sensitive. The 3 variables identified were as follows: (1) total number of persons enrolled in automated monitoring; (2) average volume of outreach per hour (ie, the cost of the alternative to automated messaging); and (3) hourly staff cost (outreach and associated data entry). Of these variables, the one subject to the most variability and uncertainty was average volume of outreach per hour, as this figure was based on a jurisdiction estimate rather than verifiable data drawn from the study period. There were significant variance and uncertainty in what jurisdiction staff estimated to be a realistic caseload per staff person performing manual outreach. Because of this uncertainty, and because the model results are highly sensitive to this specific variable, the model uses both high and low estimates for this parameter for each jurisdiction and reports results as a range. In addition, sensitivity analysis was performed on the duration of the monitoring period for Puerto Rico. During the study period, Puerto Rico used a monitoring period of 14 days for all travelers regardless of trip duration; this is the monitoring period used in the model to most accurately value the time and cost that would have been required to perform the same volume of outreach as was successfully automated by Sara Alert. However, since approximately 55% of travelers monitored by Puerto Rico were visitors to the island, many of whom likely visited for less than 14 days, this likely overestimates the monitoring period that would have been used with manual methods. We modeled the impact of a reduced average monitoring period, assuming an average trip duration of 5 to 7 days for the 55% of the travelers who were visitors.

Results

Close contact monitoring

Arkansas monitored 283 705 individuals over the study period, generating an estimated resource savings of 61.9 to 100.6 FTE staff, or $2 798 922 to $4 548 249, over a 10-month study period. Fairfax County monitored 63 989 individuals for an estimated savings of 24.8 to 41.4 FTEs, or $2 826 939 to $4 711 566, over a 13-month study period.

Traveler monitoring

In Puerto Rico, where Sara Alert was used to monitor 2 631 306 travelers, the estimated staff resources that would have been required to perform the same volume of monitoring manually was 849 to 1698 FTEs, or $26 243 161 to $52 486 322, over an 11-month study period. Additional sensitivity analysis was performed on the duration of the monitoring period, as noted earlier. Assuming a monitoring period of 5 to 7 days for visitors to the island would decrease the overall savings estimate for Puerto Rico by 32% to 41%, to 503 to 1160 FTEs, or $15 552 765 to $35 856 817. Table 2 presents a summary of model results by jurisdiction. Results are reported as a range to account for uncertainty in typical volume of outreach performed per hour by public health staff, as noted earlier.
TABLE 2

Model Results by Jurisdiction

ArkansasFairfax County, VirginiaPuerto Rico
LowHighLowHighLowHigh
Study period
StartSep 1, 2020Jul 1, 2020Aug 1, 2020
EndJun 30, 2021Jul 31, 2021Jun 30, 2021
Weeks in study period43.356.647.7
Volume of persons monitored/eligible reporting days
Total number of persons enrolled in automated monitoring283 70563 9892 631 306
Daily average number of persons added for automated monitoring9361627 878
Number of eligible reporting days included in savings analysis2 365 518678 09131 878 220
Estimated time savings due to automated monitoring
Total number of successfully automated reporting days1 340 184510 94216 574 628
Total staff outreach hours saved due to automation103 091167 52351 09485 1571 381 2192 762 438
Overall resource (FTE/cost) savings
Outreach and associated data entry staff FTEs saved59.596.822.637.6771.91 543.9
Supervisory staff FTEs saved2.43.92.33.877.2154.4
Total FTEs saved61.9100.624.841.4849.11 698.3
Cost savings for outreach and associated data entry staff$2 650 471$4 307 015$2 495 951$4 159 919$23 480 723$46 961 446
Cost savings for supervisory staff$148 451$241 233$330 988$551 647$2 762 438$5 524 876
Total cost savings (staff plus supervisory)$2 798 922$4 548 249$2 826 939$4 711 566$26 243 161$52 486 322

Abbreviation: FTE, full-time equivalent.

Abbreviation: FTE, full-time equivalent.

Discussion

Our modeling analysis indicated substantial savings in each of 3 jurisdictions ($2 798 922-$52 486 322 over the study period) associated with automated symptom monitoring using Sara Alert. Perhaps, more importantly, the analysis demonstrated a substantial reduction in the staff time required to execute jurisdictional COVID-19 response strategies (25-41 FTEs in Fairfax County, 62-101 FTEs in Arkansas, and up to 1698 FTEs in Puerto Rico). During this phase of the COVID-19 response, availability of staff with the necessary training was more of a limiting factor than funding. Automated symptom monitoring helped reduce the staff time required for routine activities that could be effectively automated, allowing jurisdictions to scale up their symptom monitoring efforts rapidly and pivot their limited resources to other aspects of their COVID-19 response. Previous publications describe the cost of symptom monitoring of potentially exposed persons, or disease containment strategies more broadly, for recent public health responses to Ebola and measles in specific US jurisdictions.12–14 It is difficult to compare those results with this analysis due to the extreme difference in scale of the monitoring effort (eg, monitoring 20 people in Maricopa County13 or 5379 people in New York City12 following travel to an Ebola-affected area, compared with 63 989 to 2 631 306 people in the examples in this analysis). Furthermore, there are important differences in the public health response required for Ebola or measles relative to COVID-19, such as the need for clinical observations or the widespread availability of vaccination at the time of analysis. This analysis focuses on the resource savings for low-skill activities that can be effectively automated rather than the overall cost of response. Staff time and cost savings varied widely by jurisdiction, with the single most significant driver of the difference being the overall scale of the symptom monitoring effort. For jurisdictions monitoring close contacts, the volume of persons monitored is driven by population size, COVID-19 incidence, and yields from case investigation and contact tracing efforts. For traveler monitoring, the scale of the symptom monitoring effort is primarily driven by overall volume of travelers (both visitors to the jurisdiction and residents returning from travel outside the jurisdiction). Puerto Rico performed symptom monitoring for 2 631306 incoming travelers over its 11-month study period, a substantially larger volume of persons than Arkansas's symptom monitoring of close contacts (283 705 persons over 10 months). The number of FTEs that Puerto Rico would have had to hire to perform that outreach manually (849-1698) was correspondingly greater than for Arkansas (62-101) despite having similarly sized populations.17 The other most significant drivers of differences in savings between jurisdictions were (1) the pace at which jurisdictions expected they could perform outreach in the absence of automation; (2) hourly staff cost (ie, the cost of the alternative to automated messaging); and, to a lesser extent, (3) response rates to automated symptom report reminders. Jurisdictions varied significantly in what they estimated to be a realistic volume of hourly manual outreach per staff person, which may be due to true operational or population differences between jurisdictions or simply differences in how this parameter was estimated. The model accounts for this uncertainty by presenting results as a range, as described in the “Sensitivity Analysis” section. The range reported by jurisdictions was generally more efficient than time estimates published elsewhere,7 suggesting that the figures are plausible or even a conservative estimate of time savings. The hourly staff cost to perform symptom monitoring using a traditional, nonautomated approach varied dramatically across jurisdictions based on local wage indices and whether the jurisdiction primarily leveraged salaried staff or temporary contract resources paid hourly. While this study is not a return-on-investment analysis, jurisdictions weighing the budget trade-offs between automated and manual approaches to symptom monitoring should carefully consider how their jurisdiction-specific staff costs may impact estimated savings from automation. Response rates to automated symptom report reminders varied somewhat by jurisdiction, even after accounting for differences in reporting method, and were generally highest for persons enrolled in monitoring via text message and lowest for monitoring by e-mail. Response rates for persons who chose automated monitoring by phone were also low, but this was a small proportion (<1% in 2 of 3 jurisdictions) of the overall volume of persons monitored. Although this analysis does not assess implementation cost, it is worth noting that costs incurred for text messages are higher than those for e-mail, since text messages are sent over cellular networks and incur carrier fees. Jurisdictions evaluating the budget impact of sending text messages should also consider the potential decrease in response rate from other methods and whether additional staff costs would need to be incurred to perform follow-up with those not responding to automated outreach.

Strengths

This analysis had several strengths. First, the model was developed in close partnership with the 3 jurisdictions participating in the analysis and was refined iteratively to reflect their local needs and implementation, as well as to focus most closely on the variables of most importance to jurisdictions. Second, when estimating parameters, we have taken a conservative estimation approach—while this may underestimate savings somewhat, it also increases confidence in the savings estimated. For example, when estimating the total volume of eligible reporting days, we excluded days following a report of symptoms, although some portion of those persons would continue automated symptom monitoring. Third, given the limitations of the data available for analysis, we further attempted to reduce error in the model by simplifying the calculations as much as possible. While this loses some of the nuance of real-life implementation, this approach reduces opportunities for introducing error into the model where data are uncertain. It also strengthens the value of the model as a communication tool for jurisdictions and other public health organizations interested in understanding and articulating the potential resource savings from automated symptom monitoring.

Limitations

This analysis was also subject to at least 6 limitations in the design, execution, and interpretation of the findings. First, data in this analysis were not collected for evaluative purposes. The model has been adapted to address limitations in the data available, as described in more detail in the “Data Sources” and “Sensitivity Analysis” sections. Second, this analysis treats resource savings as though staff are a true variable cost, able to scale up and down rapidly in response to demand. This can be true with an hourly contract workforce but may be less applicable in situations with a salaried workforce, which represent a semivariable cost. This analysis also does not consider other potential sources of staff resource savings such as volunteers, students, or in-kind support from other institutions. Third, this analysis does not model return on investment. Development, hosting, and messaging costs were not incurred by the jurisdictions that experienced resource savings—Sara Alert was provided for jurisdiction use free of charge during the study period. Fourth, the model does not attempt to capture all sources of efficiency jurisdictions gained because of available Sara Alert features, such as household reporting, automatic closure of records following completion of monitoring, or ease of use relative to previous technology. The model also does not estimate savings for symptom monitoring of persons in isolation, though substantial numbers of persons in these jurisdictions were monitored during their isolation period. Fifth, this analysis does not address whether the jurisdiction would realistically have been able to perform with manual methods the same level of outreach conducted during the study period, nor does it address the value that being able to perform outreach on this scale provided to their communities. Finally, the savings estimate for Puerto Rico relied on a monitoring period of 14 days for all travelers regardless of trip duration, based on the practices in place during the study period. Since approximately 55% of travelers monitored by Puerto Rico were visitors to the island, many of whom likely visited for less than 14 days, this likely overestimates the monitoring period that would have been used with traditional manual methods. Assuming an average trip duration of 5 to 7 days for this population of visitors would decrease the overall savings estimate for Puerto Rico by 32% to 41%.

Conclusion

This analysis demonstrated the potential for substantial cost savings associated with implementation of automated symptom monitoring as a part of a jurisdiction's COVID-19 response. Jurisdictions reported that automated symptom monitoring helped reduce staff time required for activities that could be effectively automated, allowing them to focus their limited resources on activities that required a higher level of skill, engagement, or expertise. This efficiency supported a rapid and comprehensive COVID-19 response despite challenges with quickly scaling up their public health workforce and reduced the need to alternately hire and release staff in response to changing case rates across early waves of COVID-19. This study does not explore the comparative effectiveness of automated symptom monitoring relative to traditional outreach methods for key process measures such as participant response rates; understanding relative effectiveness would be another key consideration for determining the value to a jurisdiction of adopting automated tools. Further investigation and modeling are needed to understand how these efficiencies and resource savings might translate to other scenarios, such as response during subsequent COVID-19 waves; smaller-scale domestic infectious disease outbreaks (eg, measles); or monitoring of travelers following international exposure (eg, travelers returning from countries with Ebola outbreaks). Additional analysis should also assess the value that symptom monitoring at the scale performed during the COVID-19 response provides to jurisdictions and the communities they serve in terms of reduced disease transmission, health outcomes, and public confidence in public health institutions. Finally, while this study focuses on a specific retrospective analysis, the general approach to modeling resource savings can be adapted for other purposes beneficial to public health practitioners. Retrospective studies can help demonstrate the value of specific investments to key decision makers such as legislators or municipal government. Modeling can also be performed prospectively, estimating the relative cost of different operational approaches to inform decision making. While this type of modeling may be less rigorous than statistical approaches used for research, the simplicity allows for transparency and effective communication of findings. The template used for this analysis is provided as Supplemental Digital Content (available at http://links.lww.com/JPHMP/A981) for public health practitioners to further explore and leverage these methods. Automated tools have the potential to reduce the time required for active symptom monitoring and allow public health jurisdiction staff to reallocate limited resources to other activities. Automated symptom monitoring can help support a rapid and comprehensive disease response at a large scale, even when experiencing challenges with quickly scaling up the public health workforce. Public health jurisdictions should consider their local response priorities and staffing capacity when determining whether to incorporate automated symptom monitoring into their activities. Public health jurisdictions can leverage economic analysis approaches to model differences in cost and potential resource savings of different operational scenarios. This can support prospective decision making and retrospective assessment and communication about the value of investments.
  8 in total

1.  The cost of containing one case of measles: the economic impact on the public health infrastructure--Iowa, 2004.

Authors:  Gustavo H Dayan; Ismael R Ortega-Sánchez; Charles W LeBaron; M Patricia Quinlisk
Journal:  Pediatrics       Date:  2005-07       Impact factor: 7.124

2.  Cost Analysis of 3 Concurrent Public Health Response Events: Financial Impact of Measles Outbreak, Super Bowl Surveillance, and Ebola Surveillance in Maricopa County.

Authors:  J Mac McCullough; Nicole Fowle; Tammy Sylvester; Melissa Kretschmer; Aurimar Ayala; Saskia Popescu; Jolie Weiss; Bob England
Journal:  J Public Health Manag Pract       Date:  2019 Jul/Aug

3.  COVID-19 Case Investigation and Contact Tracing: Early Lessons Learned and Future Opportunities.

Authors:  Elizabeth Ruebush; Michael R Fraser; Amelia Poulin; Meredith Allen; J T Lane; James S Blumenstock
Journal:  J Public Health Manag Pract       Date:  2021 Jan/Feb

4.  Impact of delays on effectiveness of contact tracing strategies for COVID-19: a modelling study.

Authors:  Mirjam E Kretzschmar; Ganna Rozhnova; Martin C J Bootsma; Michiel van Boven; Janneke H H M van de Wijgert; Marc J M Bonten
Journal:  Lancet Public Health       Date:  2020-07-16

5.  Quantifying the Risk and Cost of Active Monitoring for Infectious Diseases.

Authors:  Nicholas G Reich; Justin Lessler; Jay K Varma; Neil M Vora
Journal:  Sci Rep       Date:  2018-01-18       Impact factor: 4.379

6.  Characteristics and Outcomes of Contacts of COVID-19 Patients Monitored Using an Automated Symptom Monitoring Tool - Maine, May-June 2020.

Authors:  Anna Krueger; Jayleen K L Gunn; Joanna Watson; Andrew E Smith; Rebecca Lincoln; Sara L Huston; Emilio Dirlikov; Sara Robinson
Journal:  MMWR Morb Mortal Wkly Rep       Date:  2020-08-07       Impact factor: 17.586

7.  COVID-19 Case Investigation and Contact Tracing Efforts from Health Departments - United States, June 25-July 24, 2020.

Authors:  Kimberly D Spencer; Christina L Chung; Alison Stargel; Alvin Shultz; Phoebe G Thorpe; Marion W Carter; Melanie M Taylor; Mary McFarlane; Dale Rose; Margaret A Honein; Henry Walke
Journal:  MMWR Morb Mortal Wkly Rep       Date:  2021-01-22       Impact factor: 17.586

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.