Literature DB >> 33930013

Using control charts to understand community variation in COVID-19.

Moira Inkelas1,2, Cheríe Blair3, Daisuke Furukawa3, Vladimir G Manuel2,4, Jason H Malenfant3, Emily Martin5, Iheanacho Emeruwa2,6, Tony Kuo2,4,7,8, Lisa Arangua8, Brenda Robles8, Lloyd P Provost9.   

Abstract

Decision-makers need signals for action as the coronavirus disease 2019 (COVID-19) pandemic progresses. Our aim was to demonstrate a novel use of statistical process control to provide timely and interpretable displays of COVID-19 data that inform local mitigation and containment strategies. Healthcare and other industries use statistical process control to study variation and disaggregate data for purposes of understanding behavior of processes and systems and intervening on them. We developed control charts at the county and city/neighborhood level within one state (California) to illustrate their potential value for decision-makers. We found that COVID-19 rates vary by region and subregion, with periods of exponential and non-exponential growth and decline. Such disaggregation provides granularity that decision-makers can use to respond to the pandemic. The annotated time series presentation connects events and policies with observed data that may help mobilize and direct the actions of residents and other stakeholders. Policy-makers and communities require access to relevant, accurate data to respond to the evolving COVID-19 pandemic. Control charts could prove valuable given their potential ease of use and interpretability in real-time decision-making and for communication about the pandemic at a meaningful level for communities.

Entities:  

Mesh:

Year:  2021        PMID: 33930013      PMCID: PMC8087083          DOI: 10.1371/journal.pone.0248500

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.752


Introduction

Coronavirus disease 2019 (COVID-19) mitigation and containment policies have significant economic, social, and health impact. Enacting sensible public policies in the COVID-19 pandemic requires real-time data that public leaders can easily interpret and act on. The constituencies for these data are expanding as regional and community stakeholders, including cities, businesses, and school districts, assume decision-making roles in the emergency response. Moreover, given that public health interventions require public cooperation and trust in guidance and decisions that are data-driven, there is also a need to engage the public at large to successfully implement mitigation and containment strategies. Offering data to inform policy and individual health behavior is a cornerstone of prevention and public health practice [1]. While providing relevant, accessible, and timely data is a core public health function, current displays of COVID-19 data lack features that policy-makers require for decision-making and that communities need to make the connection between their actions and the state of the pandemic. Such displays often use maps, day-to-day percentage changes, and cumulative counts that refresh daily. These formats obscure variation across places, populations, and time, which are essential to learning how actions and events affect COVID-19 cases and deaths. Over-aggregation impairs the ability of decision-makers to make real-time policy adjustments and to assess the impact of these adjustments. The public is unable to see local data that are most relevant and motivating to them regarding health behaviors [2]. Statistical process control method and theory focuses on ease of use and interpretation for end users [3-7] and learning under conditions of uncertainty [8, 9]. Many commercial, healthcare, and education organizations use control charts to understand the behavior of processes or systems over time [8-11]. By distinguishing random (“common cause”) variation from non-random (“special cause”) variation, control charts reduce over-reaction to noise in data while enabling timely response when true signals show that conditions are improving or deteriorating [11]. They enable scientists, policy-makers, and community members alike to learn if a change to a policy or process has affected an outcome of interest [11-13]. Control charts can be used for multiple common types of data distributions including classification (binomial) P charts, continuous (Xbar charts) and individuals (I charts), and count (Poisson) C charts or U charts [12]. Despite their potential value, control charts are not part of standard public health practice [11-16]. This article illustrates how control charts can be used to achieve public health goals in the COVID-19 pandemic. We offer prototype control charts and displays to demonstrate their utility.

Methods

Statistical process control

Control charts display data in an ordered format, most often ordered over time, to understand, manage, and improve the behavior of a specific process or system. The control chart includes a centerline (i.e., the mean of the data) and upper- and lower-control limits, which are three sigma above and below the centerline. When the measure is stable over time, the centerline and limits provide a rational prediction of future observations [12]. Values outside of the control limits indicate that the outcome is not being produced from one consistent homogeneous process [3, 4]. When there is a signal of change, the centerline and control limits shift to reflect the new level of performance [12]. This study uses a hybrid control chart for count data and exponential growth or decline (I chart) developed by Perla et al. for use in a pandemic [17-19].

Data sources on COVID-19 cases

This study analyzed data for selected regions of California. The county-level control charts use daily counts of COVID-19 from the Los Angeles Times COVID-19 repository, which provides a public datafile of cases reported by California counties [20]. The original data source is the Confidentiality Morbidity Report (CMR16) of laboratory-confirmed COVID-19 that counties report to the California Department of Health Care Services. The Los Angeles County Department of Public Health (LAC DPH) reports data for 272 distinct cities and neighborhoods.

Control chart analysis

The control charts display daily reported COVID-19 cases. Charts for counties begin on March 2, 2020 and charts within LA County begin on March 16 for consistency. This study uses the hybrid control chart method developed by Perla et al. [17] to view epochs and phases in the pandemic. The four possible epochs are pre-exponential growth (C-chart), exponential growth (an individuals (I) chart fitted to log10 of the data series and transformed back to the original scale), post-exponential growth (a flat trajectory or exponential decline that is represented by an I chart), and stability after descent (C-chart) [17, 21, 22]. A region may experience one or multiple epochs. A phase is a time period that is represented by a distinct control chart; there can be multiple phases within an epoch. To estimate the centerline and upper and lower control limits, the method requires at least eight observations to meet the minimum requirements for an effective C-chart [12] in the first and fourth epochs. The control charts automatically set the limits of the exponential growth period based on regression analysis of the first 20 observations [17]. The exponential growth phase is modeled by the log-linear regression I-chart [12]. We used model coding in R developed by Perla et al. [21, 22] to transform the counts using the log10 function and calculated the intercept and slope through regression analysis for the log10 data; the regression line becomes the centerline (CL). Limits for the exponential phase in the charts are calculated from the median moving range of the residuals, with the upper limit (UL) and lower limit (LL) calculated as CL+3.14*MRbar and CL-3.14*MRbar, respectively. The CL, UL, and LL were then transformed to the original count scale. Charts that do not display an exponential growth phase are in C-chart format for the full period studied. Formal use of control charts identifies special cause through established statistical rules combined with inspection by experts in the system being studied. Standard criteria for special cause are an observation outside of an upper or lower control limit or a shift of 8 successive observations above or below the centerline [11, 12]. For the exponential Epochs 2 and 3, we used Shewhart criteria modified by Perla et al. [17, 22], which require two points rather than one point above the control limits to signal the start of a new phase. The rationale is that COVID-19 data displays more than “usual” variation in the form of single large values that reflect “data dumps” from reporting entities; requiring a stronger signal prevents such a data artifact from triggering a new phase [22]. Notably, time series charts often reveal reporting artifacts. An example is peak values early in the week due to cases accumulating over a weekend. It is common practice in control charts to remove special cause variation due to such cyclic behavior by separating data lines within a chart or by subgrouping by a larger unit (i.e. week rather than day) to smooth variation. In this study, we preserved daily periodicity based on the statistical principles underlying control chart methodology, which is that data should not be summarized if it would mislead the user into taking actions that would not be taken had the original data been preserved [6]. Smoothing the data in these COVID-19 control charts would temper but not remove the apparent case reporting artifact, and seeing these patterns offers insights, namely that there is an impact from facility case reporting.

Description of California counties and LA County subregions included in the analysis

California is home to more than 12 percent of the U.S. population and has a complex geopolitical landscape with 58 distinct counties. LA County is a vast region with over 10 million residents, 88 cities (including the City of Los Angeles), 272 designated neighborhoods, and three public health departments, of which the largest is the LAC DPH.

Selection of counties and LA County subregions for inclusion in the analysis

The study team selected an illustrative set of charts using criteria relevant to the COVID-19 pandemic. We included five counties from the second, third, and fourth quartiles for population and from northern, central, and southern regions of California. We selected one neighborhood and four cities within LA County. Within LA County, we sought variation in sociodemographic factors that we considered to be especially relevant to the COVID-19 pandemic: median income, overall health, median age, race/ethnicity, population density (people per square mile), median household size, and percentage of households that experience household crowding, which is a measure derived from the U.S. Census that is defined as the percentage of households with a ratio of total household members to rooms (excluding bathrooms) greater than one.

Data sources

shows sociodemographics of selected areas. Measures of population, race/ethnicity, median income, household size, and population density come from the United States Census Bureau QuickFacts (2019) [23]. Median age comes from the 2018 American Community Survey (ACS) published by Towncharts [24]. Household crowding comes from the California Healthy Places Index (Public Health Alliance of Southern California) based on an ACS five year average for 2011–2015 [25]. Quartiles of health come from an overall ranking developed by County Health Rankings & Roadmaps (University of Wisconsin) that combines multiple health outcomes including premature death, poor or fair health, poor physical health days, poor mental health days, and low birthweight [26]. For the neighborhood within LA County, race/ethnicity comes from L.A. Mapping (Los Angeles Times) [27]. Data sources: Demographics from the United States Census Bureau QuickFacts, County Health Rankings (University of Wisconsin Population Health Institute), L.A. Mapping (Los Angeles Times). Congregate cases from the Los Angeles County Department of Public Health COVID-19 Dashboard, California county COVID-19 websites. Accessed June 30, 2020. aIndicates that data are not publicly available. bIncludes only congregate health and living facilities (not correctional facilities). also shows the proportion of COVID-19 cases from congregate living facilities in the first four months of the pandemic (March through June 2020); these data are available for some but not all counties through their public health department websites. For LA County overall and for cities and neighborhoods within the County, the congregate counts include cases from residential health and living facilities including skilled nursing facilities (SNFs), shelters, and correctional facilities. The congregate measure for Santa Clara County includes only residential long term care facilities [28]. The LA County DPH website provides COVID-19 test volume rates per 100,000 population by area, based on electronic lab reporting; the number per 100,000 in January 2021 showed modest variation for the areas in this study: 32,569 for Santa Monica, 19,279 for Lancaster, 26,980 for Bell, and 26,730 for Westlake.

Results

Figs 1 and 2 show control charts for five counties. Four counties experienced exponential growth in COVID-19 cases in early March 2020, which was followed by a period of lower exponential growth in three and non-exponential growth in one. Each county experienced at least one period of exponential growth; one county (Santa Clara) experienced no exponential growth until a November surge that was observed in all counties. Imperial County showed a cyclic weekly pattern associated with the lack of case reporting on weekends; this analysis retained all days in the chart for comparability with other counties.
Fig 1

Control charts of COVI D-19 cases: California, counties, subregions.

Shows daily case counts, midline, and upper and lower control limits. Source for county data is the New York Times. Source for Los Angeles cities/neighborhoods is the Department of Public Health COVID-19 dashboard (accessed 1/10/2020).

Fig 2

Annotated control charts of COVID-19 cases: City of Lynwood and Los Angeles County.

Control charts of COVI D-19 cases: California, counties, subregions.

Shows daily case counts, midline, and upper and lower control limits. Source for county data is the New York Times. Source for Los Angeles cities/neighborhoods is the Department of Public Health COVID-19 dashboard (accessed 1/10/2020). There was considerable variation in COVID-19 cases over time among the studied subregions in LA County. Two (Lancaster, Fig 1 and Lynwood, Fig 2) experienced initial exponential growth followed by a second phase of exponential growth with a lower midline before transitioning to a non-exponential epoch. Another (Westlake) experienced initial exponential growth and then entered a non-exponential epoch with multiple phases through the study period. This neighborhood had the highest residential density and household overcrowding relative to others as well as large household size (median of 3.0) (Table 1). Other areas with relatively high crowding and household size experienced multiple phases but no exponential rise until late 2020. The city with the highest median income and lowest rate of overcrowding (Santa Monica) showed low case counts throughout the study period with a doubled rate in the last month of the study. For the first four months of the study period, nearly half (46%) of cases in this city came from congregate facilities, while the rate ranged from 15% to 28% for other areas.
Table 1

Characteristics of selected California counties and neighborhood/cities within Los Angeles County.

RegionPopulationHealth quartileMedian ageRace/ethnicityMedian incomePopulation densityMedian household size% crowded households% congregate cases as of 6/30/20
Counties
Los Angeles10,039,107 (1, 4th)3rd3649% Latino, 9% Black, 15% Asian$64,2512,4203.01415
San Diego3,338,330 (2, 4th)4th3634% Latino, 6% Black, 13% Asian$74,8557362.97--a
Santa Clara1,922,852 (6, 4th)4th3725% Latino, 3% Black, 38% Asian$116,1781,3813.0813b
Solano447,643 (20, 3rd)3rd3827% Latino, 15% Black, 16% Asian$77,6095032.95--a
Imperial181,215 (30, 2nd)1st3285% Latino, 3% Black, 2% Asian$45,834423.910--a
Cities/Neighborhood
Lancaster157,6011st3240% Latino, 22% Black, 4% Asian$52,5041,6613.2426
Westlake103,8391st2773% Latino, 4% Black, 16% Asian$26,75738,2143.04525
Santa Monica84,0844th3816% Latino, 4% Black, 10% Asian$93,86510,6642.0246
Lynwood71,0221st3088% Latino, 9% Black, 1% Asian$49,68414,4164.43328
Bell36,6671st2492% Latino, 2% Black, 1% Asian$42,54814,1854.02715

Data sources: Demographics from the United States Census Bureau QuickFacts, County Health Rankings (University of Wisconsin Population Health Institute), L.A. Mapping (Los Angeles Times). Congregate cases from the Los Angeles County Department of Public Health COVID-19 Dashboard, California county COVID-19 websites. Accessed June 30, 2020.

aIndicates that data are not publicly available.

bIncludes only congregate health and living facilities (not correctional facilities).

Several subregions show one or two non-sequential daily rates that exceed the upper control limit; these may be due to reporting patterns from laboratories or the public health department. Fig 2 shows charts for the City of Lynwood and LA County with annotated events such as public health authority COVID-19 orders, introduction of free testing centers, and holidays. The Lynwood chart is annotated with additional policies that are specific to this city, such as mandated use of facial coverings in public in the first week of April, about six weeks earlier than LA County. The charts show different time trends. The exponential epoch ended in Lynwood in June; a new phase of exponential rate in LA County began at the end of that month. Neither chart shows special cause several weeks after major holidays including Mother’s Day, the Fourth of July, and Labor Day.

Discussion

During the COVID-19 pandemic, decision-makers need signals from data to intensify or relax safety requirements. Control charts are a tool for daily learning and action that public health could benefit from [29-31]. This study shows how a novel hybrid control chart could assist authorities to act early by signaling exponential and non-exponential growth or decline in public health measures. Control charts are commonly used in management and can be used in public health to distinguish random from meaningful variation. Use of this method could reduce overreaction to expected random variation and encourage immediate action when special cause variation signals a new phase or epoch of the pandemic. Based on this study, a division within LAC DPH incorporated control charts into their approach for identifying COVID-19 outbreaks. This study shows the value of disaggregating data as the large-scale closures that characterized the initial emergency response to COVID-19 evolved into metric-focused decision-making. California adopted six indicators for relaxing the initial safer-at-home order [32] and introduced a tiered system based on local data in October 2020 [33] as local authorities adopted additional policies. Disaggregation by region and time helps state, county, and within-county authorities act on the data. For example, local authorities could use the significant within-county variation to communicate to the public what is happening in specific neighborhoods and potentially to consider mitigation strategies that customize to subregions. Further, the study shows that a city could be mischaracterized as experiencing an outbreak of COVID-19 cases because case counts for the general public and congregate living populations are combined. This was especially relevant early in the pandemic when skilled nursing facilities accounted for a significant proportion of COVID-19 cases. Local jurisdictions may need this differentiation to target strategies–for example, if most cases are in congregate facilities. Disaggregated data could also enable a school district to consider if patterns in neighborhoods that feed into specific schools warrant augmented mitigation strategies. Control charts such as those in this study can be easily interpreted once their format is understood. They can help localities check assumptions about time trends and assess the impact of planned changes (policies) and external influences (such as holidays). Their visual simplicity is designed for analysis, discussion, and decision-making. The disaggregated, time-ordered format can empower people in a specific city or neighborhood to identify potential causes of what they observe in the data. These COVID-19 charts may provide the same benefit to local communities during the pandemic as control charts offer to industries that use them for improving clinical care or other processes. The interpretation of these charts is aided by annotation of key events. Especially when done in real time, annotation makes it possible to observe if specific policies or events are followed by signals of a new phase or epoch. This study shows that assumptions about increased COVID-19 cases following holidays such as Mother’s Day, the Fourth of July, and Labor Day were not borne out in the county or neighborhood data. Users of these control charts can employ the Bradford Hill causality criteria to aid interpretation [34]. Additionally, complementing control charts with a small number of meaningful measures, such as residential density and household size, helps decision-makers and the public interpret observed variation by area and over time. Local health agencies could work with cities to identify appropriate measures that aid interpretation, such as housing features and rates of mask-wearing. Other analytical methods such as modeling associations of demographics or other subregion features with case counts can complement control charts. Lastly, the time-ordered format of a control chart contrasts with how public health data are often analyzed and displayed. Reporting patterns by public health departments (e.g., batch reporting by facilities, weekday versus weekend reporting, inclusion of congregate living residents and homeless individuals in cumulative case reporting, availability of illness dates versus reporting dates) could shape how stakeholders interpret the data. At minimum, control charts can make these limitations transparent and prompt decision-makers to ask for more interpretable displays. Engaging decision-makers in such discussions may help to justify public health investment in data reporting and analytics so that such questions can be answered from the outset in future outbreaks. Notably, this study is limited in the interpretation of the displayed control charts. Statistical process control was intended for use in real time by people with intimate knowledge of the system, process, or community that the underlying data represent. Proper interpretation of data comes from engaged discussion. This study offers some possible interpretations of the observed special cause variation, but the purpose was to illustrate the approach rather than draw causal conclusions about policy events and COVID-19 cases. Rather than interpreting the data for action, the study intended to examine if areas exhibited variation and to illustrate the potential value of providing disaggregated data in control chart format to people who can make immediate use of them. Finally, COVID-19 case counts underestimate total COVID-19 cases, and there may be noise in the case counts that result from limited staffing for data entry, delayed reporting of cases out of jurisdiction, and the impact of time-varying testing constraints and laboratory turn-around-time, among others, which may vary by county and over time. As with all significant changes to data reporting and analytics, adopting control chart methods will require time and resource investment. LAC DPH’s interest in using control charts to help inform outbreak management followed a series of virtual workshops on the method that a university partner offered to public health personnel. Health departments will need to create these displays and prepare their workforce to use them effectively, ideally as real-time learning tools. For optimal impact, public health personnel will need to assist community stakeholders to interpret and use the charts to meet their specific needs. Progress in this area for health authorities and their partnered community stakeholders has been made for topics such as infant mortality through learning collaboratives of state and local health departments [14] and in training programs for local health authorities [31].

Conclusions

The COVID-19 pandemic has placed unprecedented data and analytic demands on public health. Clear, interpretable, disaggregated displays are essential tools for policy makers as they consider the health, economic, social, mental health, and educational burden of COVID-19 in their communities. Showing the data underlying public policy decisions could increase uptake of guidance regarding personal and collective behavior in communities. This may have been a missed opportunity as the COVID-19 pandemic progressed from an initial, centralized emergency response to a period of distributed decision-making on the part of employers, cities, school districts, and others. Healthcare organizations use control charts for learning and to encourage data-driven collective action [15, 35, 36]. Public health and population-oriented systems could also make use of this method. This statistical method provides an easy, effective, and inexpensive way for public health departments to meet some of the informational needs of governors, city councils, school boards, and the general public during an emergency response when timeliness of and insight from data are of utmost importance. Use of control charts in public health will require an appreciation of their value and investment of time and resources into their creation and use. 9 Dec 2020 PONE-D-20-24209 Using Control Charts to Understand Community Variation in COVID-19 PLOS ONE Dear Dr. Inkelas, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. The manuscript has been evaluated by two reviewers, and their comments are available below. The reviewers have raised a number of major concerns. They request improvements to the introduction and justification for use of control charts, reporting of methodological aspects of the study, and further discussion of your findings or alternative approaches. Could you please carefully revise the manuscript to address all comments raised? Please submit your revised manuscript by January 22, 2021. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript: A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Beryne Odeny Staff Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1) Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2)  Thank you for stating the following in the Financial Disclosure section: [National Institutes of Health /National Center for Advancing Translational Sciences Grant Number UL1TR000124 and Grant Number TL1TR001883. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.]. We note that one or more of the authors are employed by a commercial company: Associates for Process Improvement, i. Please provide an amended Funding Statement declaring this commercial affiliation, as well as a statement regarding the Role of Funders in your study. If the funding organization did not play a role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript and only provided financial support in the form of authors' salaries and/or research materials, please review your statements relating to the author contributions, and ensure you have specifically and accurately indicated the role(s) that these authors had in your study. You can update author roles in the Author Contributions section of the online submission form. Please also include the following statement within your amended Funding Statement. “The funder provided support in the form of salaries for authors [insert relevant initials], but did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. The specific roles of these authors are articulated in the ‘author contributions’ section.” If your commercial affiliation did play a role in your study, please state and explain this role within your updated Funding Statement. ii. Please also provide an updated Competing Interests Statement declaring this commercial affiliation along with any other relevant declarations relating to employment, consultancy, patents, products in development, or marketed products, etc. Within your Competing Interests Statement, please confirm that this commercial affiliation does not alter your adherence to all PLOS ONE policies on sharing data and materials by including the following statement: "This does not alter our adherence to  PLOS ONE policies on sharing data and materials.” (as detailed online in our guide for authors http://journals.plos.org/plosone/s/competing-interests) . If this adherence statement is not accurate and  there are restrictions on sharing of data and/or materials, please state these. Please note that we cannot proceed with consideration of your article until this information has been declared. Please include both an updated Funding Statement and Competing Interests Statement in your cover letter. We will change the online submission form on your behalf. Please know it is PLOS ONE policy for corresponding authors to declare, on behalf of all authors, all potential competing interests for the purposes of transparency. PLOS defines a competing interest as anything that interferes with, or could reasonably be perceived as interfering with, the full and objective presentation, peer review, editorial decision-making, or publication of research or non-research articles submitted to one of the journals. Competing interests can be financial or non-financial, professional, or personal. Competing interests can arise in relationship to an organization or another person. Please follow this link to our website for more details on competing interests: http://journals.plos.org/plosone/s/competing-interests [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: Review PONE-D-20-24209 This is a well written and interesting article illustrating how improved presentation and visualization of data can improve learning and decision-making related to the ongoing Covid pandemic. I have some suggestions that authors may consider when revising and finalizing their manuscript. The title of the article as well as the abstract focus on the use of control charts, i.e. how data is displayed. I would say that the first strong argument being made in the article is the use of disaggregate data, in particular as more localized decisions are needed to handle existing variations in the pandemic. That data should be presented in the form of control charts, to handle randomness and identify significant changes, is the second argument. I miss explicit references to disaggregated data in the abstract (and perhaps in title). I also miss references to randomness to explain the value of control charts in the abstract (especially when disaggregate data is used, i.e. fewer observations). Also in the introduction, the focus is on “displays of Covid-19 data”. The problem with over-aggregation is mentioned (impairing the ability of decision-makers) but I think this could be presented as a (first) separate issue. On page 4 it is argued that control charts “are less burdensome” and on page 12 you say that “the analysis show that control charts are easily interpretable” and “have a visual simplicity”. First of all you have not analyzed if control charts are easily interpretable, at least not from a (real) decision-maker point of view. There is no data based on decision-makers views. The views expressed are rather arguments usually put forward by scholars, proposing control charts, funnel plots and similar displays to identify special cause variation. If think (and this is supported by experiments) that real decision-makers need at least initial support to fully understand control charts. Support becomes even more important when decision-making is distributed across local authorities (with more limited access to “experts”). Without initial and continued support (training seminars/webbinars, on-line help etc) it would probably be difficult to develop this into “a community engagement tool”. It would also take time - the Covid pandemic require local actions here and now! In summary, I think the article should take the need for training and support involving local decision-makers more seriously. Continued support could also include sharing of local experiences and local knowledge. This could support engaged discussions and improved decisions further. Reviewer #2: The authors developed a unique system of C and I-type control charts in order to retrospectively model the first 4 months of the COVID-19 pandemic within California counties and cities/neighborhoods. These charts detected both periods of exponential growth and periods of stable rates for the counties and cities/neighborhoods analyzed. The authors showed how annotated control charts could provide near real-time data to decision makers and the general population, as well as how these charts might provide hypotheses to explain special cause variation. The manuscript is interesting and well written. As the authors point out, use of control charts to monitor healthcare processes has become more common, but use within the public health arena remains surprisingly uncommon to date. I do have several comments and suggestions, primarily targeting the methods, that if addressed might improve the readability and impact of this manuscript: 1) Methods. In the first paragraph of the methods, the authors describe the characteristics of the control charts that they used in this analysis. However, these specific characteristics do not apply to all control charts relevant to this type of analysis, and I think this introductory paragraph to control charts should emphasize the wide variation in chart characteristics. Perhaps this paragraph would also be better suited for the introduction rather than the methods? For example, some control charts estimate the centerline from the mean of past data, but other control charts use an expected external baseline rate or other calculations to estimate the baseline. Furthermore, “freezing” a static baseline is one strategy, but other control charts use a dynamic rolling baseline that updates over time. Finally, defining control limits at +/- 3 standard deviations is a common convention but is not required. 2) Methods – control chart analysis. Exponential outbreak growth was quite common in many locations at the onset of the COVID-19 pandemic. The C-charts in this study estimated a centerline from the first 20 observations. However, as expected for a respiratory virus outbreak, exponential growth often occurred within the first 20 days/data points. Does use of a centerline estimated from the first 20 days of data decrease the ability of these charts to demonstrate timely detection of exponential growth within the first 20 days? Does the 20-day baseline need to be separately analyzed for exponential growth before being used as an “in control” baseline for a process assumed not yet to demonstrate exponential growth? I.e., would the baseline be more effective if it were proven to be “in control” before being used to monitor for special cause variation? 3) Methods/Figure 1. a) Charts c/d/e/f were deemed not to exhibit exponential growth, but many data points ultimately exceeded the apparent upper control limits for these charts. How were these data indicating special cause variation and possible transition to an exponential growth phase determined not to represent exponential growth? b) Chart g is labeled as both a hybrid chart and a C-chart – could the authors please clarify this unique feature of the Lancaster control chart? c) The Figure 1 legend states that county C-charts used a centerline estimated from the midpoint of all observations, but the methods state that C-chart centerlines were estimated from the first 20 data points. Could the authors please reconcile this discrepancy? Also, does “all observations” refer to all observations up to the current observation or all observations for the entire study? I am asking this question because if observations from May were used to calculate the centerline for April, this method of using a global baseline would not be useful for prospective surveillance. d) Could the authors please label Figure 1? Perhaps a single label for all panels would work well, but the “daily counts,” centerline, UCL, and LCL should be labeled, as in Figure 2. 4) Methods – control chart analysis. The authors defined the end of the exponential phase as when a data point is below the lower limit – when this happens, should an I-Chart be converted back to a C-chart to monitor for re-entry into an exponential phase? If not, it seems that ongoing use of the exponential centerline and control limits is not very useful. 5) Methods – control chart analysis. My comment here is a generalization of comment #4 above. At the bottom of page 6, the authors state that the centerline and control limits remained constant over time, except when an initial transition to epidemic growth was noted. I think this strategy is a major limitation of this analysis that should at least be discussed in more detail or, ideally, addressed by showing how centerlines and control limits could be adjusted overtime as proof of concept for using these charts for prospective surveillance. For example, in Figure 1h, the majority of datapoints from mid-May onwards are above the apparent UCL, and every data point over this broad timeframe is above the centerline. Clearly, a mechanism is needed to update the centerline and control limits over time for COVID-19 monitoring in order for this chart type to remain useful outside of monitoring for exponential growth during the early stage of the outbreak. 6) Results – page 14, just prior to conclusions. The authors state that “the purpose was to draw conclusions about causal events,” but I believe this represents a typo and was intended to state that “the purpose was NOT to draw conclusions…” 7) Overall comment – did the authors explore other chart types and chart characteristics for this analysis? How would other charts perform in certain scenarios for monitoring COVID-19 trends? For example, did the authors experiment with different centerline definitions, out of control data point definitions, control limits, etc.? Per my comment above, would other charts provide more useful monitoring after exponential growth has been confirmed or for locations that continue to see variation in case rates without exponential growth? ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. 10 Feb 2021 Thank you for the helpful reviews. We agree with and have addressed each comment in the revised narrative. Responses to the reviews are provided below. Reviewer #1: Review PONE-D-20-24209 I would say that the first strong argument being made in the article is the use of disaggregate data, in particular as more localized decisions are needed to handle existing variations in the pandemic. That data should be presented in the form of control charts, to handle randomness and identify significant changes, is the second argument. I miss explicit references to disaggregated data in the abstract (and perhaps in title). I also miss references to randomness to explain the value of control charts in the abstract (especially when disaggregate data is used, i.e. fewer observations). Response: Regarding disaggregation the abstract now refers to use of disaggregated data. The Discussion and Conclusion also describe the importance of disaggregation. Regarding randomness, the paper now includes several mentions of how control charts distinguish noise (randomness) from signal. This includes the third paragraph of the Introduction, the second paragraph in Statistical Process Control within Methods, and the first and fourth paragraph of the Discussion. Also in the introduction, the focus is on “displays of Covid-19 data”. The problem with over-aggregation is mentioned (impairing the ability of decision-makers) but I think this could be presented as a (first) separate issue. Response: We agree with this helpful comment. We edited the abstract so that the first sentence of Methods refers to the use of control charts for studying variation and disaggregating data. We address over-aggregation in the second paragraph of the Introduction. We refer to several common features of publicly available COVID-19 data in this paragraph, including over-aggregation. On page 4 it is argued that control charts “are less burdensome” and on page 12 you say that “the analysis show that control charts are easily interpretable” and “have a visual simplicity”. First of all you have not analyzed if control charts are easily interpretable, at least not from a (real) decision-maker point of view. There is no data based on decision-makers views. The views expressed are rather arguments usually put forward by scholars, proposing control charts, funnel plots and similar displays to identify special cause variation. Response: We appreciate this comment and agree that it is helpful to temper some of these statements in the narrative. We have reduced the discussion of control chart value. Assertations about their interpretability and visual simplicity cite existing literature. We deleted the sentence that stated that control charts can be less burdensome than some other commonly used methods analytic methods. We agree that this point was not justified in the narrative and believe that it would district from the paper to weigh different methods alongside control charts. We edited one sentence so that it states that control charts can be more interpretable. Additionally, the narrative now states that the Los Angeles County Department of Public Health is interested in and beginning to use control charts to assist outbreak management, which is evidence that the control charts are interpretable and add value to the other analytical methods that this local public health authority uses. Several co-authors hold positions within this public health department so we believe this provides some evidence for our assertations about the potential value of these charts. If think (and this is supported by experiments) that real decision-makers need at least initial support to fully understand control charts. Support becomes even more important when decision-making is distributed across local authorities (with more limited access to “experts”). Without initial and continued support (training seminars/webbinars, on-line help etc) it would probably be difficult to develop this into “a community engagement tool”. It would also take time - the Covid pandemic require local actions here and now! In summary, I think the article should take the need for training and support involving local decision-makers more seriously. Continued support could also include sharing of local experiences and local knowledge. This could support engaged discussions and improved decisions further. Response: We appreciate this comment and concur that public health authorities will need support and resources to adopt control chart methods. We do believe that this is possible with a reasonable amount of support, based on our experience introducing this method into the Los Angeles County Department of Public Health and also based on the experience of several co-authors in teaching these methods to state and county public health authorities. This work is cited in the references (Finnerty et al. 2019 and Davis et al. 2016). In response to this helpful and critical point from the reviewer, we have added a final paragraph to the Discussion, as follows: “As with all significant changes to data reporting and analytics, adopting SPC methods will require time and resource investment. LAC DPH’s interest in using control charts to help inform outbreak management followed a series of virtual workshops on the method that a university partner offered to public health personnel. Health departments will need to create these displays and prepare their workforce to use them effectively, ideally as real-time learning tools. For optimal impact, public health personnel will need to assist community stakeholders to interpret and use the charts to meet their specific needs.” Additionally, the final sentence of the paper states that “Use of control charts in public health will require an appreciation of their value and investing time and resources into their creation and use. Progress in this area for health authorities and their partnered community stakeholders has been made for topics such as infant mortality through learning collaboratives of state and local health departments [14] and in training programs for local health authorities [30].” Reviewer #2: 1) Methods. In the first paragraph of the methods, the authors describe the characteristics of the control charts that they used in this analysis. However, these specific characteristics do not apply to all control charts relevant to this type of analysis, and I think this introductory paragraph to control charts should emphasize the wide variation in chart characteristics. Perhaps this paragraph would also be better suited for the introduction rather than the methods? For example, some control charts estimate the centerline from the mean of past data, but other control charts use an expected external baseline rate or other calculations to estimate the baseline. Furthermore, “freezing” a static baseline is one strategy, but other control charts use a dynamic rolling baseline that updates over time. Finally, defining control limits at +/- 3 standard deviations is a common convention but is not required. Response: We appreciate this comment and now include a paragraph about control charts in the introduction. We also added this referenced sentence to that paragraph in the Introduction to describe common types of control charts: “Control charts can be used for multiple common types of data distributions including classification (binomial) P charts, continuous normal distribution (X charts) and individuals (I charts), and count (Poisson) C charts [12].” Additionally, we made a significant change in the control chart method in this paper. We use the new hybrid control chart developed for the pandemic by Perla et al. that was published in 2020. We describe the specific ways that midlines are calculated and that special cause is identified for this hybrid chart. We believe that this is responsive to the reviewer’s appropriate comment that there are some variations in how initial baseline rates are constructed; rather than describe these variations, we believe it is easiest for readers if we describe the method used in this paper and provide references to papers and texts (e.g., Provost and Murray, The Healthcare Data Guide) that describe these variations in detail. 2) Methods – control chart analysis. Exponential outbreak growth was quite common in many locations at the onset of the COVID-19 pandemic. The C-charts in this study estimated a centerline from the first 20 observations. However, as expected for a respiratory virus outbreak, exponential growth often occurred within the first 20 days/data points. Does use of a centerline estimated from the first 20 days of data decrease the ability of these charts to demonstrate timely detection of exponential growth within the first 20 days? Does the 20-day baseline need to be separately analyzed for exponential growth before being used as an “in control” baseline for a process assumed not yet to demonstrate exponential growth? I.e., would the baseline be more effective if it were proven to be “in control” before being used to monitor for special cause variation? Response: We appreciate this comment and as noted in the response to Comment #1 of this reviewer, the paper now describes the published methodology of Perla et al. to create hybrid C and I charts for this study. We briefly describe the rationale for this method and also provide references to the details of this new hybrid chart. The method that we are using is consistent with what the reviewer recommends; that is, the method does establish a centerline of the data that is “in control” while monitoring is underway to identify special cause variation. As noted in the method description, there is a minimum of 8 points to establish an initial baseline for a non-exponential epoch and a minimum of 21 points to establish an initial baseline for an exponential epoch. We believe that this is responsive to the reviewer 3) Methods/Figure 1. a) Charts c/d/e/f were deemed not to exhibit exponential growth, but many data points ultimately exceeded the apparent upper control limits for these charts. How were these data indicating special cause variation and possible transition to an exponential growth phase determined not to represent exponential growth? Response: We appreciate that these charts were difficult to interpret. We have replaced all of the charts with the hybrid control chart that Perla et al. developed in 2020. The concern raised by the reviewer is no longer relevant as these charts follow the rule for special cause that Perla et al. established, which is two sequential signs of special cause (versus the standard single sign of special cause). The rationale for this rule is provided in the Methods, and we provide references for the rule. We also note in the Methods that typically a control chart would be monitored for signals of special cause in real time by a person or team that is very familiar with system of focus. In this case, we are generating charts and using special cause rules that follow from those established by Perla et al. for this hybrid chart. b) Chart g is labeled as both a hybrid chart and a C-chart – could the authors please clarify this unique feature of the Lancaster control chart? Response: We replaced all of the charts in Figure 1 with the hybrid chart to address this comment and update the paper. Figure 1 and Figure 2 include the hybrid chart for each area that is displayed. c) The Figure 1 legend states that county C-charts used a centerline estimated from the midpoint of all observations, but the methods state that C-chart centerlines were estimated from the first 20 data points. Could the authors please reconcile this discrepancy? Also, does “all observations” refer to all observations up to the current observation or all observations for the entire study? I am asking this question because if observations from May were used to calculate the centerline for April, this method of using a global baseline would not be useful for prospective surveillance. Response: The method that we are using for the hybrid control chart reconciles the discrepancy that the reviewer identified. All of the charts use the same rules for establishing the initial midline. We heavily edited the Methods section to clarify how the hybrid charts are created, and the confusing language that the reviewer comments on (regarding “all observations”) no longer applies as we have deleted that description. d) Could the authors please label Figure 1? Perhaps a single label for all panels would work well, but the “daily counts,” centerline, UCL, and LCL should be labeled, as in Figure 2. Response: We appreciate this comment and have added a legend to Figure 1. Both Figure 1 and Figure 2 have legends that list the daily cases, midline, UCL, and LCL. 4) Methods – control chart analysis. The authors defined the end of the exponential phase as when a data point is below the lower limit – when this happens, should an I-Chart be converted back to a C-chart to monitor for re-entry into an exponential phase? If not, it seems that ongoing use of the exponential centerline and control limits is not very useful. Response: We have rewritten the Methods to explain how special cause is handled in the hybrid control chart. The paper now states that there are four possible epochs (pre-exponential growth represented by a C-chart, exponential growth (an individuals (I) chart), post-exponential growth (a flat trajectory or exponential decline that is represented by an I chart), and stability after descent (C-chart). Additionally, there are phases within an epoch so that the midline can change but the form of the control chart (C, or I) continues. We believe that this method and explanation addresses the reviewer’s comment, and additionally, we have deleted the original control charts that showed the problematic pattern that the reviewer is referring to. 5) Methods – control chart analysis. My comment here is a generalization of comment #4 above. At the bottom of page 6, the authors state that the centerline and control limits remained constant over time, except when an initial transition to epidemic growth was noted. I think this strategy is a major limitation of this analysis that should at least be discussed in more detail or, ideally, addressed by showing how centerlines and control limits could be adjusted overtime as proof of concept for using these charts for prospective surveillance. For example, in Figure 1h, the majority of datapoints from mid-May onwards are above the apparent UCL, and every data point over this broad timeframe is above the centerline. Clearly, a mechanism is needed to update the centerline and control limits over time for COVID-19 monitoring in order for this chart type to remain useful outside of monitoring for exponential growth during the early stage of the outbreak. Response: We appreciate these insightful comments and agree that the method was not fully described. As noted in responses to the other methodological questions by this reviewer, the revised paper uses the hybrid chart published by Perla et al. and has specific rules for transitioning between epochs and between phases within epochs. That addresses the reviewer’s question about how centerlines and control limits are adjusted over time. We also cite the methodology paper that describes the approach in detail. 6) Results – page 14, just prior to conclusions. The authors state that “the purpose was to draw conclusions about causal events,” but I believe this represents a typo and was intended to state that “the purpose was NOT to draw conclusions…” Response: The reviewer is correct that this was a typographical error. We have corrected the sentence accordingly. 7) Overall comment – did the authors explore other chart types and chart characteristics for this analysis? How would other charts perform in certain scenarios for monitoring COVID-19 trends? For example, did the authors experiment with different centerline definitions, out of control data point definitions, control limits, etc.? Per my comment above, would other charts provide more useful monitoring after exponential growth has been confirmed or for locations that continue to see variation in case rates without exponential growth? Response: We appreciate this thoughtful question. As noted in our response to Comment 1 from Reviewer 1, we have added a listing of commonly used control charts, and the Methods section states that this paper uses the two types of control charts that are appropriate for the data distribution underlying counts of COVID-19 cases (C charts or I charts). We believe that the new hybrid control chart methodology addresses the helpful points that the reviewer is making, in its use of C and I control charts within the same display. The Methods section now explains that the data may signal a change of phases within a C or I chart or may signal a change in epoch, which means a shift from a C to an I chart or a shift from an I to a C chart. Additionally, we provide a rationale for the rule of identifying a signal of a phase or epoch shift when there have been 2 sequential observations that indicate special cause. To avoid adding information that readers might consider to be extraneous, we do not go into detail about how other kinds of control charts could be used for understanding COVID-19 data, such as charts that are appropriate for rare events or that are used for binomial distributions. Submitted filename: Response to Reviewers.docx Click here for additional data file. 22 Feb 2021 PONE-D-20-24209R1 Using Control Charts to Understand Community Variation in COVID-19 PLOS ONE Dear Dr. Inkelas, Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. I participated as a reviewer for the initial evaluation of this manuscript and believe that you have made important improvements to the manuscript in responding to reviewer feedback. If you can make a few additional minor changes to improve readability, I anticipate that your manuscript will fully meet PLOS ONE's publication criteria. The following three areas require revision or further clarification: 1. Results, page 12. The results state that household crowding at Westlake occurred in 45% of households, but the revised Table 1 lists 45% for Westlake household crowding. The results also describe Westlake as having a large household size with median size of 4.0; however, Table 1 gives median household size of 3.0 for Westlake, which is the second smallest household size among the five cities/neighborhoods studied. Could you please clarify this apparent discrepancy and ensure that all table/manuscript statistics are accurate? 2. Results, page 12. I recommend removing the last paragraph that mentions publicly available stratification of counts. This comment is useful but is better suited for the discussion section. This comment could be folded into the related discussion of general public vs. congregate living populations on page 14. 3. Discussion/Conclusions sections. I appreciate the revised and improved discussion and conclusions sections. Additional copyediting of these sections by the authors would improve readability and impact. While these changes would be stylistic and overall minor, I think editing by the authors at this stage is important because the journal does not provide copyediting. Here are some examples of the type of copyediting that may improve readability of these sections: a) Page 13, line 3. Could consider: “...tool for daily learning and action that could benefit public health.” b) Page 13, last sentence of paragraph 1. Could consider: “Use of control charts could reduce overreaction to expected random variation and encourage immediate action when…” c) Page 14, second paragraph. Could consider: “Data alone do not tell the full story.” d) Page 17. Could consider: “Use of control charts in public health will require an appreciation of their value and investment of time and resources into their creation and use.” Please submit your revised manuscript by Apr 08 2021 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript: A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: http://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols We look forward to receiving your revised manuscript. Kind regards, Arthur Wakefield Baker Academic Editor PLOS ONE [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step. 24 Feb 2021 Thank you for the helpful review. We addressed each comment in the revised narrative. Our responses to the reviews are provided below. 1. Results, page 12. The results state that household crowding at Westlake occurred in 45% of households, but the revised Table 1 lists 45% for Westlake household crowding. The results also describe Westlake as having a large household size with median size of 4.0; however, Table 1 gives median household size of 3.0 for Westlake, which is the second smallest household size among the five cities/neighborhoods studied. Could you please clarify this apparent discrepancy and ensure that all table/manuscript statistics are accurate? Response: We corrected a typographical error in the narrative that resulted in a discrepancy between the narrative and Table 1. We deleted a sentence that had some redundant information. We reviewed the data in the narrative, and in Table 1, and all other data are correct. 2. Results, page 12. I recommend removing the last paragraph that mentions publicly available stratification of counts. This comment is useful but is better suited for the discussion section. This comment could be folded into the related discussion of general public vs. congregate living populations on page 14. Response: As suggested, we deleted this last paragraph from Results and added this point into the Discussion (Page 15) as follows: “Local jurisdictions may need this differentiation to target strategies – for example, if most cases are in congregate facilities.” 3. Discussion/Conclusions sections. I appreciate the revised and improved discussion and conclusions sections. Additional copyediting of these sections by the authors would improve readability and impact. While these changes would be stylistic and overall minor, I think editing by the authors at this stage is important because the journal does not provide copyediting. Response: We appreciate the suggestion of additional copy editing. We combined two paragraphs in the Discussion and made additional line edits to improve the clarity of this section. The changes are tracked in the revised narrative. Additionally, we addressed the specific copyediting suggestions (please see below). Here are some examples of the type of copyediting that may improve readability of these sections: a) Page 13, line 3. Could consider: “...tool for daily learning and action that could benefit public health.” Response: We made this edit in the narrative. b) Page 13, last sentence of paragraph 1. Could consider: “Use of control charts could reduce overreaction to expected random variation and encourage immediate action when…” Response: We made this change. c) Page 14, second paragraph. Could consider: “Data alone do not tell the full story.” Response: We deleted this sentence and instead edited the next sentence to read: “Chart annotation makes it possible to…” d) Page 17. Could consider: “Use of control charts in public health will require an appreciation of their value and investment of time and resources into their creation and use.” Response: We made this edit in the revised narrative. Submitted filename: Response to Reviewers v2.docx Click here for additional data file. 1 Mar 2021 Using Control Charts to Understand Community Variation in COVID-19 PONE-D-20-24209R2 Dear Dr. Inkelas, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Arthur Wakefield Baker Guest Editor PLOS ONE 19 Apr 2021 PONE-D-20-24209R2 Using Control Charts to Understand Community Variation in COVID-19 Dear Dr. Inkelas: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Arthur Wakefield Baker Guest Editor PLOS ONE
  12 in total

Review 1.  Statistical process control as a tool for research and healthcare improvement.

Authors:  J C Benneyan; R C Lloyd; P E Plsek
Journal:  Qual Saf Health Care       Date:  2003-12

2.  THE ENVIRONMENT AND DISEASE: ASSOCIATION OR CAUSATION?

Authors:  A B HILL
Journal:  Proc R Soc Med       Date:  1965-05

3.  Quality improvement in population health systems.

Authors:  Moira Inkelas; Marianne E McPherson
Journal:  Healthc (Amst)       Date:  2015-06-30

4.  The science of improvement.

Authors:  Donald M Berwick
Journal:  JAMA       Date:  2008-03-12       Impact factor: 56.272

5.  Using Infant Mortality Data to Improve Maternal and Child Health Programs: An Application of Statistical Process Control Techniques for Rare Events.

Authors:  Patricia Finnerty; Lloyd Provost; Emily O'Donnell; Sabrina Selk; Kaerin Stephens; Jamie Kim; Scott Berns
Journal:  Matern Child Health J       Date:  2019-06

6.  Statistical quality control methods in infection control and hospital epidemiology, Part II: Chart use, statistical properties, and research issues.

Authors:  J C Benneyan
Journal:  Infect Control Hosp Epidemiol       Date:  1998-04       Impact factor: 3.254

Review 7.  Statistical quality control methods in infection control and hospital epidemiology, part I: Introduction and basic theory.

Authors:  J C Benneyan
Journal:  Infect Control Hosp Epidemiol       Date:  1998-03       Impact factor: 3.254

8.  Era 3 for Medicine and Health Care.

Authors:  Donald M Berwick
Journal:  JAMA       Date:  2016-04-05       Impact factor: 56.272

9.  The meaning of variation to healthcare managers, clinical and health-services researchers, and individual patients.

Authors:  Duncan Neuhauser; Lloyd Provost; Bo Bergman
Journal:  BMJ Qual Saf       Date:  2011-04       Impact factor: 7.035

10.  Understanding variation in reported covid-19 deaths with a novel Shewhart chart application.

Authors:  Rocco J Perla; Shannon M Provost; Gareth J Parry; Kevin Little; Lloyd P Provost
Journal:  Int J Qual Health Care       Date:  2021-03-05       Impact factor: 2.038

View more
  2 in total

1.  Building local decision-making competencies during COVID-19: Accelerating the transition from learning healthcare systems to learning health communities.

Authors:  Rohit Ramaswamy; Varun Ramaswamy; Margaret Holly; Sophia Bartels; Paul Barach
Journal:  Learn Health Syst       Date:  2022-09-20

2.  New extended distribution-free homogenously weighted monitoring schemes for monitoring abrupt shifts in the location parameter.

Authors:  Tokelo Irene Letshedi; Jean-Claude Malela-Majika; Sandile Charles Shongwe
Journal:  PLoS One       Date:  2022-01-21       Impact factor: 3.240

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.