| Literature DB >> 19025684 |
Andres G Lescano1, Ria Purwita Larasati, Endang R Sedyaningsih, Khanthong Bounlu, Roger V Araujo-Castillo, Cesar V Munayco-Escate, Giselle Soto, C Cecilia Mundaca, David L Blazes.
Abstract
The performance of disease surveillance systems is evaluated and monitored using a diverse set of statistical analyses throughout each stage of surveillance implementation. An overview of their main elements is presented, with a specific emphasis on syndromic surveillance directed to outbreak detection in resource-limited settings. Statistical analyses are proposed for three implementation stages: planning, early implementation, and consolidation. Data sources and collection procedures are described for each analysis.During the planning and pilot stages, we propose to estimate the average data collection, data entry and data distribution time. This information can be collected by surveillance systems themselves or through specially designed surveys. During the initial implementation stage, epidemiologists should study the completeness and timeliness of the reporting, and describe thoroughly the population surveyed and the epidemiology of the health events recorded. Additional data collection processes or external data streams are often necessary to assess reporting completeness and other indicators. Once data collection processes are operating in a timely and stable manner, analyses of surveillance data should expand to establish baseline rates and detect aberrations. External investigations can be used to evaluate whether abnormally increased case frequency corresponds to a true outbreak, and thereby establish the sensitivity and specificity of aberration detection algorithms.Statistical methods for disease surveillance have focused mainly on the performance of outbreak detection algorithms without sufficient attention to the data quality and representativeness, two factors that are especially important in developing countries. It is important to assess data quality at each state of implementation using a diverse mix of data sources and analytical methods. Careful, close monitoring of selected indicators is needed to evaluate whether systems are reaching their proposed goals at each stage.Entities:
Year: 2008 PMID: 19025684 PMCID: PMC2587693 DOI: 10.1186/1753-6561-2-s3-s7
Source DB: PubMed Journal: BMC Proc ISSN: 1753-6561
Average and range of reporting rate (percent days with data reported) across sites, EWORS 2000 – 2006.
| Indonesia | 96 (91 – 100) |
| Lao PDR | 93 (88 – 94) |
| Peru* | 89 (69 – 100) |
* Surveillance of signs and symptoms, conducted from Monday to Saturday only.
Figure 1Average daily reporting rates (percent days with data reported) across EWORS sites in Peru, 2005–2006. Surveillance sites do not open on Sundays.
Main sociodemographic characteristics of cases surveyed, EWORS 2000 – 2006.
| Male (%) | 55.6 | 52.6 | 52.3 |
| Median age (years) | 7 | 14 | 3 |
| Traveled recently (%) | 0.1 | 6.1 | 2.5 |
Figure 2Average number of cases surveyed by day of the week, EWORS Lao PDR 2003–6.
Most frequent symptoms and syndromes and percent patients affected, EWORS, 2000 – 2006.
| Indonesia | Lao PDR | Perú | |
| First | Fever (71) | Fever (75) | Fever (80) |
| Second | Cough (52) | Cough (44) | Cough (63) |
| Third | Cold (46) | Headache (37) | Rhinorrhea (36) |
| Fourth | Diarrhea (17) | Cold (27) | Malaise (31) |
| Fifth | Vomiting (16) | Malaise (25) | Sore throat (25) |
| Febrile | 71 | 75 | 80 |
| Respiratory* | 47 | 51 | 70 |
| Gastroenteric** | 34 | 25 | 31 |
| Any of these three | 96 | 89 | 96 |
The numbers represent the percent of patients with the symptom or syndrome, respectively
* Cough or sore throat, ** Diarrhea or vomiting.
Figure 3Daily number of cases of fever and diarrhea showing simultaneous increases, EWORS Indonesia, 2001 – 2003.