| Literature DB >> 25928645 |
Clementine Calba1,2,3, Flavie L Goutard4, Linda Hoinville5, Pascal Hendrikx6, Ann Lindberg7, Claude Saegerman8, Marisa Peyre9.
Abstract
BACKGROUND: Regular and relevant evaluations of surveillance systems are essential to improve their performance and cost-effectiveness. With this in mind several organizations have developed evaluation approaches to facilitate the design and implementation of these evaluations.Entities:
Mesh:
Year: 2015 PMID: 25928645 PMCID: PMC4418053 DOI: 10.1186/s12889-015-1791-5
Source DB: PubMed Journal: BMC Public Health ISSN: 1471-2458 Impact factor: 3.295
Figure 1PRISMA flow chart diagram of studies selection process to include in the systematic review.
Category, surveillance field and objective(s) of the approaches used for the evaluation of surveillance systems
|
|
|
|
|
|
| |
|---|---|---|---|---|---|---|
|
|
| |||||
| [ | Framework | Framework Guidelines Method | PHa | Evaluate performance and effectiveness | To assess the quality of the information provided; the effectiveness in supporting the objective(s), in supporting informed decision-making; and the efficiency of SS | - |
| [ | Tool | Method Tool | PHa | Design efficient surveillance systems | Help plan, organize, implement SS | Not described |
| [ | Tool | Guidelines Method Tool | PHa | Design efficient surveillance systems | To establish a baseline and to monitor progress | - |
| [ | Guidelines | Framework Guidelines Method | PHa | Evaluate performance and effectiveness | To establish and maintain effective and efficient surveillance and response systems | - |
| [ | Framework | Guidelines | PHa | Evaluate performance and effectiveness | To assess existing SS and identify areas which can be improved | - |
| [ | Framework | Framework | PHa | Evaluate performance and effectiveness | To evaluate whether SS attain their objectives, and to provide information for further development and improvement | Military surveillance systems for early detection of outbreaks on duty areas |
| [ | Framework | Frameworl | PHa | Evaluate performance and effectiveness | To provide objective, valid and reliable information for the decisions on which surveillance activities and functions should be continued | - |
| [ | Framework | Framework Guidelines | PHa | Evaluate performance and effectiveness | To establish the relative value of different approaches and to provide information needed to improve their efficacy | - |
| [ | Tool | Framework Guidelines | PHa | Evaluate performance and effectiveness | To assess whether the surveillance method appropriately addresses the disease/health issues; whether the technical performance is adequate | - |
| [ | Guidelines | Framework Guidelines | PHa | Evaluate performance and effectiveness | To define how well the system operates to meet its objective(s) and purpose | - |
| [ | Tool | Method Tool | AHb | Evaluate performance and effectiveness | To propose recommendation for improvement of SS | Implemented in France: surveillance network for antimicrobial resistance in pathogenic bacteria from animal origin (also mentioned but not described: early detection of FMD; case detection of rabies in bats; poultry disease surveillance network and salmonella laboratory surveillance network) |
| [ | Framework | Framework Guidelines Method | AHb | Evaluate performance and effectiveness | Support the detection of disparities in surveillance and support decisions on refining SS design | Implemented in UK: demonstration of freedom from Brucella melitensis; early detection of CSF and case detection of Tb. |
| [ | Method | Guidelines Method | AHb | Evaluate performance and effectiveness | To contribute to the improvement of the management of epidemiological animal health SS | Implemented in France: evolution of mycoplasmosis and salmonellosis rates in poultry (RENESA network); and the FMD surveillance network in cattle |
| [ | Framework | Framework | EHc | Evaluate performance and effectiveness | Make evidence-based decisions regarding the future selection, development and use of data | Environmental public health surveillance programs |
| [ | Method | Guidelines Method | PHa & AHb | Evaluate the completeness of the surveillance systems in terms of core components | Evaluate the completeness and coherence of the concepts underlying a health surveillance program | National Integrated Enteric Pathogen Surveillance Program, Canada |
a: Public Health; b: Animal Health; c: Environmental Health; SS: Surveillance System. FMD: Foot and Mouth Disease *According to the information provided in the publication.
Steps of the evaluation process provided by the identified evaluation approaches; along with absence or presence of the different practical element retrieved from the analysis
|
|
|
|
| |
|---|---|---|---|---|
|
|
| |||
| [ | Structured roadmap | Context of the surveillance system | - List of evaluation attributes (13) | - No case study presentation |
| Evaluation questions | - Lack of visual representation of the results | |||
| Process for data collection and management | - Lack of information about evaluator(s) | |||
| Findings | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (only general questions) | ||
| Evaluation report | - Lack of attributes’ selection matrix | |||
| Following up | ||||
| [ | Structured roadmap - Worksheets (checklist) | - | - Methods and tools for the assessment: questionnaire and worksheets | - No case study presentation |
| - Lack of information about evaluator(s) | ||||
| - Visual representation of the results: bar and radar charts | - Lack of evaluation attributes | |||
| - Lack of definitions of evaluation attributes | ||||
| - Lack of attributes’ selection matrix | ||||
| [ | Structured roadmap - Application guide | Resources assessment | - No case study presentation | |
| Indicators | - Lack of information about evaluator(s) | |||
| Data sources assessment | ||||
| Data management assessment | - Methods and tools for the assessment: scoring guide | - Lack of evaluation attributes | ||
| Data quality assessment | - Visual representation of the results (graphs) | - Lack of definitions of evaluation attributes | ||
| Information dissemination and use | - Lack of attributes’ selection matrix | |||
| [ | Structured roadmap | Plan to evaluation | - List of evaluation attributes (10) | - No case study presentation |
| - Lack of visual representation of the results | ||||
| Prepare to evaluate | - Lack of information about evaluator(s) | |||
| Conduct the evaluation | Definitions of evaluation attributes | - Lack of methods and tools for the assessment (only general questions) | ||
| Dissemination and use of the results | - Lack of attributes’ selection matrix | |||
| [ | Structured roadmap | Preparation for the evaluation | - Type/knowledge of evaluator(s): Ministry of Health (national, provincial or district levels) | - No case study presentation |
| Documentation and evaluation of the surveillance system | - List of evaluation attributes (8) | - Lack of visual representation of the results | ||
| Evaluation of the capacity of the surveillance system | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (general questions) | ||
| Outcome of the evaluation | - Lack of attributes’ selection matrix | |||
| [ | General roadmap | Initial evaluation | - List of evaluation attributes (16) | - No case study presentation |
| - Lack of visual representation of the results | ||||
| Intermediate evaluation | -Definitions of evaluation attributes | - Lack of information about evaluator(s) | ||
| Final evaluation | - Lack of methods and tools for the assessment | |||
| - Lack of attributes’ selection matrix | ||||
| [ | General roadmap | Usefulness of the activities and outputs | - Type/knowledge of evaluator(s): three to four evaluators (5 years of expertise in surveillance on communicable diseases for the team leader, plus a laboratory expert and an expert in epidemiology) | - No case study presentation |
| - Lack of visual representation of the results | ||||
| - Lack of definitions of evaluation attributes | ||||
| Technical performance | - Lack of methods and tools for the assessment | |||
| Fulfilment of contract objectives | - List of evaluation attributes (7) | - Lack of attributes’ selection matrix | ||
| [ | General roadmap | System description | - List of evaluation attributes (9) | - No case study presentation |
| - Lack of visual representation of the results | ||||
| Outbreak detection | - Lack of information about evaluator(s) | |||
| System experience | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (general questions) | ||
| Conclusions and recommendations | - Lack of attributes’ selection matrix | |||
| [ | Structured roadmap - Questionnaire | Usefulness of the operation | - Type/knowledge of evaluator(s): experts in international surveillance on communicable diseases | - No case study presentation |
| Quality of the outputs | - Lack of visual representation of the results | |||
| Development of the national surveillance system | - Lack of definitions of evaluation attributes | |||
| Technical performance | - List of evaluation attributes (6) | - Lack of methods and tools for the assessment (general questions) | ||
| Structure and management | - Lack of attributes’ selection matrix | |||
| [ | General roadmap | Engage the stakeholders | - List of evaluation attributes (10) | - No case study presentation |
| Describe the surveillance system | - Lack of visual representation of the results | |||
| Evaluation design | - Lack of information about evaluator(s) | |||
| Performance of the surveillance system | - Definitions of evaluation attributes | - Lack of methods and tools for the assessment (general questions) | ||
| Conclusions and recommendations | - Lack of attributes’ selection matrix | |||
| Findings and lessons learned | ||||
| [ | Structured roadmap - Questionnaire - Scoring guide - Worksheets | Design the evaluation | - Case study presentation (c.f. Table | - Lack of definitions of evaluation attributes |
| - Visual representation of the results through diagram representations (pie charts, histogram, radar chart) | ||||
| Implement the evaluation | - Type/knowledge of evaluator(s): requires little knowledge and experience related to surveillance | |||
| Finalisation | - List of evaluation attributes (10) and performance indicators | - Lack of attributes’ selection matrix | ||
| - Methods and tools for the assessment: questionnaire, scoring guide and worksheets | ||||
| [ | Structured roadmap - Application guide | Scope of evaluation | - Case study application (c.f. Table | - Lack of methods and tools for the assessment (only references provided) |
| Surveillance system characteristics | - Visual representation of the results through colour-coding (green, orange, red) | |||
| Design the evaluation | - Type/knowledge of evaluator(s): “Anyone familiar with epidemiological concepts and with a reasonable knowledge of the disease under surveillance” | |||
| Conduct the evaluation | ||||
| Report | - List of evaluation attributes (22) | |||
| - Definitions of evaluation attributes | ||||
| - Attributes’ selection matrix | ||||
| [ | Structured roadmap - Questionnaire - Scoring guide | Description of the surveillance system | Case study presentation (c.f. Table | - Lack of visual representation of the results |
| Identification of the priority objectives | - Lack of information about evaluator(s) | |||
| -Lack of evaluation attributes | ||||
| Building of dashboard and indicators | Provides performance indicators | - Lack of definitions of evaluation attributes | ||
| Implementation and follow-up | - Lack of methods and tools for the assessment | |||
| Updates and audit | - Lack of attributes’ selection matrix | |||
| [ | General roadmap | Priority setting | - Provides performance indicators | - No case study presentation |
| - Lack of visual representation of the results | ||||
| - Lack of information about evaluator(s) | ||||
| Scientific basis and relevance | - Lack of evaluation attributes | |||
| Analytic soundness and feasibility | - Lack of definitions of evaluation attributes | |||
| Interpretation and utility | - Lack of methods and tools for the assessment | |||
| - Lack of attributes’ selection matrix | ||||
| [ | General roadmap | Text analysis | - Case study presentation (c.f. Table | - Lack of visual representation of the results |
| - Lack of information about evaluator(s) | ||||
| Program conceptual model | ||||
| - Lack of evaluation attributes | ||||
| Comparison Validation | - Lack of definitions of evaluation attributes | |||
| - Lack of methods and tools for the assessment | ||||
| - Lack of attributes’ selection matrix | ||||
Figure 2Number of evaluation approaches which take into consideration each evaluation attribute identified in this review.
Practical aspects identified in a review of evaluation approaches for health surveillance systems, and their role in the evaluation process
|
|
|
|---|---|
| List of evaluation attributes to be assessed | Design the evaluation |
| Definitions of the evaluation attributes to be assessed | Design the evaluation |
| Case study presentation | Ease of applicability |
| Visual representation of the results | Ease of communication |
| Information about evaluator(s) (e.g. required expertise level) | Design the evaluation |
| List of methods and tools to assess the evaluation attributes targeted | Design the evaluation |
| Ease of applicability | |
| Guide for the selection of relevant evaluation attributes | Design the evaluation |
| Ease of applicability |