Literature DB >> 31908855

Validating Joint External Evaluation reports with the quality of outbreak response in Ethiopia, Nigeria and Madagascar.

Richard Garfield1, Maureen Bartee1, Landry Ndriko Mayigane2.   

Abstract

To date more than 100 countries have carried out a Joint External Evaluation (JEE) as part of their Global Health Security programme. The JEE is a detailed effort to assess a country's capacity to prevent, detect and respond to population health threats in 19 programmatic areas. To date no attempt has been made to determine the validity of these measures. We compare scores and commentary from the JEE in three countries to the strengths and weaknesses identified in the response to a subsequent large-scale outbreak in each of those countries. Relevant indicators were compared qualitatively, and scored as low, medium or in a high level of agreement between the JEE and the outbreak review in each of these three countries. Three reviewers independently reviewed each of the three countries. A high level of correspondence existed between score and text in the JEE and strengths and weaknesses identified in the review of an outbreak. In general, countries responded somewhat better than JEE scores indicated, but this appears to be due in part to JEE-related identification of weaknesses in that area. The improved response in large measure was due to more rapid requests for international assistance in these areas. It thus appears that even before systematic improvements are made in public health infrastructure that the JEE process may assist in improving outcomes in response to major outbreaks. © Author(s) (or their employer(s)) 2019. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ.

Keywords:  descriptive study; indices of health and disease and standardisation of rates; public Health

Year:  2019        PMID: 31908855      PMCID: PMC6936541          DOI: 10.1136/bmjgh-2019-001655

Source DB:  PubMed          Journal:  BMJ Glob Health        ISSN: 2059-7908


After Action Reviews (AARs) are a key part of improving Global Health Security. AARs in 3 countries closely tracked the strengths and weaknesses seen in each of the country's Joint External Evaluation. AARs can be used to monitor progress and gaps in health security.

Introduction

Structured evaluation of a country’s ability to respond to health security threats has garnered a great deal of attention and effort in the last 2 years with implementation of the Joint External Evaluation (JEE) system.1 At the time of this writing, 95 countries had engaged in the full JEE process involving a national self-study followed by a 5 day, on the ground review involving international experts.2 JEEs are intended to provide a thorough review and evaluation of a country’s capacities in 19 key areas of public health.3 The scores for each of 49 indicators in 19 domains is measured on a five-point scale combining quantitative and qualitative characteristics. The accompanying narratives summarise major strengths and limitations in each country’s public health systems, and recommendations for improvements are made. The JEE scores are subsequently published as part of the JEE report and publicly available on several websites. Much effort has gone into developing and carrying out the JEEs; little validation of the JEE scores and recommendations are available to date. We review a disease outbreak in each of three countries having undergone the JEE process. We compare scores and recommendations from the JEEs to conditions identified during postoutbreak reviews affecting that country’s response to the outbreak. Such a comparison provides a field-based validation in outbreak-related areas of the scores and recommendations from the JEE. Scores and recommendations from each country’s JEE were drawn from the JEE summary documents published on the WHO’s website.4 Information on each outbreak was collected using a combination of sources and methods, including the following: Documented first and last reports from US Centers for Disease Control and prevention (CDC)’s Global Disease Detection Operations Center; On-line media reports, UN Situation Reports and journal articles; Interviews with staff following each outbreak at the CDC Operations Center; Interviews with international and national responders during the outbreaks. These responders included staff from CDC, other international agencies, and national Ministries of Health. Questions specific to that outbreak were elaborated for these interviews on the bases of the above sources. In some cases, follow-up questions were posed to these informants in an iterative process to probe further prior responses and in triangulating information from the various sources. Finally, preliminary conclusions were shared with field and headquarter staff to further refine an understanding of the collected responses. Correspondence between the strengths and limitations in national systems relevant to an outbreak are summarised from review of text in the JEE document. Summarised information on each outbreak was compared with the relevant country’s JEE scores and text in these topical areas. The topical areas of relevance included IHR Coordination, National Laboratory Systems, Surveillance, Public Health Workforce, Preparedness, Emergency Operations, Medical Countermeasures and Risk Communication. A subjective assessment of similarity and difference between these two sets of information was created independently as judged by each of the three authors on a three-level scale. Each of the authors is involved in global health security work professionally and has taken part in JEEs and postoutbreak reviews, though not in the countries evaluated. The three reviewers did not consult in creating their agreement scores. High correspondence existed if both the JEE and description of an outbreak raised a common concern. For example, if both described highly effective systems for laboratory diagnosis a ‘high’ level of correspondence was recorded. If in the outbreak, instead, an inadequate response from the laboratory system was reported, ‘low’ correspondence was recorded. Similarly, if the JEE reported poor surveillance capacity, and surveillance during the outbreak was considered poor, a ‘high’ correspondence was reported. A Kappa statistic was generated to identify how likely the level of agreement among raters could have happened by chance. The MAGREE macro in SAS was used as it is a multiple-rater kappa statistic which omits missing values.

Background

The earliest JEE in this set was carried out in Ethiopia, during March/April 2016. The other two JEEs were carried out in June and July of 2017. The outbreak in Ethiopia was identified as beginning a little over a year after the JEE. In Madagascar and Nigeria, outbreaks began 2 and 6 months after the JEE, respectively. See table 1 and figure 1.
Table 1

Summary of reviewed outbreaks

CountryDiseaseOutbreak datesCasesDeathsHow many regions or states affected
EthiopiaAcute watery diarrhoea1/1/2017–23/7/201739 3448017 of 11
NigeriaLassa24/3/17–15/12/173768619 of 36
MadagascarPneumonic plague19/8/2017–27/11/2017129320955 of 114
Figure 1

Time line from JEE to outbreak, 2016–2018. JEE, joint external evaluation.

Time line from JEE to outbreak, 2016–2018. JEE, joint external evaluation. Summary of reviewed outbreaks The outbreak of Acute Watery Diarrhoea (AWD) is identified as starting 1/1/2017 as an index case was not identified. The Global Disease Detection programme of CDC stopped following the outbreak as it wound down after 23/7/2017.5 A total of 39 344 clinical cases, with 801 deaths, were attributed to AWD. Cases were reported from seven regions of the country, with the majority of cases coming from the Somali region and believed to have begun from cases that came from Somalia. Outbreaks in neighbouring countries occurred in 2014, 2015 and 2016. The outbreak in 2016 resulted in registration of more than 20 000 cases in Ethiopia, including in the capital city. 2016 and 2017 outbreaks were exacerbated by drought, a high number of displaced people in areas with inadequate sanitation and poor food safety practices.6 The outbreak was called by its causative agent—Vibrio Cholera—in South Sudan and Yemen. It was referred to as AWD in Sudan, Ethiopia and Somalia. Leadership in the response in Ethiopia was provided by the Federal Ministry of Health, Regional Health Bureaus and the WHO. A major activity in response was to drill bore holes and truck water, and the provision of emergency food rations to millions of people. Thousands of national staff were deployed for water treatment activities and to staff AWD treatment centres. They were supported by dozens of international staff. Water quality testing and chlorination was a major activity, both among national and international partners. Treatment centres were set up in all affected states. Much activity was focused on infection prevention and control in treatment centres, social mobilisation to identify cases and get them to treatment centres, and training by case management teams for treatment centres. The first cases of the current outbreak were identified during mid-December of 2017.7 As of 18/3/2018 a total of 376 confirmed cases and 86 deaths were recorded. A further 1495 suspected cases were identified, 1084 of which were determined to be negative. Of note, 3675 contacts of confirmed or suspected cases were followed. Among these contacts, 59 were symptomatic but only 23 were confirmed as positive cases; 805 were still being followed at the time of publication, and a total of seven had been lost to follow-up.8 The number of new cases identified peaked in late February. By mid-March, the number of new cases declined rapidly. Nine states left the active phase of the outbreak and 38 people were still receiving treatment in six of the remaining nine ‘active’ states. Lassa fever is endemic to Nigeria and other West African countries. Small, disseminated outbreaks are common as the vectors in the animal kingdom infect people in close contact. In 2017 there were two peaks of infection, indicating a potential for expanded transmission from animal vectors as came to occur in 2018. There were 247 cases recorded in 2016 and 85 deaths. The death rate in 2018 appears to be about 1/3rd lower, probably due to earlier treatment and better identification of cases.9 Three states account for 83% of all confirmed cases. Cases were identified in 56 local areas of 19 states across the country. The Nigerian CDC (NCDC) and WHO led response activities out of the NCDC Emergency Operations Centre in Abuja. Rapid Response Teams composed of NCDC staff, Ministry of Health staff and Field Epidemiology Training Programme residents led the response in affected states.10 Three laboratories in country confirmed infection using a PCR method. The laboratory system is supported by the Bernhard Nocht Institute for Tropical medicine in Germany. Three hospitals provide all the in-patient care for Lassa Fever cases. A total of 17 health workers became confirmed cases in six states. No new infections occurred among healthcare workers in later weeks. Rapid Response Teams went to four states that bordered Benin to improve disease surveillance as nine suspected cases and several confirmed cases in Benin appear to have imported the infection from Nigeria. The Index case had become symptomatic in mid-August 2017. Travelled via taxi from central highlands through the capital city on 27/8/2017. Diagnosed first case 11/9/2017. WHO notified on 13/9/2017. Twenty-seven other cases traced to the index case. Bubonic plague is endemic with cases reported every year. The last outbreak was 8/2016–1/2017 with around 300 cases. Pneumonic plague was reported in northern Madagascar last in 2015 with 14 cases. Seven of those were treated and four of them survived. In 2017, a total of 402 confirmed cases and 209 deaths occurred due to plague through 27/11/2017. Some of these deaths occurred among unconfirmed cases and are thus probably not all pneumonic plague. Total of 2417 cases (including 700 with negative laboratory tests) reported; 1293 of the total are considered confirmed, probable or suspected. Of these 1854 were classified as pneumonic; others were bubonic or unclassified.11 Government of Madagascar then called the epidemic contained, while WHO said more cases could be anticipated through the April end of plague seasonal transmission. Ministry of Public Health led response, co-led by WHO, focusing mainly on case finding, diagnosis, treatment of cases and isolation. Preventive chemoprophylaxis provided to 7318 identified contacts of cases.12 Institute Pasteur de Madagascar provided all laboratory support for diagnosis and treatment. Awareness campaigns led by government throughout country. Nine plague treatment centres and six mobile centres were established with the support of international organisations.13 Fifty-five of 114 districts reported cases. Capital city had the most cases.

Comparison of JEE reports and outbreak results

Tables 2–4 present the results of JEE and outbreak reviews in Ethiopia, Nigeria and Madagascar, respectively. In the final column of each of these tables, scores from the three raters on the level of correspondence between the JEE and outbreak is presented as low (L), medium (M), high (H) or no response (N/A). Comparison of JEE and outbreak review in Ethiopia CDC, Centers for Disease Control and prevention; FETP, Field Epi Training Programme; JEE, joint external evaluation. Comparison of JEE and outbreak review in Nigeria EOC, Emergency Operations Centre; IPC, Infection Prevention and Control; JEE, joint external evaluation; RRT, Rapid Response Teams; SOP, Standard Operating Procedures. Comparison of JEE and outbreak review in Madagascar JEE, joint external evaluation; RRT, Rapid Response Teams. Thirty-seven variables were compared between JEE scores and field operation levels, by three raters, among these three outbreaks. This created a total of 111 scores representing a low, medium or high level of correspondence between the JEE and the outbreak response review. For 13 of the 37 variables, all three raters agreed that the correspondence was high. For additional 13 variables, two raters rated the correspondence a high, while one rater considered it to be only moderate. Only eight times did a rater consider the correspondence to be low, and for none of the 37 variables did more than one rater consider it low. While 37 variables were evaluated, a reviewer occasionally chose not to respond with a ‘low, medium or high’ response. In total, 107 scores were recorded among the three reviewers. The Coefficient of Concordance produced via the MAGREE routine for a Kappa test was 0.457; this represents an F statistic of chance probability of 0.037. In simple terms, the level of agreement was high (SAS summary reference ofvarious measures of concordance are presented in https://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm%23statug_freq_a0000000647.htm The SAS routine used for MAGREE is presented in http://support.sas.com/kb/25/006.html).14 The comparison did not show any consistent or dramatic conflicts between JEE and outbreak information. Thus, though interpretations may vary regarding the degree of agreement between JEE and outbreak information, JEEs overall appear to provide a very good guide to strengths and weaknesses in actual outbreaks. The comparisons made here had two important limitations. First, response capacity at the end of the outbreak often had improved a great deal from the beginning. Thus, comparisons depend on when the comparison is made. Second, much of the action in the outbreak is local, so laboratory, social mobilisation or treatment characteristics in one area may be very different from another. JEE scores seldom took variable capacity in a country into account. In summary, it appears that after-action reports can provide a strong check on information in the JEE and can provide a near real-time update on capacities of the public health system.

Lessons

Low JEE scores existed in critical areas across all three countries with implications on the ability of those countries to detect and respond to the outbreaks. Those low scores, however, were not always critical limiting factors. In the case of Ethiopia and Nigeria, even with limited skill and personnel, large country systems were able to mobilise an adequate number of skilled personnel. Some of the inconsistencies found between the JEE and outbreak review can be explained by particular details of the outbreak. For example, though Nigeria has a low JEE score for border health, the national EOC mobilised teams to border areas to coordinate response. Because of the needs in that outbreak, one strong area of the JEE made up for weakness in another. In each of the three countries, international staff strengthened the response in areas rated low during the JEE. The quality of outbreak response, when inconsistent with JEE scores, was generally better than that predicted by the country’s JEE. Where internal skill, equipment, training and personnel were lacking, in each of the three outbreaks, national resources were supplemented by international resources that to a large extent made up for national limitations. The quality and variety of those international personnel and supplies, and the ability of national systems to absorb them, was described to be important in strengthening the outbreak response; this could not be captured by the JEE. It appears to the authors that the JEE strongly sensitised national authorities to their areas of weakness and to opportunities to supplement national resources with international staff and equipment. The JEE assessment identified systems capacity at the time of evaluation. What cannot be assessed from a single outbreak event is the influence of the JEE experience over time. It appears to us that for all three countries, the JEE created a framework to understand key roles and activities needed to respond to an outbreak more effectively. Major outbreaks following a JEE may provide an opportunity for more rapid improvements in situation awareness of systematic weaknesses to be addressed than existed prior to the JEE. A fuller evaluation of this would require comparison of the review of several outbreak after a JEE, compared with several outbreaks before the JEE. In Nigeria, Lassa outbreaks occur every year. In the outbreak prior to the JEE, an observer noted that authorities would send epidemiologists out to collect data and as we identify problems in the response we will address them. In the post-JEE Lassa outbreak, there was a much stronger focus on technical activities needed in the areas of laboratory, surveillance, reporting, Emergency Operations Centres, Medical Countermeasures, Points of Entry, communication and biosafety. Specific weaknesses in human/animal surveillance and laboratory systems triggered discussion to involve the Ministries of Agriculture and Environment. The phrase ‘supply chain’ had become part of the vocabulary. There was discussion about weaknesses in newly established legislation. The JEE experience provided an intense context with which to focus on these issues, raising the level of understanding and discourse and creating a shared vision, which otherwise would likely have been far less. Although the JEE raised the level of understanding for key roles and actions that shared vision can be expected to deteriorate over time as staff rotation occurs and people not involved in the country’s JEE assume relevant posts. The need to refresh the reflection that occurred during the JEE can, in part, be met by improving After Action Reviews (AARs). AARs should consider the trend among several outbreaks over time. They should move from a focus on the specifics of the current outbreak to a more general reflection on the JEE indicators and levels. What would the response to the outbreak have looked like if the JEE had not occurred? In Nigeria, weaknesses may not have been recognised as well or as quickly. Having recognised those weaknesses, prioritisation of critical functions occurred (in the case of Nigeria) which would have been far weaker and the coordination among state and national authorities would have been far less had not the JEE sensitised a large group of people to critical functions. It is less clear that this occurred in Madagascar, where leadership authority was not clearly established, or in Ethiopia, where political considerations limited the ability of health leaders to organise and mobilise. Several key qualities of these outbreak responses were not captured by each country’s JEE: In federal system countries, such as Nigeria, the coordination of roles between national and state level authorities and the assessment of variable levels of capacity in various states. Quality of AARs and their integration into International Health Regulations and interim JEE internal country assessments. The level and timing of intersectorial participation in public health activities during an outbreak. Early large-scale mobilisation can greatly reduce transmission, and obviate the need for later panic-level participation. This goes beyond coordination with security authorities or risk communication to affected communities. It most closely tracks to the PREVENT indicator of ‘IHR Coordination, Communication, and Advocacy’, but is a key part of response that is easy to see but difficult to measure. Opportunities to focus on these issues in JEEs, and in After Action Reviews, Simulation Exercise Evaluations and annual national IHR reporting can be used to improve post-JEE National Action Plans and outbreak response in the future.
Table 2

Comparison of JEE and outbreak review in Ethiopia

JEE domain/indicatorJEE scoreMajor JEE recommendationOutbreak response capacityDegree of correspondance
IHR Coordination P.2.13Coordination mechanisms planned to be established but not yet in place across sectorsCoordination between health and water authorities was weak.M H H
National Lab System D.1.14System strong at the national level but poor supply chain and staff turnoverNearly all cases received a clinical diagnosis only; laboratory services were weak. More laboratory specialists were desired but funds to bring internationals were inadequate.L M H
Surveillance D.2.33Plans to develop, national commitment in place and skill but not strong at state levelCase finding and active surveillance to identify clusters of cases was strong.M L M
Workforce D.4.13Need more staff and FETP resident advisor; WHO role keyFETP staff key in regional level coordination.H H H
Preparedness R.1.22Risk assessments done but mapping of resources lackingEquipment, supplies and training not ready for the outbreak.H H H
Emergency Operations Activation R.2.12No manager and lack of permanent staffRegional coordination good; national level was inadequate and frustrating,. Emergency Operations Centre (EOC) not activated until August.H H H
Emergency Operations Function R.2.32No specific training for staffRegional coordination good; national level was inadequate and frustrating,. EOC not activated until August.H H H
Emergency Operations Case Management R.2.42Cholera guidelines existTraining done rapidly for staff when outbreak hit a new area. Guidelines were essential.H H H
Medical Countermeasures R.4.14No warehouse, weak logistics, no established international agreementsRegional Health Bureau led the response to 100 treatment centres, with good WHO support. UN Children’s Fund (UNICEF), WHO, Medecins Sans Frontieres (MSF), Oxford Famine Relief) Oxfam, CDC, Islamic Relief among other non-governmental organisations very involved. Not adequate for Water, Sanitation and Hygiene activities.M M L
Risk Communication R.5.43Dedicated local staff in placeEpi identification of cases was used to tailor social mobilisation and education activities in some areas. Weaker in nomadic areas.H H H

CDC, Centers for Disease Control and prevention; FETP, Field Epi Training Programme; JEE, joint external evaluation.

Table 3

Comparison of JEE and outbreak review in Nigeria

JEE domain/indicatorJEE scoreMajor JEE comment/recommendationOutbreak response observedDegree of correspondance
IHR Coordination P.2.12SOPs existCoordination weak especially in first monthH L H
Zoonotic Diseases P.4.31Better coordination for response neededSpread of vectors went unrecognisedH H H
Biosafety P.6.11Funding and planning weaknessesUse of Personal Protective Equipment and training inadequate; 17 health workers infectedH H H
National Lab System D.1.13Capable laboratories but need for standardisationInternational supply of reagents and training was essential. EOC actions key to make this happenH M H
Surveillance D.2.33Weak capacity in many statesNeeded RRT staff from national level to take over from statesM H H
Reporting D.3.22Officers in each stateVariable response from states. Needed RRT teams from central level to make this happenH L H
Workforce D.4.13Strong Field Epi Training Programme (FETP)FETP trainees essential to responseH H H
Preparedness R.1.21Logistics system weak; risk mapping neededLack of preparedness or awareness of rising risk of outbreakH HH
Emergency Operations Activation R.2.12SOPs not fully developed; state level EOCs missingEOC key for this response; developing while doingH M H
Emergency Operations Function R.2.33Experience coordinating responses; procedures not standardisedEOC key for this response; developing while doingM H H
Emergency Operations Case Management R.2.42Some case management guidelines availableNot useful until adapted mid-epidemicH M M
Medical Countermeasures R.4.11Need for stockpile and logisticsEffective supply system created on the flyH L M
Risk Communication R.5.43Coordination from Federal to States weakNo Info (anecdotally, seemed to be weak)M N/A N/A
Points of entry PoE.21Contingency plans neededProactive response initiated on the flyN/A L M

EOC, Emergency Operations Centre; IPC, Infection Prevention and Control; JEE, joint external evaluation; RRT, Rapid Response Teams; SOP, Standard Operating Procedures.

Table 4

Comparison of JEE and outbreak review in Madagascar

JEE domain/indicatorJEE scoreMajor JEE recommendationOutbreak response capacityDegree of correspondance
IHR Coordination P.2.12Intersectoral committee exists but plan of work and response plan neededA high-level inter-Ministerial coordination forum had to be established by the Prime Minister’s office to lead the response as the Inter-sectoral Support Group for Plague Control (GIALP) was not operational.H M H
Zoonotic Diseases P.4.32Need to elaborate and fund planWeak plan for seasonal vector controlN/A H H
Biosafety P.6.12Need intersectoral coordination and fundingInadequate and insufficient IPC supplies N/A H H
National Lab System D.1.14International accords existing; 13 national labs in placeRapid Diagnostic kits not adequate and insufficient, delay for PCR confirmation. Laboratory testing was led in-country at Institut Pasteur.H L H
Surveillance D.2.33Need for training and experienceSurveillance could have been improved to better reflect burden of disease; case definitions weak; weak detection capacity at the community levelM H M
Reporting D.3.22Not in place in all parts of the countryNo clear reporting channels led to delay in responseH H H
Workforce D.4.12Inadequate outside of national levelSeemed to have adequate personnel for contact tracing but needed to train community health workers on plague surveillance and controlM H H
Preparedness R.1.21Need to analyse and mapExistence of national contingency plan which was not implemented and shared with all regions. No coordination of preparedness activitiesH H H
Emergency Operations Activation R.2.12Able to establish RRTsInsufficient trained and equipped multisectoral teams at the regional levelH M M
Emergency Operations Function R.2.32Experienced in plague response in prior yearsInternal Incident Management System coordination was lackingH M M
Emergency Operations Case Management R.2.42Existing guidelinesRevised treatment protocol developed, but not implemented; limited experience managing pulmonic plague casesH M H
Medical Countermeasures R.4.11Need to establish procedures, stock control and logisticsWeak supply management; the logistics function not included in the national contingency planH H H
Risk Communication R.5.42Personnel exist but lack of feedback and local operationalisationMobilised 9000 to assist with risk communication and community engagement but there was still miscommunication regarding dignified safe burial and stigma of contactsH M H
Points of entry PoE.21Lack of personnel and plansLack of Standard Operation Procedures and trained personnel at points of entry; implemented by international partners during outbreakH H H

JEE, joint external evaluation; RRT, Rapid Response Teams.

  4 in total

1.  Why is the Acute Watery Diarrhea in Ethiopia Attaining Extended Course?

Authors:  Abraham Haileamlak
Journal:  Ethiop J Health Sci       Date:  2016-09

2.  Deadly Lassa-fever outbreak tests Nigeria's revamped health agency.

Authors:  Amy Maxmen
Journal:  Nature       Date:  2018-03-22       Impact factor: 49.962

3.  Nigeria hit by unprecedented Lassa fever outbreak.

Authors:  Leslie Roberts
Journal:  Science       Date:  2018-03-16       Impact factor: 47.728

4.  Joint External Evaluation-Development and Scale-Up of Global Multisectoral Health Capacity Evaluation Process.

Authors:  Elizabeth Bell; Jordan W Tappero; Kashef Ijaz; Maureen Bartee; Jose Fernandez; Hannah Burris; Karen Sliter; Simo Nikkari; Stella Chungong; Guenael Rodier; Hamid Jafari
Journal:  Emerg Infect Dis       Date:  2017-12       Impact factor: 6.883

  4 in total
  6 in total

1.  Early detection of cholera epidemics to support control in fragile states: estimation of delays and potential epidemic sizes.

Authors:  Ruwan Ratnayake; Flavio Finger; W John Edmunds; Francesco Checchi
Journal:  BMC Med       Date:  2020-12-15       Impact factor: 8.775

2.  A data-driven approach to measuring epidemiological susceptibility risk around the world.

Authors:  Alessandro Bitetto; Paola Cerchiello; Charilaos Mertzanis
Journal:  Sci Rep       Date:  2021-12-15       Impact factor: 4.379

3.  Global Health Security Preparedness and Response: An Analysis of the Relationship between Joint External Evaluation Scores and COVID-19 Response Performance.

Authors:  Laura Nguyen; Morgan Sydney Brown; Alexia Couture; Sharanya Krishnan; Mays Shamout; Luis Hernandez; Jennifer Beaver; Arianna Gomez Lopez; Cassidy Whitson; Leah Dick; Ashley Lauren Greiner
Journal:  BMJ Open       Date:  2021-12-02       Impact factor: 3.006

Review 4.  Global mapping of epidemic risk assessment toolkits: A scoping review for COVID-19 and future epidemics preparedness implications.

Authors:  Bach Xuan Tran; Long Hoang Nguyen; Linh Phuong Doan; Tham Thi Nguyen; Giang Thu Vu; Hoa Thi Do; Huong Thi Le; Carl A Latkin; Cyrus S H Ho; Roger C M Ho
Journal:  PLoS One       Date:  2022-09-23       Impact factor: 3.752

5.  The experiences of using polio outbreak simulation exercises to strengthen national outbreaks preparedness and response plans in sub-Saharan Africa.

Authors:  Daudi Manyanga; Brine Masvikeni; Fussum Daniel
Journal:  Pan Afr Med J       Date:  2020-08-25

6.  Validation analysis of Global Health Security Index (GHSI) scores 2019.

Authors:  Matthew J Boyd; Nick Wilson; Cassidy Nelson
Journal:  BMJ Glob Health       Date:  2020-10
  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.