Literature DB >> 36037237

Investigating the quality of HIV rapid testing practices in public antenatal health care facilities, South Africa.

Duduzile F Nsibande1,2, Selamawit A Woldesenbet3,4, Adrian Puren3, Peter Barron4, Vincent I Maduna5, Carl Lombard6,7, Mireille Cheyip8, Mary Mogashoa8, Yogan Pillay9, Vuyolwethu Magasana1,2, Trisha Ramraj1,2, Tendesayi Kufa3,4, Gurpreet Kindra8, Ameena Goga1,2,10, Witness Chirinda1.   

Abstract

Monitoring HIV prevalence using antenatal HIV sentinel surveillance is important for efficient epidemic tracking, programme planning and resource allocation. HIV sentinel surveillance usually employs unlinked anonymous HIV testing which raises ethical, epidemiological and public health challenges in the current era of universal test and treat. The World Health Organization (WHO) recommends that countries should consider using routine prevention of mother-to-child transmission of HIV (PMTCT) data for surveillance. We audited antenatal care clinics to assess the quality of HIV rapid testing practices as the first step to assess whether South Africa is ready to utilize PMTCT programme data for antenatal HIV surveillance. In 2017, we conducted a cross-sectional survey in 360 randomly sampled antenatal care clinics using the adapted WHO Stepwise-Process-for-Improving-the-Quality-of-HIV-Rapid-Testing (SPI-RT) checklist. We calculated median percentage scores within a domain (domain-specific median score), and across all domains (overall median percentage scores). The latter was used to classify sites according to five implementation levels; (from 0:<40% to 4: 90% or higher). Of 346 (96.1%) facilities assessed, an overall median percentage score of 62.1% (inter-quartile range (IQR): 50.8-71.9%) was obtained. The lowest domain-specific median percentage scores were obtained under training/certification (35% IQR: 10.0-50.0%) and external quality assurance (12.5% IQR: 0.0-50.0%), respectively. The majority (89%) of sites had an overall median score at level 2 or below; of these, 37% required improvement in specific areas and 6.4% in all areas. Facilities in districts implementing the HIV Rapid Test Quality Improvement Initiative and supported by the President's Emergency Plan for AIDS Relief (PEPFAR) had significantly higher median overall scores (65.6% IQR: 53.9-74.2%) (P-value from rank sum test: <0.001) compared with non-PEPFAR-supported facilities (56.6% IQR:47.7-66.0%). We found sub-optimal implementation of HIV rapid testing practices. We recommend the expansion of the PEPFAR-funded Rapid Test Continuous Quality Improvement (RTCQI) support to all antenatal care testing sites.

Entities:  

Mesh:

Year:  2022        PMID: 36037237      PMCID: PMC9423613          DOI: 10.1371/journal.pone.0268687

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.752


Introduction

HIV/AIDS remains a serious public health problem especially in the Sub-Saharan Africa (SSA) which carries 53% of the world’s people living with HIV [1]. Periodic monitoring of HIV prevalence is critical for estimating the overall burden of disease, efficient programme planning and resource allocation. For over three decades countries have used traditional unlinked anonymous testing (UAT)-based antenatal survey (ANSUR) methods for monitoring HIV prevalence. Using these however, raise ethical, epidemiological and public health challenges in low income countries [2], especially in the current era of universal test and treat (UTT) strategy as clients cannot be linked to their test results for continuity of care. The World Health Organization (WHO) recommends that countries with near universal prevention of mother-to-child transmission of HIV (PMTCT) coverage migrate from traditional UAT-based ANSUR to routine PMTCT programme-based surveillance methods to monitor trends in HIV prevalence [3]. This transition will facilitate ethically and methodologically sound surveillance as all HIV sero-status data will come from routine HIV testing where pregnant women receive their HIV test result and are linked to HIV treatment or prevention services [2]. This approach will also reduce the workload and financial cost associated with ANSUR, improving sustainability of surveillance and strengthening routine data systems [4]. For routine HIV testing data to be used for surveillance, it must be of good quality and the uptake of routine HIV testing must be above 95% [3]. Inaccuracies and incompleteness of PMTCT facility-based data [5, 6], including misdiagnosis of HIV status have been widely reported [7-10] especially in resource-limited settings. Despite advances in the development of HIV testing technologies, algorithms and policies, routine HIV testing carries a potential of incorrect diagnoses due to technical and human factors that interact at health system, provider, patient and test device levels [11, 12] in most resource-constrained settings [8, 13]. Countries that have explored the utility of routine PMTCT data for surveillance found mixed results [5, 14–18]. Some countries delayed using PMTCT programme-based data for surveillance despite reporting high overall positive percentage agreements (PPAs) in prevalence estimates between ANSUR and routine PMTCT data [5, 17, 18]. This was due to in-country site level differences in HIV prevalence [5, 18], low uptake of PMTCT HIV testing and limited PMTCT data quality [5, 16, 19]. Prior to 2017, South Africa (SA) used UAT-based ANSUR to monitor HIV prevalence in pregnancy and to assist in the overall modelling of HIV prevalence [20]. Evidence suggests that in SA there is substantial sub-population variation in uptake of HIV testing services (HTS) [21], and the 2017 SA ANSUR findings show that over a third (39.2%) of pregnant women visiting antenatal care (ANC) for the first time were not aware of their HIV-positive status [22]. This underscores the need for sustaining universal uptake and good quality antenatal HIV testing. While challenges in the quality of HTS services have been recorded in SA [23, 24], health facility service user satisfaction has been reported to be high (89.8%) [25]. To ensure the fidelity of routine HIV rapid testing, WHO recommends periodic site audits, external quality assurance (EQA) and re-testing of clients to verify HIV status before antiretroviral therapy (ART) initiation [26]. As a country with the largest HIV epidemic in the world [27], SA adopted the Joint United Nations Program on HIV/AIDS (UNAIDS) 90-90-90 testing and treatment targets with the objective of ending HIV/AIDS as a public health threat by 2030 [28, 29]. SA’s national HIV testing guidelines are modelled on the WHO consolidated guidelines for HIV rapid testing to inform testing practices including quality assurance (QA) [30, 31]. In addition, the South African National Department of Health (SA NDoH) has a high quality management system that addresses all aspects of testing that exists in the country [30], and since 2016, the National Institute for Communicable Diseases (NICD), SA National Reference Laboratory for HIV testing, has been providing technical support for HTS in 27 PEPFAR-supported high HIV burden districts, under the HIV Rapid Test Quality Improvement Initiative(RTQII). The RTQII package involved: i) policy engagement ii) human resource iii) proficiency testing programs, iv) standardized registers and v) post-market surveillance. In SA, there is no national accreditation certification programme for HTS. The impact of the PEPFAR-funded RTCQI support was evaluated in HIV testing sites using two consecutive assessments [32]. ANC HIV testing sites were not included in this initiative. After the RTQII showed good results in many PEPFAR supported districts, the name changed from RTQII to Rapid Test Continuous Quality Improvement (RTCQI) and is being run by the SA NDoH. For simplicity, we will be referring to the initiative with the new name RTCQI. We set out to investigate HIV rapid testing practices within the PMTCT programme as the first step to assess whether PMTCT programme data can be used for surveillance. Our survey was limited to testing sites within ANC clinics, which were not part of the PEPFAR-funded RTCQI support. Following our survey, Woldesenbet et al. [33], conducted secondary analysis of routine HIV test results captured as part of ANSUR 2017, to assess the feasibility of using routine HIV data for surveillance. Their work did not assess HIV rapid testing practices.

Methods

This paper is an analysis of the third objective (activity) of the broader study. The objectives of the study were (i) to assess the validity of routine versus survey data (planned to piggy-back onto the annual South African national antenatal survey), (ii) to review routine data quality and completeness abstracted from routine facility-based registers and (iii) to evaluate whether facilities adhered to the standardized protocols for rapid HIV testing (quality assurance of rapid HIV test results).

Sampling strategy

Sampling was based on the second objective of the study (to ensure that adequate numbers of records were abstracted from facility-based ANC records). The sampling frame was developed from the list of facilities available in the 2014/15 SA NDoH District Health Information System dataset [34]. It comprised of the following: all public primary health care (PHC) facilities providing ANC (both ANSUR and non-antenatal survey sentinel sites (non-ANSUR) from all 9 provinces. These were classified according to locality type (rural and urban). Initially there were 36 strata that were defined from the sampling frame. Each stratum needed a facility contribution of at least 1% of the clinic data abstracted from facility-based records to provide enough data, hence smaller strata were merged, resulting in 26 strata in the final sampling frame and therefore these defined the main inclusion criteria. The mobile and referral clinics were excluded. Facilities sampled were randomly selected without replacement within each stratum. For the HIV rapid testing QA, the minimum feasible sample size allowing was determined at 95% confidence interval; with a margin of error of 5% was 360 facilities. The approach to sample size and sampling was designed to compare proportions of outcomes between rural and urban localities, ANSUR and non-ANSUR participating sites and provinces, rather than being designed for external validity at sub-regional level.

Data collection and instruments

Between February and May 2017, trained field workers conducted an audit of selected ANC sites offering routine HTS. We used the paper-based SPI-RT checklist for data collection which was adapted from the WHO SPI-RT checklist Version 3.0. to the local context. Data collection involved i) interviewing and observing one HIV tester (selected by the facility manager) in every testing site performing one simulated HIV rapid testing procedure and ii) auditing the site to assess compliance to the SA NDoH HTS Guideline [30]. The SPI-RT checklist had seven domains (totalling 64 points, with individual scores ranging from 5 to 12 points). The domains (scores in brackets) were: (i) training/certification (10), (ii) physical facility (5), (iii) safety (11), (iv) pre-testing phase (12), (v) testing phase (9), (vi) post-testing phase (9), and (vii) EQA (8) (Table 1). Each domain had quality indicator sub-elements rated 1, 0.5 or 0, where sites were ‘compliant’, ‘partially-compliant’ or ‘non-compliant’, respectively [35]. The SPI-RT checklist contains a built-in analysis function. The points were totaled and divided by the total number of possible points to produce an overall median score. Five levels were assigned in meeting the QA requirements ranging from level 0 to level 4 as follows:
Table 1

Median scores and percentages by domain: Quality of HIV rapid testing practices, South Africa.

DomainMedian score (IQR)*Median overall score* (IQR) as percentage of highest possible score
Personnel training / certification 3.5 (1.0–5.0)35.0% (10.0%–50.0%)
Physical facility 4.5 (4.0–5.0)90.0% (80.0%–100.0%)
Safety 8.5 (7.0–10.0)77.3% (63.6%– 90.9%)
Pre-testing phase 10 (9.0–11.0)83.3% (75.0% –91.7%)
Testing phase 5.5 (3.0–7.0)61.1% (33.3% –77.8%)
Post-testing phase 7 (5.5–8.0)77.8% (61.1% –88.9%)
EQA 1 (0.0–4.0)12.5% (0.0%–50.0%)
Overall score 39.8 (32.546.0)62.1% (50.8%71.9%)

** Note: the highest possible scores for each domain: personnel training /certification = 10; physical facility = 5; safety = 11; pre-testing phase = 12; testing phase = 9; post-testing phase = 9; and EQA = 8. IQR = Interquartile Range

Level 0 site; a score of less than 40%, needs improvement in all areas and immediate remediation Level 1 site; a score between 40–59%, needs improvement in specific areas Level 2 site; a score between 60–79%, is partially ready for national site certification Level 3 site; a score between 80–89%, is close to national site certification Level 4 site; a score of 90% or higher–eligible for national site certification. ** Note: the highest possible scores for each domain: personnel training /certification = 10; physical facility = 5; safety = 11; pre-testing phase = 12; testing phase = 9; post-testing phase = 9; and EQA = 8. IQR = Interquartile Range

Data analysis

Data were uploaded into an Open Data Kit software system (University of Washington, Washington DC., USA (https://opendatakit.org/), and exported to Excel (Microsoft Corporation, USA). STATA/SE14.0 (STATA Corporation, College Station, Texas, USA) was used for analyses. We used descriptive statistics to calculate frequencies, medians and interquartile range (IQR); rank sum tests to compare differences; logistic regression to model associations and 95% confidence intervals (CI) to assess statistically significant differences between different site. We used descriptive statistics to calculate median percentage scores within a domain (domain-specific median score), and across all domains (overall median scores). The latter was used to classify testing sites according to five implementation levels for national site certification (see section above).

Ethical considerations

The study protocol was approved by the South African Medical Research Council (SAMRC) Ethics Committee, (EC029-9/2015). “The protocol was also reviewed in accordance with the Centers for Disease Control and Prevention (CDC) human research protection procedures and was determined to be research, but CDC investigators did not interact with human subjects or have access to identifiable data or specimens for research purposes”. All participants (facility manager and one designated HIV tester per site) provided written informed consent before participation.

Results

Facilities visited by province and sample size realization

We conducted QA in 346 of the selected 360 (96.1%) facilities across nine provinces (S1 Table). Twelve facilities were excluded because they were referral or mobile facilities and there were logistical challenges in two facilities. Sample size realization per province ranged between 87.9% and 100%. The majority (71.1%) of facilities assessed were in PEPFAR-supported districts and 63.9% were not antenatal sentinel sites (non-ANSUR).

Median score and median percentage score by domain

Facilities obtained a median overall score of 39.8 (IQR 32.5–-46.0; total = 64), which corresponded to a median overall percentage score of 62.1% (IQR 50.8 –- 71.9%). The lowest median percentage scores were obtained under the domains training/certification (35.0% IQR 10.0–50.0%) and EQA (12.5% IQR 0.0–50.0%) (Table 1). There were statistically significant inter-provincial differences in overall median percentage scores ranging from 46.9% (IQR 43.0 –- 59.4%) and 43.4% (IQR 37.5–48.0%), in Free State (FS) and Northern Cape (NC) provinces, respectively to 71.1% (IQR 67.2–78.1%) and 70.3% (IQR 63.3–78.1%) in Limpopo and Mpumalanga provinces (LP and MP), respectively (P value <0.001) (Table 2).
Table 2

Distribution of median overall score and percentage by province: Quality of HIV rapid testing practices, South Africa.

ProvincePlanned sample size Number (%)*Number (%) assessed**Median overall scores (IQR)***Median overall score (IQR) as percentage of highest possible score* **
Eastern Cape (EC) 43 (11.9)43 (100.0)37.0 (31.0–45.0)57.8% (48.4–70.3%)
Free State (FS) 17 (4.7)17 (100.0)30.0 (27.5–- 38.0)46.9% (43.0–59.4%)
Gauteng (GP) 86 (23.9)80 (93.0)42.5 (36.3–47.3)66.4% (56.6–73.8%)
KwaZulu-Natal (KZN) 78 (21.7)77 (98.7)38.0 (33.0–42.5)59.4% (51.6 –- 66.4%)
Limpopo (LP) 45 (12.5)43 (95.6)45.5 (43.0–50.0)71.1% (67.2–78.1%)
Mpumalanga (MP) 30 (8.3)29 (96.7)45.0 (40.5–50.0)70.3% (63.3–78.1%)
North West (NW) 20 (5.6)20 (100)39.3 (34.8–52.3)61.3% (54.3–81.6%)
Northern Cape (NC) 8 (2.2)8 (100)27.8 (24.0–30.75)43.4% (37.5–48.0%)
Western Cape (WC) 33 (9.2)29 (87.9)33.0 (29.5–37.5)51.6% (46.1–58.6%)
Total 360 (100) 346 (96.1) 39.8 (32.546.0)62.1% (50.871.9%)

*As proportion of total number of facilities per province;

** As proportion of planned sample size;

***includes all seven domains; IQR = Interquartile Range

† test for equality of median percentage scores by province: p value <0.001.

*As proportion of total number of facilities per province; ** As proportion of planned sample size; ***includes all seven domains; IQR = Interquartile Range † test for equality of median percentage scores by province: p value <0.001.

Overall scores by geographical type, PEPFAR support, and participation in 2015 antenatal survey

Facilities in PEPFAR-supported priority districts had significantly higher median overall percentage scores (65.6% IQR 53.9–74.2%) compared to non–PEPFAR-supported facilities (56.6% IQR 47.7–66.0%) (Table 3).
Table 3

Score by locality, site type and PEPFAR support: Quality of HIV rapid testing practices, South Africa.

Number (%) of facilities visitedMedian score (IQR)*Median Overall score (IQR) as percentage of highest possible score*P. Value **
Locality
Urban163 (47.1)39.5 (32.5 –- 45.0)61.7% (50.8 –- 70.3%)0.3
Rural183 (52.9)40.0 (33.0–47.0)62.5% (51.6–73.4%)
Site type
ANSUR facilities125 (36.1)39.5 (32.0–46.0)61.7% (50–71.1%)0.5
Non-ANSUR facilities221 (63.9)40.5 (33.0–46.5)63.3% (51.6–72.7%)
PEPFAR support
Facilities in PEPFAR-supported districts246 (71.1)42.0 (35.0–47.5)65.6% (53.9–74.2%)<0.001
Non-PEPFAR facilities100 (28.9)36.3 (30.5–42.3)56.6% (47.7 –- 66.0%)
Overall 346 (100) 39.8 (32.5–46.0) 62.1% (50.8–71.9%)

*includes all seven domains

** Wilcoxon rank sum tests; IQR = Interquartile Range

╫ ANSUR indicates that the facility participated in the 2015 antenatal survey

*includes all seven domains ** Wilcoxon rank sum tests; IQR = Interquartile Range ╫ ANSUR indicates that the facility participated in the 2015 antenatal survey

Implementation levels by province

Overall, most facilities (98.8%) did not meet the standard to qualify for national site certification in HIV rapid testing practices. Eighty-nine percent of sites were at level 2 and below with 37% (128/346 requiring improvement in specific areas (level 1) and 6.4% in all areas and immediate remediation (level 0). In three provinces (FS, NC and WC) none of the facilities were at implementation levels 3 and 4; and less than 2.6% of facilities in KwaZulu-Natal (KZN) were at implementation levels 3 and 4 (Table 4).
Table 4

Distribution of implementation levels by province: Quality of HIV rapid testing practices, South Africa.

ProvinceNumber of facilitiesLevel 0Level 1Level 2Level 3Level 4
(<40%) (40–59%)(60–79%)(80–89%) (90% orhigher)
NumberNumberNumberNumberNumber
(%)(%)(%)(%)(%)
EC 433 (7.0%)19 (44.2%)15 (34.9%)5 (11.6%)1 (2.3%)
FS 173 (17.7%)11 (64.7%)3 (17.7%)0 (0%)0 (0%)
GP 803 (3.8%)23 (28.8%)43 (53.8%)9 (11.3%)2 (2.5%)
KZN 775 (6.5%)34 (44.2%)36 (46.8%)1 (1.3%)1 (1.3%)
LP 430 (0%)2 (4.7%)33 (76.7%)8 (18.6%)0 (0%)
MP 290 (0%)7 (24.1%)16 (55.2%)6 (20.7%)0 (0%)
NW 201 (5%)8 (40.0%)6 (30.0%)5 (25.0%)0 (0%)
NC 83 (37.5%)5 (62.5%)0 (0%)0 (0%)0 (0%)
WC 294 (13.8%)19 (65.5%)6 (20.7%)0 (0%)0 (0%)
All 34622 (6.4%)128 (37.0%)158 (45.7%)34 (9.8%)4 (1.2%)

Level 0 site [RED]; a score of less than 40, needs improvement in all areas and immediate remediation

Level 1 site [ORANGE]; a score between 40–59%, needs improvement in specific areas

Level 2 site [YELLOW]; a score between 60–79%, is partially ready for national site certification

Level 3 site [LIGHT GREEN]; a score between 80 –- 89%, is close to national site certification

Level 4 site [DARK GREEN]; a score of 90% or higher, is eligible for national site certification

Level 0 site [RED]; a score of less than 40, needs improvement in all areas and immediate remediation Level 1 site [ORANGE]; a score between 40–59%, needs improvement in specific areas Level 2 site [YELLOW]; a score between 60–79%, is partially ready for national site certification Level 3 site [LIGHT GREEN]; a score between 80 –- 89%, is close to national site certification Level 4 site [DARK GREEN]; a score of 90% or higher, is eligible for national site certification

Implementation levels by locality, site type and PEPFAR support

In a univariate logistic regression, facilities in PEPFAR-supported priority districts had 5.4 (95% CI: 1.6–17.8%) times higher odds of being at level 3 and 4 compared to non–PEPFAR–supported facilities (Table 5).
Table 5

Stratified analysis of implementation levels by locality, site type and PEPFAR support: Quality of HIV rapid testing practices, South Africa.

ProvinceNumberLevel 0Level 1Level 2Level 3Level 4Odds ratio (95% CI) *
of facilitiesNumber (%)Number (%)Number (%)Number (%)Number (%)
Locality
Urban16313 (8.0%)58 (35.6%)75 (46.0%)15 (9.2%)2 (1.2%)1.1 (0.6–2.2)
Rural1839 (4.9%)70 (38.3%)83 (45.4%)19 (10.4%)2 (1.1%)
Site type
ANSUR1257 (5.6%)50 (40.0%)58 (46.4%)8 (6.4%)2 (1.6%)0.6 (0.3–1.3)
Non-ANSUR22115 (6.8%)78 (35.3%)100 (45.3%)26 (11.8%)2 (0.9%)
Support
PEPFAR support24614 (5.7%)75 (30.5%)122 (49.6%)32 (13.0%)3 (1.2%)5.4 (1.6–17.8)
Non-PEPFAR support1008 (8.0%)53 (53.0%)36 (36.0%)2 (2.0%)1 (1.0%)
All34622 (6.4%)128 (37.0%)158 (45.7%)34 (9.8%)4 (1.2%)

*univariate logistic regression assessed the odds of being at levels 3 and above (vs being at levels < 3) for facilities in PEPFAR districts, ANSUR facilities and rural facilities. CI- Confidence Interval

*univariate logistic regression assessed the odds of being at levels 3 and above (vs being at levels < 3) for facilities in PEPFAR districts, ANSUR facilities and rural facilities. CI- Confidence Interval

Commonly identified gaps

Table 6 lists performance gaps for each domain where testing sites were rated 0, including the number of facilities that were fully non-compliant for the gaps identified. These included 87% of sites not having documents indicating testers’ competency prior to HIV testing; quality control (QC) specimens not used routinely and lack of corrective action for unsatisfactory PT results (Table 6).
Table 6

Commonly identified deficiencies: Quality of HIV rapid testing practices, South Africa.

DomainsCommon gaps (% of facilities)
Personnel training/certification: Testers not trained on EQA/ PT and QC (56.1%)
No evidence of refresher training/ no documentation (77.2%)
No documents indicating testers’ competency prior to HIV testing (87.0%).
No national certification programme in place (84.4%)
Physical facility Room temperature poorly monitored /test kits exposed to direct sunlight/ no thermometers (19.9%)
No designated area for testing/room used for multiple purposes (7.8%)
Safety Standard Operating Procedures (SOPs)/Job aids not available in the testing room (35%)
Incorrect use of gloves /absence of protective aprons (8.1%)
Pre-testing phase Job aids on client sample collection not available/posted (26.9%)
Test kits not initialed, not dated (62.4%)
Testing phase Timers not in good working order/not available (39.9%)
Procedure followed inaccurately (14.2%)
Incorrect amount of buffer used (8.7%)
QC specimens not used routinely/no samples available (43.5%)
Unsure how to handle invalid QC results/ not recorded (70.2%)
Post-Testing phase Incomplete recording of QC key elements (lot number, expiry date) on HIV testing logbooks/registers (8.4%)
Inaccurate/inconsistent capturing of total summaries in logbooks (19.9%)
Invalid test results not recorded in register/ logbook (55.8%)
Registers not properly labelled and archived when full (24.3%)
EQA (PT, supervision & retesting) Testing point not enrolled in EQA/PT (62.7%)
QC results are not reviewed by the person in charge (69.9%)
Corrective action for unsatisfactory results implemented (86.1%)
QC samples for PT not available/ PT not done (46.0%)
Periodic supervisory visits not done/ not documented (72.3%)
No feedback/ re-training (80.4%)

Discussion

We sought to investigate quality of HIV rapid testing practices in selected antenatal public health facilities across SA using an adapted SPI-RT checklist classification for national site certification. Our main finding demonstrates that in 2017, almost all (98.8%) ANC testing sites did not adequately meet HIV rapid testing standards for national site certification. Compared to results of the 2015/17 RTCQI consecutive national site assessments for HIV rapid testing QA measured at SA HTS sites, our results show that ANC clinics are at lower implementation levels compared to HTS sites. In the RTCQI programme conducted in HTS sites around at the same time as this study, 38.8% of HTS sites were eligible for national certification as compared to 11% of antenatal sites that were eligible for site certification in our study. The difference in site eligibility between our study and the RTCQI sites could be attributed to the implementation of the PEPFAR-funded RTCQI support in HTS sites which did not include ANC clinics. According to the WHO, a score of at least 80% in each of the three phases (pre-testing, testing and post-testing) critical for the accuracy of diagnosis may be acceptable [3]. Our survey results demonstrate that the overall median percentage scores for both testing and post-testing phases were below 80%. This finding has important implications given the increase in the number of clients requiring HTS [36, 37] and high PMTCT uptake of 99.8% in SA; because high quality performance is required on all 3 phases. Consistent with our findings, it is encouraging to note that overall median scores for pre-testing in our study (i.e. 83.3%) were similar to those reported during the RTCQI programme assessments [32]. The RTQI assessment and our survey differed in the testing phase median scores; with our study scoring 16.7 percentage points below that of the PEPFAR-funded RTCQI programme [32]. Our study demonstrates that sites scored poorly on training/certification and EQA/PT domains. Similarly, in the PEPFAR-funded RTCQI programme [32] the median percentage scores for training and certification in HTS site assessments remained the lowest in both consecutive assessments (30.0% (IQR 20.0–40.0%) and 50.0% (IQR: 40.0–60.0%) in the first and second assessments, respectively) [32]. This lack of improvement in training and certification in the PEPFAR-funded RTCQI assessments could be attributed to the lack of national certification programme and the fact that sometimes the training may not be properly cascaded down to staff in the respective clinics [32]. Stepping up staff competencies is critically important given clients’ rights and the evidence suggesting that pregnant women value and appreciate when screening procedures are conducted by knowledgeable, supportive and respectful health care providers as this allows for a positive pregnancy experience [31, 38, 39]. We identified two common domain-specific gaps in our study where a high proportion of testing sites were rated fully non-compliant. These include non-existence of a nationally accredited programme and non-adherence to the EQA/PT programme. Although there is a system for assessing user competency in SA there is no nationally accredited certification programme, and the system does not work well. The HTS policy recommends quarterly EQA/PT supervisory support visits to testing sites [30]. It is apparent that current HIV RT training and supervisory mechanisms are weak and need to be strengthened and monitored. Additionally, we found that some QC assay procedures were not adhered to. These included incorrect use of buffer drops; exposing test kits to sunlight and incorrect time to reading results. Engel et al. [40] found that frequently changing HIV test kits brands were important barriers contributing to the use of incorrect amounts of buffer drops and shortened time to reading results, respectively [41]. We found that there was non-compliance to HIV testing standards and poor documentation which are suggestive of low staff competency levels. These findings are consistent with previous studies reporting high prevalence of HIV false-positive and false-negative rapid testing results [8, 9, 21, 41] and poor data quality [6, 42–45]. In 2019, Woldesenbet et al. [33], using data from the 2017 ANSUR, reported high PPA (97.6%) between third generation point of care (POC) rapid testing and laboratory-based fourth generation immunoassay test. Despite the low quality of testing practice we report in this study, it is encouraging that the risk of being diagnosed as false HIV positive was minimal in the above study [33]. However, since the study (Woldesenbet et al) was based on a cross-sectional survey, it will be important to monitor test agreement on a regular basis (by using other methods such as PT), and caution needs to be taken in using PMTCT data for monitoring of antenatal HIV prevalence [33]. Our further analyses show that ANC testing sites in PEPFAR-supported districts were more likely of being at level 3 and 4 compared to those in non-PEPFAR-supported districts. This difference could be due to the additional technical and resource support provided by PEPFAR to these facilities [32, 46]. A spill-over effect could have been possible within the same facility from HTS testers who receive PEPFAR-funded RTCQI support. It is common for HIV testers to rotate between HTS and ANC services within the same facility to address human resource shortages [47]. Between 2015 and 2016, an audit of rural antenatal PHC clinics in KZN reported an average rating score of 64.4% (CI:44% ± 84%) and 89.2% (CI: 74% ± 100%), respectively in terms of compliance to the WHO guidelines for assuring accuracy and reliability of HIV rapid tests [7]. The PEPFAR-funded RTCQI programme also reported variations in implementation levels across provinces [32]. Our study had some reassuring findings. Privacy is an important determinant to HIV testing acceptance. Our findings show that facilities scored highest (90%) in the physical facility domain. In most sites, national HIV testing guidelines were available and good stock management systems were in place. Consistent with our finding, evaluations by Jones et al. [48] in 2019 on implementation of PMTCT policies (2013–2016) found that the highest proportions (around 60%) of facilities in rural SA reported no stock-outs of HIV test kits in the year before the survey [48]. However, contrary findings were reported by two studies in SA; a qualitative study [41] conducted in Durban, Cape Town and EC (2012–2013) and a survey [24] in rural PHC facilities in KZN (2015–2016). These studies identified HIV test kits stock-out as a major barrier which resulted in poor compliance with testing guidelines, with some testers having to adapt HIV rapid testing algorithms [24, 41] or to refer clients to nearby facilities [41]. Focused continuous quality improvement (CQI) interventions including technical on-site mentoring have shown to remarkably improve PMTCT data quality [42, 43].

Study strengths and limitations

The strength of our study is that ANC facilities were selected from a range of national scenarios, considering locality, site type and PEPFAR support. Furthermore, field workers were carefully trained and supervised. In addition, data collection occurred outside the period (October) when the regular ANSUR survey took place to minimize bias. Our study had several limitations. Firstly, only one tester was observed on one occasion from each site performing a simulated HIV test; so, it is possible that testers assessed were not competent in HIV rapid testing, and nuances of observing actual testing in real life may have been missed during simulation. Secondly, smaller volume, referral and mobile testing sites were not included. Thirdly, we did not set up the sample size to provide provincially or nationally representative findings.

Conclusions

Our findings add to the growing literature on the utility of routine PMTCT data for monitoring antenatal HIV prevalence. It suggests that the growing demand for HIV rapid testing in the era of UTT has not kept pace with QA standards within the PMTCT programme, since almost all antenatal sites did not meet the criteria for national site certification. Substantial progress has been reported in HTS sites following the implementation of the PEPFAR-funded RTCQI support in SA. Despite implementation gaps observed in our study, many focused CQI strategies have been introduced by the SA NDoH as steps towards preparing for routine PMTCT data-based surveillance. These include improving the 2017 ANSUR sampling strategy, the Ideal Clinic Realization and Maintenance Programme (a comprehensive approach aiming at creating an enabling environment for sustainable implementation of QA processes), rationalization of registers (aimed at reducing the number of registers used in PHC facilities), and Nurse—connect (a project that uses mobile technology to send targeted support messages and expert information to nurses on maternal and child health) [49-51]. Improving HIV rapid testing practices in ANC clinics would be invaluable to speed up South Africa’s readiness to transition to routine programme data—based surveillance. We recommend expanding PEPFAR-funded RTCQI support, using the WHO SPI-RT tool to all HIV testing sites, including ANC sites. In future a non-simulated nationally representative survey should be conducted to evaluate the overall quality of HIV rapid testing practices in PHC settings in South Africa.

Distribution of facilities according to province, sentinel site and locality type in South Africa.

(DOCX) Click here for additional data file. 3 Mar 2022
PONE-D-22-00472
Investigating the quality of HIV rapid testing practices in public antenatal health care facilities, South Africa
PLOS ONE Dear Dr. Nsibande,
Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process. ACADEMIC EDITOR:
Please submit your revised manuscript by Apr 17 2022 11:59PM. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Mansucript submitted inspects/validates quality of   HIV rapid testing practices in public antenatal health care facilities. Authors had  made an attempt to put forth the gaps in practices through application of valid tool  WHO Stepwise-Process-for-Improving-the-Quality-of-HIV-Rapid-Testing (SPI-RT) checklist. Please modify/accept more concise introduction as suggested by reveiwer 1. Sampling section needs to be elaborated  further as suggested by reveiwer 1. Conculsion section may further be added with recommendations as marked by revewier 2 apart from other queries raised. Specific feedback Gap identification and gap closure is an important tool for health system strengthening. Application of WHO tool to addresss that ,would strengthen the systems in place. Please include the following items when submitting your revised manuscript:
A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Gopal Ashish Sharma, MBBS, MD Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf 2. Thank you for stating the following financial disclosure: “The funders had no role in study design, data collection and analysis decision to publish, or preparation of the manuscript.” At this time, please address the following queries: a)        Please clarify the sources of funding (financial or material support) for your study. List the grants or organizations that supported your study, including funding received from your institution. b)        State what role the funders took in the study. If the funders had no role in your study, please state: “The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.” c)        If any authors received a salary from any of your funders, please state which authors and which funders. d)        If you did not receive any funding for this study, please state: “The authors received no specific funding for this work.” Please include your amended statements within your cover letter; we will change the online submission form on your behalf. 3. In your Data Availability statement, you have not specified where the minimal data set underlying the results described in your manuscript can be found. PLOS defines a study's minimal data set as the underlying data used to reach the conclusions drawn in the manuscript and any additional data required to replicate the reported study findings in their entirety. All PLOS journals require that the minimal data set be made fully available. For more information about our data policy, please see http://journals.plos.org/plosone/s/data-availability. Upon re-submitting your revised manuscript, please upload your study’s minimal underlying data set as either Supporting Information files or to a stable, public repository and include the relevant URLs, DOIs, or accession numbers within your revised cover letter. For a list of acceptable repositories, please see http://journals.plos.org/plosone/s/data-availability#loc-recommended-repositories. Any potentially identifying patient information must be fully anonymized. Important: If there are ethical or legal restrictions to sharing your data publicly, please explain these restrictions in detail. Please see our guidelines for more information on what we consider unacceptable restrictions to publicly sharing data: http://journals.plos.org/plosone/s/data-availability#loc-unacceptable-data-access-restrictions. Note that it is not acceptable for the authors to be the sole named individuals responsible for ensuring data access. We will update your Data Availability statement to reflect the information you provide in your cover letter. 4. Please include captions for your Supporting Information files at the end of your manuscript, and update any in-text citations to match accordingly. Please see our Supporting Information guidelines for more information: http://journals.plos.org/plosone/s/supporting-information. 5. Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: No Reviewer #2: Yes ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The Introduction part can be reduced, it is bit lengthy. Although the matter of introduction is relevant, but it can be concised. The Sampling strategy may be elaborated: what was sampling unit, how the sample size calculated, how strata's were defined and selected and finally how were the facilities selected. Inclusion and exclusion criteria may be included. Reviewer #2: 1. The study was conducted in 2017. Are there any changes in National HIV strategy since then? Are the findings relevant in 2022? 2. Elaborate RTCQI in Abstract section as it is not a common abbreviation (Line 24). 3. Is it “Countries” or “Districts”? (Line 93) 4. Statistical difference (P value) among the Provinces can be stated for better understanding. (Table 2) 5. Conclusion section should contain detailed recommendations on the basis of study observations. Moreover, based on study limitations, suggestions can be made for future research studies for generating more generalizable and valid findings. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.
Submitted filename: PONE-D-22-00472 Reviewer Comments.docx Click here for additional data file. 13 Apr 2022 Reviewer’s Comments PONE-D-22-00472 Specific feedback: Reviewer #1: 1.The Introduction part can be reduced, it is bit lengthy. Although the matter of introduction is relevant, but it can be concised. Response: The introduction has been reduced as suggested from 1 050 to 851 words (pages 5-8). 2. The Sampling strategy may be elaborated: what was sampling unit, how the sample size calculated, how strata's were defined and selected and finally how were the facilities selected. Inclusion and exclusion criteria may be included. Response: The sampling strategy has been revised under methods page 9 (lines 114-140) Reviewer # 2 1. The study was conducted in 2017. Are there any changes in National HIV strategy since then? Are the findings relevant in 2022? Response: Response: The findings are still relevant in 2022. South Africa is still implementing the 2017-2022 National HIV Strategic Plan aims at an intensified focus on districts and locations with high burdens of HIV, STIs and/or TB; on adolescent girls and young women and on tailoring interventions for the key and vulnerable populations. 2. Elaborate RTCQI in Abstract section as it is not a common abbreviation (Line 24). Response: The full name for RTCQI is ‘Rapid Test Continuous Quality Improvement. It has been added in the abstract on page 5 (line 24). 3. Is it “Countries” or “Districts”? (Line 93). Response: The word countries has been replaced with districts under Introduction, on page 8 (line 96) 4. Statistical difference (P value) among the Provinces can be stated for better understanding. (Table 2) Response: statistical significance test (p-value) has been added under results, page 13 Table 2 (lines 205-6) 5. Conclusion section should contain detailed recommendations on the basis of study observations. Moreover, based on study limitations, suggestions can be made for future research studies for generating more generalizable and valid findings. Response: The conclusion section has been revised. Detailed recommendations based on study limitations have been made on pages 26 (lines 365-378) Submitted filename: RESPONSE To REVIEWERS _PLOSONE_07April2022 .docx Click here for additional data file. 6 May 2022 Investigating the quality of HIV rapid testing practices in public antenatal health care facilities, South Africa PONE-D-22-00472R1 Dear Dr. Duduzile Faith Nsibande, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Gopal Ashish Sharma, MBBS, MD Academic Editor PLOS ONE Additional Editor Comments (optional): Editor -Specific The manuscript submitted  and further reviewed addresses gaps and quality of HIV rapid testing practices in public antenatal health care facilities by use of valid tool by WHO. The manuscript highights key vital concern of Quality , as only 11% of assessed health care  facilities had score close to 80% and above (level 3 & 4). As the process itself is rapid and quintessential to primarily manage ANC patients , the quality concerns warrant sequential and regular evaluvation to curtail HIV transmission at grassroot level. Identified gaps,specifically in the documentation processes along with national certfication /accrediation  etc as discussed in the manuscript needs priortized intervention by policy makers. This would further contribute in HSS across ANC services in the country. Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #1: All comments have been addressed Reviewer #2: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: Yes ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: The suggestions as made, have been taken into consideration by the author and have been addressed too. Reviewer #2: (No Response) ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: No Reviewer #2: No 19 Aug 2022 PONE-D-22-00472R1 Investigating the quality of HIV rapid testing practices in public antenatal health care facilities, South Africa Dear Dr. Nsibande: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Gopal Ashish Sharma Academic Editor PLOS ONE
  30 in total

1.  The quality of HIV testing services for adolescents in Cape Town, South Africa: do adolescent-friendly services make a difference?

Authors:  Catherine Mathews; Sally J Guttmacher; Alan J Flisher; Yolisa Y Mtshizana; Tobey Nelson; Jean McCarthy; Vanessa Daries
Journal:  J Adolesc Health       Date:  2008-09-27       Impact factor: 5.012

2.  Routine data from prevention of mother-to-child transmission (PMTCT) HIV testing not yet ready for HIV surveillance in Mozambique: a retrospective analysis of matched test results.

Authors:  Peter W Young; Mussagy Mahomed; Roberta Z Horth; Ray W Shiraishi; Ilesh V Jani
Journal:  BMC Infect Dis       Date:  2013-02-22       Impact factor: 3.090

3.  Clients' perceptions and satisfaction with HIV counselling and testing: A cross-sectional study in 56 HCT sites in South Africa.

Authors:  Gladys Matseke; Karl Peltzer; Neo Mohlabane
Journal:  Afr J Prim Health Care Fam Med       Date:  2016-08-31

4.  Conducting unlinked anonymous HIV surveillance in developing countries: ethical, epidemiological, and public health concerns.

Authors:  Stuart Rennie; Abigail Norris Turner; Bavon Mupenda; Frieda Behets
Journal:  PLoS Med       Date:  2009-01-20       Impact factor: 11.069

5.  Improving the coverage of the PMTCT programme through a participatory quality improvement intervention in South Africa.

Authors:  Tanya Doherty; Mickey Chopra; Duduzile Nsibande; Dudu Mngoma
Journal:  BMC Public Health       Date:  2009-11-05       Impact factor: 3.295

Review 6.  Supervised and unsupervised self-testing for HIV in high- and low-risk populations: a systematic review.

Authors:  Nitika Pant Pai; Jigyasa Sharma; Sushmita Shivkumar; Sabrina Pillay; Caroline Vadnais; Lawrence Joseph; Keertan Dheda; Rosanna W Peeling
Journal:  PLoS Med       Date:  2013-04-02       Impact factor: 11.069

7.  False positive HIV diagnoses in resource limited settings: operational lessons learned for HIV programmes.

Authors:  Leslie Shanks; Derryck Klarkowski; Daniel P O'Brien
Journal:  PLoS One       Date:  2013-03-20       Impact factor: 3.240

8.  Challenges for routine health system data management in a large public programme to prevent mother-to-child HIV transmission in South Africa.

Authors:  Kedar S Mate; Brandon Bennett; Wendy Mphatswe; Pierre Barker; Nigel Rollins
Journal:  PLoS One       Date:  2009-05-12       Impact factor: 3.240

9.  Improving the quality of health information: a qualitative assessment of data management and reporting systems in Botswana.

Authors:  Jenny H Ledikwe; Jessica Grignon; Refeletswe Lebelonyane; Steven Ludick; Ellah Matshediso; Baraedi W Sento; Anjali Sharma; Bazghina-werq Semo
Journal:  Health Res Policy Syst       Date:  2014-01-30

10.  Evaluation of Senegal's prevention of mother to child transmission of HIV (PMTCT) program data for HIV surveillance.

Authors:  Ousmane Diouf; Astou Gueye-Gaye; Moussa Sarr; Abdou Salam Mbengue; Christopher S Murrill; Jacob Dee; Papa Ousmane Diaw; Ndeye Fatou Ngom-Faye; Pape Amadou Niang Diallo; Carlos Suarez; Massaer Gueye; Aminata Mboup; Coumba Toure-Kane; Souleymane Mboup
Journal:  BMC Infect Dis       Date:  2018-11-20       Impact factor: 3.090

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.