Literature DB >> 32439739

Understanding quality systems in the South African prehospital emergency medical services: a multiple exploratory case study.

Ian Howard1,2, Peter Cameron3, Lee Wallis2,4, Maaret Castrén5, Veronica Lindström6.   

Abstract

INTRODUCTION: In South Africa (SA), prehospital emergency care is delivered by emergency medical services (EMS) across the country. Within these services, quality systems are in their infancy, and issues regarding transparency, reliability and contextual relevance have been cited as common concerns, exacerbated by poor communication, and ineffective leadership. As a result, we undertook a study to assess the current state of quality systems in EMS in SA, so as to determine priorities for initial focus regarding their development.
METHODS: A multiple exploratory case study design was used that employed the Institute for Healthcare Improvement's 18-point Quality Program Assessment Tool as both a formative assessment and semistructured interview guide using four provincial government EMS and one national private service.
RESULTS: Services generally scored higher for structure and planning. Measurement and improvement were found to be more dependent on utilisation and perceived mandate. There was a relatively strong focus on clinical quality assessment within the private service, whereas in the provincial systems, measures were exclusively restricted to call times with little focus on clinical care. Staff engagement and programme evaluation were generally among the lowest scores. A multitude of contextual factors were identified that affected the effectiveness of quality systems, centred around leadership, vision and mission, and quality system infrastructure and capacity, guided by the need for comprehensive yet pragmatic strategic policies and standards.
CONCLUSION: Understanding and accounting for these factors will be key to ensuring both successful implementation and ongoing utilisation of healthcare quality systems in emergency care. The result will not only provide a more efficient and effective service, but also positively impact patient safety and quality of care of the services delivered. © Author(s) (or their employer(s)) 2020. Re-use permitted under CC BY-NC. No commercial re-use. See rights and permissions. Published by BMJ.

Entities:  

Keywords:  ambulances; governance; prehospital care; qualitative research

Mesh:

Year:  2020        PMID: 32439739      PMCID: PMC7247383          DOI: 10.1136/bmjoq-2020-000946

Source DB:  PubMed          Journal:  BMJ Open Qual        ISSN: 2399-6641


Introduction

The importance of quality systems in the prehospital emergency care (PEC) setting is becoming increasingly recognised given that the delivery of PEC services is frequently provided against the backdrop of demanding environments, often with limited resources, and for patients of varying and unpredictable acuity.1–4 As PEC focused tools for measuring and understanding patient safety and quality of care have been developed and implemented, so too has the recognition of the importance of formal systems for governing such activities.4–9 In South Africa (SA), a mix of government-funded and private emergency medical services (EMS) deliver PEC across the country.10 Within these services, quality systems are in their infancy.11 Among PEC clinicians, the general perception of EMS quality systems in the country is poor.11 Concerns regarding system transparency, reliability and contextual relevance have been cited as common reasons for this.11 These issues have been exacerbated by apparent poor communication, ineffective leadership and a historical association of the use quality systems as a punitive mechanism.11 Recent National Department of Health policy reviews have highlighted the importance of systems for developing, implementing and monitoring the quality of healthcare in the country.12 While significant advances have been made in improving the scope of practice, training and education of PEC clinicians, little has been done towards developing formal quality systems aimed at assessing and maintaining standards of quality of care and patient safety in the PEC setting in SA. There are a multitude of potential factors that could affect these systems as a whole. Therefore, in order to determine priorities for focus regarding their development and improvement, it is important to first understand the current state of EMS quality systems in the country. Given this need, we undertook a study to assess prehospital EMS quality systems in SA.

Methods

A multiple exploratory case study design was used in order to achieve the study aim.13 14 For the purposes of this study, a case was defined as the quality programme or system of performance measurement of a participating service. The definition of a case was purposely kept broad given that quality measurement by EMS in SA is limited and the existence or scope of formal quality systems likely to be equally limited.11 The quality systems of four provincial government EMS and one national private EMS organisation were used for the purposes of this study.

Primary data collection

Multiple sources and data types were used and collected to achieve the study aim.14 The Institute for Healthcare Improvement’s Quality Program Assessment Tool was employed as the primary means of data collection (online supplementary file 1). The tool uses a categorical rating scale of 0–5 to answer 18 key questions across six broad criteria, namely: Quality structure. Quality planning. Quality measurement. Quality improvement activities. Staff involvement in the quality programme. Evaluation of the quality programme. The tool was used as both a formative assessment for each participating service’s quality programme, as well as a semistructured interview guide to further explore the results obtained from the formative assessment. Data were collected via interviews of directors and leaders of the participating services with intricate knowledge of their respective service’s operations. To maintain anonymity, their specific titles have been omitted. All interviews were conducted in English and recorded for transcription and analysis. Reflective notes were maintained during each interview, and immediately after, for verification of the interview results during analysis.

Secondary data collection

Multiple sources of secondary data were collected to support the primary data, grouped into two categories. Category A secondary data were made up of the results of a targeted literature review to identify policy-focused guidance for EMS organisations in SA regarding the implementation of a quality programme; and/or the development, implementation and utilisation of methods to assess quality of care. A search of several key websites was conducted, including: The Health Professions Council of SA—the healthcare licensing body of the South African National Department of Health (SADoH); the SADoH; and Statistics South Africa—the statistical service of the South African national government. Category B secondary data were made up of publicly accessible quality and/or performance reports published by the participating services.

Setting and population

The delivery of prehospital emergency medical care in SA is based on a three-tiered system of basic, intermediate and advanced life support levels of qualification. Each level is licensed for independent practice and governed by a national registration board, yet delivered primarily through provincial government-funded EMS, with several private EMS located in the larger cities across the country servicing medical insurance clients. Given the variations in geography and population distribution across SA, the four provincial prehospital emergency medical services of KwaZulu Natal, Western Cape (WC), Limpopo (LP) and North West (NW) provinces were purposively selected to be as inclusive of this variation as possible (figure 1). There is limited evidence to suggest that private EMS in SA are more advanced regarding the utilisation of quality assessment tools and frameworks.11 As a result, a national private EMS organisation was additionally included as part of the multiple case review.
Figure 1

Participating provincial emergency medical services.

Participating provincial emergency medical services.

Data analysis

For the primary data collection, descriptive statistics were used to describe and summarise the categorical-based formative assessment. Conventional content analysis, as described by Hsieh and Shannon, was used to sort and analyse the interview data.15 Prior to analysis, each interview transcript was reread for content familiarisation. First-level coding was conducted through the extraction of meaning units from each transcript and summarised into codes using open coding. Once completed, similar codes were combined and organised to develop clustered subcategories. Throughout the first-level coding and subcategory development, the reflective notes were referenced for verification. Interview transcriptions were analysed using MAXQDA (MAXQDA, 2016; Sozialforschung GmbH, Berlin, Germany). For the secondary data collection, document analysis as described by Bowen was used to sort and analyse the supporting data.16 Eligible documents were retrieved and scanned for relevance based on the inclusion criteria. A full-text review was conducted if the document remarked on quality systems, quality of care or quality indicators (QIs). Supporting excerpts, quotations or passages that made reference to EMS in general or by case example were extracted and synthesised. Data were extracted using a standardised data extraction form (Microsoft Excel 2010; Redwood, Washington USA)

Triangulation

The utilisation and triangulation of multiple methods and data sources attempt to safeguard against potential implications that findings are simply an artefact of a single method, a single source or a single investigator’s bias.16 Therefore, for the purposes of this study, multiple methods were used to ensure internal validity and trustworthiness of the overall study, as described by Guba, and included17: the quality programme formative assessment and supporting documentation; the in-depth qualitative exploration of the assessment via recorded interviews and transcripts; reflective notes; national and/or provincial legislation, policies and directives; and published organisational performance reports. Consent for participation was provided by each of the participating services and individuals prior to data collection.

Patient and public involvement

No patients were involved in the development of the research question, study design or data collection. The results of the study will be disseminated to participants in the form of a peer-reviewed publication, once complete.

Results

The services included for the case review covered a multitude of social and healthcare demographics found across the country (table 1). There was equal variation in the outcomes of the formative assessment, where services generally scored higher for structure and planning (table 2). Measurement and improvement, however, were found to be more dependent on the services’ utilisation and perceived mandate. There was a relatively strong focus on clinical quality assessment and improvement within the private service, whereas in the provincial systems, QIs reported were exclusively restricted to call times and available vehicle resources, with little to no focus on clinical care. Given the limited scope of QIs measured and reported, it was somewhat predictable that staff engagement and programme evaluation were generally among the lowest scores for the participating services (see table 3 for subcategories and supporting quotes from the qualitative analysis of the quality programme assessment).
Table 1

Selected social and health demographics of participating provinces

MetricSouth AfricaWestern CapeKwaZulu NatalLimpopoNorth West
N%N%N%N%N%
Wealth quintiles
 Lowest2.725.527.614.7
 Second7.522.640.729.1
 Middle11.820.617.830.3
 Fourth32.115.87.718.7
 Highest45.815.46.27.3
Primary source of income
 Salary58.672.954.642.853.2
 Remittances9.42.710.716.312.2
 Pensions2.24.31.71.21.8
 Grants19.910.324.630.424.2
 Other sources9.99.98.49.38.5
Household type
 Other0.81.40.200
 Informal13.1196.74.918.6
 Traditional5012.62.20.5
 Formal81.179.680.59380.9
Household services
 Household piped water8998.786.674.185.2
 Household mains electricity84.787.983.592.783.7
 Household sanitation8393.881.458.970.6
Medical insurance coverage
 Male36.311.313.415.7
 Female30.112.710.514.9
Healthcare facility consulted first
 Public clinic64.943.773.978.172.3
 Public hospital6.112.44.77.82.6
 Other public institution0.50.10.40.30.6
 Private clinic1.31.20.90.81
 Private hospital1.62.31.10.50.7
 Private doctor24.239.818.311.119.7
 Traditional healer0.70.20.410.3
 Pharmacy0.40.30.20.30.3
 Other0.40.10.10.32.6
Problems in accessing healthcare
 Obtaining permission7.223.922.810.6
 Money for payment1627.837.532.9
 Distance to travel11.329.733.131.8
 Not wanting to go alone8.624.618.817.4
Satisfaction with healthcare facilities
 Public/government
 Very satisfied53.847.950.872.140.3
 Somewhat satisfied26.521.631.715.726
 Neither satisfied nor dissatisfied9.511.111.15.115.1
 Somewhat dissatisfied58.93.84.25.3
 Very dissatisfied5.210.52.62.913.4
Private
 Very satisfied92.693.789.391.989
 Somewhat satisfied53.77.45.89.1
 Neither satisfied nor dissatisfied1.30.92.700.3
 Somewhat dissatisfied0.50.90.30.31.3
 Very dissatisfied0.60.80.420.4
Distribution of death
 03.33.80.56.3
 1–141.52.90.53.5
 15–4424.330.721.827.4
 45–6430.626.73130
 65+40.135.646.332.7
 Unspecified0.20.200.1
Leading natural cause of death (all ages)
 TB1st6.55th5.11st7.64th5.51st7.4
 Diabetes2nd5.51st7.72nd7.42nd6.36th4.7
 Other forms of heart disease3rd5.110th3.13rd668th3.33rd5.5
 Cerebrovascular diseases4th5.15th63rd5.87th4.3
 HIV5th4.82nd6.24th6.27th3.48th3.4
 Hypertensive diseases6th4.49th3.97th3.85th5.42nd5.8
 Influenza and pneumonia7th4.31st7.65th5
 Other viral diseases8th3.68th3.66th5.24th5
 Ischaemic heart diseases9th2.83rd69th2.8
 Chronic lower respiratory diseases10th2.86th4.910th2.7
 Malignant neoplasm—digestive7th4.610th2.2
 Malignant neoplasm—intrathoracic8th4.6
 Intestinal infectious diseases9th2.9
 Renal failure10th2
 Other disorders involving immune mechanism9th3.2
Non-natural causes of death (all ages)
 Transport accidents7.51331.816.1
 Other accidental injuries6467.156.165.3
 Intentional self-harm0.420.40.2
 Assault24.413.7812.7
 Complications of medical and surgical care2.11.81.21.4

HIV, Human Immunodeficiency Virus; TB, tuberculosis.

Table 2

Quality programme formative assessment

No.Quality programme assessment tool questionWCKZNNWLPPrivate
Quality structure
 A.1Does the organisation have an organisational structure in place to plan, assess and improve the quality of care?21135
 A.2Have adequate resources been committed to fully support the quality programme?42024
 A.3Do the leadership support the quality programme?31135
Subtotal (max=15)942814
Quality planning
 B.1Does the organisation have a comprehensive quality improvement/management plan?23132
 B.2Does the organisation have clearly described roles and responsibilities for the quality programme?41014
 B.3Does the work plan specify timelines and accountabilities for the implementation of the quality programme?41033
Subtotal (max=15)105179
Quality measurement
 C.1Are appropriate outcome and process quality indicators selected in the quality programme?13112
 C.2Does the organisation regularly measure the quality of care?13013
 C.3Are processes established to evaluate, assess and follow-up on quality data?33023
Subtotal (max=15)59148
Quality improvement activities
 D.1Does the organisation conduct specific quality activities and projects to improve the quality of care?31123
 D.2Are quality improvement teams formed for specific projects?31024
 D.3Are systems in place to sustain quality improvements?33022
Subtotal (max=15)95169
Staff involvement
 E.1Are staff routinely educated about the programme’s quality programme?21021
 E.2Does the organisation routinely engage all levels of staff in quality programme activities?23022
 E.3Are patients involved in quality-related activities?30023
Subtotal (max=15)74066
Evaluation of quality programme
 F.1Is a process in place to evaluate the quality programme?33021
 F.2Does the quality programme integrate findings into future planning?33023
 F.3Does the programme have an information/data system in place to track patient care and measure quality indicators?23013
Subtotal (max=15)89057
Total (max=90)483653653

0—no plan/structure/process.

1—limited plan/structures/process in place.

2—early implementation.

3—full implementation.

4—developing systematic approach to quality.

5—full systematic approach to quality.

KZN, KwaZulu Natal; LP, Limpopo; NW, North West; WC, Western Cape.

Table 3

Qualitative exploration of the quality programme assessment

Participating service and intervieweeText referenceSubcategorySupporting quote
1. Western CapeDirector-level participant1.1Leadership“We’re at the disadvantage where [the director] who normally drives this [quality] has been away for probably almost two years now and as a consequence, much of these questions where we had answered reasonably well before, realistically speaking we are nowhere near that because the person responsible for coordinating that has not been here”
1.2Mandate“I’m of the view that in the South African context, we are a logistics company, we are not a medical company…we are a transport system”
1.3Historical factors“Because of the nature of the South African services, because of the socio-political aspects of the way cities are structured in South Africa, particularly in Cape Town, response time performance had to be prioritised, due to spatial divide… our cities are racially designed which means in a post-democratic country, in a way to break that up, you have to put a transport system in place, so that the racial divide, the inequity isn’t perpetuated, and where you don’t have a public transport system, when it comes to healthcare, that’s the primary purpose of ambulance service”
1.4Safety“so, what has happened as a consequence of safety, as a consequence of all of these ambulance attacks, one of the things we’ve had to do, we’ve had to engage with the community more often, so what is happening relatively frequently, is we attend patient health forums. The district managers must attend or send a representative to every community health forum meeting or community safety forum meeting. So, at these sessions, a patient voice invariably comes through”
2. KwaZulu NatalDeputy Director-level participant2.1Structure“EMS in KwaZulu Natal has a provincial M&E (measurement and evaluation) manager and then one FIO (facility information officer) per district. We have eleven districts in total. Information and quality currently measured are focused on service delivery. The quality of medical care provided to patients is an area that is currently lacking. A set of indicators is reported on monthly by each district using an excel spreadsheet, this is a huge challenge as data is manually captured at each level from the source to final consolidation and reporting”
2.2“We do have a quality plan in place. This is reviewed annually. The plan takes into account available resources, available budget and timeframes. The plan contains mainly issues around service delivery and strategies to improve service delivery. The plan is reviewed by the EMS management team which includes the EMS provincial management team and EMS district managers.”
2.3Mandate“When we measure quality of services, we look at the national norms currently available together with the demand for services. Firstly, we look at available resources and how we compare to the 1 ambulance per 10 000 population national norm. Then we look at the demand for services—what the available resources had to attend to. And then we look at the percentage P1 cases responded to within the national norms. These are all viewed as a piece of the complete puzzle and should not be measured or reported on independently as the picture will be incomplete. The assumption is that, if you have 1 ambulance per 10 000 population then you should be able to achieve the response time norms to P1 cases taking into account your case load has not spiked due to any unforeseen circumstance”
2.4“This is the focus of our performance measured on a continuous basis where trends are monitored on a monthly, quarterly and annual basis. Other quality indicators are measured as and when required, particularly if we have a special project or intervention in place.”
2.5Engagement“performance results are presented at our EMS management team forum and distributed to districts by the provincial M&E manager. EMS district managers are encouraged to present their performance to staff at all levels within the districts, but this is not happening in all districts”
2.6“As EMS we do not have much public engagement regarding our performance however our performance reports are included in the departmental annual reports which are public documents. These are also discussed at public imbizo events where the public has an opportunity to pose questions, concerns, comments to the departments senior management where EMS is represented”
3. LimpopoDirector-level participant3.1Strategic planning“The EMS plan fits into the broader department strategic plans, where we have a section that is focused on EMS… the strategic plans are updated and planned for over several years and then re-evaluated at the end of that period. Where we have failed to reach a target or goal, we re-incorporate those projects into future plans”
3.2Relationships“We form part of the (health) departments system as a whole and filter into the departments committees… for me the most important thing is the relationship we have with them. I would rather we have someone with an understanding of quality and quality systems and improve their understanding of EMS, than have someone from EMS and need to bring their understanding up to understand quality. But either way, for me the most important thing is still about the relationship we have with them”
3.3“We measure quality through response times targets, through the number of complaints, and from feedback from the facilities we take patients to. Their feedback about the interaction with our staff is very important to me.”
3.4Attitude“The attitude of the staff is very important to me, and that’s one of the biggest improvements we have planned for… It will be very difficult, but we want to involve organized labour, and invite them to be a part of the process… here they determine success or failure and that’s why I want to make sure they have buy-in to the process and provide feedback”
3.5Technology“Having systems in place such as CAD systems will allow us to monitor everything involving staff, vehicles, how they are used, all of which will allow us to monitor our performance more closely and to make the sure the staff are held responsible and accountable, because this will also allow us to provide extra information to the public as a measure of our performance as well”
4. North WestDirector-level participant4.1Structure“We’re not a provincialized service, we’re a totally decentralised service, each EMS station reports to the subdistrict they are in, so there’s no provincial structure. Currently we are the only province that is like that… Basically we’ve got like 19 different EMS services in the North West.”
4.2Staff capacity“we lost a lot of them to OSD (occupational specific dispensation) …the OSD has shot us in the foot. We’re losing a lot of staff because we can’t retain them, so we’re training, but we’re actually training for [other services)”
4.3Non-personal resources“I’m finding out from research that we don’t need such a high amount of ambulances, we need to be focusing more on planned patient transport, because 65% of our calls are actually P3, so we’re using a very expensive resource to transport something that we don’t need to transport”
4.4Technology“the unfortunate thing is all our stuff is paper-based, and we don’t have a digital system. So, we are moving towards a digital communication system, but currently it’s very easy to lie to your statistics, so I cannot trust the information given to me”
5. Private ServiceSenior manager-level participant5.1Leadership“We’re probably as good as a 5 as you can get, in my opinion. [Representatives] From the CEO, to the operational crews sit on a clinical committee, there’s a quality assurance manager that sits at an executive level, and all of this works through, it’s all auditable through minutes and committee meetings that report into the executive committee”
5.2Representation“we’ve got representatives from cross the organisation sitting on the clinical panel to discuss what the consumer wants, what training needs to be provided, what operations is currently doing and where the operations within operations is needed”
5.3Improvement focus“If we’re doing a quality improvement project, if it gets written down as a quality improvement project, and not just an intervention, then we do put the assurances in place, putting in the checks to monitor it over and time and then look at whether there’s a consistent change in behaviour or not”
5.4Fit for purpose“our biggest problems in terms of this are systems. We often review stuff, and we often see, and we might know what quality indicators to use, but the problem comes in that the system we currently have is, manual, and very hard to change any kind of quality indicators, because it’s an accounting system that we’re using for quality indicators essentially, and it’s still paper-based, and manually captured”
5.5Patient/community engagement“In terms of a structured patient satisfaction assessment, we do have that. In terms of having a point of entry into the business for patients concerns to be brought up, we do have that, that’s very well developed at [parent company]. I think the problem comes in when you start talking about patient or community engagement when it comes to patient centred events, and I don’t think we’re there yet.”

EMS, emergency medical services.

Selected social and health demographics of participating provinces HIV, Human Immunodeficiency Virus; TB, tuberculosis. Quality programme formative assessment 0—no plan/structure/process. 1—limited plan/structures/process in place. 2—early implementation. 3—full implementation. 4—developing systematic approach to quality. 5—full systematic approach to quality. KZN, KwaZulu Natal; LP, Limpopo; NW, North West; WC, Western Cape. Qualitative exploration of the quality programme assessment EMS, emergency medical services.

Primary data

South Africa population: 57 458 000. No. of households: 16 671 000. Public transport use: 46.2%.

Western Cape

Population: 6 650 000 (11.6%). No. of households: 1 877 000 (11.3%). Public transport use: 44.7%. The provincial service’s higher points in the formative assessment were largely within structure and planning, where a hybrid centralised/decentralised system of subdistrict engagement with two ‘centralised’ quality nodes (ie, one urban and one rural) was employed for the services quality system. Within this system were staff primarily dedicated to quality assessment and monitoring. Despite this strength, it was acknowledged that a lack of higher level leadership had had an impact on the programme (1.1). Similarly, while a comprehensive quality plan existed, it was acknowledged to be outdated and inconsistently reviewed and/or updated. The most significant points to emerge regarding measurement and improvement were in relation to the services understanding of its mandate, and the view that the service operated as a transport company more than a medical company, especially given the sociopolitical history of the region (1.2, 1.3). In light of this, it was felt that reporting on time-based measures of performance was wholly appropriate. Similarly, much of the focus on improvement activities were centred around transport and improving interfacility transport booking and operations in particular. The service acknowledged that improvements could be made in terms of staff engagement; however, they felt their public engagement had improved significantly in recent years. Unfortunately, the primary driver for this had been an exponential increase in attacks on ambulances in the community (1.4).

KwaZulu Natal

Population: 11 245 000 (19.5%). No. of households: 2 905 000 (17.4%). Public transport use: 40.9%. The service scored low for structure in the formative assessment, compared with the other services. The decentralised approach towards measurement and evaluation adopted made coordination difficult, which was further exacerbated by the perceived rudimentary means with which data were captured and shared (2.1). While the service acknowledged the lack of described roles, responsibilities and accountabilities within its quality plan, the content of the plan was otherwise described as comprehensive and underwent regular evaluation and update [2.2]. The service scored highest in measurement, where a strong focus was placed on continuous monitoring for trend analysis. As with the WC, the focus was strongly associated with its perceived mandate and service utilisation (2.3, 2.4). The service scored low for staff and public engagement where it was acknowledged that while some effort was made towards this, there was still much to be improved on (2.5, 2.6).

Limpopo

Population: 5 854 000 (10.2%). No. of households: 1 579 000 (9.3%). Public transport use: 41.9%. The LP EMS quality system scored relatively highly within the structure and planning categories of the formative assessment. There was a strong focus on strategic planning, where their quality system and planning were firmly entrenched into the broader provincial health structures (3.1). The importance of this relationship with the provincial health system was emphasised as a driver for potential improvements in service quality monitoring (3.2). It was acknowledged that much could be done to improve quality measurement and improvement within the service, which scored lower in the formative assessment. The service focused primarily on response time targets and complaints for measuring and reporting of quality and performance (3.3). The notion of relationships was echoed in these sections, where feedback from the facilities the service interacted with were too seen as an important measure of quality. Despite the low scores for staff engagement and evaluation, these had been areas earmarked for attention in the services current strategic plan. Staff attitude was acknowledged and planned for as an important driver of general service success (3.4). Similarly, technology was also earmarked as a driver of success, both for staff engagement, and community accountability as well (3.5).

North West

Population: 3 925 000 (6.8%). No. of households: 1 210 000 (7.3%). Public transport use: 41.3%. The NW scored low across all questions and categories in the formative assessment. This was unsurprising considering (unbeknownst to the authors at the time of data collection) the provincial government, including the health system and EMS, had been placed under administration. On deeper examination, several key factors became apparent that highlighted the difficulties faced by EMS in the province. From a managerial perspective, the extreme decentralisation in which the service was structured made coordination and oversight complicated, and significantly hindered process and/or plan implementation (4.1). Coupled with this, the service found it difficult to retain high-level clinical staff, further hampering the ability to implement and sustain a clinically focused quality programme (4.2). From an operations point of view, based on a recent audit, it was recognised that the province’s non-personnel resources were inappropriately matched towards the needs of their daily activity (4.3, 4.4). The QIs that were reported by the service were limited to time-based measures, and vehicle and staff counts. Furthermore, the service lacked their own standalone committees regarding complaints and patient safety, which were instead incorporated into broader general provincial health service committees and structures.

Private service

Based on the formative assessment and interview, several strengths were highlighted within the service, largely centred around structure. There was a strong clinical focus within the quality system of the service, with representation up to the executive level (5.1). Furthermore, while input was collected from across the service branches, much of the planning came from a centralised office, providing overall strategic direction (5.2). Similarly, there was a relatively strong focus on quality improvement activities within the service. While input and scope were somewhat limited, a robust and comprehensive process was consistently followed when a project was carried out (5.3). In contrast, the service acknowledged that there was room for improvement with regards to programme planning and evaluation. While a quality management plan existed, it was outdated, and not often reviewed, at least in any formal capacity. Likewise, while several clinically focused indicators are consistently reported and discussed at a high level, the system was acknowledged to be outdated and rudimentary, largely manually captured, and difficult to change as it is not fit for purpose (5.4). This was perceived to have had an impact on both general quality monitoring and monitoring for sustained improvement. Of all the categories, staff and patient engagement were perceived to be the weakest, and an area for improvement within the service. The strengths the service enjoyed in this area were largely as a result of the services private hospital group parent company (5.5).

Secondary data

Nationally and provincially focused policy documents were included as part of the secondary data collection (table 4). Several concentrated on the development and implementation of quality and patient safety systems yet were almost exclusively limited to health facilities. Despite this, they were in depth and pragmatic in their approach towards outlining the steps required to implement effective quality systems. While these may not all be applicable to the EMS setting, several of the concepts outlined in these documents were considered useful towards the development of similar systems for EMS.
Table 4

Policy review

RegionDocumentPublication dateHealth facility focusEMS focusSupporting quote for EMS guidanceRef
NationalA Policy on Quality in Healthcare for South AfricaApril 2007YesNoNil 12
“Towards Quality Care for Patients”National Core Standards for Health Establishments in South Africa2011YesNoNil 25
South African Department of Health Strategic Plan 2015–20192014YesYes Strategic objectives:

Ensure the effective and efficient delivery of Emergency Medical Services

Ensure access to effective and efficient delivery of quality Emergency Medical Services

26
National Policy to Manage Complaints, Compliments and Suggestions in the Public Health Sector of South AfricaJuly 2016YesNoNil 27
National Policy for Patient Safety Incident Reporting and Learning in the Public Health Sector of South AfricaJuly 2016YesNoNil 28
National Health Act, 2003 (Act no. 61 of 2003)National Health Insurance Policy2017YesYes Improving access to Emergency Medical Services: 156. A uniform level of quality for Emergency Medical Services (EMS) and Facility-based Emergency Care will be provided across the country according to nationally determined norms and standards in relation to the level of care, staffing requirements, prescribed equipment, suitability of response vehicles and ambulances and other relevant components based on the level of care. 29
National Health Act, 2003 (Act no. 61 of 2003)Emergency Medical Service RegulationDecember 2017NoYes Consideration of application for Licence: (c) the need to promote quality services which are accessible, affordable, cost-effective and safe;(h) where applicable, the quality of health services rendered by the applicant in the past; 30
Management of Emergency Medical Service: (b) ensure that the Emergency Medical Service is operated in a way that provides quality care and does not compromise the safety of the public, patient or personnel;(t) ensure that there are mechanisms in place for the management of complaints, consultation, clinical governance and quality assurance
Professional Board for Emergency Care Clinical Practice Guidelines2018NoYes Important Additional notes All interventions and medications are to be performed and administered within the Clinical Practice Guidelines and a locally relevant standard of care. Clinical governance structures shall support these guidelines 31
Western CapeWestern Cape Ambulance Services Act, 20032003NoYes Norms, standards and quality assurance 7. (1) The MEC shall prescribe minimum norms and standards for the delivery of ambulance services which will include—

equitable access;

the use of volunteers;

personnel, vehicle and equipment requirements;

communication and co-ordination procedures; and

systems to receive, investigate and remedy complaints.

32
Healthcare 20302014 Emergency Medical Services:

EMS district managers will closely support district health managers by providing EMS-related data for monitoring and evaluation

International benchmarking and best practice establish that EMS is best delivered as a provincial service rather than a local service.

33
Western Cape Government Health Annual Report2018YesYes Reported indicators:

EMS P1 urban response under 15 min rate

EMS inter-facility transfer rate

Total number of EMS emergency cases

34
KwaZulu NatalKwaZulu Natal Department of Health Strategic Plan 2015–20192015YesYes Priority 2: Improve the Efficiency of Emergency Medical Services:

Governance structures will be strengthened, and training of managers will be prioritized to improve management and quality.

Appropriate ICT infrastructure (including mobile data terminals) and computers will be installed at all ambulance bases to ensure access to on-line facilities to improve data accuracy and availability.

An appropriate electronic patient booking system will be introduced to improve appropriate response to emergency calls.

35
Quality improvement Intervention based on Patients Safety Incident (PSI)2016YesNilNil 36
KwaZulu Natal Department of Health Annual Report2018YesYes Reported indicators:

Total number of EMS clients

Total number of interfacility transfers

Percentage of response times to red codes (P1) within 15 mins for urban areas

Percentage of response times to red codes (P1) within 40 mins for rural areas

Cases attended to by Air Ambulance Services

Aeromedical Services utilisation per district

Ambulances per 10 000 population

37
North West ProvinceNorth West Department of Health Strategic Plan 2015–20192015YesYes Strategic Goal 2:

Improve the quality of care by setting and monitoring national norms and standards, improving systems for user feedback, increasing safety in health care, and by improving clinical governance.

38
North West Department of Health Annual Report2018YesYes Reported indicators:

EMS Operational ambulance coverage

EMS P1 urban Response under 15 min rate

EMS P1 rural Response under 40 min rate

EMS interfacility Transfer rate
39
LimpopoLimpopo Department of Health Annual Report2018YesYes Reported indicators:

Ratio of ambulances per population

Number of ambulances procured

EMS P1 urban Response under 15 min rate

EMS P1 rural Response under 40 min rate

EMS inter-facility transfer rate

40
Policy review Ensure the effective and efficient delivery of Emergency Medical Services Ensure access to effective and efficient delivery of quality Emergency Medical Services equitable access; the use of volunteers; personnel, vehicle and equipment requirements; communication and co-ordination procedures; and systems to receive, investigate and remedy complaints. EMS district managers will closely support district health managers by providing EMS-related data for monitoring and evaluation International benchmarking and best practice establish that EMS is best delivered as a provincial service rather than a local service. EMS P1 urban response under 15 min rate EMS inter-facility transfer rate Total number of EMS emergency cases Governance structures will be strengthened, and training of managers will be prioritized to improve management and quality. Appropriate ICT infrastructure (including mobile data terminals) and computers will be installed at all ambulance bases to ensure access to on-line facilities to improve data accuracy and availability. An appropriate electronic patient booking system will be introduced to improve appropriate response to emergency calls. Total number of EMS clients Total number of interfacility transfers Percentage of response times to red codes (P1) within 15 mins for urban areas Percentage of response times to red codes (P1) within 40 mins for rural areas Cases attended to by Air Ambulance Services Aeromedical Services utilisation per district Ambulances per 10 000 population Improve the quality of care by setting and monitoring national norms and standards, improving systems for user feedback, increasing safety in health care, and by improving clinical governance. EMS Operational ambulance coverage EMS P1 urban Response under 15 min rate EMS P1 rural Response under 40 min rate Ratio of ambulances per population Number of ambulances procured EMS P1 urban Response under 15 min rate EMS P1 rural Response under 40 min rate EMS inter-facility transfer rate Of the EMS focused documents, all of these were limited to high-level/strategic ‘statements’ regarding quality or patient safety. None of the documents found reported any measures of clinical quality, with the focus solely restricted to call times and call volumes. Furthermore, no policy-related documents were found that outlined minimum standards or provided steps towards the development and/or implementation of a quality system or clinically focused QIs for EMS.

Discussion

Healthcare organisational case studies have been identified as an important methodological approach towards describing the factors facilitating and impeding quality systems.18 This was echoed in our study, where several broad observations were made regarding EMS quality systems in SA. From a system structure perspective, a centralised approach with appropriate and engaged senior/executive level management established responsibility of the system and facilitated greater control over the direction of the system, whereas decentralisation hampered collection and reporting, and as a consequence, accountability. Leadership has previously been identified as an essential component in health quality systems, a factor present in this study as both a driver of success when incorporated, and a barrier when inadequate or unaccounted for.11 19–21 The lack of a cohesive vision and/or mission regarding quality, and the role of leadership towards developing and driving these concepts has also been associated with organisations who consistently struggle to improve quality and were similarly lacking or poorly developed within the services assessed in this study.21 Factors associated with infrastructure, support and capacity have too been identified as key drivers of success of quality systems in healthcare.19–21 While structure was among the highest scored attributes of the participating service assessments, insufficient capacity was often identified as a weak link in this study. The combination of leadership and capacity has been described as primary drivers of a quality culture in healthcare quality systems; another component reported as both an enabler of high-quality systems when present, and a barrier to its success when absent.19–21 It is unsurprising that given the lack of each of these components in the participating services that culture did not feature as a common observation or discussion point within the assessment and interviews. All participating services were limited in their measurement of either adverse events, technical quality of care or patient-reported measures, with the primary focus largely centred around time-based measures. This is in contrast to the increasing focus on non-time-based measures of quality evident in the literature.22 This limitation was widely acknowledged and partially justified around the perceived purpose of EMS and what was understood to be the mandate of these services in SA. Non-time-based measures of safety and quality have previously been used as a strong base with which focused quality improvement programmes have led to meaningful and improved patient outcomes in the PEC setting. The lack of such measures could in part explain the generally poor results observed regarding quality improvement in this study. Resources and technology were a common feature among the interviews as a potential driver for improvement in quality systems. Of interest to note, there was limited discussion regarding the perceived benefits offered by technology during the evaluation of the WC, as the only user of computer-aided dispatch system and electronic patient records. It nonetheless remained a specific solution identified by the remaining services as the answer to many of the problems they faced regarding quality. These contrasting views are evident in the literature, where the importance of technological resources has been often debated, and where a lack of consensus regarding their influence and status has them described as ‘probationary’ when it comes to their role in quality systems.19 20 There was little to no supporting documentation in the way of national policies and/or guidelines for EMS in either implementing quality systems, measuring quality or reporting performance. Furthermore, there was a general lack of policy outlining minimum standards for EMS quality systems altogether. This was evident in the variation of the results of the quality programme assessment and further highlights the need for such guidance. To be effective in both implementation and use, it is essential that appropriate high-level guidance and minimum standards regarding quality systems be outlined, as a driver for change.23 24 In order to deliver safe, high-quality care, it is crucial that the system or mechanism responsible for monitoring and maintaining this process is equally efficient and effective in doing so. Understanding the factors affecting this process are essential towards identifying areas and priorities for improvement within the system. The outcomes of this study have provided a base from which the factors affecting quality systems in EMS in SA can be addressed. However, as systems evolve and mature in their approach towards quality and safety, so will the factors that affect the success of the system. As such, quality system evaluation should become a regular, scheduled component of the system itself. Towards this, our study has described one approach that can be used as an objective, repeatable measure of quality system development.

Limitations

The nature of the questions which case study research in general—and this article in particular—attempt to answer limit the overall extent to which the results are generalisable and/or reproducible. We attempted to address this through the previously described approach towards enhancing the validity and trustworthiness of the methodology. Despite this, the results of this study need to be understood within the context in which they were studied and appreciate the impact this has on the observations and their broader potential implications. While the specific observations found in this study may not be generalisable, the outcomes are nonetheless consistent with what is known in the literature.

Conclusion

A multitude of factors were identified that affected the effectiveness of quality systems, centred around leadership, vision and mission, and quality system infrastructure and capacity, guided by the need for comprehensive yet pragmatic strategic policies and standards. Understanding and accounting for these factors will be key to ensuring both successful implementation and ongoing utilisation of healthcare quality systems in PEC in SA. The result will not only provide a more efficient and effective service, but also positively impact patient safety and quality of care of the services delivered.
  19 in total

1.  Quality systems in health care: a sociotechnical approach.

Authors:  P P M Harteloh
Journal:  Health Policy       Date:  2003-06       Impact factor: 2.980

Review 2.  The meaning of quality in health care: a conceptual analysis.

Authors:  P P M Harteloh
Journal:  Health Care Anal       Date:  2003-09

3.  Three approaches to qualitative content analysis.

Authors:  Hsiu-Fang Hsieh; Sarah E Shannon
Journal:  Qual Health Res       Date:  2005-11

4.  Development of a trigger tool to identify adverse events and harm in Emergency Medical Services.

Authors:  Ian Lucas Howard; James Marcus Bowen; Loua Asad Hanna Al Shaikh; Kedar Shrikrishna Mate; Robert Campbell Owen; David Michael Williams
Journal:  Emerg Med J       Date:  2017-02-02       Impact factor: 2.740

Review 5.  Quality Indicators for Evaluating Prehospital Emergency Care: A Scoping Review.

Authors:  Ian Howard; Peter Cameron; Lee Wallis; Maaret Castren; Veronica Lindstrom
Journal:  Prehosp Disaster Med       Date:  2017-12-10       Impact factor: 2.040

6.  The contribution of case study research to knowledge of how to improve quality of care.

Authors:  G Ross Baker
Journal:  BMJ Qual Saf       Date:  2011-04       Impact factor: 7.035

Review 7.  Adverse events related to emergency department care: a systematic review.

Authors:  Antonia S Stang; Aireen S Wingert; Lisa Hartling; Amy C Plint
Journal:  PLoS One       Date:  2013-09-12       Impact factor: 3.240

Review 8.  Improving the governance of patient safety in emergency care: a systematic review of interventions.

Authors:  Gijs Hesselink; Sivera Berben; Thimpe Beune; Lisette Schoonhoven
Journal:  BMJ Open       Date:  2016-01-29       Impact factor: 2.692

9.  Characteristics of healthcare organisations struggling to improve quality: results from a systematic review of qualitative studies.

Authors:  Valerie M Vaughn; Sanjay Saint; Sarah L Krein; Jane H Forman; Jennifer Meddings; Jessica Ameling; Suzanne Winter; Whitney Townsend; Vineet Chopra
Journal:  BMJ Qual Saf       Date:  2018-07-25       Impact factor: 7.035

10.  Identifying quality indicators for prehospital emergency care services in the low to middle income setting: The South African perspective.

Authors:  Ian Howard; Peter Cameron; Lee Wallis; Maaret Castrén; Veronica Lindström
Journal:  Afr J Emerg Med       Date:  2019-08-06
View more
  1 in total

1.  Refining circumstances of mortality categories (COMCAT): a verbal autopsy model connecting circumstances of deaths with outcomes for public health decision-making.

Authors:  Lucia D'Ambruoso; Jessica Price; Eilidh Cowan; Gerhard Goosen; Edward Fottrell; Kobus Herbst; Maria van der Merwe; Jerry Sigudla; Justine Davies; Kathleen Kahn
Journal:  Glob Health Action       Date:  2021-10-26       Impact factor: 2.640

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.