Literature DB >> 24671121

The use of on-site visits to assess compliance and implementation of quality management at hospital level.

C Wagner1, O Groene, M Dersarkissian, C A Thompson, N S Klazinga, O A Arah, R Suñol.   

Abstract

OBJECTIVE: Stakeholders of hospitals often lack standardized tools to assess compliance with quality management strategies and the implementation of clinical quality activities in hospitals. Such assessment tools, if easy to use, could be helpful to hospitals, health-care purchasers and health-care inspectorates. The aim of our study was to determine the psychometric properties of two newly developed tools for measuring compliance with process-oriented quality management strategies and the extent of implementation of clinical quality strategies at the hospital level.
DESIGN: We developed and tested two measurement instruments that could be used during on-site visits by trained external surveyors to calculate a Quality Management Compliance Index (QMCI) and a Clinical Quality Implementation Index (CQII). We used psychometric methods and the cross-sectional data to explore the factor structure, reliability and validity of each of these instruments. SETTING AND PARTICIPANTS: The sample consisted of 74 acute care hospitals selected at random from each of 7 European countries. MAIN OUTCOME MEASURES: The psychometric properties of the two indices (QMCI and CQII).
RESULTS: Overall, the indices demonstrated favourable psychometric performance based on factor analysis, item correlations, internal consistency and hypothesis testing. Cronbach's alpha was acceptable for the scales of the QMCI (α: 0.74-0.78) and the CQII (α: 0.82-0.93). Inter-scale correlations revealed that the scales were positively correlated, but distinct. All scales added sufficient new information to each main index to be retained.
CONCLUSION: This study has produced two reliable instruments that can be used during on-site visits to assess compliance with quality management strategies and implementation of quality management activities by hospitals in Europe and perhaps other jurisdictions.

Entities:  

Keywords:  audit; hospital; implementation; on-site visits; quality management

Mesh:

Year:  2014        PMID: 24671121      PMCID: PMC4001692          DOI: 10.1093/intqhc/mzu026

Source DB:  PubMed          Journal:  Int J Qual Health Care        ISSN: 1353-4505            Impact factor:   2.038


Introduction

In a more and more market-oriented health-care delivery system it may be increasingly important to evaluate the quality of care delivered. To this end, health-care purchasers are gathering and using performance information on patient experiences as well as organizational and clinical performance indicators. Despite years of efforts to improve the reliability and validity of performance indicators, there are still differences in outcomes related to registration and measurement error in the administrative databases used to calculate outcome indicators. Additional information about quality management and clinical quality strategies could be a valuable adjunct to hospital assessment, especially since accreditation/certification instruments that can be used to assess quality management systems already exist [1]. An alternative to administrative data and surveys is the on-site visit (or audit). Visits involving independent auditors can verify compliance with activities, methods and procedures used to plan, control, monitor and improve the quality of care. On-site visits are mainly used by accreditation and certification organizations who visit an organization for a few days, reviewing documents and meeting with front-line staff. To date, easy-to-use survey instruments have not been developed for use by health-care purchasers or health-care inspectorates. Such instruments could be used to reveal whether a hospital has appropriate quality management strategies in place, whether they are used and whether they are stimulating continuous learning and improvement. The latter is based on the Deming or Nolan Quality Improvement Cycle, which describes the steps: Plan–Do–Check/study–Act. On-site visits offer an opportunity to discuss achievement of more complex steps like ‘Check and Act’. In addition, widely used quality indicators can rarely be used to measure the improvement structures and culture of hospital units, which can be explored in conversation during an on-site-visit. In this article we describe the development of two novel quality management indices for purchasers and other stakeholders of European hospitals. Both are developed within the DUQuE project (Deepening our understanding of Quality Improvement in Europe). One (the Quality Management Compliance Index or QMCI) focuses on compliance with existing quality management procedures at the hospital level and the other (the Clinical Quality Implementation Index or CQII) on activities that support continuous improvement of clinical indicators. This paper describes testing of the psychometric properties of the two newly developed measurement instruments (QMCI and CQII).

Methods

Setting and participants

The study took place in the context of the DUQuE project which ran from 2009 to 2013 [2]. Hospitals were sampled at random from all acute care hospitals in seven European countries: Czech Republic, France, Germany, Poland, Portugal, Spain and Turkey. Data for the QMCI and CQII were collected in the hospitals that participated in the in-depth study of the DUQuE project. In total, 74 hospitals (response rate 88%) were visited by experienced surveyors and these responses were used for the psychometric analyses.

Development of the instruments

Quality Management Compliance Index

The aim of the QMCI was to identify and verify the compliance to a set of closely related methods and procedures used to plan, monitor and improve the quality of care. By design, three scales of the index were defined a priori as quality planning, monitoring opinions of professionals and patients, and improvement of quality of care based on the notion that if a hospital would like to base its quality management on a limited number of activities, then it would need to have a plan based on the opinions of front-line staff and the users of hospital services, patients. Furthermore, strategies are necessary to solve possible shortcomings mentioned by professionals and patients. The choice of questionnaire items was based on expert opinion of experts with years of experience in hospital performance evaluation during accreditation and certification audits. The main criteria for including an item were its assumed influence on quality and safety of care, and the feasibility of verifying an answer to the item. Face validity was established based on the review by 10 experts of the DUQuE project, and a pilot test in two hospitals. All items of the QMCI (n = 15) were rated on a five-point Likert scale, varying from ‘no or negligible compliance’ (0) to ‘full compliance’ (4).

Clinical Quality Implementation Index

The purpose of the CQII is to test clinical quality systems and seek evidence of their implementation at the hospital level. The CQII has been designed to measure to what extent efforts regarding key clinical quality areas are implemented across the hospital. Following Bate and Mendel [3], each quality effort is assessed with regard to three levels of development: (i) Do quality efforts regarding the key areas exist (i.e. is there a responsible group and hospital protocol)? (ii) To what extent are these efforts monitored (i.e. with regard to compliance and improvement measurements)? (iii) To what extent is the sustainability of these efforts monitored? The key clinical areas included stem from the different quality functions described in most accreditation systems, as well as the recommendations of the WHO Patient Safety Alliance covering most of the key hospital clinical and safety areas. In total, seven areas were selected: (i) preventing hospital infection, (ii) medication management, (iii) preventing patient falls, (iv) preventing pressure ulcers, (v) routine assessment and diagnostic testing of patients in elective surgery, (vi) safe surgery that includes an approved checklist and (vii) preventing deterioration and advance life support (i.e. rapid response teams, resuscitation programmes). As a whole, the audit instrument comprised items from seven clinical areas, which could be rated on a five-point Likert scale, varying for example from ‘no compliance’ to ‘full compliance’. Additionally, when no information was available ‘not applicable’ could be selected. However, to get more meaning groups, the answer categories were recoded to a scale of 1–3, where responses of no, negligible or low compliance were coded as 1, medium compliance was coded as 2 and high, extensive or full compliance were coded as 3. The choice for the seven clinical areas was based on the fact that evidence exists on how to prevent the unsafe practices and related adverse outcomes. By following the existing guidelines patient harm might be prevented and patient safety improved.

Data collection

The QMCI and the CQII were designed as data collection tools for use during on-site visits by experienced surveyors who had previous experience in hospital accreditation, but had no relationship with the hospitals. Data were collected in the 74 hospitals between May 2011 and February 2012. In total, 14 external surveyors (2 in each country) collected data. The surveyors were trained on the main aspects to be assessed, and the scoring system. A data collection manual was developed to provide guidance and ensure homogeneity of data collection. Data were first gathered on paper and then entered into an online database system and checked by the country coordinator for missing data. Every hospital was visited by two surveyors for 1 day. No hospital professionals were made aware of the contents of the visit beforehand.

Data analysis

We began the analysis by describing the sample of hospitals that provided data from external visits. Then, we investigated the factor structure, reliability and construct validity of QMCI and CQII using standard psychometric methods. We conducted principal components, confirmatory factor, reliability coefficient, item-scale total correlation and inter-scale correlation analyses separately for QMCI and CQII. There were no missing values for any of the items, as data collected from hospitals participating in external visits were complete. Since we had external visit data from only 74 hospitals and factor analysis required 5–10 observations per variable, we did not split the data into two parts to perform factor analysis. We explored factor structure using principal component analysis with oblique (promax) rotation with a factor extraction criterion of eigenvalues >1 and three or more item loadings. Items were assigned to the factor where they had the highest factor loading, and only items with loadings ≥0.3 were retained. However, only one item was used to assess the ‘quality planning’ domain for QMCI, and this was considered to be theoretically important for assessing quality compliance. We then used confirmatory factor analysis to examine whether the data supported the final factor structure, where root mean square residual <0.05 and a non-normed fit index >0.9 indicated good fit. We also used Cronbach's alpha to assess internal consistency reliability of each factor, where a value of 0.7 was acceptable. Item-total correlations corrected for item overlap were used to examine the homogeneity of each scale. Item-total correlation coefficients of 0.4 or greater suggested adequate scale homogeneity. Lastly, we assessed the degree of redundancy between scales using inter-scale correlation coefficients, where a Pearson's correlation coefficient <0.7 was indicative of non-redundancy. Once psychometric evaluations of QMCI and CQII were completed and a final factor structure was established, we computed scores for each of the scales comprising these indices by taking the mean of items retained for each scale from the factor analysis. These sub-scales were then summed to build each final index. We subtracted the number of scales from the final CQII in order to bring the lower bound of the index down to 0. In order to assess construct validity, we used Pearson's correlation coefficients to examine the relationship between CQII and QMCI. We also provide descriptive statistics on the final index, sub-scales aggregated to build the index and items that comprise the sub-scales. All statistical analyses were carried out in SAS (version 9.3, SAS Institute, Inc., NC, 2012).

Results

Hospital characteristics

Across the 7 countries, 74 hospitals participated in this in-depth part of the DUQuE project. The teaching status was evenly balanced between non-teaching (55%) and teaching (45%). Most hospitals were publicly owned (80%) and comprised 501–1000 beds (42%) (Table 1).
Table 1

Characteristics of hospitals participating in the analysis

Characteristicn%
Number of hospitals74100
Teaching status, n (%)
Teaching3345
Non-teaching4155
Ownership, n (%)
Public hospitals5980
Private (or mixed ownership)1520
Approximate number of beds in hospital
<20079
200–5002230
501–10003142
>10001419
Characteristics of hospitals participating in the analysis

Quality Management Compliance Index

The QMCI was designed to measure compliance in 3 domains: quality planning (1 item); quality control and monitoring (12 items) and improving quality by Staff development (5 items), but factor analysis revealed 4 factors instead of the proposed three, as can be seen in Table 2. The quality planning factor comprised one item as this domain was only assessed with one question in our questionnaire. The other three QMCI sub-scales were monitoring of patient/professional opinions, monitoring of quality systems and improving quality by staff development. Factor analysis revealed that the items initially included in the quality control and monitoring domain actually clustered on two distinct factors: the monitoring of the opinions of patients and professional and that of quality systems. This distinction is meaningful and was retained. Three items from the questionnaire did not load on any of the factors and these were excluded.
Table 2

Factor loadings, Cronbach's alpha and corrected item-total correlations of QMCI (n = 74)

Scale and items of QMCIFactor loadings on primary scaleInternal consistency reliability: Cronbach's αCorrected item-total correlation
Quality planning
 Q1 The hospital (management) board approved an annual programme for quality improvement in 2010
Monitoring of patient/professional opinions0.742
 Q2 The results of patient satisfaction surveys were formally reported to the hospital (management) board in 20100.5340.450
 Q3 The hospital (management) board received results of surveys of staff satisfaction in 20100.5220.411
 Q4 Patients incidents and adverse events are analysed and evaluated0.5850.491
 Q5 Patients' opinion/perception is measured and evaluated0.5530.474
 Q6 Patient complaint system is available and/or evaluated0.5340.440
 Q7 Professional opinion/perception is measured and evaluated0.6920.606
Monitoring of quality systems0.783
 Q8 The hospital (management) board received regular, formal reports on quality and safety in 20100.7200.593
 Q9 Medical leaders received regular, formal reports on quality and safety in 20100.7630.651
 Q10 There is an active clinical guidelines register0.6710.607
 Q11 Guidelines application are measured and evaluated0.5770.505
Improving quality by staff development0.756
 Q12 The hospital maintains a record for each member of the medical staff that contains a copy of documents related to license, education, experience and certification0.8760.674
 Q13 The hospital maintains a record for each member of the nursing staff that contains a copy of documents related to license, education, experience and certification0.8470.623
 Q14 The performance of all individual medical staff members is formally reviewed to determine continued competence to provide patient care services0.5530.545
 Q15 The performance of all nursing staff members is formally reviewed to determine continued competence to provide patient care services0.4020.387
Factor loadings, Cronbach's alpha and corrected item-total correlations of QMCI (n = 74) The factors of the QMCI yielded acceptable results with regard to internal consistency (Cronbach's alpha ranges between 0.74 and 0.78). None of the corrected item-total correlations were <0.4, except in the case of one item in the improving quality by staff development sub-scale, indicating that all items contribute to the distinction between high and low scores on the factor. The inter-scale correlations, presented in Table 3, had a maximum of 0.52, which is below the maximum threshold of 0.70. This indicates that the QMCI is indeed a multi-dimensional construct with sub-scales addressing independent aspects of quality management. All sub-scales had notable correlations with the overall index, meaning that they contribute to the QMCI.
Table 3

Inter-scale correlation coefficients of QMCI and scales with overall construct (n = 74)

1.2.3.4.QMCI
1. Quality planning10.78
2. Quality control and monitoring of patient/professional opinions0.31710.69
3. Quality control and monitoring of quality systems0.5200.47510.78
4. Improving quality by staff development0.1420.3030.14510.52

Note. The numbers in the first row correspond to the scales in the first column.

Inter-scale correlation coefficients of QMCI and scales with overall construct (n = 74) Note. The numbers in the first row correspond to the scales in the first column. Descriptive statistics for QMCI, the sub-scales and items that comprise the sub-scales are presented in Table 4. QMCI had a final scale range of 0–16. Six out of 15 items had a median score of 4 (range 0–4). This is also reflected in a high ceiling ratio of these items. In contrast to most other items, the third item (improving quality by staff development) was an exception. It had a low average (zero) and a high floor ratio.
Table 4

Distribution of item and scale scores of QMCI (n = 74)

Scale and items of QMCI (range 0–4)Median (IQR)aFloor (% with lowest score)Ceiling (% with highest score)
Quality Management Compliance Index (QMCI)b (range 0–16)10 (3.2)
Quality planning; mean (SD)2.9 (1.4)
 Q1 The hospital (management) board approved an annual programme for quality improvement in 20104 (2)1458
Monitoring of patient/professional opinions; mean (SD)2.7 (0.8)
 Q2 The results of patient satisfaction surveys were formally reported to the hospital (management) board in 20104 (2)757
 Q3 The hospital (management) board received results of surveys of staff satisfaction in 20102 (4)3428
 Q4 Patients incidents and adverse events are analyzed and evaluated3 (3)1934
 Q5 Patients' opinion/perception is measured and evaluated4 (1)165
 Q6 Patient complaint system is available and/or evaluated4 (1)169
 Q7 Professional opinion/perception is measured and evaluated2 (4)3534
Monitoring of quality systems; mean (SD)2.1 (1.1)
 Q8 The hospital (management) board received regular, formal reports on quality and safety in 20103 (2)1439
 Q9 Medical leaders received regular, formal reports on quality and safety in 20103 (2)1539
 Q10 There is an active clinical guidelines register2 (4)2727
 Q11 Guidelines application are measured and evaluated1 (2)3218
Improving quality by staff development; mean (SD)2.4 (1.0)
 Q12 The hospital maintains a record for each member of the medical staff that contains a copy of documents related to license, education, experience and certification4 (2)361
 Q13 The hospital maintains a record for each member of the nursing staff that contains a copy of documents related to license, education, experience and certification4 (2)361
 Q14 The performance of all individual medical staff members is formally reviewed to determine continued competence to provide patient care services0 (2)6118
 Q15 The performance of all nursing staff members is formally reviewed to determine continued competence to provide patient care services2 (4)3438

aMedian (IQR) presented for individual question items.

bQMCI is the sum of all 4 sub-scales, range: 0–16.

Distribution of item and scale scores of QMCI (n = 74) aMedian (IQR) presented for individual question items. bQMCI is the sum of all 4 sub-scales, range: 0–16.

Clinical quality implementation index

The CQII aimed to assess three levels of implementation, such as existence of protocol, monitoring of compliance and sustainability by measuring and using indicators to keep an improvement focus. Factor analysis revealed, however, that the items did not group into these dimensions. Instead, the factors appear to be grouped according to different clinical areas (Table 5) suggesting that the levels of clinical implementation are not consistent on the same level across different clinical areas. Rather, the levels of development coexist and reflect the implementation of a certain area. Therefore, we used the items to describe clinical implementation as a single score for each area. The seven sub-scales retained by factor analysis were preventing hospital infection, medication management, preventing patient falls, preventing patient ulcers, routine testing of elective surgery patients, safe surgery practices and preventing deterioration. The resulting seven-factor structure showed high factor loadings, Cronbach's alphas ranging from 0.82 to 0.93 and corrected item-total correlations. The inter-scale correlations, presented in Table 6, had a maximum of 0.59, which is below the maximum threshold of 0.70. This indicates that the CQII is a multi-dimensional construct.
Table 5

Factor loadings, Cronbach's alpha and corrected item-total correlations of CQII (n = 74)

Scale and items of CQIIFactor loadings on primary scaleInternal consistency reliability: Cronbach's αCorrected item-total correlation
Preventing hospital infection0.817
 C1 Responsible group exists0.5740.522
 C2 Hospital protocol exists0.5480.491
 C3 Extent of compliance monitoring0.8330.789
 C4 Sustainability of the system0.8080.693
 C5 Improvement focus0.7190.558
Medication management0.903
 C6 Responsible group exists0.6710.655
 C7 Hospital protocol exists0.5670.554
 C8 Extent of compliance monitoring0.9540.899
 C9 Sustainability of the system0.9380.855
 C10 Improvement focus0.9170.845
Preventing patient falls0.898
 C11 Responsible group exists0.6100.590
 C12 Hospital protocol exists0.6810.648
 C13 Extent of compliance monitoring0.9070.859
 C14 Sustainability of the system0.9520.890
 C15 Improvement focus0.8500.772
Preventing patient ulcers0.879
 C16 Responsible group exists0.6440.612
 C17 Hospital protocol exists0.6310.600
 C18 Extent of compliance monitoring0.8670.810
 C19 Sustainability of the system0.8890.807
 C20 Improvement focus0.8040.733
Routine testing of elective surgery patients0.923
 C21 Responsible group exists0.5810.571
 C22 Hospital protocol exists0.7020.679
 C23 Extent of compliance monitoring0.9840.937
 C24 Sustainability of the system0.9830.929
 C25 Improvement focus0.9700.910
Safe surgery practices0.881
 C26 Responsible group exists0.6470.616
 C27 Hospital protocol exists0.5370.513
 C28 Extent of compliance monitoring0.9180.850
 C29 Sustainability of the system0.8870.812
 C30 Improvement focus0.8670.809
Preventing deterioration0.932
 C31 Responsible group exists0.8040.774
 C32 Hospital protocol exists0.7870.757
 C33 Extent of compliance monitoring0.9050.868
 C34 Sustainability of the system0.8910.850
 C35 Improvement focus0.8950.855
Table 6

Inter-scale correlation coefficients (n = 74)

1234567
1. Preventing hospital infection1.000
2. Medication management0.5851.000
3. Preventing patient falls0.130−0.0191.000
4. Preventing patient ulcers0.2490.2850.3911.000
5. Routine testing of elective surgery patients0.1700.2040.1680.3711.000
6. Safe surgery practices0.3640.4260.2270.1820.2761.000
7. Preventing deterioration0.3710.2580.2850.0530.1600.4521
Overall construct CQII0.590.600.550.600.570.700.63
Factor loadings, Cronbach's alpha and corrected item-total correlations of CQII (n = 74) Inter-scale correlation coefficients (n = 74) CQII had a final scale range of 0–14. The distribution of the scores (Table 7) showed that the prevention of hospital infection stands out with a very high average score and a ceiling ratio of over 80% for all of its items. For other items it seems that quite a number of hospitals have the highest or lowest score (ceiling effects). Around two-thirds of the hospitals have a low score on the sub-scale routine testing of elective surgery patients.
Table 7

Distribution of item, scale and index scores (n = 74)

Scale and items (range 1–3)Median (IQR)aFloor (% with lowest score)Ceiling (% with highest score)
Clinical Quality Implementation Index (CQII)b (range 0–14)8.3 (2.9)
Preventing hospital infection; mean (SD)2.8 (0.3)
 C1 Responsible group exists3 (0.0)588
 C2 Hospital protocol exists3 (0.0)180
 C3 Extent of compliance monitoring3 (0.0)193
 C4 Sustainability of the system3 (0.0)389
 C5 Improvement focus3 (0.0)586
Medication management; mean (SD)2.4 (0.6)
 C6 Responsible group exists3 (1.0)1672
 C7 Hospital protocol exists3 (1.0)1461
 C8 Extent of compliance monitoring3 (1.0)2061
 C9 Sustainability of the system3 (1.0)2064
 C10 Improvement focus3 (1.0)2355
Preventing patient falls; mean (SD)2.1 (0.7)
 C11 Responsible group exists2 (2.0)5039
 C12 Hospital protocol exists3 (1.0)2357
 C13 Extent of compliance monitoring3 (2.0)3055
 C14 Sustainability of the system2 (2.0)3849
 C15 Improvement focus2 (2.0)4645
Preventing patient ulcers; mean (SD)2.3 (0.7)
 C16 Responsible group exists3 (2.0)3059
 C17 Hospital protocol exists3 (1.0)1658
 C18 Extent of compliance monitoring3 (2.0)2662
 C19 Sustainability of the system3 (2.0)2859
 C20 Improvement focus3 (2.0)3557
Routine testing of elective surgery patients; mean (SD)1.5 (0.7)
 C21 Responsible group exists1 (1.0)6624
 C22 Hospital protocol exists1 (1.0)5922
 C23 Extent of compliance monitoring1 (1.0)7220
 C24 Sustainability of the system1 (1.0)7219
 C25 Improvement focus1 (1.0)7420
Safe surgery practices; mean (SD)2.1 (0.7)
 C26 Responsible group exists3 (2.0)4350
 C27 Hospital protocol exists3 (1.0)2059
 C28 Extent of compliance monitoring2 (2.0)4246
 C29 Sustainability of the system2 (2.0)4545
 C30 Improvement focus1 (2.0)5135
Preventing deterioration; mean (SD)2.0 (0.8)
 C31 Responsible group exists3 (2.0)3655
 C32 Hospital protocol exists2 (2.0)3245
 C33 Extent of compliance monitoring2 (2.0)4643
 C34 Sustainability of the system2 (2.0)4943
 C35 Improvement focus2 (2.0)5036

aMedian (IQR) presented for individual question items

bCQII is the sum of all 7 sub-scales (minus 7), range: 0–14.

Distribution of item, scale and index scores (n = 74) aMedian (IQR) presented for individual question items bCQII is the sum of all 7 sub-scales (minus 7), range: 0–14.

Construct validity: hypothesis testing

The inter-index correlation between QMCI and the CQII was 0.565. This was in line with our expectations that both are distinct, but related constructs.

Discussion

Main findings

The results suggest that at the hospital level the QMCI and CQII are reliable and valid to assess compliance with quality management procedures as well as the extent of several activities related to continuous improvement of clinical quality. The latter activities included having a group of professionals responsible for the clinical area as well as a formally approved protocol and performance indicators. The initially proposed factor structure of both indices had to be adjusted based on the results of the factor analysis, but these minor adjustments did not change the theoretical constructs, instead refining the fit of the sub-scales to the concepts of interest. Both QMCI and CQII showed a high internal consistency and appeared to be multi-dimensional constructs. The descriptive results showed that some items included in the indices may be subject to ceiling effects with a large proportion of respondents having a positive score. Despite that, we kept the items and clinical areas in the instrument in case subsequent testing in across a broader range of hospitals in other European countries reveals more variation than those investigated in our study.

Strength and limitations

On-site audits have the advantage that they provide more objective and independent outcomes based on factual information derived for instance from an annual report. In contrast to self-administered questionnaires, audit can avoid potential social desirability bias in responses. The time burden for the audited organization is relatively low compared with an all staff survey. The downside is that documents on which the audit is based have to be reliable. Furthermore, trained surveyors are needed to conduct the audit to minimize variation due to inter-observer differences.

Relation with other studies

Using audit as a measurement strategy is relatively rare, although it may grow as it is proving useful in other study designs [4, 5]. More often audit instruments are used as a tool to improve quality, especially in the form of an accreditation programme. As a recent review suggests, audits seem to lead to improved structure and process of care and even clinical outcomes [6].

Conclusion

The two indices we developed and evaluated have the potential for use in research and in routine practice to help hospitals focus on quality and safety issues as well as follow the quality improvement P-D-C/S-A cycle. The instruments can also be used by purchasers, policy-makers or health-care inspectorates, if they want to assess the implementation of quality management at hospitals level in a more standardized way. The QMCI focuses on the core elements of a quality system, while the CQII has its focus on clinical areas that are directly related to patient care at ward level. Future research is needed to investigate the relationship between these novel quality measurement tools and other indicators including patient outcomes.

Funding

Deepening our Understanding of Quality Improvement in Europe (DUQuE) has received funding from the European Community's Seventh Framework Programme (FP7/2007–2013) under grant agreement number 241822. Funding to pay the Open Access publication charges for this article was provided by European Community's Seventh Framework Programme (FP7/2007–2013) under grant agreement no. 241822.
  5 in total

Review 1.  A systematic review of instruments that assess the implementation of hospital quality management systems.

Authors:  Oliver Groene; Daan Botje; Rosa Suñol; Maria Andrée Lopez; Cordula Wagner
Journal:  Int J Qual Health Care       Date:  2013-08-22       Impact factor: 2.038

2.  The Treatment of cardiovascular Risk in Primary care using Electronic Decision supOrt (TORPEDO) study-intervention development and protocol for a cluster randomised, controlled trial of an electronic decision support and quality improvement intervention in Australian primary healthcare.

Authors:  David Peiris; Tim Usherwood; Katie Panaretto; Mark Harris; Jenny Hunt; Bindu Patel; Nicholas Zwar; Julie Redfern; Stephen Macmahon; Stephen Colagiuri; Noel Hayman; Anushka Patel
Journal:  BMJ Open       Date:  2012-11-19       Impact factor: 2.692

Review 3.  Impact of accreditation on the quality of healthcare services: a systematic review of the literature.

Authors:  Abdullah Alkhenizan; Charles Shaw
Journal:  Ann Saudi Med       Date:  2011 Jul-Aug       Impact factor: 1.526

4.  Deepening our understanding of quality improvement in Europe (DUQuE): overview of a study of hospital quality management in seven countries.

Authors:  Mariona Secanell; Oliver Groene; Onyebuchi A Arah; Maria Andrée Lopez; Basia Kutryba; Holger Pfaff; Niek Klazinga; Cordula Wagner; Solvejg Kristensen; Paul Daniel Bartels; Pascal Garel; Charles Bruneau; Ana Escoval; Margarida França; Nuria Mora; Rosa Suñol
Journal:  Int J Qual Health Care       Date:  2014-03-25       Impact factor: 2.038

5.  Economic evaluation of Australian acute care accreditation (ACCREDIT-CBA (Acute)): study protocol for a mixed-method research project.

Authors:  Virginia Mumford; David Greenfield; Reece Hinchcliff; Max Moldovan; Kevin Forde; Johanna I Westbrook; Jeffrey Braithwaite
Journal:  BMJ Open       Date:  2013-02-08       Impact factor: 2.692

  5 in total
  10 in total

1.  DUQuE quality management measures: associations between quality management at hospital and pathway levels.

Authors:  Cordula Wagner; Oliver Groene; Caroline A Thompson; Maral Dersarkissian; Niek S Klazinga; Onyebuchi A Arah; Rosa Suñol
Journal:  Int J Qual Health Care       Date:  2014-03-09       Impact factor: 2.038

2.  The associations between organizational culture, organizational structure and quality management in European hospitals.

Authors:  C Wagner; R Mannion; A Hammer; O Groene; O A Arah; M Dersarkissian; R Suñol
Journal:  Int J Qual Health Care       Date:  2014-03-25       Impact factor: 2.038

3.  Deepening our understanding of quality improvement in Europe (DUQuE): overview of a study of hospital quality management in seven countries.

Authors:  Mariona Secanell; Oliver Groene; Onyebuchi A Arah; Maria Andrée Lopez; Basia Kutryba; Holger Pfaff; Niek Klazinga; Cordula Wagner; Solvejg Kristensen; Paul Daniel Bartels; Pascal Garel; Charles Bruneau; Ana Escoval; Margarida França; Nuria Mora; Rosa Suñol
Journal:  Int J Qual Health Care       Date:  2014-03-25       Impact factor: 2.038

4.  A checklist for patient safety rounds at the care pathway level.

Authors:  Cordula Wagner; Caroline A Thompson; Onyebuchi A Arah; Oliver Groene; Niek S Klazinga; Maral Dersarkissian; Rosa Suñol
Journal:  Int J Qual Health Care       Date:  2014-03-09       Impact factor: 2.038

5.  Development and validation of an index to assess hospital quality management systems.

Authors:  C Wagner; O Groene; C A Thompson; N S Klazinga; M Dersarkissian; O A Arah; R Suñol
Journal:  Int J Qual Health Care       Date:  2014-03-11       Impact factor: 2.038

6.  [Assessment model for evaluating the preparedness plan for COVID-19 in a tertiary care hospital].

Authors:  C Llorente-Parrado; R Mejon-Berges; Y Cossio-Gil; M S Romea-Lecumberri; A Roman-Broto; M A Barba-Flores; A Salazar-Soler
Journal:  J Healthc Qual Res       Date:  2020-10-10

7.  Implementation of Departmental Quality Strategies Is Positively Associated with Clinical Practice: Results of a Multicenter Study in 73 Hospitals in 7 European Countries.

Authors:  Rosa Sunol; Cordula Wagner; Onyebuchi A Arah; Solvejg Kristensen; Holger Pfaff; Niek Klazinga; Caroline A Thompson; Aolin Wang; Maral DerSarkissian; Paul Bartels; Philippe Michel; Oliver Groene
Journal:  PLoS One       Date:  2015-11-20       Impact factor: 3.240

8.  Deepening our Understanding of Quality in Australia (DUQuA): a study protocol for a nationwide, multilevel analysis of relationships between hospital quality management systems and patient factors.

Authors:  Natalie Taylor; Robyn Clay-Williams; Emily Hogden; Victoria Pye; Zhicheng Li; Oliver Groene; Rosa Suñol; Jeffrey Braithwaite
Journal:  BMJ Open       Date:  2015-12-07       Impact factor: 2.692

9.  Is having quality as an item on the executive board agenda associated with the implementation of quality management systems in European hospitals: a quantitative analysis.

Authors:  Daan Botje; N S Klazinga; R Suñol; O Groene; H Pfaff; R Mannion; A Depaigne-Loth; O A Arah; M Dersarkissian; C Wagner
Journal:  Int J Qual Health Care       Date:  2014-02-17       Impact factor: 2.038

10.  Evidence-based organization and patient safety strategies in European hospitals.

Authors:  Rosa Sunol; Cordula Wagner; Onyebuchi A Arah; Charles D Shaw; Solvejg Kristensen; Caroline A Thompson; Maral Dersarkissian; Paul D Bartels; Holger Pfaff; Mariona Secanell; Nuria Mora; Frantisek Vlcek; Halina Kutaj-Wasikowska; Basia Kutryba; Philippe Michel; Oliver Groene
Journal:  Int J Qual Health Care       Date:  2014-02-26       Impact factor: 2.038

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.