Literature DB >> 31967098

Regione Lombardia: a tool for improving quality in hospitals.

M Nobile1, E Luconi2, R Sfogliarini3, M Bersani4, E Brivio4, S Castaldi1,2.   

Abstract

INTRODUCTION: The regional healthcare system of the Lombardy Region pay great attention to monitoring the effectiveness and quality level with which its services. The aim of this paper is to describe the method adopted by the Lombardy Region to create a governance tool for the healthcare system that would be applied within hospitals to create value at financial-economic level, to achieve continuous quality improvement and to increase patient/customer satisfaction levels. It was called: Piano Integrato del Miglioramento dell'Organizzazione (PIMO), i.e. Integrated Plan for Hospital Improvement. METODS: The approach for the definition of the PIMO was based on: the Plan Do Check Act methodology; the management requirements introduced by the UNI EN ISO 9001:2008 and UNI EN ISO 9004:2005 standards; the regulations and indications made for the Public Administration; the Guidelines for planning and monitoring improvement proposed by the CAF (Common Assessment Framework).
RESULTS: The evaluation of the scores for all the health structures shows a good level of quality and qualifies PIMO as a strategic tool for hospitals.
CONCLUSIONS: It will be necessary to allow this tool to operate for some time in order to make an overall assessment of the results achieved. ©2019 Pacini Editore SRL, Pisa, Italy.

Entities:  

Keywords:  Health structures; Quality improvement; Quality tool

Mesh:

Year:  2019        PMID: 31967098      PMCID: PMC6953444          DOI: 10.15167/2421-4248/jpmh2019.60.4.1298

Source DB:  PubMed          Journal:  J Prev Med Hyg        ISSN: 1121-2233


Introduction

A solution to provide quality health care in an efficient way that has proved effective, especially at higher decision-making levels, is performance management [1]. The cardinal element of performance management is the quality improvement in order to promote the best possible quality level [2]. One of the most widespread strategies for the pursuit of a quality improvement is the creation of accreditation and performance monitoring systems. Examples of performance monitoring systems in hospitals are the Joint Commission on Accreditation of Healthcare Organization (JCAHO) [3] and the Australian Council of Healthcare Standards (ACHS) [4]. Since 1997, with the recommendation n. R 97.17 “On the development and activation of systems for the improvement of the quality of healthcare”, adopted on 30 September 1997, Europe recommends a higher quality of the health system for all Member States and the presence of an improvement system that is known and understood by all [5]. Despite this, in Europe hospital performance monitoring seems a relatively new sector in the field of healthcare sciences and hospital management [6]. In 2005, a project named PATH (Performance Assessment Tool for quality improvement in Hospitals) was promoted by the WHO Regional Office for Europe. Its goal was to develop a framework for performance monitoring and quality improvement in hospitals by creating and developing a set of monitoring indicators [7, 8]. This project involved several States and it aimed to improve the quality of their services, in their public healthcare departments. A network system was developed to facilitate exchange of information, to identify best practices and to pinpoint key elements in the monitoring/accreditation programmes [9]. With the entry into force of Directive 2011/24/EU concerning cross-border healthcare, it has become essential to monitor the quality of individual structures so that the patient has guarantees concerning the quality of the facilities present in the different Member States [10]. Since 1978, Italy has organized a public national healthcare service based on availability to all (universal). Various subsequent laws have introduced reforms that have gradually established the concept of company-style management [11, 12]. Starting from the end of the last century a strong trend towards decentralization has led to an increasingly regional basis for the provision of healthcare services; the result has been that healthcare services have been organized and managed in very different ways from one region to another, but the principle of a national health service was not denied. The regional healthcare system of the Lombardy Region, in Northern Italy, was approved in 1997. From its inception, it pays great attention to monitoring the effectiveness and quality level with which the services were provided [13]. The Lombardy Region aimed to adopt a governance tool for the healthcare system that would be applied within hospitals to create value at the financial-economic level, to achieve quality improvement in the internal processes. The aim of this paper is to describe the results of the self-assessment checklists filled twice a year by all the healthcare structures of the Lombardy region in the years 2016 and 2017 to monitor their quality improvement.

Methods

This tool was created by the clinical and research healthcare facilities themselves through the organization of work groups and applying a bottom up logic within a system that had acquired, over the years, solid experience in relation to Quality Improvement. The structures themselves named the project Piano Integrato del Miglioramento Ospedaliero (PIMO), i.e. Integrated Plan for Hospital Improvement. PIMO project is directed to all healthcare facilities, public and private, accredited and contracted within the Regional Health Service, in particular the main objective of PIMO is to highlight the priorities for improving the quality of each organizations fixing medium and long-term strategic objectives [14]. The self-assessment check list of the PIMO of the single health structure is collected in a regional database (after the approval by the health structure’s Strategic Direction), so this is an important monitoring tool also for the Region (as well as for the single structure and for territorial level). The self-assessment checklist has 17 areas with 85 standards and 370 items (Tab. I).
Tab. I.

The 17 areas of the self-assessment checklist

AcronymAreaStandards
AACAnaestheesiological and surgical assistanceSedation; anesthesia; surgical planning; surgery; post-operative care
AASAcquisition of equipment and supervision of contractsAppropriate use of equipment, devices and medications recommended by professional associations or, alternatively, by other authoritative sources; contracts for services entrusted to external persons
ACAAccess to care and assistance servicesScreening and reception; Patient acceptance and hospitalization process and management of ambulatory patients; evaluation of patients with urgent needs; linguistic, cultural and structural barriers; Access criteria and transfer to intensive care units
CCCCoordination and continuity of careCoordination of care for clinical-assistance continuity; sharing of clinical and assistance information
DCRClinical and rehabilitative documentationPatient’s medical record; contents of the patient’s medical record; health documentation checks; symbols codes and definitions
DIMDischarge areaAppropriate discharge of the patient; territorial network; discharge letter; follow-up instructions
EPFEducation of patients and familyAssessment of each patient’s educational needs and registration; essential areas of the educational process
IDPInformation and rights of the patientProtected categories; patient information and informed consent; privacy and confidentiality
OBIInternational goals for patient safetyIdentification of the patient; telephone and verbal communications; management of high-risk drugs; safe surgery; prevention of infections related to care practices; prevention and management of damage resulting from falls
PDCCare processPlanning of care and assistance; planning rehabilitation treatment; care for high risk patients; high risk processes; pain management
PGFDrug management processPrescription and transcription of drugs: policies and procedures; requirements and criteria for acceptability of drug therapy prescriptions; the organisation identifies qualified professionals who are authorized to prescribe or order drugs; Registration of prescription and administration of drugs; drug preparation management; authorization to administer drugs; management of drug administration; regulation of self-administration of drugs and samples of medicinal specialities; monitoring and measurement of the effects of drugs on the patient; LASA drugs (look-alike drugs)
PGMManagement process for improving the organizationDevelopment and dissemination of documentation; plan for the improvement of the organization and its realization; communication and feed back to the staff about information on improvement; monitoring and control activities and data analysis; guidelines for clinical practice and clinical pathways to guide clinical care; key indicators to monitor the structures, the processes,and the clinical and managerial, processes and outcomes; management of sentinel event; reporting and management of near misses and adverse events; analysis of trends and unwanted variations; planning of information requirements
PVPPatient Assessment ProcessInitial evaluation of the patient; timeliness of the initial evaluation process; personalized evaluations; presurgical evaluations; resignation planning; patient revaluation
QDPQualification of the staffPlan of the organic amenities; the responsibilities of each member of staff are defined in an updated document (job description); insertion of the newly-hired or newly-assigned person and his evaluation; evaluation of managerial staff; evaluation of the operators belonging to the health professions and the technical administrative area; personal file; credentials: degree of study and qualifications;training in the techniques of emergency cardiopulmonary resuscitation; training, updating and development of skills
SDIDiagnostic services through imagesPre-diagnostic phase; diagnostic phase and refertation
SMLLaboratory medicine servicesPre-analytical phase; quality controls; analytical phase; post-analytical phase
TDPPatient transferPatient transfer; suitability of the receiving structure; transfer letter; monitoring during the transfer; documentation of the transfer process; transport service of patients
For each item there are 6 possible scores: 1 (systematic application); 0.75 (applied everywhere); 0.5 (applied in part); 0.25 (applied in experimental or initial phase); 0 (not applied); NA (Not Applicable). For documental items there are four scores: 1 (document prepared according to the content of the item); 0.5 (document partially respects the expected contents); 0 (no document); NA (Not Applied).. The Regional Health Authority of Lombardy provided data of 4 self-assessment checklists (first semester of 2016, second semester of 2016, first semester of 2017 and second semester of 2017) for all the healthcare structures who took part in the project in the considered period (ASST (Aziende Socio Sanitarie Territoriali, i.e. hospital and community trusts), IRCCS (Istituto di Ricovero e Cura a Carattere Scientifico, i.e. Scientific Institute for Research and Healthcare), and private structures). We had not the possibility to access to the dataset of each structure collected by Lombardy Region. We could only use information provided by the report of Lombardy region. The reports available contained only graphical representation of data and, in particular, we decided to use histograms with the following information: the average value of self-assessment for each type of structures (IRCCS, ASST and private structures) in the two semesters of 2016 and 2017; the percentages of zero responses for each area for the first and the second semesters of 2016 and 2017. In this case it was not possible to distinguish between structures because in the histogram there was not a distinction between different types of structures, but the data was aggregated for information of interest; the average value of self-assessment checklists for each type of structures (IRCCS, ASST e private structures) for each area in the two semesters of 2016 and 2017. Data were synthesized by the average value for self-assessment checklists of each type of healthcare structures, the zero percentage responses for each area and, for each semester and the number of areas in which the type of structures obtained a higher value respect to the other structures for the same area. Moreover, we made differences about the second and the first semester of the same year and the first and the second semester of different years about the zero percentage responses for each area. We also made difference from the second and the first semester of the same year for the average value of self-assessment for each type of structures. To obtain numerical values needed for the descriptive statistical above cited, for each histogram, the average values of self-assessment for each type of structures, the percentages of responses with a value of zero per area of interests and the average values of self-assessment for each institution was obtained through the program “PDF Xchange Viewer” (Tracker Software) [15]. In particular the program measurement tool allowed to obtain the height of the bars of the histogram, from which the value of the average or of the percentage was obtained thanks to a proportion with the measurement of a tick marks. In this study, even considering the available data, only descriptive statistic was made, as the population was considered as the set of health structures that were interested by the PIMO in the considered period. The analyses with numerical data (obtained from histograms as described above) were carried out using R (ver. 3.5.1, R: A Language and environment for statistical computing, R Foundation for Statistical Computing, Vienna, Austria) and KNIME (KNIME Analytics Platform, ver. 3.5.3, KNIME AG, Zurich, Switzerland).

Results

In Lombardy Region there are 27 ASST and 4 IRCCS and 86 private hospitals. For all type of healthcare structures, it was possible to evaluate the results of the 4 self-assessment checklists for the years 2016 e 2017. Table II presents the average value of ASST, IRCCS and private structures for each semester of 2016 and 2017; the percentage of zero responses for each area for each semester of the considered periods is presented in Table III.
Tab. II.

The average value for self-assessment checklists of each type of healthcare facility.

I sem. 2016II sem. 2016I sem. 2017II sem. 2017
ASST0.8910.8780.8580.855
IRCCS0.910.8810.8830.884
Private structures0.8870.8800.8790.882

The average value for self-assessment checklists for each type of healthcare facility (ASST (hospital and community trusts), IRCCS (Scientific Institute for Research and Healthcare) and private structures) for each semester (I sem.2016 (first semester of 2016), II sem.2016 (second semester of 2016), I sem.2017 (first semester of 2017), II sem.2017 (second semester of 2017)).

Tab. III.

The zero percentage responses for each area.

AreaI sem. 2016 (%)II sem. 2016 (%)I sem. 2017 (%)II sem. 2017 (%)
AAC0.6150.50.7690.731
AAS1.2310.9621.3081.308
ACA1.1541.52.1542.538
CCC1.6151.1922.8463.5
DCR0.4230.5381.0770.962
DIM4.4233.8084.5384.654
EPF1.4231.5772.6152.577
IDP1.7691.6152.4232.423
OBI1.3851.1151.6151.577
PDC0.6920.6921.1541.077
PGF1.0381.1151.3461.038
PGM2.1152.4232.9623.346
PVP2.9621.9232.7692.577
QDP3.54.1544.6545.423
SDI2.6152.9615.1925.538
SML0.4620.53811.192
TDP3.4233.5775.5776.192

The zero percentage responses for each area for each semester. For the explanation of the acronyms of the areas see Table I. I sem. 2016 (first semester of 2016), II sem. 2016 (second semester of 2016), I sem. 2017 (first semester of 2017), II sem. 2017 (second semester of 2017).

The biggest difference between the second and the first semesters of the 2016 was for the area PVP (patient assessment process), with -1.039% of answers with value zero, so in the second semester of 2016 the quality of this area increased respect to the first semester of 2016 (Tab. III). The other areas that in the second half of 2016 increased their quality were: DIM (discharge area), CCC (coordination and continuity of care), OBI (international goals for patient safety), AAS (acquisition of equipment and supervision of contracts), IDP (information and rights of the patient) and AAC (anaestheesiological and surgical assistance) (Tab. III). For PDC (care process) the percentage of responses with zero value was the same between the first and the second semester of 2016. In the other areas the percentage of zero-responses score increased between the first and the second semester of 2016 (Tab. III). Between the second and the first semester of 2017 the biggest difference was for the area PGF (drug management process) (-0.308%), so in the second semester of 2017 the quality increased for this area, respect to the first semester of 2017. The other areas that in the second half of 2017 increased their quality were: PVP (patient assessment process), DCR (clinical and rehabilitative documentation), PDC, EPF (education of patients and family), AAC and OBI. For AAS and IDP the percentage of responses with zero value was the same between the first and the second semester of 2017. In the other areas the quality decreased between the first and the second semester of 2017. The biggest difference between the first semester of 2017 and the first semester of 2016 was for PVP (patient assessment process) area (-0.193%), so in the first semester of 2017 the quality was better. For all other areas there was a decrease in quality in the first semester of 2017 compared to the first semester of 2016. The only area where there was an improvement in the quality from the second semester of 2017 compared to the second semester of 2016 was the PGF. For the other areas there were a decrease in the level of hospital quality. In the first semester of 2016, for 9 areas out of 17 the highest score (compared to the scores of the other type of structures in the same period for the same area) was for IRCCS structures, for 6 areas it had been registered for private structures and for 2 areas was detected for ASST. In the second semester of 2016, for 8 areas out of 17 the highest recorded score (compared to the scores of the other type of structures in the same period for the same area) was for private structures, for 4 areas was detected for IRCCS, for 3 areas it has been registered for ASST; for 1 area the highest average score was found to be the same between IRCCS and ASST and for 1 area out of 17 the maximum recorded score was found to be the same between private structures and IRCCS. In the first semester of 2017, for 8 areas out of 2017 the highest recorded score (compared to the scores of the other type of structures in the same period for the same area) was for IRCCS, for 7 areas it has been registered for private structures, for 1 area was detected for ASST; for 1 area the highest average score was found to be the same between private structures and IRCCS. In the second semester of 2017, for 9 areas out of 2017 highest recorded score (compared to the scores of the other type of structures in the same period for the same area) was for private structures, for 7 areas was detected for IRCCS, for 1 area it has been registered for ASST. In the second half of 2016 there was the decrease of all the scores for all the structures compared to the first half of the same year, and the biggest difference was for IRCCS structures. In 2017 it was the same for the ASST, but the average of the scores of IRCCS structures and the average of the private structures increased from the first to the second semester of the considered year; the increase was greater for the private structures (Tab. II).

Discussion

The checklist as a self-assessment tool makes possible to identify most of the areas which need a plan of quality improvement that enable the standards to be achieved. It is worth emphasizing that the checklist is not merely a set of standards to be monitored: it is a planning tool. It should be conceived and implemented to bridge the gap between medium/long-term strategic decisions and the implementation tools which are, as of now, principally aimed at the short-term. Its planning function, unlike other tools, it could overcome certain limits/risks linked to the introduction of management performance systems, such as tunnel vision and compliance mood, since it steers the unit’s attention towards finding answers to its needs [16]. It encourages healthcare facilities to identify long-term goals as well as short-term ones, so it could overcome the risk of short-sightedness. The scores differences looking at the same year or at the trend in the two years might be explained with the meetings of the working group and with their attempts to make comparable their scores. Also some peculiarities of the national and regional health services might explain some differences and trends, for example IRCCS must be certified to obtain Ministry of Health fundings and private structures try to be appeal for patients and increase the level of their services. In any case, the goal of this approach of the Lombardy Region has been achieved as all the health structures involved have shown that they take into great consideration the level of quality of their services. It is worth to underlines some limitations of this study: the tool we describes has only two years of life and for this reason we propone our results as preliminary ones; the tool we presented need for its application of a health service with a quality background it is not feasible to be used as the starting method to implement quality improvement. It si quite strange in the international scenario that the quality improvement starts from a single health regional service but it should be read looking at the organization of the Italian National Health Service which in 1992 and 1993 allowed the single Regional Authorities to organize their health service complying with the National one [17, 18]. The are many types of quality improvement tools because they must manage the specific health service organization as happened in the Lombardia Regional Health Authority also in many other Countries the quality improvement used different strategies the most succeful are the ones able to involve all the staff and the leadership [3-5].

Conclusions

In response to the increased attention towards quality, many Countries have developed improvement programmes based on both an external auditing approach, such as accreditation systems, and an internal assessment approach, such as self-assessments and plans merging both strategies, such as indicator-based systems. Starting from the standards proposed by the accreditation systems, performance monitoring programmes based on specific indicators have often been created. The process presented is the result of experience gained by the Lombardy region from 2010 onwards in the performance monitoring all the healthcare facilities both public and private. This tool was created by the clinical and research healthcare facilities themselves through the organization of work groups and applying a bottom up logic within a system that had acquired, over the years, solid experience in relation to Quality Improvement. The checklist as a self-assessment tool makes possible to identify most of the areas which need a plan of quality improvement that enable the standards to be achieved. The internal monitoring system enables control of all areas and processes and identification of areas for improvement. It is worth emphasizing that the checklist is not merely a set of standards to be monitored, it is a planning tool. It should be conceived and implemented to bridge the gap between medium/long-term strategic decisions and the implementation tools which are, as of now, principally aimed at the short-term. Its planning function, unlike other tools, it could overcome certain limits/risks linked to the introduction of management performance systems, such as tunnel vision and compliance mood, since it steers the unit’s attention towards finding answers to its needs. It encourages healthcare facilities to identify long-term goals as well as short-term ones, so it could overcome the risk of short-sightedness. The methodology used to support these activities responds to the need to involve first and foremost the Regional Government in drawing up programmes for quality improvement, in order to ensure consistency between the activities undertaken and the needs of the system, involving the healthcare providers directly in proposing ways to improve the systems for which they are responsible. Sharing knowledge is a cardinal aspect at both healthcare unit and regional levels. This latter has created a dedicated network that will continue to support the improvement process. Quality improvement, especially in the healthcare field, requires multiple approaches, often in apparent contradiction with one another, strong leadership combined with a sense of participation, orientation and control, but also flexibility in implementing actions on the basis of local needs, and a willingness to learn from feedback that is constructively critical of the services provided. The 17 areas of the self-assessment checklist The average value for self-assessment checklists of each type of healthcare facility. The average value for self-assessment checklists for each type of healthcare facility (ASST (hospital and community trusts), IRCCS (Scientific Institute for Research and Healthcare) and private structures) for each semester (I sem.2016 (first semester of 2016), II sem.2016 (second semester of 2016), I sem.2017 (first semester of 2017), II sem.2017 (second semester of 2017)). The zero percentage responses for each area. The zero percentage responses for each area for each semester. For the explanation of the acronyms of the areas see Table I. I sem. 2016 (first semester of 2016), II sem. 2016 (second semester of 2016), I sem. 2017 (first semester of 2017), II sem. 2017 (second semester of 2017).
  6 in total

Review 1.  Quality improvement interventions in public health systems: a systematic review.

Authors:  Julia A Dilley; Betty Bekemeier; Jeffrey R Harris
Journal:  Am J Prev Med       Date:  2012-05       Impact factor: 5.043

Review 2.  An international review of projects on hospital performance assessment.

Authors:  Oliver Groene; Jutta K H Skau; Anne Frølich
Journal:  Int J Qual Health Care       Date:  2008-03-13       Impact factor: 2.038

3.  Lessons learned from the multistate learning collaborative.

Authors:  Russell A Brewer; Brenda Joly; Marlene Mason; Debra Tews; Lee Thielen
Journal:  J Public Health Manag Pract       Date:  2007 Jul-Aug

4.  Deciphering the imperative: translating public health quality improvement into organizational performance management gains.

Authors:  Leslie M Beitsch; Valerie A Yeager; John Moran
Journal:  Annu Rev Public Health       Date:  2014-12-10       Impact factor: 21.981

5.  Ten challenges in improving quality in healthcare: lessons from the Health Foundation's programme evaluations and relevant literature.

Authors:  Mary Dixon-Woods; Sarah McNicol; Graham Martin
Journal:  BMJ Qual Saf       Date:  2012-04-28       Impact factor: 7.035

6.  The effect of certification and accreditation on quality management in 4 clinical services in 73 European hospitals.

Authors:  Charles D Shaw; Oliver Groene; Daan Botje; Rosa Sunol; Basia Kutryba; Niek Klazinga; Charles Bruneau; Antje Hammer; Aolin Wang; Onyebuchi A Arah; Cordula Wagner
Journal:  Int J Qual Health Care       Date:  2014-03-09       Impact factor: 2.038

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.