Final Report Summary - DUQUE (Deepening our understanding of quality improvement in Europe)
1. An executive summary
• This document presents the conceptual framework, methodology and results of the “Deepening our Understanding of Quality Improvement in Europe (DUQuE) Project”. The overall aim of the project was to study the relationship between organizational quality improvement systems, organizational culture, professional involvement and patient involvement in quality management and their effect on the quality of hospital care (clinical effectiveness, patient safety and patient experience).
• DUQuE was designed as a cross-sectional, multi-method study with measurements at hospital, departmental and patient level. A total of 192 hospitals from eight countries (the Czech Republic, France, Germany, Poland, Portugal, Spain, Turkey, the United Kingdom) participated in the data collection. Data from 188 hospitals were included in the analysis. A total of 25,812 measures were collected and analysed, including 9,712 professional questionnaires, 6,536 patient questionnaires, 9,021 chart reviews, 366 external visits to departments and hospitals and 177 sets of hospital administrative data. Response rates were exceptional, ranging from 74% to 99%, depending on the type of measure.
• Within the DUQuE project we developed and validated seven measures for quality management: At hospital level the (i) quality management systems index, (ii) the quality management compliance index and (iii) the clinical quality implementation index and at departmental level indices for (iv) specialized expertise and responsibility, (v) evidence-based organization of pathways, (vi) patient safety strategies and (vii) clinical review.
o These measure of quality management were not associated with dominant types of organizational culture (as measured by the Competing Values Framework), however, a higher degree of social capital existed in hospitals that exhibit higher maturity of their quality management systems.
o Few leading physicians and nurses reported to be fully involved in the management of their hospital. More clinical leaders reported positive perceptions of teamwork and safety climate than frontline clinicians and more physicians had positive perceptions of teamwork and safety climate than nurses did. Implementation of quality management systems was generally positively related to both teamwork and safety climate.
o Hospital level quality management systems were marginally associated with clinical outcome indicators; however, at department level associations between quality management strategies and clinical process measures were substantial. For patient experience measures, we did not detect systematic trends and sometimes counterintuitive findings.
o A combination of accreditation and certification was a more powerful predictor of departmental organisation and clinical outcome than either assessment in isolation; however, the association varied between departments and between conditions.
o Hospital boards discussed quality performance more frequently when the CEO perceived more external pressure and discussing quality performance at Board meetings more often was associated with a higher quality management system score
• Recommendations of the project are synthesized in the document “Seven ways to improve quality and safety in your hospital”. For each of the seven strategies, we provide an overview on the underlying evidence base and suggest prompts to guide improvement efforts.
Project Context and Objectives:
2. A summary description of project context and objectives
2.1. Project Context
Research on quality in health care has over the last 30 years resulted in a considerable increase of knowledge. During this period we have seen major developments in approaches to measuring quality, assessing variations in health care delivery, implementing clinical practice guidelines based on best-evidence, assessing patient satisfaction and experience and, more recently, on estimating the incidence of adverse events which led to the patient safety movement. Nevertheless, evidence on the effectiveness of organizational quality improvement systems has only started to emerge more recently [1-4].
In response to this debate a new research line related to the effectiveness of quality improvement emerged from the quality field in the last 10 to 15 years. This led to research questions such as “does quality improvement lead to better quality of care?”, “which quality tools are most effective”, “how can various quality tools be integrated into a sensitive quality and safety improvement programme” or “what factors impact on the implementation of quality strategies at hospital level?” [5-6]. Previous EU projects, such as the project Methods of Assessing Responses to Quality Improvement Strategies (MARQUIS) added a new focus to the existing literature in that they aimed to assess the value of different quality strategies in European hospitals in order to provide the needed information for countries when contracting care for patients moving across borders. However, a major limitation of previous research is that it did not address the impact of these strategies on patient clinical process and outcomes [7-11]. We are presenting here the conceptual framework, methodology results and recommendations of the DUQuE project (Deepening our Understanding of Quality Improvement n Europe) funded by the European Commission 7th Framework Program.
2.2. Research objectives
The overall aim of this project is to study to what extent organizational quality improvement systems, organizational culture, professional involvement and patient involvement in quality management are related to the quality of hospital care, assessed in terms of clinical effectiveness, patient safety and patient experience in a sample of European hospitals.
Specific objectives to be pursued are the following:
1. To develop and validate an index to assess the implementation of quality management systems across European hospitals
2. To investigate associations between the maturity of quality management systems and measures of organizational culture, professional involvement and patient involvement in quality management
3. To investigate associations between the maturity of quality management systems and patient level measures of clinical effectiveness, patient safety and patient experience.
4. To identify factors influencing the uptake of quality management activities by hospitals, including external pressure as enforced by accreditation, certification or external assessment programs.
References
1. Cohen AB, Restuccia JD, Shwartz M, Drake JE, Kang R, Kralovec P,Holmes SK, Margolin F, Bohr D: A survey of hospital quality improvement activities. Med Care Res Rev 2008, 65:571-95.
2. Schouten LM, Hulscher ME, van Everdingen JJ, Huijsman R, Grol RP: Evidence for the impact of quality improvement collaboratives: systematic review. BMJ. 2008 Jun 28;336(7659):1491-4.
3. Greenfield D, Braithwaite J: Health sector accreditation research: systematic review. Int J Qual Health Care 2008, 20:172-184.
4. Suñol R, Garel P, Jacquerye A: Cross-border care and healthcare quality improvement in Europe: the MARQuIS research project. Qual Saf Health Care 2009, 18:i3-i7.
5. Suñol R, Vallejo P, Thompson A, Lombarts MJMH, Shaw CD, Klazinga N: Impact of quality strategies on hospital outputs. Qual Saf Health Care 2009, 18:i62-i68
6. Wagner C, De Bakker DH, Groenewegen PP: A measuring instrument for evaluation of quality systems. Int J Qual Health Care 1999, 11:119-30.
7. Weiner BJ, Alexander JA, Shortell SM, Baker LC, Becker M, Geppert JJ: Quality improvement implementation and hospital performance on quality indicators. Health Serv Res 2006, 41:307-34.
8. Suñol R, Vallejo P, Groene O, Escaramis G, Thompson A, Kutryba B, Garel P:Implementation of patient safety strategies in European hospitals. Qual Saf Health Care 2009, 18:i57-i61
9. Groene O, Klazinga N, Walshe K, Cucic C, Shaw C, Suñol R: Learning from MARQUIS: the future direction of quality and safety in healthcare in the European Union. Qual Saf Health Care 2009, 18:i69-i74
10. Lombarts MJMH, Rupp I, Vallejo P, Klazinga N, Suñol R: Differentiating between hospitals according to the "maturity" of quality improvement systems: a new classification scheme in a sample of European hospitals. Qual Saf Health Care 2009, 18:i38-i43
11. Shaw C, Kutryba B, Crisp H, Vallejo P, Suñol R: Do European hospitals have quality and safety governance systems and structures in place? Qual Saf Health Care 2009, 18:i51-i56.
Project Results:
3. A description of the main S&T results/foregrounds
3.1. Methodology
Design
DUQuE was designed as a cross-sectional, multi-method and multi-level study in which patient-level measurements based on clinical practice indicators and questionnaires are nested in hospital departments, which are in turn nested in hospitals in 8 EU countries. 30 hospitals with more than 120 beds were randomly selected for the overall project development in the participating countries: Czech Republic, England, France, Germany, Poland, Portugal, Spain, Turkey, and the United Kingdom. Additionally data at departmental and patient level was collected in 12 of these 30 hospitals. Hospitals were randomly selected from a country list.
Inclusion criteria include: 1) Countries having a sufficient number of hospitals in line with our sampling criteria, having different approaches to financing and organizing health care, having experience in conducting field test, and geographical distribution across EU and 2) Candidate hospitals for participation were those general hospitals with more than 120 beds, either public or private, university or non university, attending patients with Acute Myocardial Infection (AMI), Stroke, Hip Fracture and Deliveries with enough volume of cases volume >30 cases every 3 months to guarantee data collection.
When a hospital refused to participate, it was replaced with a hospital of similar characteristics in terms of size, ownership and teaching status.
Ethical and Confidentiality: DUQuE fulfils all the requirements for research projects in the 7th framework of EU DG Research [12]. Ethical approval was obtained by the project coordinator at the Bioethics Committee of the Health Department of the Government of Catalonia (Spain). Each country complied with the confidentiality issues according with national legislation or standards of practice available in each country. All data was anonymous for patients and codes were used for hospitals and countries.
Measure development: We developed a theoretical framework (figure 1) model based in existing evidence, background knowledge and methodology from relevant disciplines. The partners and experts of the project further developed the analytical framework iteratively with meetings, teleconferences and emails to simplify the original model and built up the final 13 main constructs and their expected relationship. DUQuE’s study conceptual model outlines the expected relationships between constructs at different levels and outcome measures in four care pathways (Acute Myocardial Infarction, deliveries, hip fracture and stroke).
In order to select measures for each construct, literature reviews were performed by each group of DUQuE experts to identify existing measures. Explicit criteria were used to assess and select measures, including psychometric properties, level of evidence, and appropriateness in multinational studies. Whenever a validated measure existed, permission was requested from lead authors and the measures were adopted for the DUQUE questionnaires. Where this was not the case, a new measure was developed. Final DUQuE measures and data collection method are shown in table 1.
Fig.1 DUQuE Conceptual model
Table 1 DUQuE Measures and data collection method
Construct name Measure assessed Measure definition Data collection method Administration system
1. EXTERNAL PRESSURE
Perceived External Pressure Influence in hospital management of external factors (accreditation, contracts, press…) Questionnaire to Chief Executive Officer (CEO) Electronically administered questionnaire
External Assessment Whether the hospital has undergone external assessment (Accreditation, ISO ) Assessment at Hospital level performed by an external visitor Visit at hospital level performed by an external visitor. Both, paper and electronically administered audit forms
2. HOSPITAL GOVERNANCE Quality orientation of the management board Including background in quality, time allocated for quality in the meetings etc. Questionnaire to Chief Executive Officer (CEO) Electronically administered questionnaire
3. HOSPITAL LEVEL QUALITY MANAGEMENT SYSTEMS (QMS)
Quality Management System Index (QMSI) Index to assess the implementation of quality management system at hospital level Questionnaire to hospital to Quality manager (QM) Electronically administered questionnaire
Quality management Compliance Index (QMCI) Measuring compliance with Quality Management strategies to plan, monitor and improve the quality of care Assessment at Hospital level performed by an external visitor Both paper and electronically administered audit forms
Clinical Quality Implementation Index (CQII) Index measuring to what extent efforts regarding key clinical quality areas are implemented across the hospital Assessment at Hospital level performed by an external visitor Both paper and electronically administered audit forms
4. HOSPITAL CULTURE
Organizational Culture (CVF) Competing Values Framework CVF has two dimensions: Structure of internal processes within the hospital and orientation of the hospital to the outside world Questionnaires to Chair of Board of trustees, Chief Executive Officer, Medical Director and the Highest ranking Nurse Electronically administered questionnaire
Social Capital Measures common values and perceived mutual trust within the management Board Questionnaire to Chief Executive Officer Electronically administered questionnaire
Hospital Patient Safety Culture Safety Attitude Questionnaire (SAQ): measures perceptions on Patient Safety Culture in terms of teamwork and safety climate Questionnaires to leading Physicians and Nurses Electronically administered questionnaire
5. HOSPITAL PROFESSIONAL INVOLVEMENT Professional involvement in management Measures leading doctors and nurses involvement in Management, Administration & Budgeting and managing medical and nursing practice Questionnaires to Leading Physicians and Nurses Electronically administered questionnaire
6. PATIENT INVOLVEMENT IN QUALITY MANAGEMENT Patient Involvement in Quality at hospital level
This construct assesses patients’ involvement in setting standards, protocols and quality improvement projects. These constructs used in previous research (Groene, ENQUAL) Questionnaire to hospital Quality Manager Electronically administered questionnaire
7. DEPARTMENT QUALITY STRATEGIES
Specialized Expertise and Responsibility (SER) Measures if specialized expertise and clear responsibilities are in place at pathway level Assessment at Pathway or Department settings performed by an external visitor Both paper and electronically administered audit forms
Evidence Based organization of pathways (EBOP) Measures if pathways are organized with regard to existing evidence, Assessment at Pathway or Department settings performed by an external visitor Both paper and electronically administered audit forms
Patient Safety Strategies (PSS) Measures if most recommended safety strategies are in place at ward level Assessment at Pathway or Department settings performed by an external visitor Both paper and electronically administered audit forms
Clinical Review (CR) Measures if clinical reviews are performed systematically Assessment at Pathway or Department settings performed by an external visitor Both paper and electronically administered audit forms
8. DEPARTMENT PATHWAY CULTURE Pathway Patient Safety Culture Safety Attitude Questionnaire (SAQ): measures perceptions on Patient Safety Culture in terms of teamwork and safety climate Questionnaires to Physicians and nuerses at pathway level Questionnaire electronically administered
9.PROFESSIONALISM Professionalism Measures professional attitudes towards professionalism and behaviour in their clinical area Questionnaires to Professionals at pathway level Questionnaire electronically administered
10. PATIENT INVOLVEMENT IN QUALITY MANAGEMENT Patient Involvement in Quality at departmental level
This construct assesses patient’s involvement in setting standards, protocols and quality improvement projects. These constructs used in previous research (Groene, ENQUAL) Questionnaire to Manager of care pathways or Head of department Questionnaire electronically administered
Patient Information strategies in departments Measures if information literature, surveys and other activities are conducted at pathway or department level Assessment at Pathway or Department settings performed by an external visitor Both paper and electronically administered audit forms
11. PATIENT EXPERIENCE
Generic patient Experience Generic measure of patient experience (NORPEQ) Patient Survey Paper based questionnaire
Perceived patient involvement Measures perceived involvement of care (From Commonwealth Fund sicker patients survey) Patient Survey Paper based questionnaire
Hospital Recommendation Measure of hospital recommendation (from HCAHPS) Patient Survey Paper based questionnaire
Perceived Continuity of Care Measures patient-perceived discharge preparation (Health Care Transition Measure) Patient Survey Paper based questionnaire
12. PERCEIVED PATIENTS SAFETY Perceived patients safety Measures patients’ perception of possible harm and it’s management Patient Survey Paper based questionnaire
13. CLINICAL EFFECTIVENESS Clinical Effectiveness indicators for AMI, Stroke, Hip Fracture and Deliveries A set of clinical process composite indicators based on their high evidence of impacting patients’ outcomes Patient Clinical Charts
Administrative hospital data Electronic data collection sheet
Clinical indicator development process:
A number of criteria were applied to select a bundle of 4 to 6 process and outcome clinical indicators per condition. Each individual clinical indicator was expected to capture a relevant, comparable and measurable quality or patient safety information. The final indicators were based upon the highest level of evidence as possible, balancing the need for maximum variance between patients, and minimum variance between countries, and data availability across countries. Those indicators previously used in clinical cross-national comparative were prioritized. Data from each clinical indicator was collected via medical records review.
The selection of clinical indicators was done by a multi-step process initiated by a literature review leading to an overview of existing indicators according to the criteria outlines above.
To aid the selection of clinical indicator bundles for the four conditions, we invited relevant European scientific societies, individual key experts with knowledge of the health care systems in Europe and the relevant conditions investigated, and or patient level measures, and the eight DUQuE country coordinators to take part in a hearing process to evaluate their relevance and feasibility. Finally, we selected a total of 7 clinical indicators and 2 aggregated measures for acute myocardial infarction (AMI); 6 clinical indicators for and 3 aggregated measures for deliveries, 4 clinical indicators and 1 aggregated measure for hip fracture, and 4 clinical indicators and 1 aggregated measure for stroke (Table 2).
Table 2. Clinical indicators use in DUQuE
Condition Main clinical indicators used Source Level of evidence
Acute Myocardial Infarction (AMI) Fibrinolytic agent administered within 75 min of hospital arrival AHRQ A
Primary percutaneous coronary intervention within 90 min AHRQ A/B
Thrombolytic therapy OR Primary percutaneous coronary intervention given See 1a & 2a See 1a & 2ª
*Therapy given on time
Anti-platelet drug prescribed at discharge (ASPRIN) AHRQ A
Beta blocker prescribed at discharge AHRQ A
Medication with statin prescribed at discharge AHRQ A
ACE inhibitors prescribed at discharge AHRQ A
* Appropriate medications (all 4 of anti-platelet, beta blocker, statin, ACE inhibitor) prescribed at discharge
Deliveries Epidural anaesthesia applied within 1 hour after being ordered for vaginal births The Danish Clinical Registries D
Exclusive breastfeeding at discharge WHO D
Blood transfusion during intended or realized vaginal birth The Danish Clinical Registries B
Acute caesarean section
Obstetric trauma (with instrumentation) OECD B
Obstetric trauma (without instrumentation) OECD B
* Mother complication (unplanned C-section, blood transfusion, laceration, and instrumentation) The Danish Clinical Registries B
* Adverse birth outcome (child) The Danish Clinical Registries B
* Birth with Complications The Danish Clinical Registries B
Hip Fracture Prophylactic antibiotic treatment given within 1 hour prior to surgical incision RAND A
Prophylactic thromboembolic treatment received on the same day as admission (within 24 hours or on same date when (one or more) times not provided) RAND A
Early mobilization (within 24 hours or before next day when (one or more) times not provided) The Danish Clinical Registries
B
In hospital surgical waiting time < 48 hours (or 1 day when (one or more) times not provided) OECD C
* Percentage of recommended care per case (indicators 1a, 2a, 3a, 4=YES)
Stroke Admitted to a specialized stroke unit within 1 day after admission The Danish Clinical Registries A
Platelet inhibitor treatment within 2 days after admission The Danish Clinical Registries A
Diagnostic examination within first 24 hours/same day after admission using CT or MRI scan The Danish Clinical Registries D
Mobilized within 48 hours (or 2 days when times are missing) after admission? The Danish Clinical Registries C/D
* Appropriate stroke management (All three applied 2a, 3a and 4bi)
The clinical indicators were retrieved through medical record abstraction using a standardized data collection sheet and a manual, translated into the local languages. The review of the medical chart was organized through the hospital coordinators, and performed by hospital staff with clinical background knowledge and insight in local clinical and documentation practice.
Testing and translation of the measures:
A cognitive test to assess the understanding of the questions was conducted for the professional and patient questionnaires. Instruments were piloted in different settings to assess their feasibility and reliability and adaptations where done when required. All questionnaires were originally designed in English language and underwent forward-backward translation to ensure linguistic equivalence. All final questionnaires used are available at the DUQuE’s website: www.duque.eu. The remaining measures (chart reviews, audit forms and routine data specifications) are available upon request.
Measures were grouped in a variety of questionnaires to professionals (management and frontline), a patient survey, clinical records review, administrative data and on-site visits. We assessed clinical effectiveness, perceived patient safety and patient experience in relation to four conditions: acute myocardial infarction (AMI), deliveries, hip fracture and stroke.
Field test process
The field test process involved a preparation phase for identification of respondents and sampling, training sessions, distribution of material and access codes, data collection, follow up, data gathering and cleaning. Country and hospital coordinators played a crucial role in this phase. Non-in depth hospital (N=115) were scheduled to retrieve 25 questionnaires each. The in-depth hospitals (N=74) provided the same information and also retrieved complete data at pathway level for the four studied conditions (a total of 372 questionnaires per hospital) including clinical records review, patients’ surveys and data from audit visits (one and a half days). At each hospital, specific sampling approaches were applied for those questionnaires requiring to be answered by more than 1 respondent. All hospital coordinators at each participating country received specific training on their responsibilities and tasks. Also external surveyors were trained to fulfil their responsibilities. An IT platform was created for professional questionnaires and external visits in order to facilitate data collection, and submit it to the project coordination. The patient surveys were distributed in paper version, filled in anonymously, and centrally entered to the DUQuE database by project coordinator team.
Data collection and quality control
The data collection in hospitals lasted 10 months, with the data collection period starting in May 2011 and being completed by February 2012. Once the data collection period was finalized, all data underwent centralized cross-checking by the central project coordination team (CPCT) to ensure that hospital and respondents´ codes matched per country. Data—especially charts review files—were also quality controlled for any discrepancies and errors.
3.2 Participation and hospitals characteristics
A total of 8 countries participated in the data collection process of the project. [Table 3]
Table 3. Hospitals accepting to participate
Country Czech Republic England France Germany Poland Portugal Spain Turkey Total
Total approached 39 48 51 194 77 31 66 42 548
Total accepting 30 4 25 13 30 30 30 30 192
Non in-depth 18 4 14 9 18 19 18 18 118
In-depth hospitals 12 0 11 4 12 11 12 12 74
Total number of hospitals approached was 548. Hospital participation rates varied from 7%-97% between countries. The reasons for non-participation reported by the countries with lowest acceptance rates were related mainly to research fatigue, burnout with regard to quality management issues, time constraints and competition with regard to efficiency and productivity issues
Ultimately, data from 188 hospitals and 294 departments in 7 participating countries were finally analysed (Table 4). Due to the extensive difficulties in recruiting hospitals and delays in obtaining ethical approval, data from England were excluded from the analysis.
Table 4. Country and hospitals included in the analysis. Descriptive characteristics
Country Czech Republic France Germany Poland Portugal Spain Turkey Total
Total Final Participants 30 25 13 30 30 30 30 188
Non in-depth 18 14 9 18 19 18 18 114
In-depth hospitals 12 11 4 12 11 12 12 74
Total final departments 47 44 16 48 43 48 48 294
Acute Myocardial 11 11 4 12 11 12 12 73
Stroke 12 11 4 12 11 12 12 74
Hip Fracture 12 11 4 12 11 12 12 74
Deliveries 12 11 4 12 10 12 12 73
Distribution of Hospital beds
<200 0 0 0 1 0 8 9 18
200-500 13 7 6 8 17 11 17 79
501-1000 9 9 6 21 9 6 2 62
>1000 8 9 1 0 4 5 2 29
Ownership
Public 18 24 6 27 30 25 26 156
Private non for Profit 12 1 5 0 0 2 0 20
Private for Profit 0 0 2 3 0 3 4 12
Teaching Hospital
No 22 15 13 0 30 3 24 107
Yes 8 10 0 30 0 27 6 81
A total of 25.812 questionnaires were analyzed. The response rates for the different measures ranged from 74% to 99%.
Type of Questionnaire Participants Total expected Total Response
Rate (%)
Professional Questionnaires All hospitals (in-depth and non-in depth) 10,916 9,712 89
Patient Questionnaires In-depth hospitals 8,880 6,536 74
Clinical Chart Reviews In-depth hospitals 10,360 9,021 87
External Visits: departments and overall hospitals In-depth hospitals 370 366 99
Administrative Routine Data All hospitals (in-depth and non-in depth) 188 177 94
3.3 Main results
We describe here the main results of the project for each specific objective:
Objective 1. To develop and validate an index to assess the implementation of quality management systems (QMS) across European hospitals
The assessment of integral quality management (QM) in a hospital asks for detailed measurement and monitoring from different perspectives and at various levels of care delivery. Within the DUQuE project we developed and validated 7 measures for QM: 3 at hospital level, and 4 at department level. The reason for developing different quality management measures is that, according our theoretical model, they cover different complementary aspects. The measures included:
At hospital level
- Quality Management Systems Index (QMSI) — an overall measure for the extent of implementation of quality management systems, also includes subscales on quality policy documents, quality monitoring by the board, training of professionals , formal protocols for infection control, formal protocols for medication and patient handling, analysing performance of care processes, analysing performance of professionals, analysing feedback & patient experiences and evaluating results.
- Quality Management Compliance Index (QMCI) developed from the perspective of how the hospital management oversees hospital quality program initiatives
- Clinical Quality Implementation Index (CQII) measuring the spread of quality efforts and continuous improvement in clinical areas.
At department level
- Specialized expertise and responsibility (SER) — covering how clinical responsibilities are assigned for each of the four conditions
- Evidence-based organization of pathways (EBOP) — measuring if department organisation processes for admission, acute care, rehabilitation (if appropriate) and discharge reflect evidence-based care
- Patient safety strategies (PSS) — based on patient safety recommendations of international agencies
- Clinical review (CR) — evaluating if audit and systematic monitoring are embedded in departmental quality management mechanisms
At hospital level:
I. Quality Management Systems Index (QMSI) — an overall measure to assess the quality management systems.
Development of instrument:
Questionnaire development was facilitated through expert opinion, literature review and earlier empirical research. The questionnaire focuses on the managerial aspects of quality management. Based on the literature review and our own earlier experiences, we expect that a quality management system can be functionally described through themes such as quality policy documents, quality monitoring, professional training, evidence based guidelines, internal quality methods, and evaluating results. Two methods were applied to develop the items for each of the theoretical themes. First, the expert opinions of other DUQuE project members were considered (n=10). Secondly, items were selected based on the theoretical framework of the EFQM model (European Foundation of Quality Management), accreditation manuals, and management literature. Respondent could rate each item on a four-point-Likert-type scale, with answer categories ranging from ‘Not available’ to ‘Fully implemented’ and from ‘Disagree’ to ‘Agree’. The quality managers of the hospitals were the main respondents. We used psychometric methods to explore the factor structure, reliability, and validity of the instrument.
Results: Structure, reliability and validity
Of the 188 hospitals, 183 quality managers returned completed questionnaires. The content validity of the final questionnaire used in the DUQuE project was approved and judged complete by the 10 experts from different quality research areas involved in the project that have not been related with QMSI development.
The factor analysis yielded nine scales: quality policy documents, quality monitoring by the board, training of professionals, formal protocols for infection control, formal protocols for medication and patient handling, analysing performance of care processes, analysing performance of professionals, analysing feedback & patient experiences and evaluating results which were combined to build the Quality Management Systems Index. Cronbach’s reliability coefficients were satisfactory (ranging from 0.72 to 0.82) for eight scales and low for one scale (0.48). Corrected item-total correlations generally provided adequate evidence of factor homogeneity. Inter-scale correlations showed that every factor was related, but also distinct, and added to the index. Construct validity testing showed that the index was related to recent measures of quality. Participating hospitals attained a mean value of 19.7 (standard deviation 4.7) on the index that theoretically ranged from 0-27.
II. Quality Management Compliance Index (QMCI) — developed from the perspective of hospital management overseeing hospital quality program initiatives
Development of the instrument QMCI: The aim of the QMCI was to identify and externally verify the compliance to a set of interacting activities, methods and procedures used to plan, monitor and improve the quality of care. All items of the QMCI (N=15) were rated on a five point Likert scale, varying from ‘no or negligible compliance’ (0) to ‘full compliance’ (4). The choice of questionnaire items were based on expert opinion, based on years of experience in hospital performance evaluation during accreditation and certification audits. The main criterion for including an item was the assumed influence on quality and safety of care, and the possibility to verify the answer.
Results QMCI: Across the seven countries, 74/74 hospitals participated in this in-depth part of the DUQuE project. The items from the visit had good face validity, as tested by experts of the DUQuE project. Factor analysis excluded 3 items of the questionnaire because they did not load on any of the factors and a final set of 15 items was accepted. In the QMCI the questionnaire could be grouped in 4 dimensions: Quality Planning (1 item); Monitoring of Patient / Professional opinions (6 items), Monitoring quality systems (4), and Improving Quality by Staff Development?? (4).
The factors of the QMCI yielded acceptable results with internal consistency (Cronbach’s alpha) ranging from 0.74 to 0.78. None of the corrected item-total correlations were less than 0.4 except in the case of one item in the Improving Quality by Staff Development sub-scale, indicating that all items contribute to the distinction between high and low scores on the factor. The inter-scale correlations, presented had a maximum of 0.52 which is below the maximum threshold of 0.70. This indicates that the QMCI is indeed a multidimensional construct with subscales addressing independent aspects of quality management. All subscales had notable correlations with the overall index, meaning that they contribute to the QMCI. Results of QMCI had a final scale range of 0-16.
III. Clinical Quality Implementation Index (CQII) — measuring the spread of quality efforts and continuous improvement in clinical areas.
Development of the instruments: The purpose of the CQII is to test clinical quality systems and seek evidence of their implementation at hospital level. The CQII has been designed to measure to what extent efforts regarding key clinical quality areas are implemented across the hospital. Following Bate and Mendel (2008) [3], each quality effort will be assessed regarding three levels of development: 1) Do quality efforts regarding the key areas exist (i.e. is there a responsible group and hospital protocol)?; 2) To what extent are these efforts monitored (i.e. with regard to compliance and improvement measurements)?; 3) To what extent is the sustainability of these efforts monitored?
The key clinical areas to be analysed stem from the different quality functions described in the accreditation systems, as well as the recommendation of WHO Patients’ safety Alliance that cover most of the key hospital clinical and safety areas (infection, medication management, surgical safety, falls etc.). In total, seven areas were selected: 1) preventing hospital infection, 2) medication management, 3) preventing patient falls, 4) preventing pressure ulcers, 5) routine assessment and diagnostic testing of patients in elective surgery, 6) safe surgery that includes an approved checklist, and 7) preventing deterioration and advance life support (i.e. rapid response teams, resuscitation programs). In total, the audit instrument comprised items from 7 clinical areas, which could be rated on a 5-point Likert scale, varying from ‘No compliance’ to ‘Full compliance’. Additionally, when no information was available ‘not applicable’ could be selected. However, these items were re-coded to a scale of 1-3, where responses of no, negligible, or low compliance were coded as 1, medium compliance was coded as 2, and high, extensive, or full compliance were coded as 3.
Results CQII: Across the seven countries, 74/74 hospitals participated in this in-depth part of the DUQuE project. The CQII aimed to assess three levels of implementation, such as: existence of protocol, monitoring of compliance, and sustainability by measuring and using indicators to keep an improvement focus. Factor analysis revealed, however, that the items did not group into these dimensions. Instead, the factors appear to be grouped according to different clinical areas: Preventing Hospital Infection, Medication Management, Preventing Patient Falls, Preventing Patient Ulcers, Routine Testing of Elective Surgery Patients, Safe Surgery Practices, and Preventing Deterioration. This indicates that the levels of development of clinical implementation are not consistent on the same level across different clinical areas. Rather, the levels of development coexist and reflect the implementation of a certain area. Therefore we used the items to describe clinical implementation as a single score for each area. The resulting seven-factor structure showed high factor loadings, Cronbach’s alphas ranging from 0.82 to 0.93 and corrected item-total correlations. The inter-scale correlations, had a maximum of 0.59 which is below the maximum threshold of 0.70. This indicates that the CQII is a multidimensional construct. CQII had a final scale range of 0-14.
IV. Complementarities of the three quality management indices at hospital level. Inter-index correlations at hospital level.
The inter-index correlation between QMCI and the CQII is 0.565. This is in line with our expectations that both are distinct, but related constructs. All scales add sufficiently to the overall index. Study results suggest two reliable instruments that can be used during on-site visits to assess quality management activities at the hospital level in Europe. The results show that the QMCI and CQII seem reliable and valid to assess respectively compliance with existing procedures at hospital level and the extent of continuous improvement of clinical quality by having a responsible group for the clinical area as well as a formally approved protocol and performance indicators. The two developed indices have the potential to be used in routine practice by external or internal audits to help hospitals focus on quality and safety issues as well as follow the quality improvement P-D-C/S-A cycle. The QMCI focuses on the core elements of a quality system, while the CQII has its focus on clinical areas that are directly related to patient care at ward level. Associational analysis between the three hospital level QM measures showed significant positive associations between all three measures. This means that hospitals with a higher score for example on QMSI also have a higher score on QMCI and CQII. Thus, we showed that our QM measures were related with each other, but were also sufficiently distinct to add to an overall concept of QM in multi-level complex health care organizations.
Conclusions:
We set out to develop and validate indices to measure quality management systems in European hospitals. We found that our indices are reliable and valid for the assessment of quality management systems in European hospitals. The answers to the questionnaire and audit items could be summarized in a set of indices to express the extent of implementation of quality management activities, such as quality policies, methods for continuous improvement, and procedures for patient complaint handling or staff education. These could be assessed across seven European countries.
At departmental level:
Development of the instruments: At department level we developed and tested a checklist for the assessment of quality management activities using on-site visits by trained external surveyors. The purpose was to define a checklist that can be used to assess the performance of a department and evaluate the implementation of quality management activities across departments or pathways in acute care hospitals.
Four measures of quality management activities at care pathway level were developed, focusing on specialized expertise and responsibility (SER), Evidence-based organization of pathways (EBOP), Patient safety strategies (PSS) and Clinical review.
• Specialized expertise and responsibility (SER) covering how clinical responsibilities are assigned for that condition
• Evidence-based organization of pathways (EBOP) measuring if department organization processes (admission, acute care, rehabilitation (if appropriate), and discharge) are organized to facilitate evidence based care recommendations
• Patient safety strategies (PSS) based on patients’ safety recommendations of international agencies
• Clinical review (CR) evaluating if audit and systematic monitoring are embedded in departmental quality management mechanisms
Three (SER, PSS and CR) of the four quality management measures are identical for the four conditions. Evidence-based organization of pathways (EBOP) has the same structure for each department but the content follows the evidence recommendation for each type of condition.
Results: For this purpose a sample of 292 hospital departments of 74 acute care hospitals across seven European countries was analysed. In every hospital four departments for the conditions: AMI, Stroke, Hip fracture and Deliveries participated. Participating departments attained mean values on the various scales between 1.2 and 3.7. The theoretical range was 0-4. The highest scores on the four scales are found for deliveries. In general, scores on Evidence based organization of pathways were higher than for clinical review. This pattern was consistent over the four types of departments
Complementarity of the four quality management indices at hospital level. Inter-index correlations at hospital level.
Correlations among the four QM measures designed showed that every factor was related, but also distinct, and added to the overall picture of QM at pathway level. Thus, the newly developed checklist can be used across various types of departments and pathways, like AMI, deliveries, stroke and hip fracture.
Associations between quality management measures at hospital and pathway/departmental levels
One key question that emerged from this study was to test whether there was any association between development of quality management at hospital and department levels.
Design: We conducted a multilevel linear regression analysis of the association between three measures of quality management at hospital level and four scales of quality management activities at department level, while controlling for covariates, such as hospital ownership, teaching status, number of beds, number of board members, organizational culture, and country clustering of hospitals.
Results: Results of the relationships between hospital level QM measures and department level QM measures were mixed and showed most associations between QMCI and department level QM measures for all four types of departments. In general associations were small between hospital and department quality measures. In general, the beta coefficients were small for all investigated relationships.
By using the seven measures of QM it is possible to get a more comprehensive picture of the maturity of quality management in hospitals, with regard to the different levels and across various types of hospital departments.
Objective 2. To investigate associations between the maturity of quality management systems and measures of organizational culture, professional engagement and patient involvement in quality management.
To investigate associations between the maturity of quality management systems and measures of organizational culture we used measures of organizational culture: competing value framework, and social capital. Design and results are presented below.
Ia. Associations between the maturity of QM systems and Organizational culture: Competing Values Framework (CVF)
Design/Methods: We explored the role of organizational culture in influencing the implementation of quality management systems in European hospitals. The hypothesis that we tested was that the implementation of quality management systems will be higher in those hospitals with hospital cultures more focused on clients and the external environment.
Hospital organizational culture was measured using the Competing Values Framework (CVF)]. The CVF uses two main dimensions – the first describing how internal processes are structured within the hospital and the second describing the orientation of the hospital to the outside world. This gives rise to four distinct organizational cultural ‘types’, e.g. clan culture, developmental culture (also known as Open), hierarchical culture and rational culture. The CVF questionnaire offers respondents a series of descriptions of a hospital, arranged in five aspects of organizational culture and four answer categories representing different culture types. Within each group of four descriptions, the respondent is asked to ‘distribute 100 points’ between them ‘according to which description best fits your current organization’. The five groups represent descriptions of hospital characteristics, leadership, emphasis, cohesion and rewards. Collating these ‘points allocations’ provides a score (in the range 0-100) for each individual on each of four cultural types. Since the CVF is valid with three or more respondents, the CVF items were incorporated into the questionnaires of board of trustees, chief executive officer, the chief medical officer and the highest ranking nurse – the four top-level managers – to assess organizational culture in their hospital. To determine the hospital culture types per hospital the average of three respondents was calculated. The answer of the board of trustee has only been used if one of the others respondents have not answered the questions.
We used as outcomes the three hospital quality management system indices previously described, namely Quality Management System Index (QMSI), Quality Management Compliance Index (QMCI) and Clinical Quality Implementation Index (CQII). For the multivariable adjusted analyses aimed at our core research objective, we ran separate multilevel linear regression models for each outcome, guided by the DAG that showed the hypothesized and background relationships.
Results and Conclusions: Of the participating hospitals, 33% had a clan culture as their dominant culture type, 26% an open and developmental culture, 16% a hierarchical culture and 25% a more rational culture. Neither culture type had a significant relation with the three outcome measures. The results of the associational analysis showed no relationship between CVF culture type and the implementation of quality management strategies as measured by QMSI, QMCI and CQII. It appears that in our study, there is no single dominant organizational culture type that is more than others positively associated with quality management in European hospitals.
This does not mean that organizational culture is not important in relation to quality management as the values and beliefs that underpin such activity will shape how programs are implemented and evaluated. If all cultures were associated with quality management, then this would mean that there was no one best culture to support quality improvement and that hospitals could develop a range of supportive cultures depending on local contexts and contingencies. We did find a positive relationship between organizational structures and different types of quality management with the implication that hospitals need to develop appropriate organizational structures to support the types of quality management they are interested in implementing.
Ib. Associations between the maturity of QM systems and organizational culture: Social Capital (SC)
Design and Methods: Successful cooperation and coordination within groups depends crucially on their social capital which is defined as a common set of shared values and relationships of mutual trust among hospital management board members. We therefore hypothesize that the degree of social capital within the hospital management boards is associated with the effectiveness and maturity of quality management system in European hospitals. The exposure variable, “social capital” was measured with a six-item scale incorporated into chief executive officer’s (CEO) questionnaires. CEOs have been asked about their perceptions of social capital within the hospital management board. The outcome for the quality management system at the hospital level was the Quality Management System Index (QMSI) previously described. We hypothesized that the degree of social capital within the hospital management board is associated with the effectiveness and maturity of the quality management system in European hospitals
In order to test the hypothesis, we conducted a multilevel linear regression model. We controlled for hospital ownership, number of beds, organizational culture, and the number of hospital board members. The country where the hospital is based was considered as confounder at the country level.
Results and Conclusions: The average social capital score within a hospital was 3.3 (standard deviation: 0.6; range: 1-4). Higher social capital was associated with higher quality management system index scores). The results suggest that a higher degree of social capital exists in hospitals that exhibit higher maturity of their quality management systems. Although uncontrolled confounding and reverse causation cannot be completely ruled out, our new findings, along with the results of previous research, could have important implications for the work of hospital managers and the design and evaluation of hospital quality management systems.
II. To investigate associations between the maturity of quality management systems professional engagement in quality management
Design and Methods: To measure the level of clinical management by leading physicians and nurses in European hospitals we developed a scale of professional involvement in management. The questionnaire of 21-items with a scale of four points was developed on the basis of a previous study and expert opinion. The maturity of the quality management system was measured by the Quality Management System Index (QMSI).
Results and conclusions: (6) The sample consisted of 3386 leading physicians and nurses working in 188 hospitals from the 7 participating countries. Psychometric analysis yielded four subscales for leading physicians: (i) Administration and budgeting, (ii) Managing medical practice, (iii) Strategic management, and (iv)Managing nursing practice. Only the first three factors applied well to the leading nurses. Cronbach's alpha for internal consistency ranged from 0.74 to 0.86 for the physicians, and from 0.61 to 0.81 for the nurses. Except for the 0.74 correlation between “Administration and budgeting” and “Managing medical practice” among physicians, all inter-scale correlations were below 0.70 (range 0.43-0.61).
Less than 10% (9.8% of physicians and 5.3% of nurses) of our respondent sample held no formal management role, more than half of the respondents (51.7% of physicians and 54.6% of nurses) held a formal management role at the department level, and the rest held formal management roles at the hospital level only (5.3% of physicians and 9.6% of nurses) or both the hospital and department levels (17.5% of physicians, 15.3% of nurses
The average scores on the subscales for physicians ranged from 1.7 /4 for “Strategic management” to 2.8 /4 for “Managing medical practice”. These average scores were lower for nurses ranging from 1.5/4 for “Strategic management” to 1.9/4 for “Managing medical practice”. Few respondents acted as shared or final decision makers in these hospital management areas, although there was variation at the item-level. For instance, the majority of leading physicians where shared of final decision makers when it comes to the organization of medical education, the content of medical protocols, and a new multidisciplinary consult.
Overall, few leading physicians and nurses report to be fully involved in the management of their hospital, indicating mostly modest levels of clinical management; leading physicians reported to be shared decision makers in some specific hospital decision making areas. Leading physicians display higher levels of professional involvement than leading nurses.
It seems that European hospitals designed organisational structures to incorporate clinicians in hospital management and European doctors and nurses fulfil some managerial roles, but they bear little actual decision making responsibility in hospital management. In our study, drawing doctors and nurses into hospital management is not positively associated with the level of implementation of quality management systems.
IV. Associations between involvement of patients or their representatives in quality management functions and patient-centered care strategies
Design and Methods: To describe the involvement of patients or their representatives in quality management functions at hospital and department level we used a five item scale that was validated and used in previous research [12, 13]. Items assess whether patients are involved in a) the development of quality criteria / standards / protocols, b) the design / organization of processes, c) quality committees, d) quality improvement projects, and e) discussion of results of quality improvement projects. Each item is scored on a 4 point scale including the categories never, sometimes, usually, and always. As outcome measures we used a scale reflecting patient-centred care strategies at pathway level based on items collected via organisational audit: (1) existence of a formal survey seeking views of patients and carers, (2) written policies on patients’ rights, (3) patient information literature, including guidelines, and (4) fact sheets for post discharge care. Each item was scored on a five point scale (no or negligible compliance, low compliance, medium compliance, high-extensive compliance, full compliance).
Results and conclusions: Current levels of involving patients and their representatives in quality management functions in European hospitals are low at hospital level (mean score 1.6 on a scale of 0 to 5, standard deviation (SD) 0.7) but even lower at departmental level (mean 0.6 SD 0.7). We did not detect associations between levels of involving patients and their representatives in quality management functions and the implementation of patient-centred care strategies; however, the smallest hospitals were more likely to have implemented patient-centred care strategies. There is insufficient evidence that involving patients and their representatives in quality management leads to establishing or implementing strategies and procedures that facilitate patient-centred care.
Objective 3. To investigate associations between the maturity of quality management systems and patient level measures of clinical effectiveness, patient safety and patient experience.
I. Associations between the maturity of QMS and PLM’s of clinical effectiveness
Design: We assessed the association between hospital quality management systems and departmental quality management strategies with a range of key clinical process measures in four clinical conditions (AMI, stroke, hip fracture and deliveries). We developed algorithms to measure clinical indicators selected for the four conditions (AMI, Stroke, hip fracture and deliveries) based on literature review of their high evidence of impacting patients’ outcomes and after a process consultation with European scientific societies. Also, a set of 5 aggregate measures was developed: (2 for AMI, and 1 for each of the other 3 conditions):
- Therapy given on time in AMI care
- All appropriate medications given in AMI at discharge or contraindicated (include Anti-platelet drugs, beta-blockers, statin and ACE inhibitors prescribed at discharge). Contraindications include patients with coagulation diseases and other conditions accepted in guidelines
- Adverse birth outcome (include all mother or child complications: blood transfusion, acute C-section, instrumentation needed during vaginal delivery, and 3rd or 4th degree laceration, and child complication: Apgar score less than 7 at 5 minutes )
- % of recommended care given to patients with hip fracture (include antibiotic treatment within 1 hour prior to surgery, thromboembolic treatment on the same day as surgery, early mobilization (<24 hours post-surgery), and in hospital waiting time for surgery less than 48 hours
- Appropriate management of care in stroke patients (include being treated with antiplatelet inhibitor within 48hours, CT or MRI performed within 24 hours and mobilized within 48 hours)
Data was collected from standardized medical record review in 73 in-depth hospitals between May 2011 and January 2012. Independent variables include the three quality management indices at hospital level, and four quality management indices at department level (see objective 1).
Results and conclusions: 276 departments from 73 hospitals provided data for this study. A total of 9.021 clinical records were analysed. Descriptive results for aggregate measures of the clinical indicators showed positive compliance levels ranging from 75% (in therapy given on time for AMI and birth without complications) , to 50% for appropriate management of the condition in stroke and under 50% for therapy given on time in AMI and recommended care higher than 75% in the hip fracture. Country ranges of positive responses varied from 25 points in appropriate medications prescribed at discharge (or contraindicated) in AMI to 47 points in appropriate management of the condition in stroke. Variations between countries were as high as within countries.
At hospital level positive association between hospital quality management system measures and clinical indicators was found only in 4 of 23 indicators analysed, with not very high odds ratios. At department level we found significant associations between department quality strategies and clinical indicators in more than 50% of the indicators (12/24). In medical conditions we found positive associations in more than 75% of the indicators (6/7 indicators in AMI, and in 4/6 in stroke) and negative associations in 1/5 in deliveries, 1/5 in hip fracture. Associations were stronger in indicators based on highest level of scientific evidence (10 of 13 indicators).
We found limited and weak associations between hospital quality management development and clinical indicators of the four conditions, but the association was strong and statistically significant for departmental quality management and clinical indicators for AMI and stroke. These findings suggest that quality efforts taken closer to the decision level could have greater effect on patient outcomes. The inclusion of a set of common areas on department quality programs include: how responsibilities are allocated, how organizations facilitate guidelines implementation and implemented patient safety strategies and clinical review. These seem promising findings that could give a common framework for departmental quality efforts development.
II. Associations between the maturity of QMS and patient safety culture and procedures
Design and Methods: We aimed to describe and compare differences between clinical leaders’ and frontline clinicians’ perceptions of teamwork and safety climates, further to investigate quality management systems as predictors of teamwork and safety climates. We also described how generic patient safety procedures are in place in clinical departments. For this purpose a number of 73 acute care hospitals with a total of 291 departments managing AMI, , hip fracture, stroke and obstetric deliveries. We obtained data on 3622 leaders and 4903 frontline clinicians on their perception of teamwork and safety climate, and data on implementation of quality management systems from 188 hospitals, 74 hospitals (291 departments) were further surveyed for their safety procedures in place. Our data analyses included descriptive and multilevel regression models.
Results and Conclusions: We found that more clinical leaders reported positive perceptions of teamwork and safety climate than frontline clinicians. Overall, more physicians had positive perceptions of teamwork and safety climate than nurses did. Implementation of quality management systems was generally positively related to both teamwork and safety climate. Implementation of key quality activities at department level was associated with better safety climate as perceived by frontline clinicians. The implementation of specific key clinical quality activities (measured by the CQII) was statistically significant positively related to safety climate as perceived by frontline clinicians.
Patient safety practices were heterogeneously implemented, with differences observed regarding the type of practice, type of department, and country. There were no significant differences in bracelet identification between the four specialty departments, but wide differences between countries. Few departments had removed potassium chloride from general ward stocks. We did not detect significant differences in the implementation of patient safety practices between departments, neither was the implementation associated with recognition of teaching status.
III. Associations between the maturity of QMS and patient experience
Design/Methods: Patient-reported experience measures (PREMs) are increasingly being used to routinely monitor the quality of care. With the increasing attention on PREMs, hospital managers seek ways to systematically improve patient experience across hospital departments, in particular where PREM outcomes are used for public reporting or reimbursement. However, it is currently unclear whether hospitals with more mature quality management systems and stronger patient involvement in quality management perform better on PREMs. We assessed the effect of hospital quality management systems and departmental strategies for patient involvement and patient information on a range of PREMs. The analysis was performed in the 74 in depth hospitals. Each hospital contributed patient level data from four conditions/pathways: acute myocardial infarction, stroke, hip fracture and deliveries between May 2011 and January 2012. The outcome variables in this study were a set of PREMs collected at the patient level including a generic 6-item measure of patient experience (NORPEQ), a 3-item measure of patient-perceived discharge preparation (Health Care Transition Measure) and single item measures for perceived involvement of care and hospital recommendation. Explanatory variables included the measure on the maturity of the hospital quality management system (QMSI) already described, as assessed by a questionnaire administered to the hospital quality manager and two measures of departmental strategies for patient involvement in quality functions: Patient Involvement in Quality Management at Pathway Level) and Patient Information at Pathway Level
Results and conclusions: Overall, 6536 patients from 74 hospitals (patients from 276 clinical departments) contributed data to this study (acute myocardial infarction n=1379, hip fracture n=1503, deliveries n=2088, stroke n=1566). Patients admitted for hip fracture had the lowest scores across the four PREM measures through out. Patients admitted after acute myocardial infarction reported highest scores on patient experience and hospital recommendation; woman after delivery reported highest scores for patient involvement and health care transition. The associational analysis found statistically non-significant relationship between hospital-wide quality management strategies and patient involvement and information at departmental level. Departmental level strategies for patient involvement were significantly associated with patient experience; however, the associations were not systematic across the four departments and sometimes counterintuitive. Our findings suggest a) absence and/or wide variations in the institutionalization of strategies to improve patient-centred care in quality functions concepts can not be linked to description above, and b) seemingly counterintuitive inverse associations could be capturing a scenario where hospitals with poorer quality management were beginning to improve their patient experience. The former suggests that patient-centred care is not yet sufficiently integrated in quality management while the latter warrant a nuanced assessment of the motivation and impact of involving patients in the design and assessment of service.
Objective 4. To identify factors influencing the uptake of quality management activities by hospitals including external pressure as enforced by accreditation, certification or external assessment programmes.
I. Associations between ISO certification and health care accreditation and the maturity of quality strategies at department level and the results of clinical indicators
Design and Methods: We investigated the relationship between overall hospital ISO certification and healthcare accreditation, the maturity of quality strategies developed at department level in hospitals and the results on clinical indicators in the four tracer conditions. The study was performed in the 74 in depth hospitals. We used as outcome measures the 4 quality strategies measures developed in objective 1: Specialised expertise and responsibility (SER), Evidence-based organisation (EBOP), Patient safety strategies (PSI) and Clinical review as well as the 5 composite indicators of appropriate clinical management of four tracer conditions at patient level (please see objective 3.I)
Results and conclusions: We included data from 73 acute care hospitals and 291 departments managing acute myocardial infarction, hip fracture, stroke and obstetric deliveries.Combination of accreditation and certification was a more powerful predictor of departmental organisation and clinical outcome than either assessment in isolation. Measures of clinical review, patient safety strategies and specialist expertise were higher in recognised hospitals but neither assessment was associated with implementation of evidence-based organisation of clinical pathways. Delivery and hip fracture were relatively unresponsive to external assessment, in contrast to stroke and AMI, in terms both of departmental organisation and of clinical outcome. Combination of accreditation and certification was strongly associated with patients’ receiving appropriate therapy, and within the appropriate time frame. Accreditation and certification were associated with better clinical organisation and outcome, but the association varied between departments and between tracer conditions. Further analysis is needed to interpret the interrelationships of certification, accreditation and other external assessments. This data could be important to improve current accreditation processes
II. Associations between perceived external pressure by CEO’s, quality priority at board level and Quality management systems maturity at hospital level.
We aimed to test whether there was any association between perceived external pressure by CEO’s, quality priority at management board level and maturity of quality management systems at hospital level.
Design and Methods: The study was performed in all hospitals in the sample. We used as outcome variable the Perceived external pressure (PEP) which reflects the level of influence of factors from outside the hospital on the hospital’s quality management system, as perceived by the CEO. Since there was no validated scale to measure PEP, we asked experts from the participating countries to provide an overview of pressures applicable to hospitals in their country. We identified 18 different types of external pressure. In a web questionnaire, CEOs indicated how much they perceived influence of each external pressure on their quality management (0=No influence; 1= Moderate influence; 2=Major influence). The composite measure of PEP was constructed by the sum score of 18 external pressures. The composite measure for PEP ranges from 0 to 36. To evaluate quality priority at management board level we used the frequency of quality discussions in the hospital management board meetings. Data was retrieved from the question to Chief Executive Officers (CEOs) about how often quality performance was on the board’s agenda; 1) never, or during 2) a few, 3) most, or 4) every meeting. As predictors, we used the three hospital level quality management measures (Quality Management System Index (QMSI), Quality Management Compliance Index (QMCI) and Clinical Quality Implementation Index (CQII)).
Results and conclusions: We collected data from 155 hospitals in 7 countries. Highest CEOs perceived pressure was based on Governmental policy on quality and safety in health care, Legislation for internal quality systems, Public health, sanitary inspection, Hospital accreditation and Quality system certification (ISO 9004). Boards discussed quality performance more frequently when the CEO perceived more external pressure (regression coefficient b=4.23; p=0.004). Discussing quality performance at Board meetings more often was associated with a higher quality management system score (regression coefficient b=2.53; SE=1.16; p=0.030). Perceived external pressure was not associated to any of the quality management indices. Having quality on the Board’s agenda allows Board members to review and discuss quality performance more often in order to improve their hospital’s quality management.
3.4 Discussion
This is the first large-scale quantitative study exploring the implementation and impact of quality management systems in European hospitals. It represents a major advancement over previous research by including clinical process measures and patient outcomes.
This project reflects perhaps the most comprehensive conceptualization of the functioning hospital quality management systems assessed using an empirical, quantitative approach. We achieved to collect a large amount of data from hospitals in 7 countries including professionals (management and baseline), patients’ surveys, clinical indicators from medical records, and administrative data from participating hospitals. Overall, 188 hospitals participated in the data collection tools effort, including surveys of 9,857 professionals and 6,536 patients, 9,082 chart reviews, 74 external visits, and routine data from 182 hospitals. The active participation of hospitals and high response rate achieved across measures is a reflection of the dynamic field test coordination strategy that included periodic follow up and feedback to hospital coordinators. The project also led to the development of new measures, such of which reported in this supplement, and that can be used in future research.
Limitations of the study
This study has a number of limitations that are worth highlighting. First, the study was not designed or powered to report on the results for each country. This was not an objective and would have rendered the design infeasible within the budget available for this grant. The data gathered from countries with different health systems allowed a broad scope on quality improvement strategies in different countries. We pooled this data across countries and addressed heterogeneity in country-level estimates in our statistical modelling approach that allowed country level baselines and effects to vary. Further covariate adjustment included hospital size, ownership and teaching status.
A second limitation is that we used a cross-sectional study design. This was considered most appropriate for hypothesis testing in this project because it requires a relatively shorter time commitment and fewer resources to conduct, although ultimately does not allow conclusively to establish causality. We dealt with this issue by using Directed-Acyclic Graphs that guided the development of our statistical models, incorporating theory and knowledge derived from previous research findings. This approach also allowed us to address competing hypothesis about causal relationships.
A third limitation is related to the sampling strategy. Although sampling was conducted randomly, a generalization to participating countries and hospitals is limited because of a possible self-selection of hospitals participating in the project. This is reflected in the different acceptance rates in different countries. The reasons of rejection reported by the countries with lowest acceptance rates were related mainly to research fatigue, burnout with regard to quality management issues, time constraints and competition with regard to efficiency and productivity issues. Selection bias with regard to better performing hospitals is also possible. Oversampling hospitals in substitution of those which previously refused participation, though following same pattern of refused hospital, may have biased the type of hospitals by finally leading to the hospitals participating in the study. However, our analysis shows substantial variations in the implementation of quality criteria, and outcomes, thus it seems unlikely that the effect of self-selection is very strong.
A final limitation is that this project design combined both quantitative and qualitative approaches that rely largely on selfreporting and may potentially induce social desirability bias. However, this is a cause of concern mostly for measures with single respondents (whose reliability we were able to check through other data sources), and not so much for other measures.
Measuring clinical practice thorough indicators sources from clinical records review could potentially reflect systemic biases depending on coding practices in a country ; however, such a bias would not affect the overall findings of the project as we do not report country-level results and adjust the statistical modelling for the effects of country, hospital size, ownership and teaching status. In conclusion, while the design is prone to a number of limitations it was the only feasible within the time frame and budget. All key problems emanating from the design were anticipated from a clinical epidemiology perspective and addressed in further design features and in the approach to statistical modelling.
Potential Impact:
4. The potential impact (including the socio-economic impact and the wider societal implications of the project so far) and the main dissemination activities and exploitation of results
In this section we describe DUQuE recommendations and the impact of the DUQuE project and its socio-economic and wider societal implications. This section is structured as follows:
1. Project recommendations synthesised in an appraisal scheme called: Seven ways to improve quality and safety in your hospital
2. Impact on knowledge generation: the generation of new knowledge that enhances the understanding of the topic area and enlightens further scientific investigation by the project team and others.
3. Impact on knowledge translation: conscious efforts to facilitate the communication of the main findings, address key stakeholders and produce outputs of particular relevance to the target community (e.g. hospital clinicians and managers and those purchasing hospital services)
4. Broader policy impact: building and strengthening EU and international health policy regarding the quality of health care.
4.1 Project recommendations
In developing DUQuE recommendations we acknowledge that research on quality improvement methods has resulted in a wide range of assessment tools, statistical techniques and improvement applications in the last decade. For those in charge of planning and implementing quality management, the wealth of information on quality and safety interventions creates a problem. So, in our recommendations we aimed to include not only DUQuE findings but also synthesis of multiple sources of evidence available. The approach was not to cover every quality strategy but rather taking a birds-eye view to support managers in reflecting on their organization-wide approaches to ensure quality and safety and to provide an up-dated framework to assess quality and safety improvement in hospitals. The objective of these recommendations was to provide an answer to the question “where to start? And: how to translate hundreds of approaches into a coherent strategy”?
In implementing quality improvement actions, attention needs to be given to the role of context. Contextual factors, such as staffing ratios, supportive cultures, types of reporting back on performances, have a major influence on the effectiveness of quality improvement. Hospitals need to be aware of these contextual factors in designing, implementing and improving their quality management systems. The Consortium conducted a series of systematic reviews on the key strategies to improve quality and safety in hospitals. We also extracted information on their effectiveness and on contextual factors affecting their implementation. Recommendations include state-of-the art research and synthesises the results of the DUQuE Project and other large-scale empirical studies, systematic reviews and expert knowledge. Language of the recommendations was adapted to be of friendly use by managers and policy makers and payers
Finally, the Consortium brings together a large group of health care quality researchers, stakeholders representing national/regional quality agencies, and clinicians and managers in charge of implementing quality systems and ensuring quality of care. Their expert knowledge, too, was used in the formulation of recommendations.
Recommendations were synthesised in an appraisal scheme called: Seven ways to improve quality and safety in your hospital.
The seven ways to improve quality and safety cover the following:
1. Align organizational processes and external pressure
2. Putting quality high on the agenda
3. Linking external systems of accountability and internal improvement
4. Assure expertise, clear responsibilities and teamwork departmental level
5. Organize care pathways based on evidence for quality and patient safety interventions
6. Assure pathway-oriented information systems
7. Assessment and feedback
For each of the seven strategies, we provide an overview on the underlying evidence base, highlight key issues for further development and suggest prompts that can be used by quality managers and their teams to guide local question asking and reflection. Multiple assessment tools are referred to that can be used to support reflection processes with quantitative measurement.
These recommendations do not aim to be prescriptive. Hospitals differ structurally, in terms of the services they are providing (and the patients that are receiving them), their professional workforce and the maturity of their quality and safety management systems. It remains the responsibility of professionals and managers to set local priorities for their engagement with quality and safety. However, some of the lessons synthesized here are likely to be relevant for any hospital, whether a community hospital or large university clinic, whether providing internationally recognized services or operating in a resource-constrained environment.
1. Align external pressure with organizational processes
External pressure may take different forms, such as external assessment programmes (accreditation or certification) or pressure enforced by public inquiries or medial scandals.
There is mounting evidence to suggest that undergoing accreditation improves the organization of work processes, promotes changes and professional development. The effectiveness of accreditation and certification programs has been researched in close to 100 scientific studies. External assessment supports assurance of payers, patients and the public at large. It helps to raise the bar. It also stimulates internal quality improvement and helps to align work processes. Nevertheless, despite these effects the impact of health care accreditation and certification on health care outcomes remains unclear. It may thus be a particular advantage for hospitals that are aiming to clarify information procedures and work processes, but should not be confused as a single tool to improve health care outcomes.
External pressure also contains pressure enforced by governmental inspections, existing national programs and the media. Media coverage often occurs after high profile events and raise concern about the institutions reputation and individual´s liabilities. Hospitals frequently fail to learn from such cases and instead fall into defensive routines aimed at minimizing legal risk. Instead, they may be seized as an opportunity for change due to their power to create unity and sense of belonging.
2. Putting quality high on the agenda
A common factor responsible for catastrophic failures in health care is the lack of leadership involvement. This is a decisive component that affects patient care even where patient care in clinical units is pursued by competent and dedicated professionals. Simply put, research suggests that hospitals in which leaders are involved in quality, reach better quality of care outcomes. One thing that we have learned is that the board and senior management has got to be concerned with quality. Quality needs to be on the agenda at the top level.
Causal mechanisms for this are not fully understood but cover elements such as leading by example, non-blaming culture, adequate sourcing of key clinical areas, proactive monitoring of quality and safety indicators, and early interventions when problems arise. Leaders should realistically assess the performance of the organizations they represent, be aware of the quality metrics available in the organization and engaged with the clinical teams who are aware of the difficulties of quality improvement.
3. Supportive organization-wide systems for quality improvement
One key finding of the DUQuE Collaboration is that multiple quality systems operate within any hospitals. These quality systems need to be well aligned to maximize impact and minimize unnecessary bureaucracy or documentation that takes time away from patient care. Departmental level quality activities are strongly related with quality of care outcomes. Hospital quality management systems should be designed to support departments in delivering high quality care.
Organization-wide processes and accountability systems are important, but quality improvement adds most value near clinical processes. Thus, organization-wide systems need to be aligned and support departments in delivering quality.
Firstly, hospital-wide quality management systems are necessary to establish priorities, structures (i.e. infection committee), procedures (i.e. for the dissemination of knowledge and the update of practice guidelines), data collection and quality monitoring systems. These systems are an important prerequisite for quality improvement in organizational units, however, should be designed to be supportive of clinical improvement processes and patient-centred care rather than becoming an end in itself.
Secondly, implementation of organization-wide policies needs to be monitored throughout the organization. Mission statements and a “tick-box mentality” are not enough. This can be assessed by evidence in documents, reports, files, records of compliance with policy, procedures and activities, and direct observation.
Thirdly, hospital wide quality management needs to translate into clinical quality improvement actions. Otherwise, it is at risk of being considered a bureaucratic exercise. A wide range of strategies exist at clinical level that should be assessed in relation to organization procedures, e.g. evidence in minutes/reports for sustainable prevention and measurement of infections, falls, pressure ulcers, medication, safe surgery. Current implementation and spread of these strategies needs to be monitored periodically.
These three ways of conceptualizing organization wide quality management should be linked with quality improvement approaches at departmental level. Furthermore, it is not apparent that current quality management systems do appropriately reflect outcomes that are of importance for patients and their families. It seems that our quality systems are designed for improve clinical processes and measuring clinical outcomes but they currently fail to capture what is important for patients and their families.
4. Expertise and responsibilities at departmental level and teamwork
High quality care cannot be provided without well-trained and motivated professionals. A key strategy to improve the quality of care is thus the recruitment, retention and development of professionals with the right competences. High performing hospitals often attract particularly motivated individuals which cements their reputation. On the other side, hospitals without a track record in terms of quality and safety, research output and reputation may have difficulties recruiting the best professionals. Key factors to delivery high quality care is to recruit professionals with the right competences and establishing clear responsibilities for care processes.
5. Organize pathways based on best evidence for quality and safety interventions
Pathways need to be based on best evidence for quality and safety interventions. The majority of hospital departments still follow a traditional organizing principle according the medical specialization. To better respond to current patient’s needs, an organization based on care pathways should be pursued in which all clinical activities are centred on the patient’s overall journey.
Advantages of an organization based on care pathways are better standardization of care processes, better collaboration among clinicians, reduced variability and improve clinical outcomes. There is a lot that we can still learn from evidence-based medicine. It is not just about professionals following guidelines, it is about organizing care according to best evidence. A care pathway is more than a guideline content requirement. It reflects best evidence and bed-side actions, but more importantly is reflected in the overall organization of work, including definition of professional roles, physical ward organization and strategies to ensure patient safety.
Patient safety strategies have to be in place where the clinical service is provided. This is not an add on, it is an integral component of organizing the care.
The implementation of care pathways is often challenging as old patterns of care needs to be overcome and new collaborations, often across specialties and professionals groups, need to be established. Care pathways are associated with reduced costs, but they don´t come for free and leadership support, financial resources for reorganization and staff training are required.
6. Pathway-oriented information systems
Hospital information systems (covering computerized clinical decision support systems in hospitals, electronic health records, computer-assisted diagnosis, reminders for preventive care or disease management or drug dosing and prescribing) have an enormous potential to improve quality and safety of health care.
The effectiveness of computerized clinical decision support systems has been evaluated by a wealth of studies (more than 300), often randomized controlled trials. There is therefore a strong evidence-base for its effectiveness. No other organization could afford to continue using paper and pencil instead of maintaining sophisticated information systems to plan, deliver and control service provision.
Current implementation of HITs varies greatly between hospitals, even within national boundaries. Likewise, the evidence supporting HITs is varied and the implementation of hospital information systems can be resource intensive. A fully integrated electronic health record may not be necessary. In fact, the strongest evidence for quality and safety improvement points at specific medication order. For hospital information systems to be fit for the future, careful integration with clinical pathways within and outside the hospital are paramount.
7. Assessment and feedback
Audit and feedback are key quality improvement strategies, which can be applied individually or as part of multifaceted interventions. The assumption is that professionals will improve their performance when feedback demonstrates deficiencies in process or outcomes of care.
Audit and feedback has been well researched in hundreds of studies of with more than 100 studies based on experimental or quasi-experimental design. But, audit and feedback mechanisms differ with regard to: format of feedback, source of feedback, frequency of feedback, instructions for improvement, baseline performance, targeted behaviour or measures that make a difference to patients.
Audit and systematic monitoring need to embed in departmental quality management mechanisms, with all professionals participating and receiving feedback on performance. Hospitals engage in audit and feedback for a number of reasons. Many countries monitor the quality of care at national level, prospectively collect information and provide feedback on variations in provider performance. ´Closing the audit cycle´ is a frequently used expression to denote deficiencies in making sense out of and using audit data to drive improvement processes.
4.2 Impact on Knowledge generation
The DUQuE Project generated new knowledge in multiple ways, such as
- building a comprehensive conceptual model of hospital performance
- developing, validating and adapting measures to assess hospital quality management systems
- establishing baseline data for critical patient safety and quality of care standards
- investigating previously unspecified relationships between hospital quality management approaches, cultures, clinical processes and patient outcomes, and
- using methodological advances over traditional statistical modelling.
Development of a comprehensive model to guide the study and improvement of hospital quality management systems: The DUQUE conceptual model is perhaps the most comprehensive model of hospital performance. Previous conceptual models of OECD and WHO stress broader determinants of population health and different performance domains, respectively, but none of the existing models incorporates the wide range of factors that in practice reflect on quality and safety of care. For example, the role of hospital boards and professional groups in ensuring quality and safety, the role of hospital cultures that enhance or hinder improvements efforts, as well as the variability of hospital functioning across different departments are issues widely accepted to be of importance to improve quality and safety. These are now reflected accordingly in the DUQUE conceptual model to inform future studies and raise awareness of the complex relationships that exist in studying and improving hospital services.
Development, translation and adaptation of measures to assess hospital quality management systems: As a key output of the project, seven measures to assess the implementation of quality management systems were developed. These measures are conceptually complementary and address different organizational levels, e.g. hospital wide management and department specific management. The measures underwent appropriate psychometric validation and are openly available to the public. Other measures that were developed include scales and indices to assess professional involvement, professionalism and perceived external pressure. Moreover, a set of clinical process indicators was developed that may greatly enhance further pan-European comparisons of the quality and safety of care, given that they were developed considering issues of data availability and coding quality in the participating countries and general relevance to a wide group of patients. In addition, various existing measures were assessed in the DUQuE study, including social capital, the Competing Values Framework, subscales of the Safety Attitude Questionnaire, patient involvement in quality management, patient information strategies in departments, the NORPEQ patient experience measure, the Health Care Transition measure, and items on patient perceived involvement in care and hospital recommendation. The questionnaires to collect all the data have been made available in eight EU languages and are publicly available on the DUQuE web pages, facilitating further assessment, development, research, adaptation and implementation. The DUQuE team has already received requests from various countries to provide support in national studies of the quality of care.
Baseline assessment of quality and safety measures: As an intermediate output of the more sophisticated statistical analysis, the DUQuE assessments improve our understanding of the baseline performance of EU hospitals on a wide range of measures relevant for improving quality and safety. This includes, for example awareness of hospital leadership of quality and safety issues, predominant cultures according to the Competing Values Framework, the level of team work and safety culture across hospital departments, the level at which patients are involved at hospital and departmental level in quality management functions. For the outcome measures, the project also established levels of compliance with evidence based clinical process measures and surveys on patient reported experience with care. While the study was not designed to report these outcomes in a representative manner for each of the participating countries, the information on baseline performance is an invaluable source to lead further inquiry and action at EU level. For example, quality and safety data is rarely discussed at leadership levels, the current implementation of key patient safety recommendations such as patient identification or safe storage of potassium chloride is still low in EU countries, actual levels of involvement of patients in designing and assessing quality services defy policy rhetoric, and hospitals frequently fail to ensure that patients receive their medication. Some of the findings emanating from the project require urgent further investigation at EU and Member State level.
Investigating previously unspecified relationships between hospital quality management approaches, cultures, clinical processes and patient outcomes: The principal objective of the DUQuE study was to assess associations between hospital and departmental level constructs and clinical processes of care and patient outcomes. Details of these findings are described in section 3. The findings of the project are innovative and greatly contribute to the understanding of how to improve the implementation of quality and safety within the hospital setting. The study demonstrated that well designed quality management systems make a difference and are associated with higher implementation of clinical care processes and patient outcomes. This association is stronger the closer the action is to clinical decision making process. Hospital-wide quality management systems are necessary to establish priorities, structures (i.e. infection committee), procedures (i.e. for the dissemination of knowledge and the update of practice guidelines), data collection and quality monitoring systems. Thus, these systems are an important prerequisite for quality improvement in organisational units; however, they should be designed to be supportive of clinical improvement processes and patient-centred care rather than becoming an end in itself. Importantly, most of the previous research in the topic area emanated from North-America and DUQUE added a clear EU research perspective on the impact of quality improvement to the international research arena that is already being recognized abroad (see further below).
Methodological advances: In studying the multiple relationships between hospital and departmental constructs and the range of clinical process and patient outcome measures, we used a methodological approach hitherto rarely applied in the field of quality improvement research: the use of directed acyclic graphs. This non-parametric approach was used to guide our statistical models, incorporating theory and knowledge derived from previous research findings, rather than building statistical models that ignore the complex relationships between the variables. All analyses were conducted using cutting edge statistical modelling that made full use of the rich data embedded in the multi-level structure of the project.
4.3 Impact on Knowledge translation
The DUQuE Project facilitated knowledge translation as follows:
- building on, using and strengthening European policy and research networks
- facilitating hospital benchmarking
- encouraging further evidence-based development of hospital quality management systems
- a comprehensive publication and dissemination strategy
Building on, using and strengthening European policy and research networks: DUQuE was devised and implemented according to the principles of innovation, parsimony, synergy and sustainability. Collaborations were formed with leading researchers internationally, in particular in the USA and in Australia, to identify relevant experiences and methodological advances towards studying hospital quality improvement systems (e.g. Professor Cohen, University of Boston, Professor Braithwaite, University of New South Wales). Duplication of efforts was avoided by systematically building on existing work (COMAC, BIOMED, ENQUAL, MARQuIS) and by collaborating with existing international efforts to develop and validate performance indicators on quality and safety in health care (for example, through the OECD’s Health Care Quality Indicator programme, the EUNetPas project on patient safety indicator development or the Joint Commission’s indicator project). We invited relevant European clinical societies (such as the European Stroke Organisation (ESO, European Society of Cardiology (ESC), European Board and College of Obstetrics and Gynaecology (EBCOG), European Midwives Association, and the European Orthopaedic Research Society), national authorities and key experts to comment on the devised indicator set using a systematic hearing process. In each of the participating countries, we worked closely with national coordinators in order to harmonize the DUQuE data collection efforts with on-going national and regional activities, ensure participation of hospitals, meeting legal confidentiality requirements, adapting measurement systems and maximizing the local impact of the research project. As part of this process, a series of capacity building workshops were organized with representatives from the participating countries and junior staff in charge of data collection to standardize processes while at the same time ensuring cross-country learning and exchange of ideas. Through these efforts a research infrastructure was established that can be used in further (EU) projects. The final project conference organized in December 2012 in Berlin demonstrated how large the DUQuE network had grown, with more than 110 participants from 21 countries attending, representing senior level representatives of international organizations such as the World Health Organization, hospital managers, purchasers, NGO’s, patient organizations, and academia. As a result of the project, a strong network was formed that represents a major resource for future actions in the field of improving the quality of hospital services. This network already established connections with other EU projects, such as collaboration with the QUASER project regarding guidance to hospitals or a collaboration with the COST-ACTION projects on evaluating the role of professionals in hospital leadership.
Beyond the EU, a close collaboration was established with the University of New South Wales that obtained a grant to replicate the DUQuE study in the context of the Australian health systems. This grant, officially to commence in January 2014, will allow comparisons between EU countries and Australia and institutionalize strong working relationships that will yield future benefits. This is of particular importance as a very strong health services research group exists in New South Wales that can give important stimuli to EU research and policy.
Facilitating hospital benchmarking: The DUQuE project designed an online benchmarking platform as part of the compact with hospitals that participated in the project who assumed a considerable burden of data collection. This platform was released following the final project conference and provides access to the rich comparative data. Following consultations with country coordinators it was designed to display a range of constructs in the domains quality management systems, leadership, clinical effectiveness and patient outcomes. Funnel plots and others forms of graphical presentation are used to position each individual hospital against the others and identify where performance is systematically below or above expect levels (using 95% and 99.8% control limits). To make the comparisons more relevant to local users, benchmarking data can be filtered by a number of criteria, such as hospital type of size. On average, more than 3 accees per hospital to in deptyh analysis of the data has been registered. While the complete uptake of the platform has not been evaluated systematically, the feedback from country coordinators suggest that the type of data has been highly appreciated by individual hospitals and been used to target discussions at leadership levels on the performance of the hospital and potential improvement actions.
Encouraging further evidence-based development of hospital quality management systems: A central output of the DUQuE project is the DUQUE Appraisal Tool. This is a simple-to-read document targeted at decision-makers in hospitals, purchasing agencies and policy makers. It synthesizes both existing knowledge and DUQuE findings to base recommendations on a broad evidence base and to contextualise and enrich the findings of the cross-sectional analysis. The latter is of particular relevance to allow inference regarding causality, direction and magnitude of effects. Specifically, the tool maps the findings emerging from the multiple DUQuE scientific articles currently in production with the results of an overview of systematic review on the effectiveness of quality improvement interventions and the contextual factors affecting their implementation. The overarching messages from this work were distilled in ‘Seven ways to improve quality and safety in hospitals’ and are currently under production as an electronic book written in plain language. This eBook also includes quotes and contributions from country coordinators, experts, exhaustive references to the available evidence, and ‘prompts for action’ to guide local improvements. The release of the eBook is scheduled to coincide with the publication of the DUQuE supplement (see below).
A comprehensive publication and dissemination strategy: The DUQuE team pursues a three-prong publication strategy to maximize scientific impact and dissemination. First, a collection of twelve scientific articles distilling some of the key findings corresponding to the four research objectives of the project is planned to be published in 2014 as a supplement in the International Journal for Quality in Health Care. Secondly, another dozen of scientific papers is currently being prepared for targeted submission to specialized high impact journals. Thirdly, national coordinators are encouraged to use existing data for publications in national medical and health services journals, focusing on the concrete lessons given specific national or regional contexts. The majority of these papers will be made freely available in public domain. All current papers are captured in the project’s publication plan and continue to being monitored by the project team to ensure adherence to the publication criteria accepted by the consortium and to avoid overlap between the papers. All project partners have been provided with the complete dataset to facilitate further exploitation of data, before it will be released to the public domain according to the time frame specified in the consortium agreement. In addition to the scientific publications, the results of the project were disseminated via the dedicated project webpage (www.duque.eu access statistics in section 5) and in a large number of national and international meetings, symposia and conferences, addressing different target groups such as policy makers, hospital managers, and the scientific community. Details of the dissemination efforts so far are summarized in section B of this report. Country coordinators dissemination efforts are also included under coordinator identification in the report.
4.4 Broader policy issues
The DUQuE Project addressed broader policy issues as follows:
- Choosing and contracting hospital services across EU borders
- Towards a debate on patient involvement in hospital functions
- Relevance to a wide group of stakeholders
Choosing and contracting hospital services across EU borders: The DUQuE project demonstrated that the implementation of various quality and safety recommendations is highly variable between hospitals. Agencies purchasing hospital services may want to include in their contracts that hospitals adhere to these recommendations (see section 3). Likewise, the findings strongly suggest that targeted quality management approaches at departmental are positively associated with the complying with clinical process indicators based. This knowledge, too, may be translated into specific contractual requirements. For patients, there is currently limited information to compare the quality and safety performance of hospitals in the EU. This is in contrast to Directive 2011/24/EU of the European Parliament and The Council of the European Union (The Council) on the rights of EU patients to receive treatment in another EU country that stipulates that patients should have access to information on quality and safety of care to make an informed choice about their treatment and compare providers. Some countries in the EU produce comparative reports on hospital performance, however, the data often differs in terms of which hospitals participate, the type of service covered, indicator definitions, periodicity of reporting, feedback mechanism. The DUQuE project investigated the feasibility of using routine administrative hospital databases to facilitate such comparative analysis. In national contexts, it is increasingly being used to compare service utilization and outcomes, including mortality and complications, between hospitals. While at current hospital administrative databases do not seem to be an appropriate source of information for comparisons of hospitals performance in the EU, given that this is a dynamic field, it might be possible in the medium to long term. Research should target an in-depth comparative analysis of the issues of using administrative data for comparisons of hospital performances in EU countries.
Towards a debate on patient involvement in hospital functions: There is currently a strong rhetoric in policy circles at national and EU levels that support the involvement of patients in a wide range of functions, including participating in the design and assessment of hospital services. This rhetoric is motivated by ethical considerations on the role of users in health care, but also by pragmatic considerations assuming that such involvement can easily be realized, is intrinsically beneficial and leads to better and more patient centred care delivery. The DUQuE project can make some important contributions to this debate as various measurements of patient involvement in quality functions were embedded in the project. First, the project established that the involvement of patients in quality management in hospitals is low in EU hospitals. Secondly, such involvement is more pronounced at higher organizational levels (hospital wide strategies) rather than lower units (departments) where a concrete impact on the organization of services is more likely. Thirdly, our analysis suggest that the involvement of patients in quality functions is inversely associated with the implementation of patient centred care strategies (such as having patient rights policies at ward level), and neither are associated with the experiences of patients in the four departments regarding overall perception of the quality of care or perception regarding the involvement in clinical decision making processes. These results should be not interpreted as a lack of effect of involvement patients in quality functions; neither should they question the general principles towards greater participation and transparency in health care. However, the findings do suggests that the functions are not well understood and that future debate should go beyond the principles, but rather generate case stories and provide detailed guidance regarding the best way of involving patients in quality functions.
Relevance to a wide group of stakeholders: The DUQuE project addresses a number of fundamental to the improvement of health care quality. As such, they are of high relevance to multiple stakeholders, including the Commission, the Directorate-General for Research, the Directorate-General for Health and Consumer Affairs and the Directorate-General for Employment, Social Affairs and Equal Opportunities and the Council of Europe. Outside the Commission, main target users are Members States, hospital managers and clinicians with leadership functions, and purchasing agencies, as well as international organizations such as the World Health Organization or the Organization for Economic Cooperation and Development. The outputs described above, in particular the guidance tool for decision makers will be further presented to the broad group of stakeholders and we expect this to lead to broader health policy questions regarding the strategies to improve quality in health care.
List of Websites:
5. The address of the project public website, if applicable as well as relevant contact details
Project coordinator:
Prof Rosa Sunol MD, PhD
Director of Avedis Donabedian Research Institute – Universitat Autónoma de Barcelona
C/Provenza 293. Pral. - 08037 Barcelona.
Tel. +34 932076608
fad@fadq.org
www.fadq.org
Project co-chair:
Oliver Groene, MA, MSc, PhD
Lecturer in Health Services Research
Faculty of Public Health and Policy
London School of Hygiene & Tropical Medicine
15-17 Tavistock Place, London WC1H 9SH, UK
Tel: +44 (0) 207 927 2785
Fax: +44 (0) 207 927 2701
http://www.lshtm.ac.uk/people/groene.oliver
Project website: www.duque.eu
Contact details:
Webmaster: European Hospital and Healthcare Federation (HOPE)
Avenue Marnix 30 - 1000 Brussels
Tel: +32 2 742 13 21 – Fax: +32 2 742 13 25
E-mail: eu@hope.be
Webdesign: Dominique Winter, Photo by cm11/photocase.com