| Literature DB >> 18836063 |
F Davidoff1, P Batalden, D Stevens, G Ogrinc, S Mooney.
Abstract
In 2005, draft guidelines were published for reporting studies of quality improvement interventions as the initial step in a consensus process for development of a more definitive version. This article contains the full revised version of the guidelines, which the authors refer to as SQUIRE (Standards for QUality Improvement Reporting Excellence). This paper also describes the consensus process, which included informal feedback from authors, editors and peer reviewers who used the guidelines; formal written commentaries; input from a group of publication guideline developers; ongoing review of the literature on the epistemology of improvement and methods for evaluating complex social programmes; a two-day meeting of stakeholders for critical discussion and debate of the guidelines' content and wording; and commentary on sequential versions of the guidelines from an expert consultant group. Finally, the authors consider the major differences between SQUIRE and the initial draft guidelines; limitations of and unresolved questions about SQUIRE; ancillary supporting documents and alternative versions that are under development; and plans for dissemination, testing and further development of SQUIRE.Entities:
Mesh:
Year: 2008 PMID: 18836063 PMCID: PMC2773518 DOI: 10.1136/qshc.2008.029066
Source DB: PubMed Journal: Qual Saf Health Care ISSN: 1475-3898
SQUIRE guidelines (Standards for QUality Improvement Reporting Excellence)*
| Text section; item number and name | Section or item description |
| Title and abstract | Did you provide clear and accurate information for finding, indexing, and scanning your paper? |
| 1 Title | (a) Indicates the article concerns the improvement of quality (broadly defined to include the safety, effectiveness, patient-centredness, timeliness, efficiency and equity of care) |
| (b) States the specific aim of the intervention | |
| (c) Specifies the study method used (for example, “A qualitative study,” or “A randomised cluster trial”) | |
| 2 Abstract | Summarises precisely all key information from various sections of the text using the abstract format of the intended publication |
| Introduction | Why did you start? |
| 3 Background knowledge | Provides a brief, non-selective summary of current knowledge of the care problem being addressed, and characteristics of organisations in which it occurs |
| 4 Local problem | Describes the nature and severity of the specific local problem or system dysfunction that was addressed |
| 5 Intended improvement | (a) Describes the specific aim (changes/improvements in care processes and patient outcomes) of the proposed intervention |
| (b) Specifies who (champions, supporters) and what (events, observations) triggered the decision to make changes, and why now (timing) | |
| 6 Study question | States precisely the primary improvement-related question and any secondary questions that the study of the intervention was designed to answer |
| Methods | What did you do? |
| 7 Ethical issues | Describes ethical aspects of implementing and studying the improvement, such as privacy concerns, protection of participants’ physical wellbeing and potential author conflicts of interest, and how ethical concerns were addressed |
| 8 Setting | Specifies how elements of the local care environment considered most likely to influence change/improvement in the involved site or sites were identified and characterised |
| 9 Planning the intervention | (a) Describes the intervention and its component parts in sufficient detail that others could reproduce it |
| (b) Indicates main factors that contributed to choice of the specific intervention (for example, analysis of causes of dysfunction; matching relevant improvement experience of others with the local situation) | |
| (c) Outlines initial plans for how the intervention was to be implemented—for example, what was to be done (initial steps; functions to be accomplished by those steps; how tests of change would be used to modify intervention) and by whom (intended roles, qualifications, and training of staff) | |
| 10 Planning the study of the intervention | (a) Outlines plans for assessing how well the intervention was implemented (dose or intensity of exposure) |
| (b) Describes mechanisms by which intervention components were expected to cause changes, and plans for testing whether those mechanisms were effective | |
| (c) Identifies the study design (for example, observational, quasi-experimental, experimental) chosen for measuring impact of the intervention on primary and secondary outcomes, if applicable | |
| (d) Explains plans for implementing essential aspects of the chosen study design, as described in publication guidelines for specific designs, if applicable (see, for example, | |
| (e) Describes aspects of the study design that specifically concerned internal validity (integrity of the data) and external validity (generalisability) | |
| 11 Methods of evaluation | (a) Describes instruments and procedures (qualitative, quantitative or mixed) used to assess (a) the effectiveness of implementation, (b) the contributions of intervention components and context factors to effectiveness of the intervention and (c) primary and secondary outcomes |
| (b) Reports efforts to validate and test reliability of assessment instruments | |
| (c) Explains methods used to assure data quality and adequacy (for example, blinding; repeating measurements and data extraction; training in data collection; collection of sufficient baseline measurements) | |
| 12 Analysis | (a) Provides details of qualitative and quantitative (statistical) methods used to draw inferences from the data |
| (b) Aligns unit of analysis with level at which the intervention was implemented, if applicable | |
| (c) Specifies degree of variability expected in implementation, change expected in primary outcome (effect size) and ability of study design (including size) to detect such effects | |
| (d) Describes analytical methods used to demonstrate effects of time as a variable (for example, statistical process control) | |
| Results | What did you find? |
| 13 Outcomes | (a) Nature of setting and improvement intervention |
| (i) Characterises relevant elements of setting or settings (for example, geography, physical resources, organisational culture, history of change efforts) and structures and patterns of care (for example, staffing, leadership) that provided context for the intervention | |
| (ii) Explains the actual course of the intervention (for example, sequence of steps, events or phases; type and number of participants at key points), preferably using a time-line diagram or flow chart | |
| (iii) Documents degree of success in implementing intervention components | |
| (iv) Describes how and why the initial plan evolved, and the most important lessons learned from that evolution, particularly the effects of internal feedback from tests of change (reflexiveness) | |
| (b) Changes in processes of care and patient outcomes associated with the intervention | |
| (i) Presents data on changes observed in the care delivery process | |
| (ii) Presents data on changes observed in measures of patient outcome (for example, morbidity, mortality, function, patient/staff satisfaction, service utilisation, cost, care disparities) | |
| (iii) Considers benefits, harms, unexpected results, problems, failures | |
| (iv) Presents evidence regarding the strength of association between observed changes/improvements and intervention components/context factors | |
| (v) Includes summary of missing data for intervention and outcomes | |
| Discussion | What do the findings mean? |
| 14 Summary | (a) Summarises the most important successes and difficulties in implementing intervention components, and main changes observed in care delivery and clinical outcomes |
| (b) Highlights the study’s particular strengths | |
| 15 Relation to other evidence | Compares and contrasts study results with relevant findings of others, drawing on broad review of the literature; use of a summary table may be helpful in building on existing evidence |
| 16 Limitations | (a) Considers possible sources of confounding, bias or imprecision in design, measurement, and analysis that might have affected study outcomes (internal validity) |
| (b) Explores factors that could affect generalisability (external validity)—for example, representativeness of participants; effectiveness of implementation; dose-response effects; features of local care setting | |
| (c) Addresses likelihood that observed gains may weaken over time, and describes plans, if any, for monitoring and maintaining improvement; explicitly states if such planning was not done | |
| (d) Reviews efforts made to minimise and adjust for study limitations | |
| (e) Assesses the effect of study limitations on interpretation and application of results | |
| 17 Interpretation | (a) Explores possible reasons for differences between observed and expected outcomes |
| (b) Draws inferences consistent with the strength of the data about causal mechanisms and size of observed changes, paying particular attention to components of the intervention and context factors that helped determine the intervention’s effectiveness (or lack thereof) and types of settings in which this intervention is most likely to be effective | |
| (c) Suggests steps that might be modified to improve future performance | |
| (d) Reviews issues of opportunity cost and actual financial cost of the intervention | |
| 18 Conclusions | (a) Considers overall practical usefulness of the intervention |
| (b) Suggests implications of this report for further studies of improvement interventions | |
| Other information | Were there other factors relevant to the conduct and interpretation of the study? |
| 19 Funding | Describes funding sources, if any, and role of funding organisation in design, implementation, interpretation and publication of study |
*These guidelines provide a framework for reporting formal, planned studies designed to assess the nature and effectiveness of interventions to improve the quality and safety of care. It may not always be appropriate or even possible to include information about every numbered guideline item in reports of original studies, but authors should at least consider every item in writing their reports. Although each major section (that is, Introduction, Methods, Results and Discussion) of a published original study generally contains some information about the numbered items within that section, information about items from one section (for example, the Introduction) is also often needed in other sections (for example, the Discussion).
Key features of the SQUIRE guidelines that differ from the initial draft
| SQUIRE section and item | Added or changed feature of SQUIRE |
| Title and abstract | Focuses on the accessibility/retrievability of your article |
| 1 Title | Specifies what is meant by improvement, aim of intervention, study methods |
| 2 Abstract | Separated from title; elaborates on abstract format |
| Introduction | Focuses on the rationale of your study |
| 3 Background knowledge | Asks for characteristics of organisations in which the problem occurs |
| 4 Local problem | No change |
| 5 Intended improvement | More specific about improvement aim, plus what triggered the decision to make changes |
| 6 Study question | Distinguishes the study question from the aim of the improvement |
| Methods | Focuses on what you did |
| 7 Ethical issues | Added item: addresses concrete ethical issues rather than administrative ethics review |
| 8 Setting | Highlights context features relevant to why an intervention succeeds |
| 9 Planning the intervention | Requests specifics on intervention components, factors in choice of the intervention, initial plans for implementation |
| 10 Planning the study of the intervention | Added item: separates study of the interventions from the improvement methods themselves; requests specifics on intervention dose and mechanism, study design, issues of internal and external validity |
| 11 Methods of evaluation | Requests specifics on qualitative and quantitative methods; implementation effectiveness, mechanism, primary and secondary outcomes; data quality |
| 12 Analysis | Requests specifics on qualitative and quantitative approaches; appropriateness of the unit of analysis; power |
| Results | Focuses on what you found |
| 13 Outcomes | Includes characteristics of setting relevant to intervention mechanism; requests specifics on success of implementation, strength of association between intervention and outcomes, missing data |
| Discussion | Focuses on what your findings mean |
| 14 Summary | Highlights the study’s strengths |
| 15 Relation to other evidence | Suggests use of summary table of available published evidence |
| 16 Limitations | Requests specifics on maintenance of improvement and on challenges to internal and external validity |
| 17 Interpretation | Expands on the differences between observed and expected results; strength of the data; influence of context factors; modifications that might increase the intervention’s effectiveness; opportunity and actual financial costs |
| 18 Conclusions | No change |
| Other information | Focuses on factors external to the study itself that could affect findings and conclusions |
| 19 Funding | Requests specifics on funding sources and role of funders in conduct of the study |