Literature DB >> 23793669

Evidence-based commissioning in the English NHS: who uses which sources of evidence? A survey 2010/2011.

Aileen Clarke1, Sian Taylor-Phillips, Jacky Swan, Emmanouil Gkeredakis, Penny Mills, John Powell, Davide Nicolini, Claudia Roginski, Harry Scarbrough, Amy Grove.   

Abstract

OBJECTIVES: To investigate types of evidence used by healthcare commissioners when making decisions and whether decisions were influenced by commissioners' experience, personal characteristics or role at work.
DESIGN: Cross-sectional survey of 345 National Health Service (NHS) staff members.
SETTING: The study was conducted across 11 English Primary Care Trusts between 2010 and 2011. PARTICIPANTS: A total of 440 staff involved in commissioning decisions and employed at NHS band 7 or above were invited to participate in the study. Of those, 345 (78%) completed all or a part of the survey. MAIN OUTCOME MEASURES: Participants were asked to rate how important different sources of evidence (empirical or practical) were in a recent decision that had been made. Backwards stepwise logistic regression analyses were undertaken to assess the contributions of age, gender and professional background, as well as the years of experience in NHS commissioning, pay grade and work role.
RESULTS: The extent to which empirical evidence was used for commissioning decisions in the NHS varied according to the professional background. Only 50% of respondents stated that clinical guidelines and cost-effectiveness evidence were important for healthcare decisions. Respondents were more likely to report use of empirical evidence if they worked in Public Health in comparison to other departments (p<0.0005, commissioning and contracts OR   0.32, 95%CI   0.18 to 0.57, finance OR  0.19, 95%CI 0.05 to 0.78, other departments OR 0.35, 95%CI 0.17 to 0.71) or if they were female (OR 1.8 95% CI 1.01 to 3.1) rather than male. Respondents were more likely to report use of practical evidence if they were more senior within the organisation (pay grade 8b or higher OR 2.7, 95%CI 1.4 to 5.3, p=0.004 in comparison to lower pay grades).
CONCLUSIONS: Those trained in Public Health appeared more likely to use external empirical evidence while those at higher pay scales were more likely to use practical evidence when making commissioning decisions. Clearly, National Institute for Clinical Excellence (NICE) guidance and government publications (eg, National Service Frameworks) are important for decision-making, but practical sources of evidence such as local intelligence, benchmarking data and expert advice are also influential. New Clinical Commissioning Groups will need a variety of different evidence sources and expert involvement to ensure that effective decisions are made for their populations.

Entities:  

Keywords:  Health Services Administration & Management; Public Health

Year:  2013        PMID: 23793669      PMCID: PMC3657651          DOI: 10.1136/bmjopen-2013-002714

Source DB:  PubMed          Journal:  BMJ Open        ISSN: 2044-6055            Impact factor:   2.692


This paper investigates the types of evidence (empirical or practical) used to make commissioning decisions in Primary Care Trusts (PCTs) in England. The extent to which empirical evidence is used for commissioning decisions in the National Health Service (NHS) varies according to the professional background. Those trained in Public Health and working in commissioning were more likely to report using empirical evidence; senior commissioners were more likely to use practical local evidence. Although National Institute for Health and Clinical Excellence guidance and government publications are important for decision-making, the influence of local intelligence, benchmarking data and expert advice cannot be ignored. This is a nationwide study of 345 representative commissioning staff in the NHS from 11 PCTs in England with a high response rate (78%). It represents an important resource for those designing and undertaking commissioning decision-making with clear implications for an issue under significant flux in the NHS due to the introduction of Clinical Commissioning Groups. The study would benefit from corroboration by further research using a prospective design to follow individual decisions through commissioning processes.

Strengths and limitations of this study

Variation exists in the sources of evidence used for decision-making in commissioning. Professional background, gender and employment status (seniority) had a significant impact on the choice of evidence used for decision-making in NHS Primary Care Trusts.

Introduction

In England, local healthcare commissioners plan, fund and review health service spending to ensure that sufficient services are available for the defined populations for whom they are responsible. Until recently, commissioning departments were located within Primary Care Trusts (PCTs).1 From 1 April 2013, PCTs in England are being abolished and responsibility for commissioning health services is moving to newly formed Clinical Commissioning Groups (CCGs). There are currently 229 planned CCGs who will have responsibility for commissioning services for their local populations.1 Commissioning is a complex process undertaken by individuals from a variety of professional backgrounds and disciplines, including medicine, public health, nursing, the allied health professions, finance, accounting, contracting and business studies.2 Commissioners must take into account a number of factors such as local need, availability of resources and relevant available information and evidence. Little is known about how personal characteristics such as employment status and professional background can influence the attitudes and practices of individuals responsible for making healthcare commissioning decisions. Research on evidence use suggests that evidence is defined in various ways and is used differently by the individuals involved.3–5 This difference in use is even more clear when evidence requires assimilation in order to be used in practice,6–8 and has implications for the creation and composition of CCGs in National Health Service (NHS) in England. The purpose of this research was to examine the types of evidence local commissioners of health services used in practice during their own recent decision-making processes and variations in this in relation to the characteristics, roles, professional background and experience of the decision-maker.

Methods

Survey development

A cross-sectional questionnaire survey of commissioners working in PCTs in England was conducted. Topic areas and questions were derived from published surveys, literature reviews and our own case study evaluation of commissioning processes in four PCT sites9 (see online supplementary appendix 1 for the survey). Prepiloting and piloting of the questionnaire were conducted with purposive samples of participants drawn from local NHS organisations and results were used to develop and refine the questionnaire and the process of administration. Participants were asked to provide demographic and work role details. They were invited to identify a commissioning decision they had recently taken part in and to answer questions regarding the size and nature of that decision. Commissioners might work as individuals but more usually would be reporting on decisions taken as part of a group. In addition, participants were asked to what extent various sources of evidence were important and used in the decision-making process. These were categorised as ‘very important’, ‘quite important’ or of ‘limited importance’, ‘not important’, ‘not used’. The types of evidence were classified into two categories: empirical/external (eg, clinical guidelines) and practical/internal (expert advice from colleagues) based on the Weatherly scale,10 as shown in online supplementary appendix 2.

Data collection

A sample size calculation was undertaken to allow us to detect a 10–15% difference in proportions (with 80% power and a 95% CI) in response to professional work roles (clinically vs non-clinically qualified commissioners). This indicated that approximately 300 respondents would be required. Following discussion with the study sites, we estimated that we would need to invite approximately 450–500 potential participants and 10–15 PCTs, excluding pilot sites. Stratified random sampling of 15 PCTs from a total of 143 eligible sites was conducted. Contact details of all staff employed at NHS grade 7 or above who were involved in commissioning decision-making were obtained from each identified PCT. This included staff from Departments of Public Health, Finance, Purchasing, Commissioning, Contract monitoring and Information Services as well as the executive team. Participants were given information sheets and details of how to participate. They could complete the survey via face-to-face meetings held at their office or by email using an online electronic questionnaire software host. Four additional reminders were sent to non-responders at two-weekly intervals. Questionnaires completed both manually and electronically were anonymised.

Data analysis

Logistic regression analysis was undertaken using SPSS. Responses were dichotomised so that the dependent variable was binary: whether the median score assigned to questions about the importance of sources of empirical evidence was ‘very/quite’ important, or of ‘limited importance/not important/not used’. Missing data for individual evidence sources were not included in calculation of the median for each respondent. Backwards stepwise logistic regression analyses were used with the following predictors: age, gender, years' experience in NHS commissioning, whether the respondent was a qualified medical doctor, pay (dichotomised to grade 7/8a or grade 8b/8c/8d/9 or above) and work role (Public Health, Commissioning and Contracts, Finance or other). All six predictors were entered in the model and assessed at each step against criteria to remain in the model (p<0.1). The analysis stopped when all predictors remaining in the model met the criteria. Model evaluation, goodness of fit and validation of predicted probabilities were calculated using the likelihood ratio test, the Hosmer-Lemeshow test and the c-statistic. The Wald statistic was used to determine whether each independent variable was a significant contributor. We obtained ethics approval from the Warwickshire Research Ethics Committee (09/H1211/63) and local ethics and research governance approval for each PCT included in the study.

Results

Participant characteristics

Fifteen PCTs were invited to participate in the study. In the first wave six PCTs accepted and were randomised by strata. In the second wave of recruitment 5 more PCTs accepted giving a final total of 11 PCTs. Participating PCTs were representative of all PCTs in England in relation to demographic characteristics, general practitioner list size and Care Quality Commission (CQC) ratings. In total, the survey was circulated to 440 individuals across the 11 PCTs and 345 responded (a response rate of 78%). The lowest PCT response rate was 72%. The median age band of individual participants was 45–54 years and 63% were female, almost exactly replicating the characteristics of the NHS Information Centre Infrastructure Support Staff Statistics in England.10 Thirty-one percent (n=107) of respondents were clinically qualified (medical, nursing or allied health professionals), although only 1% (n=3) were currently also primarily employed in a clinical setting. Sixty-nine percent (n=239) held a higher degree (Masters, NHS management qualification, clinical qualification or a PhD). The largest single group of respondents (43%, n=149) were working in Commissioning and Contracts roles, with 33% (n=114) working in Public Health roles. Seven percent (n=24) of respondents worked in the Finance department and 15% of respondents (n=52) were spread across Other Commissioning Settings. Forty-seven percent of participants (163) had 5 years’ or less experience in healthcare commissioning. An employee's salary ranged from £34 000 to over £100 000, with the median salary at approximately £45 000–£55 000, equivalent to Band 8b in NHS pay scales.

Decision-making characteristics

The most common type of decision reported was ‘changing the organisation or design of a particular service’ and this was selected by 189 participants (55%). This was followed by a ‘major decision on strategic direction’ (24%) and Individual Funding Requests (IFRs) (9%). The remaining 12% did not identify a relevant decision and were excluded from subsequent analysis. The specialty or service area within which the decision took place varied considerably. The largest single category of decisions affected between 1000 and 100 000 people (47%), and these were estimated to cost between £100 000 and £1 000 000 (40%). Figure 1 shows the reported use of different types of evidence in the decisions described. It presents the data as percentages to demonstrate the variation in the importance of the 17 factors that were assessed. The 10 empirical evidence factors (based on the Weatherly scale11) were consistently scored as being of ‘limited importance, not important or not used’ as compared to the practical evidence factors (based on a work undertaken by Gkeradakis et al7). Clinical guidelines, cost-effectiveness and benchmarking data achieved only 50% importance rating.
Figure 1

Responses to questions on use of evidence sources. Reproduced with permission from Swan.10

Responses to questions on use of evidence sources. Reproduced with permission from Swan.10

Use and importance of empirical evidence

Logistic regression analysis (table 1) revealed that the significant predictors of the importance assigned to use of empirical evidence were gender and work role. Compared to their Public Health colleagues, respondents working in other departments were less likely to report use of empirical evidence (commissioning and contracts (OR   0.32, 95%CI  0.18 to 0.57); finance (OR 0.19, 95%CI 0.05 to 0.78); other departments (OR 0.35, 95%CI 0.17 to 0.71)). Female respondents were more likely to report a higher importance of empirical evidence in their commissioning decisions, in comparison to their male counterparts (OR 1.8 95% CI 1.01 to 3.1).
Table 1

Logistic regression analysis of the importance assigned to empirical evidence (median score of quite/very important in comparison to limited importance/not important/did not use)

VariableOR (95% CI)
Significance
Female gender1.779(1.007 to 3.144)0.047
Role<0.005
Role (commissioning and contracts)0.320(0.180 to 0.570)<0.005
Role (finance)0.194(0.048 to 0.779)0.21
Role (other)0.349(0.171 to 0.713)0.04
Constant1.3170.337

Reference category for role is Public Health. Variable(s) entered on step 1: gender, age, years' experience of NHS commissioning, work role, pay and whether the respondent has any medical qualifications. Likelihood ratio test χ2 (2)=25.3, p<0.0005, Cox and Snell R2=0.09, Nagelkerke R2=0.12, Hosmer and Lemeshow χ2 (4)=1.1, p=0.9, c-statistic=0.65.

Logistic regression analysis of the importance assigned to empirical evidence (median score of quite/very important in comparison to limited importance/not important/did not use) Reference category for role is Public Health. Variable(s) entered on step 1: gender, age, years' experience of NHS commissioning, work role, pay and whether the respondent has any medical qualifications. Likelihood ratio test χ2 (2)=25.3, p<0.0005, Cox and Snell R2=0.09, Nagelkerke R2=0.12, Hosmer and Lemeshow χ2 (4)=1.1, p=0.9, c-statistic=0.65. The model was a good fit to the data (Hosmer and Lemeshow χ2 (4)=1.1, p=0.9), and model predictions showed reasonable agreement with actual outcomes (c-statistic=0.65). No evidence was found that clinical training was associated with a greater likelihood of use of empirical evidence.

Use and importance of practical evidence

Logistic regression analysis (table 2) showed that the only significant predictor of the importance assigned to the use of practical evidence was pay grade. Those participants at NHS pay grade 8b or above were more likely to report higher importance of practical evidence in their commissioning decisions (OR 2.7 95%CI 1.4 to 5.3). Compared to their Public Health colleagues, respondents working in Commissioning and Contracts departments showed a trend towards being more likely to report use of practical evidence (OR 1.8, 95%CI 0.8 to 4.1), although this finding was not significant. The model was a good fit to the data (Hosmer and Lemeshow χ2 (7)=6.5, p=0.5), and model predictions showed reasonable agreement with actual outcomes (c-statistic=0.68). No evidence was found that clinical training was associated with a greater likelihood of use of practical evidence.
Table 2

Logistic regression analysis of the importance assigned to practical evidence (median score of quite/very important in comparison to limited importance/not important/did not use)

VariableOR (95% CI)
Significance
Female gender1.808(0.909 to 3.595)0.092
Role0.004
Role (commissioning and contracts)1.82(0.815 to 4.067)0.144
Role (finance)0.492(0.127 to 1.899)0.303
Role (other)0.391(0.174 to 0.879)0.023
Pay grade 8b or above2.708(1.37 to 5.342)0.004
Constant1.8430.125

Reference category for role is Public Health. Reference category for pay grade is band 7 or 8a. Variable(s) entered on step 1: gender, age, years' experience of NHS commissioning, work role, pay and whether the respondent has any medical qualifications. Likelihood ratio test χ2 (5)=22.8, p<0.0005, Cox and Snell R2=0.08, Nagelkerke R2=0.13, Hosmer and Lemeshow χ2 (7)=6.5, p=0.5, c-statistic=0.68.

Logistic regression analysis of the importance assigned to practical evidence (median score of quite/very important in comparison to limited importance/not important/did not use) Reference category for role is Public Health. Reference category for pay grade is band 7 or 8a. Variable(s) entered on step 1: gender, age, years' experience of NHS commissioning, work role, pay and whether the respondent has any medical qualifications. Likelihood ratio test χ2 (5)=22.8, p<0.0005, Cox and Snell R2=0.08, Nagelkerke R2=0.13, Hosmer and Lemeshow χ2 (7)=6.5, p=0.5, c-statistic=0.68.

Discussion

We undertook a cross-sectional survey in a representative sample of 345 commissioners in England to examine the use of empirical and practical evidence in commissioning decisions in the NHS. The aim of the research was to determine the types of evidence local commissioners of health services use in practice and to investigate whether the characteristics of the decision-maker and the decision size significantly influence the decision-making process. The extent to which empirical evidence is used for commissioning decision-making in the NHS varied according to different types of commissioners. Female commissioners and those specialised in Public Health were more likely than those working in commissioning and contracts, finance or other job roles to report using empirical evidence in commissioning decisions. Those at a higher pay scale were more likely than those in other roles to assign a greater importance to the use of practical evidence. Those with clinical training did not appear more likely to use either empirical evidence.

Strengths and limitations

Considerable effort was made to obtain and retain participation from PCTs. A 78% response rate in participating PCTs was achieved and the resulting sample was representative of relevant NHS staff. Eleven PCTs were included in the final sample from an original target of fifteen. Refusal to participate was typically made on the grounds of extensive staff change and structural reorganisation occurring as a result of nationally driven organisational changes in PCTs and the NHS at this time. There was substantial variability in the organisational structures of participating PCTs. Because there was no register of PCT staff, we asked PCTs to provide their own lists of relevant participants. This was not independently verifiable, but we have no reason to believe that staff involved in commissioning were systematically omitted from the lists. We designed our own questions, rooted in an extensive qualitative investigation, to capture the types of evidence commissioners were using; these were fully piloted, but formal assessments of their validity and reliability were not undertaken. The findings may be subject to recall bias, although this should not differentially affect types of staff and the evidence sources that they used. A more serious bias might be social desirability bias where self-reporting of greater or lesser use of either empirical or practical evidence might be reported than in fact occurred. Although we are not able to measure the presence of this, immediate anonymisation of the questionnaires completed at the PCT sites and the reassurances of anonymity given in relation to electronic completion of the questionnaire should have reduced the likelihood of this type of bias. The design of this study did not allow us to follow up on decisions to investigate their outcomes or determine if there was an overlap between the decisions described by individual respondents.

Implications for the changing NHS

Previous research has identified several sources of external empirical evidence which are important in decision-making for commissioning in healthcare, including NICE guidance, National Service Frameworks and secondary research. This research further suggests that practical sources, such as local public health intelligence, expert advice and benchmarking data, are important and are actively being used by commissioners working in the NHS. It was interesting to discover that apparently 50% of healthcare decisions made by commissioners in our study were not based on clinical evidence, and 50% were not based on the consideration of cost-effectiveness of the various decision options. This contradicts the evidence-based decision-making approach which aims to resolve uncertainty about which treatments, procedure and interventions represent the best quality care for patients and which offer the best value for money for the NHS. There is an extensive and diverse discourse on the need for ‘evidence-based’ policy and commissioning.4–7 Critical to this debate is a shared definition of ‘evidence’. We have found reports of empirical and practical evidence being used differently by different professional groups in real commissioning decisions undertaken in PCT settings. Those trained in Public Health and women involved in commissioning appeared more likely to use external empirical evidence in decisions than others working in commissioning. Those at higher pay scales were more likely to use practical evidence. This could result in variations in the outcomes of the decision-making processes according to the professional group and employment status of the collective decision-makers. This has significant implications for selection of individuals in the new CCGs. Compared to PCTs, CCGs are designed to have a higher senior representation of primary care practitioners at board level, but will also have more limited representation of those with a Public Health background. Public Health support will be changing as the specialty move away from the NHS into local authorities. From our findings, these changes are likely to affect the types of evidence which these organisations will use in their commissioning decisions. New CCGs will have to recognise the need for a variety of different sources of inputs and evidence in their commissioning plans and ensure that they have the appropriate mix of advice and support to allow them to make the best decision for their populations.

Conclusions

Variation exists in the sources of evidence used for decision-making in healthcare commissioning in the NHS in England. Professional background, seniority and gender had a significant impact on the choice of evidence used for decision-making across the NHS PCTs included in the study. Individuals trained in Public Health appeared more likely to use external empirical evidence and those at higher pay scales were more likely to use practical evidence when making commissioning decisions. New Clinical Commissioning Groups will need a variety of different professional experts and sources of evidence to ensure that effective commissioning decisions are made for their populations.
  5 in total

1.  Evidence-based management: from theory to practice in health care.

Authors:  K Walshe; T G Rundall
Journal:  Milbank Q       Date:  2001       Impact factor: 4.911

Review 2.  Using evidence in the development of local health policies. Some evidence from the United Kingdom.

Authors:  Helen Weatherly; Michael Drummond; Dave Smith
Journal:  Int J Technol Assess Health Care       Date:  2002       Impact factor: 2.188

Review 3.  Practice based commissioning: applying the research evidence.

Authors:  Judith Smith; Jennifer Dixon; Nicholas Mays; Hugh McLeod; Nick Goodwin; Siobhan McClelland; Richard Lewis; Sally Wyke
Journal:  BMJ       Date:  2005-12-10

4.  Why 'knowledge transfer' is misconceived for applied social research.

Authors:  Huw Davies; Sandra Nutley; Isabel Walter
Journal:  J Health Serv Res Policy       Date:  2008-07

5.  Mind the gap: Understanding utilisation of evidence and policy in health care management practice.

Authors:  Emmanouil Gkeredakis; Jacky Swan; John Powell; Davide Nicolini; Harry Scarbrough; Claudia Roginski; Sian Taylor-Phillips; Aileen Clarke
Journal:  J Health Organ Manag       Date:  2011
  5 in total
  13 in total

1.  Researching Healthcare Availability for Probation Clients: An Illustration of Methodological Challenges and Lessons in Surveying Organisations.

Authors:  Coral Sirdifield; David Denney; Rebecca Marples; Charlie Brooker
Journal:  Br J Community Justice       Date:  2019-11-20

2.  Relational aspects of building capacity in economic evaluation in an Australian Primary Health Network using an embedded researcher approach.

Authors:  Donella Piper; Christine Jorm; Rick Iedema; Nicholas Goodwin; Andrew Searles; Lisa McFayden
Journal:  BMC Health Serv Res       Date:  2022-06-22       Impact factor: 2.908

3.  Mandatory implementation of NICE Guidelines for the care of bipolar disorder and other conditions in England and Wales.

Authors:  Richard Morriss
Journal:  BMC Med       Date:  2015-09-30       Impact factor: 8.775

4.  Making the case for a fracture liaison service: a qualitative study of the experiences of clinicians and service managers.

Authors:  Sarah Drew; Rachael Gooberman-Hill; Andrew Farmer; Laura Graham; M Kassim Javaid; Cyrus Cooper; Andrew Judge
Journal:  BMC Musculoskelet Disord       Date:  2015-10-01       Impact factor: 2.362

5.  Coproduction in commissioning decisions: is there an association with decision satisfaction for commissioners working in the NHS? A cross-sectional survey 2010/2011.

Authors:  Sian Taylor-Phillips; Aileen Clarke; Amy Grove; Jacky Swan; Helen Parsons; Emmanouil Gkeredakis; Penny Mills; John Powell; Davide Nicolini; Claudia Roginski; Harry Scarbrough
Journal:  BMJ Open       Date:  2014-06-05       Impact factor: 2.692

6.  Effects of a demand-led evidence briefing service on the uptake and use of research evidence by commissioners of health services: protocol for a controlled before and after study.

Authors:  Paul M Wilson; Kate Farley; Carl Thompson; Duncan Chambers; Liz Bickerdike; Ian S Watt; Mark Lambert; Rhiannon Turner
Journal:  Implement Sci       Date:  2015-01-09       Impact factor: 7.327

Review 7.  Evidence use in decision-making on introducing innovations: a systematic scoping review with stakeholder feedback.

Authors:  Simon Turner; Danielle D'Lima; Emma Hudson; Stephen Morris; Jessica Sheringham; Nick Swart; Naomi J Fulop
Journal:  Implement Sci       Date:  2017-12-04       Impact factor: 7.327

8.  Does access to a demand-led evidence briefing service improve uptake and use of research evidence by health service commissioners? A controlled before and after study.

Authors:  Paul M Wilson; Kate Farley; Liz Bickerdike; Alison Booth; Duncan Chambers; Mark Lambert; Carl Thompson; Rhiannon Turner; Ian S Watt
Journal:  Implement Sci       Date:  2017-02-14       Impact factor: 7.327

9.  Evidence based policy making and the 'art' of commissioning - how English healthcare commissioners access and use information and academic research in 'real life' decision-making: an empirical qualitative study.

Authors:  Lesley Wye; Emer Brangan; Ailsa Cameron; John Gabbay; Jonathan H Klein; Catherine Pope
Journal:  BMC Health Serv Res       Date:  2015-09-29       Impact factor: 2.655

10.  Study protocol: DEcisions in health Care to Introduce or Diffuse innovations using Evidence (DECIDE).

Authors:  Simon Turner; Stephen Morris; Jessica Sheringham; Emma Hudson; Naomi J Fulop
Journal:  Implement Sci       Date:  2016-04-05       Impact factor: 7.327

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.