Literature DB >> 36081595

Benefits of Providing Feedback and Utilisation Metrics to Specialists on Their Participation in eConsult.

Erin Keely1,2,3, Rhea Mitchell1, Sheena Guglani1,4, Douglas Archibald1,2,3,4,5, Amir Afkham5, Clare Liddy1.   

Abstract

Our study evaluates the impact of feedback sent to specialists participating in eConsult services. eConsult Specialists from two eConsult services in Ontario, Canada, received feedback on their use of eConsult via bi-annual specialist reports. An 11-item survey was developed to evaluate the impact, content, and distribution process of these specialist reports. We distributed 742 specialist reports in March 2021 and surveyed the specialists in July 2021. Our findings show that specialists largely felt that the feedback received validated their efforts (83%) and that receiving the report made them more likely to continue to participate in the eConsult service (59%). Most did not feel judged (74%) or distressed (79%) by the reports, and 72% said that reporting the median self-reported billing time did not impact their own billing times. Overall, eConsult services can capture, report and aggregate data valuable to specialists and is useful for Continuing Professional Development. Benefits and lack of risk implementing this type of feedback should encourage other services to consider similar processes.
© 2022 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.

Entities:  

Keywords:  Electronic consultation; continuing education; evaluation; survey feedback

Year:  2022        PMID: 36081595      PMCID: PMC9448361          DOI: 10.1080/21614083.2022.2116193

Source DB:  PubMed          Journal:  J Eur CME        ISSN: 2161-4083


Introduction

There is increasing expectation that specialists will participate in practice audits and peer review as continuing professional development (CPD) and maintenance of certification (MOC) [1]. Unfortunately, many physicians are unsatisfied and concerned with current MOC activities and their relevance [2]. CPD needs to be grounded and guided by daily practice and feedback data from performance and practice [3]. Normative feedback about performance relative to peers may be particularly important in new health care delivery models where standards are not clearly established and skills are developed with experience rather than previous training. eConsult services are technology-enhanced systems allowing providers, such as primary care providers (PCPs) or specialists, to ask clinical advice from a specialist consultant [4]. eConsult services provide a unique opportunity for direct communication between providers, thus fostering education, professionalism and collegiality [5,6]. They also enable case-based data collection that can be aggregated and may include feedback from requesting providers through surveys or free text comments. This can be shared with providers as part of quality improvement (QI) and engagement activities. Based in Ontario, Canada, the Ontario eConsult service and the Champlain BASE eConsult service (www.econsultontario.ca) are part of the Ontario eConsult programme. Both include a mandatory survey within their workflow that requesting providers complete before closing cases. To maintain a timely and effective eConsult service, the eConsult Centre of Excellence (COE) distributes bi-annual reports to all qualifying specialists who participated in eConsult services. Our study evaluates impact and usefulness of bi-annual specialist reports distributed in March 2021 by collecting feedback on its impact, content, format and process from participating eConsult specialists. Here, we explore satisfaction with and effectiveness of individualised bi-annual reports as feedback provided to specialist consultants from PCPs during electronic consultations.

Methods

Settings and Participants

This study is set in Ontario, which is divided into 5 distinct and diverse healthcare regions. The federal government and the Ontario Ministry of Health finance public health services, community and hospital care, and most primary care and specialist funding [7]. Specialists participating in the Ontario and Champlain BASE eConsult services are compensated with an hourly rate pro-rated based on self-reported billing time. In December 2020, there were 941 specialists and 3506 requesting providers active (i.e. participated in 3 or more eConsults within six months) on the two services. From July 2020 and December 2020, an average of 5979 eConsults were completed monthly across both services.

Report Preparation and Distribution

The eConsult COE creates and distributes specialist reports to specialists having provided 5 or more eConsults within the designated 6-month period (January to June and July to December) when participating in the two eConsult services. Reports are distributed the third month following the reporting period (i.e. September and March). Reports include metrics and direct feedback from requesting physicians and nurse practitioners on completed eConsults, and compare their metrics within their speciality and the entire eConsult service (Figure 1).
Figure 1.

An example eConsult Specialist Report.

An example eConsult Specialist Report. To create reports, we analyse raw data for each service to generate metrics for the service as a whole, each speciality grouping and each individual specialist (Figure 2). Cross-sectional descriptive analysis is completed for the 6-month time period of interest for qualifying specialists. Number of cases is counted for each user, speciality, and service, along with the average time billed and percentage of eConsults responded to within 7 days, and results for the close-out survey are also calculated. Leveraging Microsoft® Power Automate and other connected Microsoft® 365 software, metrics are input, and individual metrics translated onto a report template, exported as a PDF and emailed to the appropriate provider. Testing and quality checks are performed before and during the distribution process, ensuring that accurate reports are sent to their corresponding providers. This requires the support of three staff with ~ 20–30 hours each over 3–4 weeks.
Figure 2.

Process for Preparation and Distribution of eConsult Specialist Reports.

Process for Preparation and Distribution of eConsult Specialist Reports.

Survey Development and Distribution

Created by consensus from the eConsult programme clinical leads and management team, an 11 item survey (Table 1), including items on the specialist’s feedback report usage, what they found most useful, their perceived value of the report and influence on their behaviour. On July 7th, 2021, the survey was sent utilising an online survey platform to all specialists who had received an individualised eConsult Specialist Report in March 2021 (n = 742) and remained open for 2 weeks with a reminder email sent to recipients after one week. Surveys were anonymous and did not collect identifying data. We summarised results using descriptive analysis and identified major themes from free text responses.
Table 1.

Survey Questions and Response Options.

Question: Response Options: 

How do you use the feedback report? Please check all that apply. 

I do not use it Iglance at it I make referenceto itperiodically  I keep a copy for my records I review in detail I use it for promotion/  hospital privileges I review and submit to RCPSC for Section 3 eligible activity credits Other:______ 

(2) What do you find to be the most useful components of the report? Please check all that apply. 

My utilization data (i.e. number of eConsults, response rate and self-reported billing time) Impact on Patient Care data (i.e. direct feedback from primary care providers through close-out survey on the outcome for the patient and the effect on need for referral) Comparison between my data and data from my specialty Comparison between my data and data from the entire eConsult service Comparison between data from my specialty and data from the entire eConsult service Overall eConsult Service data I don’t know  

(3) Please indicate your level of agreement with the following statements: 

    a. The reports validate the effort I put into answering eConsults

Strongly Disagree Disagree Neither Agree norDisagree Agree Strongly Agree 

    b. The information I receive in these reports is surprising.

    c. The report reduces the amount of time I spend on answering eConsults. 

    d. Receiving reports makes me more likely to continue participating in the eConsult service 

    e. Receiving reports makes me feel like I am providing the advice required. 

    f. The reports are easy to understand 

    g. The format I receive the report in is acceptable 

    h. I would like these reports to continue to be provided on a regular basis 

(4) Does the reported median billing time for your specialty influence your billing time for eConsults? 

No Yes, I think I should bill more minutes Yes, I think I should bill less minutes Unsure Option to provide additional comments: __________________________________

(5) Do you have any concerns about the reports? Please check all that apply. 

Receiving these reports causes me distress Receiving these reports makes me feel judged I am concerned these reports could be used against me by a regulatory body, hospital administration, or other Other: ________ 

(6) How often you would like to receive reports? 

Never Every 6 months Once a year Other: ________ 

(7) Is there other information or data you would like to see included on the reports?  

Free text comment field 

(8) Do you have any additional comments regarding the distribution process, content or format of these reports? 

Free text comment field 

(9) For what specialty do you provide eConsults? 

List of top 20 specializations from distribution list Other: ____________ Do not wish to disclose 

(10) For which service(s) do you provide eConsults? (check all that apply) 

Champlain BASE™ (hosted on HealthWorks) Ontario eConsult Service (hosted on OTNhub) Not sure 

(11) Please indicate which Ontario Health region you are located in. 

West Central Toronto East North Not Sure 
Survey Questions and Response Options. How do you use the feedback report? Please check all that apply. (2) What do you find to be the most useful components of the report? Please check all that apply. (3) Please indicate your level of agreement with the following statements: a. The reports validate the effort I put into answering eConsults b. The information I receive in these reports is surprising. c. The report reduces the amount of time I spend on answering eConsults. d. Receiving reports makes me more likely to continue participating in the eConsult service e. Receiving reports makes me feel like I am providing the advice required. f. The reports are easy to understand g. The format I receive the report in is acceptable h. I would like these reports to continue to be provided on a regular basis (4) Does the reported median billing time for your specialty influence your billing time for eConsults? (5) Do you have any concerns about the reports? Please check all that apply. (6) How often you would like to receive reports? (7) Is there other information or data you would like to see included on the reports? (8) Do you have any additional comments regarding the distribution process, content or format of these reports? (9) For what specialty do you provide eConsults? (10) For which service(s) do you provide eConsults? (check all that apply) (11) Please indicate which Ontario Health region you are located in.

Results

Specialist Report Preparation & Distribution

Between March 15th and 31st, 2021, we prepared and sent eConsult specialist reports to all specialists who had provided 5 or more eConsults between July and December 2020 on the Ontario eConsult Service (568 distinct reports sent) and/or the Champlain BASE™ eConsult Service (214 distinct reports sent). The median number of eConsults provided by specialists per speciality over the 6-month reporting period was 21 eConsults (range 5–620).

Survey Results

A total of 244 (33%) recipients completed the survey; 158 (65%) were participants of the Ontario eConsult Service, 49 (20%) were participants of the Champlain BASE™ service, 23 (9%) were participants in both services, and 14 (6%) did not identify the service they participated in. Psychiatry, Neurology, Cardiology, Paediatrics, Endocrinology, Infectious Diseases and Haematology were the highest represented specialisations of respondents (combined 45%). Of all the responses, 123 (50%) specialists stated that they reviewed the report in detail, 78 (32%) said they glanced at it, and only 5 (2%) did not use the reports. Eighty-eight respondents (36%) submitted their review of the report for MOC credits with their regulatory college. Most (n = 177, 73%) found the sections of the report with direct feedback from requesting PCPs on the impact on patient care the most useful component. The reporting of their utilisation metrics (i.e. number of eConsults provided, responsiveness and billing time) and the ability to compare their results with their peers were identified by 142 (58%) and 162 (66%) respondents, respectively, as most useful. Specialists found the reports’ format acceptable (n = 215, 88%) and found them easy to understand (n = 211, 87%). The majority supported the continued distribution of the reports in general (n = 211, 87%), with 172 (71%) satisfied with the bi-annual distribution timeline. Receiving the report made most specialists feel like they were providing the advice required (n = 199, 82%) and more likely to continue participating in the eConsult service (n = 143, 59%). This was highlighted through the free text comments, with one specialist stating, “my participation is predicated on the fact that I feel I am making a positive impact and that I am providing quality service”. Only 15 (6%) specialists found the information in the reports surprising, while 202 (83%) found the reports validated the effort put into answering eConsults. One specialist highlighted that “it is definitely motivating to know that[their] responses have met the needs of the referring physicians”. Reporting on the median self-reported billing time of the specialists and comparing it to peers in their speciality and the service did not impact the billing times for most specialists (n = 176, 72%). One specialist indicated that they “ … like to validate that [their] mean billing time is similar to others”. Specialists did not find that the reports made them feel judged (n = 180, 74%) nor caused them distress (n = 194, 79%). One specialist indicated that they “ … enjoy reading critical reports”, while another stated that “[specialists] should be judged by users, that’s what evaluation means. [They are] ok with that”. Specialists (n = 183, 76%) were not concerned that reports would be used against them by a regulatory body, hospital administration, or other.

Evaluation of Specialist Report Preparation & Distribution Process

Based on the support for continuation and positive feedback received through this survey and informed by ad hoc feedback from specialists, process improvements were put in place to ease the logistical challenges and ensure sustainability. To ensure continuous quality improvement, iterative process improvements were implemented, including (i) further automation of document creation and distribution utilising Microsoft® 365 software suite, (ii) addition of staff resourcing and time allocation, (iii) updated and more detailed process documentation, and (iv) improved quality assurance processes to ensure accuracy of reports (Figure 3).
Figure 3.

Distribution Workflow MS Power Automate.

Distribution Workflow MS Power Automate.

Discussion

Receiving feedback and comparison from peers can be an effective method of CPD activities. Our results show that providing reports to specialists regarding their performance in eConsult is an acceptable method of providing feedback. Specialists indicated that they use the reports for Maintenance of Certification (MOC) and value receiving feedback from requesting providers and comparisons to their peers, particularly those in the same speciality. Our study can inform other eConsult services worldwide to adopt this method for augmenting specialist CPD activities. . There was no evidence that providing specialists with feedback led to an increase in perceived threat, negative feelings, or disengagement. There was strong consensus that they should continue to be provided with few recommendations for changes. Despite eConsult services being well-developed and available in many jurisdictions, we are not aware of other systematically generated evaluation of specialist reports. Although there is increased awareness of the importance of developing skills in virtual care, including eConsult services, participation in eConsult services is new for most physicians and will not have been part of their training prior to going into practice. Traditionally, audit and feedback processes compare an individual’s performance to an established professional standard and accepted standards are not defined for eConsult services, peer comparisons are used as a surrogate marker. Although feedback from PCPs is an important motivator for specialists participating in eConsult services, it has been associated with negative emotions and risk of disengagement in other settings [8,9]. The data provided must be meaningful and credible for feedback to be engaging [10]. For example, one study of PCPs who completed a QI module and were provided pre- and post-performance feedback noted the importance of accurate data, enhanced detail in the content of feedback, and ability to customise peer comparison groups to compare performance to peers with similar patient populations or practice characteristics [11]. A recent study has shown that feedback comparing physicians to their top-performing peers using other specialists’ ratings improves performance [12]. This clustered randomised trial included 80 speciality clusters and 214 specialist consultants, and outcome measures included 1) elicitation of information from primary care practitioners; 2) adherence to institutional clinical guidelines; 3) agreement with peer’s medical decision-making; 4) educational value; and 5) relationship building. Rating colleagues’ responses and receiving individualised feedback resulted in significant improvements on 3 of the 5 consultation performance dimensions: medical decision-making, educational value, and relationship building. This required a new workflow and manual rating done by other specialists (not PCPs). Such studies show that reporting feedback to specialist consultants is critical to improving consult advice and, thus, more streamlined and efficient patient care. A unique feature of the included eConsult services is the need for the specialist to self-report their billing time. Although survey responses indicate that peer comparison for billing time did not alter behaviour for individual providers, we were pleased to see that the reports caused few negative emotions. Evaluating existing processes and making iterative improvements based on direct provider feedback is paramount to continuous quality improvement. This study allowed process improvements to be implemented; this can help sustain the programme management team’s ability to provide these highly valued specialist reports to the eConsult specialists. Our study is limited by the response rate of 33% and a single, albeit large, geographical area with two provincial services. Our service may be unique in the types of data collected about the services and thus other services may not be able to include all the information our services can include in the reports. We do not have data on whether the reports change specialist behaviour; we only have their perception. This is an area for future research.

Conclusion

eConsult services can capture, report and aggregate data valuable to specialists and is useful for CPD. The benefits and lack of risk implementing this type of feedback should encourage other services to consider a similar process.
  10 in total

1.  Practicing physicians' needs for assessment and feedback as part of professional development.

Authors:  Joan Sargeant; David Bruce; Craig M Campbell
Journal:  J Contin Educ Health Prof       Date:  2013       Impact factor: 1.355

2.  Perspectives of Champlain BASE Specialist Physicians: Their Motivation, Experiences and Recommendations for Providing eConsultations to Primary Care Providers.

Authors:  Erin Keely; Paul Drosinis; Amir Afkham; Clare Liddy
Journal:  Stud Health Technol Inform       Date:  2015

3.  eConsults and Learning Between Primary Care Providers and Specialists.

Authors:  Clare Liddy; Tala Abu-Hijleh; Justin Joschko; Douglas Archibald; Erin Keely
Journal:  Fam Med       Date:  2019-07       Impact factor: 1.756

4.  Physician Perceptions of Performance Feedback in a Quality Improvement Activity.

Authors:  Aimee R Eden; Elizabeth Hansen; Michael D Hagen; Lars E Peterson
Journal:  Am J Med Qual       Date:  2017-11-01       Impact factor: 1.852

5.  Physician Attitudes About Maintenance of Certification: A Cross-Specialty National Survey.

Authors:  David A Cook; Morris J Blachman; Colin P West; Christopher M Wittich
Journal:  Mayo Clin Proc       Date:  2016-10       Impact factor: 7.616

6.  Effect of Peer Benchmarking on Specialist Electronic Consult Performance in a Los Angeles Safety-Net: a Cluster Randomized Trial.

Authors:  Daniella Meeker; Mark W Friedberg; Tara K Knight; Jason N Doctor; Dina Zein; Nancy Cayasso-McIntosh; Noah J Goldstein; Craig R Fox; Jeffrey A Linder; Stephen D Persell; Stanley Dea; Paul Giboney; Hal F Yee
Journal:  J Gen Intern Med       Date:  2021-09-09       Impact factor: 6.473

7.  Integration of e-consultations into the outpatient care process at a tertiary medical centre.

Authors:  Frederick North; Lorraine D Uthke; Sidna M Tulledge-Scheitel
Journal:  J Telemed Telecare       Date:  2014-05-06       Impact factor: 6.184

8.  Specialist Perspectives on Ontario Provincial Electronic Consultation Services.

Authors:  Erin Keely; Rob Williams; Gilad Epstein; Amir Afkham; Clare Liddy
Journal:  Telemed J E Health       Date:  2018-05-10       Impact factor: 3.536

9.  Model depicting aspects of audit and feedback that impact physicians' acceptance of clinical performance feedback.

Authors:  Velma L Payne; Sylvia J Hysong
Journal:  BMC Health Serv Res       Date:  2016-07-13       Impact factor: 2.655

10.  The Calgary Audit and Feedback Framework: a practical, evidence-informed approach for the design and implementation of socially constructed learning interventions using audit and group feedback.

Authors:  Lara J Cooke; Diane Duncan; Laura Rivera; Shawn K Dowling; Christopher Symonds; Heather Armson
Journal:  Implement Sci       Date:  2018-10-30       Impact factor: 7.327

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.