Literature DB >> 35560153

Using electronic health records to streamline provider recruitment for implementation science studies.

Chiamaka L Okorie1, Elise Gatsby2, Florian R Schroeck1,3,4,5,6, A Aziz Ould Ismail3, Kristine E Lynch2.   

Abstract

BACKGROUND: Healthcare providers are often targeted as research participants, especially for implementation science studies evaluating provider- or system-level issues. Frequently, provider eligibility is based on both provider and patient factors. Manual chart review and self-report are common provider screening strategies but require substantial time, effort, and resources. The automated use of electronic health record (EHR) data may streamline provider identification for implementation science research. Here, we describe an approach to provider screening for a Veterans Health Administration (VHA)-funded study focused on implementing risk-aligned surveillance for bladder cancer patients.
METHODS: Our goal was to identify providers at 6 pre-specified facilities who performed ≥10 surveillance cystoscopy procedures among bladder cancer patients in the 12 months prior to recruitment start on January 16, 2020, and who were currently practicing at 1 of 6 pre-specified facilities. Using VHA EHR data (using CPT, ICD10 procedure, and ICD10 diagnosis codes), we identified cystoscopy procedures performed after an initial bladder cancer diagnosis (i.e., surveillance procedures). Procedures were linked to VHA staff data to determine the provider of record, the number of cystoscopies they performed, and their current location of practice. To validate this approach, we performed a chart review of 105 procedures performed by a random sample of identified providers. The proportion of correctly identified procedures was calculated (Positive Predictive Value (PPV)), along with binomial 95% confidence intervals (CI).
FINDINGS: We identified 1,917,856 cystoscopies performed on 703,324 patients from October 1, 1999-January 16, 2020, across the nationwide VHA. Of those procedures, 40% were done on patients who had a prior record of bladder cancer and were completed by 15,065 distinct providers. Of those, 61 performed ≥ 10 procedures and were currently practicing at 1 of the 6 facilities of interest in the 1 year prior to study recruitment. The random chart review of 7 providers found 101 of 105 procedures (PPV: 96%; 95% CI: 91% to 99%) were surveillance procedures and were performed by the selected provider on the recorded date. IMPLICATIONS: These results show that EHR data can be used for accurate identification of healthcare providers as research participants when inclusion criteria consist of both patient- (temporal relationship between diagnosis and procedure) and provider-level (frequency of procedure and location of current practice) factors. As administrative codes and provider identifiers are collected in most, if not all, EHRs for billing purposes this approach can be translated from provider recruitment in VHA to other healthcare systems. Implementation studies should consider this method of screening providers.

Entities:  

Mesh:

Year:  2022        PMID: 35560153      PMCID: PMC9106149          DOI: 10.1371/journal.pone.0267915

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.752


Introduction

The goal of implementation science is to promote evidence-based findings into real-world practice [1,2]. Healthcare providers are often targeted as participants in implementation science research to gain insights into implementation barriers and help develop strategies to enhance uptake of guideline-recommended practices. Provider recruitment, although a single milestone to the research process, is comprised of multiple sub-components, each with its challenges [3]. One of the first steps of provider recruitment is screening, which is identifying individuals who meet defined eligibility criteria for study participation. Like patient recruitment strategies for randomized clinical trials, eligibility criteria are necessary to ensure homogeneity across important factors related to the research question. However, with provider recruitment, eligibility is frequently based on both provider and patient factors. Historically, implementation research has relied on two screening strategies to identify potentially eligible providers: 1) self-report of clinical data and 2) manual chart review [4,5]. Although these methods can be effective, self-report hinges on the accuracy of provider recall [6], and manual review often takes longer than anticipated, which can adversely affect research timelines, funds, and processes [5]. Recent literature continues to call for alternative approaches that can help improve provider recruitment efficiency throughout the life cycle of an implementation study [7]. Electronic health records (EHRs) contain data that provide one such alternative approach [8-11]. Automated screening based on EHR data has been used for patient screening for randomized clinical trials recruitment [12]. However, an extension of this approach has not been described for provider screening and recruitment. Since provider data can be linked to patient data, the EHR also provides an avenue to find providers based on patient- and provider-level exclusion/inclusion criteria. Thus, the widespread adoption and use of EHRs may offer investigators the ability to rapidly identify a precise cohort of providers prescreened for eligibility criteria for participation in implementation studies [13-15]. In this manuscript, we provide an example of streamlined provider screening employing EHR data using the Veterans Health Administration (VHA) study focused on the Implementation of Risk-aligned Bladder Cancer Surveillance (ImpRaBS) as a use case [16]. The VHA is one of the largest integrated health systems in the United States and was one of the earliest adopters of electronic records [17,18]. Data derived from its in-house EHR system, Veterans Health Information Systems and Technology Architecture (Vista), is available to researchers via the VHA Corporate Data Warehouse (CDW) [19]. The CDW is a nationwide data repository of prioritized clinical domains including demographic, laboratory, pharmacy, procedure, and vital status data, as well as unstructured clinical text [20]. It is refreshed nightly enabling near-real-time querying of data for clinical and administrative research purposes. For this project, this rich resource was leveraged to identify providers eligible for inclusion based on patient- and provider-level criteria.

Methods

Use case

The primary aim of ImpRaBS is to develop and subsequently pilot implementation strategies for risk-aligned bladder cancer surveillance [16]. Cystoscopy is a surgical procedure that allows a urologist to visualize abnormalities of the urethra and bladder. It is the most common surgical procedure in the VHA with 30,000 performed annually [21]. Using cystoscopy, bladder cancer patients undergo frequent surveillance to check for cancer recurrence within their bladder. There is international consensus that surveillance cystoscopy should be aligned with each patient’s risk for recurrence and progression [22], however, prior work suggested that risk-aligned surveillance of patients with non-muscle invasive bladder cancer was not consistently practiced in VHA [16]. Thus, the goal of this study was to develop implementation strategies to improve risk-aligned surveillance of early-stage bladder cancer. As a first step towards this goal, we evaluated factors influencing guideline adherent clinical practice through provider interviews [23]. Thus, we needed to identify potentially eligible VHA providers for a qualitative evaluation of the barriers and facilitators of risk-aligned bladder cancer surveillance.

Eligibility criteria

Inclusion criteria consisted of patient- and provider-level data extracted from the CDW. Providers were eligible for inclusion if they were currently practicing at 1 of 6 pre-specified VHA Medical Centers and had performed ≥ Medcystoscopy procedures among previously diagnosed bladder cancer patients (i.e., surveillance cystoscopy) in the 12 months prior to recruitment start (January 16, 2020). Attending urologists, residents, and Advance Practice Providers (NP/PA) were all considered for inclusion and considered currently practicing if at least one of their qualifying procedures was performed at one of the 6 study sites. We identified eligible providers using four sequential steps (see Fig 1):
Fig 1

Flow chart showing the sequential identification process of eligible providers based on inclusion and exclusion criteria.

Identification of surveillance cystoscopy procedures using patient-level data. Linkage of provider data (provider ID and current location) to these procedures. Application of eligibility criteria to linked providers. Validation of at least 100 procedures by manual chart review.

Patient level

We used patient-level data to identify cystoscopy procedures. Because administrative coding (i.e., procedure codes) is not specific to surveillance cystoscopy, we developed a simple rule-based phenotype that considered diagnosis and procedure data elements. Surveillance cystoscopy was defined as the occurrence of a cystoscopy procedure at any time after a given patient’s incident bladder cancer diagnosis. Bladder cancer was defined based on the International Classification of Diseases, Ninth and Tenth Revision diagnosis codes (ICD-9, ICD-10). Cystoscopy was defined using Common Procedural Terminology (CPT) codes and ICD-10 procedure codes (see S1 Appendix for diagnosis and procedure codes). Bladder cancer diagnoses and cystoscopy procedures from October 1, 1999—January 16, 2020, were considered.

Provider level

All surveillance cystoscopy procedures were enumerated from patient-level data as described above and were each linked to the provider and clinical facility of record. For the “provider” role, the study recruited urologists and PAs. At the VHA, because provider identifiers are specific to each medical facility, providers can theoretically have up to 130 distinct identifiers. We established a many-to-one relationship using the pre-transformed VHA data available in the Observational Medical Outcomes Partnership (OMOP) common data model, which permits only one record per provider [24]. For each provider, we then calculated the total number of surveillance procedures performed from October 1, 1999—January 16, 2020, regardless of the geographical location the procedure was performed. Any provider who performed ≥10 cystoscopy procedures in the one year prior to study participation and currently practicing at any of the 6 clinical locations of interest was retained (Fig 1). If any of the ≥10 cystoscopy procedures in the 12 months prior to recruitment start were performed at 1 of the 6 pre-specified VHA Medical Centers, that provider was categorized as currently practicing at that Medical Center. A list of all qualifying providers and up to 15 of their most recent surveillance cystoscopy procedures, including patient identifiers and the procedure CPT, ICD10 procedure, and ICD10 diagnosis codes, was generated for validation.

Validation

Each provider from the list generated above was assigned a random number. The list was then sorted by the random numbers in ascending order. Starting with provider number 1, all individual procedures were reviewed chronologically per provider until reaching at least 100 procedures. Given that the first 7 randomly selected providers all had 15 procedures each, 105 procedures were reviewed. Manual chart review by study personnel (a urologist and a study coordinator) served as the reference standard for validation. They manually validated bladder cancer surveillance procedures by referencing procedure notes on or around the date of cystoscopy and determined (1) if a cystoscopy was performed on the date indicated by the CDW data, (2) if it occurred after a bladder cancer diagnosis and thus was a surveillance procedure, (3) whether the provider who performed the procedure was correctly identified, and (4) whether the location of the procedure aligned with the extracted data. The proportion of correctly identified procedures was calculated (Positive Predictive Value (PPV)), along with binomial 95% confidence intervals (CI).

Ethics statement

The study was approved by the VA Central Institutional Review Board (No.19-01). Data were not anonymized given the need to link patient and provider records and perform a chart review for validation. This study had a waiver of informed consent and HIPAA authorization related to the data and study activities described in the current manuscript.

Results

Patient level

Using the cystoscopy codes listed in the S1 Appendix, there was a total of 1,917,856 distinct cystoscopy procedures performed on 703,324 patients in VHA from October 1, 1999—January 16, 2020. During the same period, 850,305 cystoscopy procedures were performed on the 250,643 patients who had bladder cancer. Of those procedures, 762,158 met the definition of surveillance cystoscopy, that is the bladder cancer diagnosis preceded cystoscopy (circle section on the left side of Fig 2).
Fig 2

Pictogram describing the stepwise descent from patient data to the final list of eligible providers who performed qualifying surveillance cystoscopy.

Provider level

A total of 15,065 distinct providers performed the 762,158 procedures determined to be surveillance cystoscopy. Of these, 1,005 providers were located at the 6 pre-specified VHA Medical Centers. Of the 15,065 providers, 745 performed procedures in the 12 months prior to provider participation in study. At the intersection of location and number of surveillance cystoscopies performed in the prior year, there were only 61 providers who performed ≥10 procedures in that prior 12 months and were practicing at 1 of the 6 pre-specified facilities. These 61 providers were considered eligible for recruitment (Fig 2) with an average of 31.4 patients seen by these eligible providers and an average of 34.1 procedures performed. Below we provide a table detailing the number (median and interquartile range) of procedures and unique patients for each selected provider groups before final eligibility criteria were met (Table 1).
Table 1

Detailing the minimum, median and maximum number of procedures and unique patients for selected provider groups with averages and SD, including the specific procedure timeline.

 Providers who performed surveillance cystoscopyProviders practicing at any time at any of the 6 stationsProviders who performed > = 10 procedures in the last 12 monthsProviders who performed > = 10 procedures in the last 12 months at the 6 stations
 N = 15,065N = 1,005N = 745N = 61
 PatientsProceduresPatientsProceduresPatientsProceduresPatientsProcedures
Minimum1.01.01.01.07.010.09.010.0
Median2.02.04.55.031.033.027.027.0
Maximum1212.04205.0495.01791.0329.0405.082.097.0
Average28.946.930.842.742.451.131.434.1
SD61.8161.252.2115.336.149.518.221.0
 Procedures between 1999 and 01/16/2020Procedures between 1999 and 01-16-2020Procedures between 01/16/2019 and 01/16/2020Procedures between 01/16/2019 and 01/16/2020

Results of validation

105 cystoscopies performed by 7 providers according to the administrative data were reviewed. 101 of 105 surveillance procedures (PPV of 96%; 95% CI: 91% to 99%) were performed by the selected provider on the given date. Thus, all 7 providers met inclusion criteria with ≥10 procedures performed in the last 12 months, with a PPV of 100% at the provider level. For the 4 surveillance procedures that were not confirmed by chart review, the reason was the absence of a procedure note. Thus, it remained unclear whether the procedure was performed or not.

Discussion

We found that EHR data can be used for accurate identification of healthcare providers who meet patient- and provider-level inclusion criteria and thus provides a practical approach for implementation research. Starting with almost 2 million cystoscopy procedures and ~15,000 providers, we leveraged patient- and provider-level EHR data to identify 61 providers who met our predefined inclusion criteria. Previous studies have investigated the use of EHRs for patient screening for recruitment as research participants [25]. For patient recruitment, EHRs have effectively reduced both the time and cost in the recruitment of patients for randomized clinical trials. As such, study timelines were consequently accelerated, and research investigators could focus their efforts on other aspects of the study [12,25]. Even with these described advantages for patient screening, EHR use in provider screening has not been described. Although EHR’s have recently been reported as a major source of physician burnout [26], they also show great potential benefits for clinical and implementation science research studies. In this study, we extrapolated methods previously used for EHR-based patient screening to that of provider screening for study inclusion. Our study demonstrates the practicality of research based on the EHR for simple procedures or well described diseases. Both EHR vendors and coding authorities could be valuable stakeholders in the expansion and improvement of the quality of EHR based research. Self-report and manual chart review can also be effective screening methods and could alternatively have been used in our study instead of the automated EHR approach. However, both alternative screening methods come with limitations (Fig 3). Employing the self-report approach would have required us to identify and contact all currently practicing providers who perform cystoscopies from the six VHA facilities of interest. Then, these providers would have had to self-report how many surveillance procedures they performed in the previous 12 months–a method subject to recall bias [6]. As providers may perform many dozens or even hundreds of cystoscopy procedures per year on varying patients, memory decay is expected. The manual chart review approach would have been similarly cumbersome. With this approach, using clinic records at the specified VHA facilities, we would have had to identify patients who recently underwent a cystoscopy procedure within the last 12 months. In addition, we would have needed to find a subset of those patients who had their cystoscopy for surveillance of their bladder cancer. Within that cohort, we could use this information to identify performing and eligible providers. (Fig 3). This approach would be time-consuming and subject to variation between chart reviewers/study coordinators [5]. These limitations posed by these known approaches are why an EHR facilitated approach for provider eligibility recruitment may be preferred [27,28].
Fig 3

Flow chart of alternative approaches for identifying eligible providers: A) Provider self-reported B) Manual chart review.

Flow chart of alternative approaches for identifying eligible providers: A) Provider self-reported B) Manual chart review. An additional strength of EHR-based selection approach is its flexibility in evaluating granular inclusion criteria dictated by the study aim. In this study providers were eligible if they performed 10 procedures in the previous year regardless of how many unique patients were involved (Table 1). Additional patient or provider specific requirements could easily be incorporated into the coding process. A final strength of this approach is its generalizability. The current study used VHA EHR data. While not all healthcare systems have a vast EHR-based data resource [29], administrative codes and provider identifiers are collected in the majority of EHRs for billing purposes. As such, our approach can likely be translated to provider recruitment in other healthcare systems. However, there are also limitations to the EHR approach that warrant discussion. One limitation to this approach is the misclassification of data elements due to coding errors. We observed 4 false-positive surveillance procedures. The reason was a missing procedure note. It was unclear whether the procedure happened or not. It is possible that providers forgot to enter the note or that a procedure was erroneously coded although it was never done. Administrative coding (e.g., ICD, CPT-4, HCPCS) is imperfect and miscoding or non-specific coding may limit their utility in certain research settings, including EHR facilitated provider recruitment. For example, codes may not accurately reflect a patient’s underlying disease or what occurred in clinical practice. In urology specifically, there are certain diseases such as pyonephrosis and chronic testicular pain that have non-specific diagnosis codes [30,31]. Codes for these diseases may be clinically accurate but lack the granularity required to identify a specific subset of patients. In this present study of surveillance cystoscopy for bladder cancer patients, coding was sufficient to identify qualifying providers. However, structured data may not adequately or reliably capture other clinical domains as well and manual chart review alone or as a supplement of EHR based selection may be needed to overcome this limitation. A second limitation of this study is the somewhat limited validation given that there was no gold standard approach to identify surveillance cystoscopy or eligible providers. Thus, we were able to assess PPV but were unable to assess the sensitivity of the EHR based approach for identifying surveillance cystoscopy procedures or for providers performing surveillance cystoscopy. As such, our approach may have missed some qualifying procedures or providers. However, both misclassification and unknown sensitivity were less of a concern given that cystoscopy is a common procedure performed at the VHA. Thus, a few missed procedures even if misclassified would likely not have affected a provider’s eligibility. However, we acknowledge that for less common procedures, sensitivity would be a more important metric as even a small amount of misclassification could make a provider erroneously eligible or ineligible. In conclusion, the EHR-based screening model appreciably simplifies identification of eligible providers for research investigations, compared to alternative methods like manual chart review or self-report. Our EHR-based screening approach can likely be adapted for use in any healthcare system with an established EHR. Given the extensive array of information collected by healthcare systems in EHRs (diagnosis, medication, clinical notes, laboratory tests, consults, etc.), there are many ways research investigators can utilize this EHR-based screening approach to identify providers who meet defined inclusion criteria. We suggest researchers seriously consider EHR-based approaches to provider eligibility screening for their studies.

Cystoscopy procedures based on International Classification of Diseases, Ninth and Tenth Revision procedure codes and Common Procedural Terminology (CPT) codes.

Bladder Cancer diagnosis based on the International Classification of Diseases, Ninth and Tenth Revision diagnosis codes (ICD-9, ICD-10). (DOCX) Click here for additional data file. 11 Mar 2022
PONE-D-21-36556
Using Electronic Health Records to Streamline Provider Recruitment for Implementation Science Studies
PLOS ONE Dear Dr. Okorie , Thank you for submitting your manuscript to PLOS ONE. After careful consideration, we feel that it has merit but does not fully meet PLOS ONE’s publication criteria as it currently stands. Therefore, we invite you to submit a revised version of the manuscript that addresses the points raised during the review process.
Please submit your revised manuscript by May, 7 2022. If you will need more time than this to complete your revisions, please reply to this message or contact the journal office at plosone@plos.org. When you're ready to submit your revision, log on to https://www.editorialmanager.com/pone/ and select the 'Submissions Needing Revision' folder to locate your manuscript file. Please include the following items when submitting your revised manuscript:
A rebuttal letter that responds to each point raised by the academic editor and reviewer(s). You should upload this letter as a separate file labeled 'Response to Reviewers'. A marked-up copy of your manuscript that highlights changes made to the original version. You should upload this as a separate file labeled 'Revised Manuscript with Track Changes'. An unmarked version of your revised paper without tracked changes. You should upload this as a separate file labeled 'Manuscript'. If you would like to make changes to your financial disclosure, please include your updated statement in your cover letter. Guidelines for resubmitting your figure files are available below the reviewer comments at the end of this letter. If applicable, we recommend that you deposit your laboratory protocols in protocols.io to enhance the reproducibility of your results. Protocols.io assigns your protocol its own identifier (DOI) so that it can be cited independently in the future. For instructions see: https://journals.plos.org/plosone/s/submission-guidelines#loc-laboratory-protocols. Additionally, PLOS ONE offers an option for publishing peer-reviewed Lab Protocol articles, which describe protocols hosted on protocols.io. Read more information on sharing protocols at https://plos.org/protocols?utm_medium=editorial-email&utm_source=authorletters&utm_campaign=protocols. We look forward to receiving your revised manuscript. Kind regards, Beatrice Nardone Academic Editor PLOS ONE Journal Requirements: When submitting your revision, we need you to address these additional requirements. 1. Please ensure that your manuscript meets PLOS ONE's style requirements, including those for file naming. The PLOS ONE style templates can be found at https://journals.plos.org/plosone/s/file?id=wjVg/PLOSOne_formatting_sample_main_body.pdf and https://journals.plos.org/plosone/s/file?id=ba62/PLOSOne_formatting_sample_title_authors_affiliations.pdf Please review your reference list to ensure that it is complete and correct. If you have cited papers that have been retracted, please include the rationale for doing so in the manuscript text, or remove these references and replace them with relevant current references. Any changes to the reference list should be mentioned in the rebuttal letter that accompanies your revised manuscript. If you need to cite a retracted article, indicate the article’s retracted status in the References list and also include a citation and full reference for the retraction notice. [Note: HTML markup is below. Please do not edit.] Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 2. Has the statistical analysis been performed appropriately and rigorously? Reviewer #1: Yes Reviewer #2: I Don't Know Reviewer #3: Yes ********** 3. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: No ********** 4. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #1: Yes Reviewer #2: Yes Reviewer #3: Yes ********** 5. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #1: There is clear methodology in the method section and well description of the topic in the introduction part result is very consistent with discussion part I recommend the author to add some strength and limitation of the study Reviewer #2: This is a well written study which is in line with the new trend in the researchers and what EMR providers promotes in their marketing. I would suggest adding on the discussion that despite the fact that EMR is one of the major sources of physicians burnout, it has its own benefits such as the help in the research projects. I would also emphasize more on the coding errors and also the fact that coding systems have their own limitations and therefore using EMR based research may not be always ideal. In urology for example, there are several diseases which you may not be able to find an appropriate code for them as a diagnostic code, and there are also several codes for a single disease which can be misleading when it comes to conducting a research project on a disease or a procedure. I believe your study can be considered as an example of practicality of research based on EMR systems for simple procedures or some clearly defined and understood diseases at this stage, and both EMR companies and coding authorities should help with the improvement of the quality of these types of the researches which will eventually be the main source of clinical research in near future. Reviewer #3: Okorie et al evaluated the effectiveness of using EHR to identify potential healthcare providers who had performed >=10 surveillance cystoscopy for bladder cancer patients within 12 months prior to the recruitment and were practicing at specified facilities of Veteran Affairs. They found records from EHR had similar performance with usual chart review with a positive predictive value of 96% for correctly identify surveillance cystoscopy procedures. Overall, the study is rigorously conducted, and the paper provides new insights for finding eligible providers for multi-center clinical research. Comments: 1. Practicing urologists for surveillance cystoscopy: The authors found sixty-one out of 1,005 (6%) current practicing providers that met the inclusion criteria. When limiting current practicing providers to urologists, a different picture might be shown. The VA healthcare provider website (https://www.accesstocare.va.gov/ourproviders ) lists 65, 59, and 46 physicians in surgery practicing at White River Junction, Salt Lake City, and Richard L. Roudebush (facilities listed in Acknowledgements). It is expected that the total number of urologists will be less. So, how is the ratio of sixty-one to all current practicing urologists at specified facilities? 2. Procedures and patients: despite the same statistics, performing the procedure ten times for one patient differs from that one time for ten patients. To further demonstrate the superiority of EHR-based provider selection, please provide a table detailing the number (median and interquartile range) of procedures and unique patients for selected provider groups listed in Figure 2. ********** 6. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #1: Yes: Mohammedjud Hassen Ahmed Reviewer #2: No Reviewer #3: No [NOTE: If reviewer comments were submitted as an attachment file, they will be attached to this email and accessible via the submission site. Please log into your account, locate the manuscript record, and check for the action link "View Attachments". If this link does not appear, there are no attachment files.] While revising your submission, please upload your figure files to the Preflight Analysis and Conversion Engine (PACE) digital diagnostic tool, https://pacev2.apexcovantage.com/. PACE helps ensure that figures meet PLOS requirements. To use PACE, you must first register as a user. Registration is free. Then, login and navigate to the UPLOAD tab, where you will find detailed instructions on how to use the tool. If you encounter any issues or have any questions when using PACE, please email PLOS at figures@plos.org. Please note that Supporting Information files do not need this step.
6 Apr 2022 Reviewer #1: Mohammedjud Hassen Ahmed Attests that our manuscript is technically sound, presented in an intelligible fashion with data that support the conclusions made using appropriate and rigorous statistical analysis. Dr. Hassen Ahmed also accepts our data sharing restriction. Author Response: Thank you very much. Comment 1. There is clear methodology in the method section and well description of the topic in the introduction part. Result is very consistent with discussion part. Author Response: Thank you Dr. Hassen Ahmed. Comment 2. I recommend the author to add some strength and limitation of the study Author Response: Thank you for this suggestion. We agree that an additional discussion of the strengths and limitations to the study is needed, and this was also suggested by the other two reviewers. We have provided an additional limitation and three study strengths. The additional statements with some supporting references for each added strength and limitation are below. Strengths: 1) We acknowledge the practicality of EHR data as an avenue for major clinical research in the second paragraph of the discussion section. (page 8, line 292-294) “Our study demonstrates the practicality of research based on the EHR for simple procedures or well described diseases. Both EHR vendors and coding authorities could be valuable stakeholders in the expansion and improvement of the quality of EHR based research.” 2) We demonstrated the utility of EHR driven selection by showing that EHR helps distinguish between patient level and procedure level data. Statement below to be found in the fourth paragraph of the discussion section (page 8, line 312-315). See result section for Table 1. “An additional strength of EHR-based selection approach is its flexibility in evaluating granular inclusion criteria dictated by the study aim. In this study providers were eligible if they performed 10 procedures in the previous year regardless of how many unique patients were involved (Table 1). Additional patient or provider specific requirements could easily be incorporated into the coding process.” 3) We discussed the generalizability of this EHR-based approach. Statement below to be found in the fourth paragraph of the discussion section (page 8, line 315-319). “A final strength of this approach is its generalizability. The current study used VHA EHR data. While not all healthcare systems have a vast EHR-based data resource, [29] administrative codes and provider identifiers are collected in the majority of EHRs for billing purposes. As such, our approach can likely be translated to provider recruitment in other healthcare systems.” 29. Velarde KE, Romesser JM, Johnson MR, Clegg DO, Efimova O, Oostema SJ, et al. An initiative using informatics to facilitate clinical research planning and recruitment in the VA health care system. Contemporary Clinical Trials Communications. 2018;11: 107–112. doi:10.1016/j.conctc.2018.07.001 Limitations: 1) We added a statement on Coding errors as a limitation to EHR based studies in the fifth paragraph of the discussion section including a reference on coding errors in urology. (page 9, line 331-340) “Administrative coding (e.g., ICD, CPT-4, HCPCS) is imperfect and miscoding or non-specific coding may limit their utility in certain research settings, including EHR facilitated provider recruitment. For example, codes may not accurately reflect a patient’s underlying disease or what occurred in clinical practice. In urology specifically, there are certain diseases such as pyonephrosis and chronic testicular pain that have non-specific diagnosis codes. [30,31] Codes for these diseases may be clinically accurate but lack the granularity required to identify a specific subset of patients. In this present study of surveillance cystoscopy for bladder cancer patients, coding was sufficient to identify qualifying providers. However, structured data may not adequately or reliably capture other clinical domains as well and manual chart review alone or as a supplement of EHR based selection may be needed to overcome this limitation.” 30. Ballaro A, Oliver S, Emberton M. Do we do what they say we do? Coding errors in urology. BJU International. 2000;85: 389–391. doi:10.1046/J.1464-410X.2000.00471.X 31. Khwaja HA, Syed H, Cranston DW. Coding errors: a comparative analysis of hospital and prospectively collected departmental data. BJU international. 2002;89: 178–180. doi:10.1046/J.1464-4096.2001.01428.X Reviewer #2: Has the statistical analysis been performed appropriately and rigorously? Reviewer #2: I Don't Know Author Response: We are unsure what specific statistical analysis Reviewer #2 is uncertain about. We are happy to elaborate on our statistical analysis with further comments. To the best of our current knowledge, all statistical analyses were performed appropriately and rigorously. Comment 1. This is a well written study which is in line with the new trend in the researchers and what EMR providers promotes in their marketing. Author Response: We thank the reviewer for their time and helpful comments. Comment 2. I would suggest adding on the discussion that despite the fact that EMR is one of the major sources of physician’s burnout, it has its own benefits such as the help in the research projects. Author Response: Thank you for this suggestion. We agree with your comment and have incorporated your suggestion to the second paragraph of the discussion section (page 8 line 289-290). The added statement is also below including a reference on EHR contributing to physician burnout. “Although EHR’s have recently been reported as a major source of physician burnout,[26] they also show great potential benefits for clinical and implementation science research studies.” 26. Babbott S, Manwell LB, Brown R, Montague E, Williams E, Schwartz M, et al. Electronic medical records and physician stress in primary care: Results from the MEMO Study. Journal of the American Medical Informatics Association. 2014;21. doi:10.1136/amiajnl-2013-001875 Comment 3. I would also emphasize more on the coding errors and the fact that coding systems have their own limitations and therefore using EMR based research may not be always ideal. In urology for example, there are several diseases which you may not be able to find an appropriate code for them as a diagnostic code, and there are also several codes for a single disease which can be misleading when it comes to conducting a research project on a disease or a procedure. Author Response: Thank you for this comment. We have added a description of this limitation to the fifth paragraph of the discussion section (page 9, line 331-340). The added statement is also below including a reference on coding errors in urology. “Administrative coding (e.g., ICD, CPT-4, HCPCS) is imperfect and miscoding or non-specific coding may limit their utility in certain research settings, including EHR facilitated provider recruitment. For example, codes may not accurately reflect a patient’s underlying disease or what occurred in clinical practice. In urology specifically, there are certain diseases such as pyonephrosis and chronic testicular pain that have non-specific diagnosis codes. [30,31] Codes for these diseases may be clinically accurate but lack the granularity required to identify a specific subset of patients. In this present study of surveillance cystoscopy for bladder cancer patients, coding was sufficient to identify qualifying providers. However, structured data may not adequately or reliably capture other clinical domains as well and manual chart review alone or as a supplement of EHR based selection may be needed to overcome this limitation.” 30. Ballaro A, Oliver S, Emberton M. Do we do what they say we do? Coding errors in urology. BJU International. 2000;85: 389–391. doi:10.1046/J.1464-410X.2000.00471.X 31. Khwaja HA, Syed H, Cranston DW. Coding errors: a comparative analysis of hospital and prospectively collected departmental data. BJU international. 2002;89: 178–180. doi:10.1046/J.1464-4096.2001.01428.X Comment 4. I believe your study can be considered as an example of practicality of research based on EMR systems for simple procedures or some clearly defined and understood diseases at this stage, and both EMR companies and coding authorities should help with the improvement of the quality of these types of the research which will eventually be the main source of clinical research in near future. Author Response: Thank you for this comment. We appreciated this comment and have included it in the discussion in the second paragraph (page 8/Line 292-294). The summarized statement is below. “Our study demonstrates the practicality of research based on the EHR for simple procedures or well described diseases. Both EHR vendors and coding authorities could be valuable stakeholders in the expansion and improvement of the quality of EHR based research.” Reviewer #3: Have the authors made all data underlying the findings in their manuscript fully available? Reviewer #3: No Author Response: We unfortunately cannot make the data publicly available as they contain potentially identifying and sensitive patient information. Data in the Department of Veterans Affairs Corporate Data Warehouse are collected for clinical purposes as part of the patient medical record. They contain potentially identifying and sensitive patient information and, therefore, cannot be shared. They can be accessed by any VA researcher through the Institutional Review Board process. Interested researchers can direct data access requests to the director of the Veteran's IRB of Northern New England, 215 N Main Street, White River Junction, VT 05009, phone 802-295-9363, email: vhawrjresearchtask@va.gov. Comment 1. Okorie et al evaluated the effectiveness of using EHR to identify potential healthcare providers who had performed >=10 surveillance cystoscopy for bladder cancer patients within 12 months prior to the recruitment and were practicing at specified facilities of Veteran Affairs. They found records from EHR had similar performance with usual chart review with a positive predictive value of 96% for correctly identify surveillance cystoscopy procedures. Overall, the study is rigorously conducted, and the paper provides new insights for finding eligible providers for multi-center clinical research. Author Response: Thank you for your time and helpful comments. Comment 2. Practicing urologists for surveillance cystoscopy: The authors found sixty-one out of 1,005 (6%) current practicing providers that met the inclusion criteria. When limiting current practicing providers to urologists, a different picture might be shown. The VA healthcare provider website (https://www.accesstocare.va.gov/ourproviders) lists 65, 59, and 46 physicians in surgery practicing at White River Junction, Salt Lake City, and Richard L. Roudebush (facilities listed in Acknowledgements). It is expected that the total number of urologists will be less. So, how is the ratio of sixty-one to all current practicing urologists at specified facilities? Author Response: Thank you for bringing up this concern. We are grateful for the level of investigation put in by this reviewer. We understand how 61 can exceed expectations, but it is an accurate representation of the sites due to the following: • We assessed 6 total facilities, of which only three where highlighted in the Acknowledgements. • In addition to attending urologists, residents, and Advance Practice Providers (NP/PA) were considered for inclusion and considered currently practicing if at least one of their qualifying procedures was performed at one of the 6 study sites. Given the high turnover in these positions, residents in particular, these providers add substantially to the reported number. • We anticipate readers having the same concern as this reviewer and have added language in text in the methods section: eligibility criteria (page 4, line 148-151) to clarify that providers included but were not limited to attending urologists. The added statement is below. “Attending urologists, residents, and Advance Practice Providers (NP/PA) were all considered for inclusion and considered currently practicing if at least one of their qualifying procedures was performed at one of the 6 study sites.” • Lastly, while investigating your comment, we noticed a typographic error in Figure 2 that may have caused confusion. Figure 2 previously showed there were 1,005 providers who performed at least 1 cystoscopy were currently practicing at one of the 6 stations. It now correctly states that there were 1,005 providers who performed at least 1 cystoscopy at one of the 6 stations at any point in time (1999-2020). Comment 3. Procedures and patients: despite the same statistics, performing the procedure ten times for one patient differs from that one time for ten patients. To further demonstrate the superiority of EHR-based provider selection, please provide a table detailing the number (median and interquartile range) of procedures and unique patients for selected provider groups listed in Figure 2. Author Response: We agree with the reviewer’s assessment that performing a cystoscopy ten times on the same patient is quite different than one time for ten patients. Although this distinction did not matter for our study inclusion criteria, it could for a different study. We add this as an additional strength of EHR-based provider selection in the fourth paragraph of the discussion section (page 8, line 312-315). The added statement is below. “An additional strength of EHR-based selection approach is its flexibility in evaluating granular inclusion criteria dictated by the study aim. In this study providers were eligible if they performed 10 procedures in the previous year regardless of how many unique patients were involved (Table 1). Additional patient or provider specific requirements could easily be incorporated into the coding process.” Additionally, as requested by the reviewer, we have provided a table (Table 1) detailing the number (median and interquartile range) of procedures and unique patients for selected provider groups in the results section under provider level, as it further demonstrates the utility of EHR driven selection. The added statement is below in the result section: provider level (page 4, line 232-235). “These 61 providers were considered eligible for recruitment (Figure 2) with an average of 31.4 patients seen by these eligible providers and an average of 34.1 procedures performed. Below we provide a table detailing the number (median and interquartile range) of procedures and unique patients for each selected provider groups before final eligibility criteria were met. (Table 1)” Table 1: Detailing the minimum, median and maximum number of procedures and unique patients for selected provider groups with averages and SD, including the specific procedure timeline. Submitted filename: Response to Reviewers.docx Click here for additional data file. 19 Apr 2022 Using Electronic Health Records to Streamline Provider Recruitment for Implementation Science Studies PONE-D-21-36556R1 Dear Dr.  Okorie, We’re pleased to inform you that your manuscript has been judged scientifically suitable for publication and will be formally accepted for publication once it meets all outstanding technical requirements. Within one week, you’ll receive an e-mail detailing the required amendments. When these have been addressed, you’ll receive a formal acceptance letter and your manuscript will be scheduled for publication. An invoice for payment will follow shortly after the formal acceptance. To ensure an efficient process, please log into Editorial Manager at http://www.editorialmanager.com/pone/, click the 'Update My Information' link at the top of the page, and double check that your user information is up-to-date. If you have any billing related questions, please contact our Author Billing department directly at authorbilling@plos.org. If your institution or institutions have a press office, please notify them about your upcoming paper to help maximize its impact. If they’ll be preparing press materials, please inform our press team as soon as possible -- no later than 48 hours after receiving the formal acceptance. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information, please contact onepress@plos.org. Kind regards, Beatrice Nardone Academic Editor PLOS ONE Additional Editor Comments (optional): Reviewers' comments: Reviewer's Responses to Questions Comments to the Author 1. If the authors have adequately addressed your comments raised in a previous round of review and you feel that this manuscript is now acceptable for publication, you may indicate that here to bypass the “Comments to the Author” section, enter your conflict of interest statement in the “Confidential to Editor” section, and submit your "Accept" recommendation. Reviewer #2: All comments have been addressed Reviewer #3: All comments have been addressed ********** 2. Is the manuscript technically sound, and do the data support the conclusions? The manuscript must describe a technically sound piece of scientific research with data that supports the conclusions. Experiments must have been conducted rigorously, with appropriate controls, replication, and sample sizes. The conclusions must be drawn appropriately based on the data presented. Reviewer #2: Yes Reviewer #3: (No Response) ********** 3. Has the statistical analysis been performed appropriately and rigorously? Reviewer #2: (No Response) Reviewer #3: (No Response) ********** 4. Have the authors made all data underlying the findings in their manuscript fully available? The PLOS Data policy requires authors to make all data underlying the findings described in their manuscript fully available without restriction, with rare exception (please refer to the Data Availability Statement in the manuscript PDF file). The data should be provided as part of the manuscript or its supporting information, or deposited to a public repository. For example, in addition to summary statistics, the data points behind means, medians and variance measures should be available. If there are restrictions on publicly sharing data—e.g. participant privacy or use of data from a third party—those must be specified. Reviewer #2: Yes Reviewer #3: (No Response) ********** 5. Is the manuscript presented in an intelligible fashion and written in standard English? PLOS ONE does not copyedit accepted manuscripts, so the language in submitted articles must be clear, correct, and unambiguous. Any typographical or grammatical errors should be corrected at revision, so please note any specific errors here. Reviewer #2: Yes Reviewer #3: (No Response) ********** 6. Review Comments to the Author Please use the space provided to explain your answers to the questions above. You may also include additional comments for the author, including concerns about dual publication, research ethics, or publication ethics. (Please upload your review as an attachment if it exceeds 20,000 characters) Reviewer #2: Authors have addressed all of my comments. With the statistical analysis aspect of the manuscript, I would not consider myself competent enough to investigate the details and the editor will decide. Reviewer #3: (No Response) ********** 7. PLOS authors have the option to publish the peer review history of their article (what does this mean?). If published, this will include your full peer review and any attached files. If you choose “no”, your identity will remain anonymous but your review may still be made public. Do you want your identity to be public for this peer review? For information about this choice, including consent withdrawal, please see our Privacy Policy. Reviewer #2: No Reviewer #3: No 6 May 2022 PONE-D-21-36556R1 Using Electronic Health Records to Streamline Provider Recruitment for Implementation Science Studies. Dear Dr. Okorie: I'm pleased to inform you that your manuscript has been deemed suitable for publication in PLOS ONE. Congratulations! Your manuscript is now with our production department. If your institution or institutions have a press office, please let them know about your upcoming paper now to help maximize its impact. If they'll be preparing press materials, please inform our press team within the next 48 hours. Your manuscript will remain under strict press embargo until 2 pm Eastern Time on the date of publication. For more information please contact onepress@plos.org. If we can help with anything else, please email us at plosone@plos.org. Thank you for submitting your work to PLOS ONE and supporting open access. Kind regards, PLOS ONE Editorial Office Staff on behalf of Dr. Beatrice Nardone Academic Editor PLOS ONE
  27 in total

1.  Do we do what they say we do? coding errors in urology.

Authors:  A Ballaro; S Oliver; M Emberton
Journal:  BJU Int       Date:  2000-03       Impact factor: 5.588

2.  Efficacy and cost-effectiveness of an automated screening algorithm in an inpatient clinical trial.

Authors:  Catherine C Beauharnais; Mary E Larkin; Adrian H Zai; Emily C Boykin; Jennifer Luttrell; Deborah J Wexler
Journal:  Clin Trials       Date:  2012-02-03       Impact factor: 2.486

Review 3.  Getting a clinical innovation into practice: An introduction to implementation strategies.

Authors:  JoAnn E Kirchner; Jeffrey L Smith; Byron J Powell; Thomas J Waltz; Enola K Proctor
Journal:  Psychiatry Res       Date:  2019-07-02       Impact factor: 3.222

4.  Determinants of Risk-Aligned Bladder Cancer Surveillance-Mixed-Methods Evaluation Using the Tailored Implementation for Chronic Diseases Framework.

Authors:  Florian R Schroeck; A Aziz Ould Ismail; Grace N Perry; David A Haggstrom; Steven L Sanchez; DeRon R Walker; Jeanette Young; Susan Zickmund; Lisa Zubkoff
Journal:  JCO Oncol Pract       Date:  2021-08-31

Review 5.  Implementing risk-aligned bladder cancer surveillance care.

Authors:  Florian R Schroeck; Nicholas Smith; Jeremy B Shelton
Journal:  Urol Oncol       Date:  2018-02-13       Impact factor: 3.498

6.  Insights from advanced analytics at the Veterans Health Administration.

Authors:  Stephan D Fihn; Joseph Francis; Carolyn Clancy; Christopher Nielson; Karin Nelson; John Rumsfeld; Theresa Cullen; Jack Bates; Gail L Graham
Journal:  Health Aff (Millwood)       Date:  2014-07       Impact factor: 6.301

7.  Department of Veterans Affairs Cooperative Studies Program Network of Dedicated Enrollment Sites: Implications for Surgical Trials.

Authors:  Faisal G Bakaeen; Domenic J Reda; Annetine C Gelijns; Lorraine Cornwell; Shuab Omer; Rayan Al Jurdi; Panos Kougias; Daniel Anaya; David H Berger; Grant D Huang
Journal:  JAMA Surg       Date:  2014-06       Impact factor: 14.766

8.  Utility of electronic medical record for recruitment in clinical research: from rare to common disease.

Authors:  Tapan Thacker; Ashley R Wegele; Sarah Pirio Richardson
Journal:  Mov Disord Clin Pract       Date:  2016-01-29

9.  Benefits and drawbacks of electronic health record systems.

Authors:  Nir Menachemi; Taleah H Collum
Journal:  Risk Manag Healthc Policy       Date:  2011-05-11

10.  Recruiting clinical personnel as research participants: a framework for assessing feasibility.

Authors:  Sylvia J Hysong; Kristen Broussard Smitham; Melissa Knox; Khai-El Johnson; Richard SoRelle; Paul Haidet
Journal:  Implement Sci       Date:  2013-10-24       Impact factor: 7.327

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.