Literature DB >> 35734267

Documentation from trained medical students has a low rate of relative downcoding for emergency medicine encounters.

David S Tillman1, Corlin M Jewell1, Dann J Hekman1, Adam M Nicholson1, Benjamin H Schnapp1,2, Michael R Lasarev2, Roxana Alexandridis2, Jamie M Hess1, Mary C Westergaard1.   

Abstract

Background: Since 2018, the Centers for Medicare & Medicaid Services (CMS) guidelines have allowed teaching physicians to bill for evaluation and management services based on medical student documentation. Limited previous data suggest that medical student documentation suffers from a high rate of downcoding relative to faculty documentation. We sought to compare the coding outcomes of documentation performed by medical students, and not edited by faculty, with documentation edited and submitted by faculty.
Methods: A total of 104 randomly selected notes from real patient encounters written by senior medical students were compared to the revised notes submitted by faculty. The note pairs were then split and reviewed by blinded professional coders and assigned level of service (LoS) codes 1-5 (corresponding to E&M CPT codes 99281-99285).
Results: We found that the LoS agreement between student and faculty note versions was 63%, with 23% of all student notes receiving lower LoS compared to faculty notes (downcoded). This was found to be similar to baseline variability in professional coder LoS designations. Conclusions: Notes from medical students who have completed a focused documentation curriculum have less LoS downcoding than in previous reports.
© 2022 The Authors. AEM Education and Training published by Wiley Periodicals LLC on behalf of Society for Academic Emergency Medicine.

Entities:  

Year:  2022        PMID: 35734267      PMCID: PMC9197152          DOI: 10.1002/aet2.10741

Source DB:  PubMed          Journal:  AEM Educ Train        ISSN: 2472-5390


INTRODUCTION

In 2018, The Centers for Medicare & Medicaid Services (CMS) revised their Medicare Claims Processing Manual with the addition of CR 10412, a provision that allows teaching physicians to fully bill for evaluation and management services based on medical student notes.  This change has had a dramatic impact on both students and supervising physicians. Previous studies have shown that students feel more involved in the care team when their documentation is used as part of the patient's record, as opposed to a duplication of physician work. , At the time of the rollout of the new regulations, there was not a defined curriculum for teaching students to document in a way that meets CMS compliance regulations while also capturing elements that are necessary for billing. This gap led to concerns about potential coding issues, including the possibility that student generated notes would be “downcoded” relative to faculty notes and could therefore negatively impact billing for services provided. Limited previous research supports this concern, with significant discrepancies between medical student and faculty documentation in terms of billing outcomes. , In August 2018, our institution rolled out medical student documentation in line with the CMS regulations. In our desire to ensure the effectiveness of our curriculum and to address concerns about the nascent student documentation, we planned this review of our student documentation quality. We sought to use coding level of service (LoS) as one marker of documentation quality. Our purpose was to explore whether medical students who have undergone our focused documentation curriculum write notes that are billed comparably to notes submitted by faculty physicians. Our hypothesis was that medical student documentation is associated with a high degree of coding discrepancy from the faculty‐revised cosigned documentation (in line with limited previous data). ,

METHODS

Study setting and population

This study took place across two sites: one large academic medical center emergency department (ED) that sees around 65,000 patients annually and a smaller outlying community hospital ED that sees 20,000 patients annually. Residents from the emergency medicine (EM) program and senior medical students from the school of medicine rotate in both EDs. One academic faculty group staffs both EDs. The timeline for the chart review spanned from May 2019 to July 2019. This time period was chosen as it marked a full year after the introduction of our medical student documentation curriculum, allowing for complete incorporation and stabilization of the course. Inclusion criteria stipulated notes that were written by senior medical students undertaking a 4‐week rotation in the ED who had participated in the documentation curriculum. All EM faculty who supervised medical students, including part‐time faculty and fellow instructors, were eligible.

Student training

Prior to beginning clinical duties in the ED, students participated in a 30‐min EM faculty–led didactic on the purposes and best practices for EM documentation, a 15‐min standardized patient encounter, and a 15‐min case discussion with EM faculty (simulated student‐to‐faculty patient presentation). Finally, each student wrote a patient encounter note based on the simulated case and received written feedback from EM faculty on this note. The same three faculty (D.S.T., A.M.N., J.H.) led all instances of this curriculum from 2018 to time of manuscript submission. This curriculum involves a total of 1.5 h of student time and a total of 1.5 h of faculty time, including feedback.

Study design overview

This was a retrospective chart review. The primary study outcome was the proportion of medical student notes that were coded at a lower LoS compared to the faculty‐revised cosigned documentation. Within our preset time period, we identified medical student–written ED provider notes that were subsequently cosigned by faculty using the electronic health record (EHR, Epic, Verona, WI). All notes were retrieved from a relational database (Epic Clarity, hosted on Oracle Relational Database Management System) and software was used to parse notes into plain text format for user review (R version 3.6.2, R Core Team, Vienna, Austria). Encounters were identified in the EHR using patient arrival and discharge times, clinical locations at our two sites, and note authorship. Then, both the note signed by the medical student prior to faculty review/cosignature (henceforth MS) and the final, revised note as it appears in the patient's medical record after faculty cosignature (henceforth AT) were abstracted from the encounter in the EMR. The EMR was able to confirm authorship of any and all text in each note to ensure that no faculty‐performed documentation was present within the MS note. To maintain the blinding of reviewers, study authors removed any attestations or details identifying the chart author and attached a sham “faculty” attestation to all notes to blind the coders to whether the note being reviewed was MS versus AT. Note pairs (MS and AT from the same encounter) were then split and randomly assigned to two separate professional coders from our institution. These were the same professional coders who normally code all documentation from our EDs. Each coder (A and B) was randomized to review half of the MS notes and half of the AT notes, but never the two notes from the same pair/encounter. Each MS note and AT note was assigned a 1–5 (LoS) designation (reflecting the five core E&M CPT codes). Outcomes were based on comparison of the LoS designations among the repaired MS and AT notes.

Eligibility criteria

For the electronic record pull, the following programmatic criteria were used. Only ED encounters where the initial note was written by a senior‐level medical student and subsequently cosigned by faculty were eligible for the study. At our institution, senior‐level medical students include students in the fourth year or the last 2 months of their third year of medical school. We excluded encounters where any resident was assigned to the treatment team. We included encounters where the patient was dispositioned (admit or discharge order placed) while the student's shift was still ongoing (based on recorded student schedule) and excluded encounters where the patient was dispositioned by the faculty after the student left or signed out to another ED care team (so that both the medical student's and the faculty's documentation were privy to all key patient events). We excluded any encounter where the faculty documented critical care time, as the requirements for critical care documentation were not included as part of the medical student curriculum. We excluded encounters wherein the note was cosigned by the faculty as “for education purposes only,” as these notes were not part of the patient's official medical record and are typically not closely reviewed or revised by the attending physician.

Chart preparation

A total of 363 patient encounters met the eligibility criteria. Limited prior data suggested a high downcode percentage (>90%). Our power calculation gave 85 charts as a minimum to identify an improvement of 10 percentage points (90% vs. 80% downcoded). A total of 140 encounters with complete note pairs were randomly selected from the 363 to be reviewed by our trained study reviewers (four EM faculty D.S.T., A.M.N., B.H.S., C.M.J.; see Figure 1). Review proceeded according to a prespecified rubric (Appendix A) to screen for exclusion criteria missed by electronic encounter identification, to remove protected health information (PHI) from notes, and to facilitate blinding of MS and AT note attestations.
FIGURE 1

Selection of note pairs to review with inclusion and exclusion criteria

Selection of note pairs to review with inclusion and exclusion criteria In Round 1, D.S.T., A.M.N., B.H.S., and C.M.J. all performed a review of the 140 AT notes. From this round, the most common reason for exclusion among the reviewed note pairs was evidence of resident involvement (4/140). Other reasons for exclusion were student notes that were marked as “educational purposes only” by the attending physician (3/140), patient signed out to a new physician team (3/140), or no existing student documentation (1/140). This left 129 note pairs that proceeded to the second round where D.S.T., A.M.N., B.H.S., and C.M.J. all performed a review of the MS note for each encounter. An additional five (5/129) met exclusion criteria that had not been apparent on electronic pull or AT review. In sum, 16 of 140 note pairs were not included in the final analysis due to meeting one or more of the above exclusion criteria. Of the remaining 124 note pairs, 20 were randomly selected, deidentified, and given sham attestations as below to facilitate a calculation of agreement between two coders. These 20 pairs were not included in the final analysis. Ultimately, a total of 104 note pairs were included in the final analysis (Figure 1). D.S.T. performed a third round of review of all 124 note pairs during which no new exclusions were identified. Patient characteristics of encounters accessed for our documentation evaluation are found in Table 1.
TABLE 1

Patient characteristics of encounters accessed for our documentation evaluation

SexAgeESINumber (%)
Female67Range0–89 years11 (0.8)
Male57211 (8.9)
%Female0.54Mean35 years375 (60.5)
%Male0.46432 (25.8)
55 (4)
Patient characteristics of encounters accessed for our documentation evaluation The blinding of MS and AT notes was achieved by removing provider identifiers and adding sham attestations. This occurred on Rounds 1, 2, and 3 by reviewers. MS notes (notes that do not have faculty attestation) were blinded by making them look like they had been reviewed/signed by faculty. This was done by: Replacing the student name with “–Student.” Adding our institution/s required minimum language for attestation of medical student documentation (“I was present with the medical student. I have edited this note as necessary to reflect the history, physical examination, and medical decision making I have performed and deemed medically necessary.”) And signing the note “–Faculty.” This sham faculty attestation blinded the coder as to whether this MS note version was actually the version only written by the student or the version reviewed/edited/cosigned by faculty. Appendix A has further details regarding PHI deidentification, signature/attestation blinding, and example sham attestation changes. AT notes were blinded in much the same way. Any medical decision making (MDM) added by the faculty with the attestation was kept. Any identifying or idiosyncratic faculty attestations were removed and the standard institutional attestation language as above was inserted. Note signed “–Faculty.” Since our sham attestation matched institutional requirements, faculty attestations on original AT notes (true encounter note) already looked much the same as the goal/final product blinded AT note. Appendix A has further details and examples of sham attestations. Our trained coders then each reviewed their 124 assigned notes, assigning a 1–5 LoS. We collected the LoS for each encounter's MS and AT note as well as the original LoS given to the actual encounter.

Data analysis

To assess intercoder agreement, 20 identical notes (evenly split between MS and AT notes) were sent to both coders. For those notes we calculated a Gwet's AC1 statistic to ensure a reasonable level of intercoder agreement. As a sensitivity analysis, we also calculated the kappa coefficient between the LoS assigned by our study coders and the LoS CPT billed on the original encounter. Reported confidence intervals (CIs) have 95% coverage and all analyses done using R 4.1.0. Our investigation was granted exemption from formal review from the University of Wisconsin Health Sciences Institutional Review Board as a quality improvement initiative.

RESULTS

Overall, when comparing the MS to AT notes, there was moderate agreement in the coded LoS between faculty and students 62.5% (95% CI 50.8%–72.9%). The AT note was coded for a higher LoS in 24 note pairs (23.1%, 95% CI 14.8%–34.2%) and a lower LoS in 15 note pairs (14.4%, 95% CI 8.0%–24.5%). The matrix showing the LoS for the students and faculty can be seen in Table 2. As seen in the table, there were only two instances in which the MS note was coded at a LoS two levels below the AT note LoS. The coefficients of agreement (kappa and Gwet's AC1) between coded student and faculty LoS is seen in Table 3. Appendix B contains all LoS designations for the MS and AT versions of each note as well as the LoS given by coders in real time (when the clinical encounter was billed).
TABLE 2

Comparisons of coded LoS between MS and AT versions of note pairs

MS LoS
12345
AT LoS101000
201200
3002380
40014294
5002812

Abbreviations: AT, attending; LoS, level of service; MS, medical student.

TABLE 3

Agreement coefficients in coded LoS between MS note and AT note

Agreement coefficientEstimate95% CI
Kappa0.4260.282–0.571
Gwet's AC10.5520.439–0.665

Abbreviations: AT, attending; LoS, level of service; MS, medical student.

Comparisons of coded LoS between MS and AT versions of note pairs Abbreviations: AT, attending; LoS, level of service; MS, medical student. Agreement coefficients in coded LoS between MS note and AT note Abbreviations: AT, attending; LoS, level of service; MS, medical student. Our agreement was again tested using the 20 notes where Coder A and B both reviewed the same note version (10 where they both reviewed the MS, 10 where they both reviewed the AT). In this instance, there was agreement (same LoS) between expert coders in 14/20 cases (70%, 95% CI 48%–85%). Gwet's AC1 was estimated at 0.645 (95% CI 0.399–0.899).

DISCUSSION

Despite previous studies showing substantial medical student LoS discrepancy compared with faculty LoS, , our study shows much less LoS discrepancy. While we did not establish causation, we posit that our documentation curriculum (with relatively limited time invested up front) enabled our students to complete charts that are coded similarly to faculty charts. This is congruent with other studies suggesting that brief interventions can be highly effective for improving trainee compliance with billing requirements. Looking at the raw data (Appendix B) we are able to see the AT LoS, MS LoS, and the LoS that was given in real time for each note (true LoS). The AT LoS matched the true LoS 56.7% of the time (95% CI 47%–66%). Agreement between MS and AT note LoS (62.5%) is similar to the agreement seen in the two potential control measures (AT LoS and True LoS [56.7%] and our sample of 20 charts where Coders A and B reviewed the same note [70%]). This suggests that there may be a maximum feasible agreement between any chart LoS among multiple coders. Limited previous data also suggest that medical coders are subject to variation in assigning CPT codes. Certainly, our student note LoS designations still had measurable differences from faculty note LoS designations (Table 2; Appendix B). However, these differences were substantially less than demonstrated in previous studies where the students involved had not undergone specific documentation training , and appear to be in line with possible natural variation in coding, as above. Students whose notes were included in this study had undergone our published documentation curriculum.  While our study does not establish causation, we believe that this short curriculum, administered over the course of 1 h every 4 weeks, may be a powerful tool in overcoming previous challenges with coding of medical student documentation. The introduction of billable student notes opens the door for greater satisfaction of both students and attending physicians, including greater student involvement in all aspects of patient care. Additionally, a study involving multi‐institutional faculty and Association of American Medical Colleges (AAMC) student representatives shows that the majority of faculty and students perceive medical student documentation in the ED as an important part of their training. Finally, our previous limited investigation surveying faculty and student perspectives on medical student documentation reflects high value perceived by both groups. Additional barriers that exist for the utilization of medical student documentation include lack of workspace and the lack of institutional policy regarding its use.  The adoption of such a policy is a critical step in addressing stakeholder and liability concerns before allowing the medical student notes to be entered into the EMR. To ensure that the policy is actually embraced within a department after its adoption, it will be necessary to provide faculty training to increase understanding, comfort, and ultimately, consistency in applying the policy. Furthermore, involving operational and educational stakeholders will similarly be important in ensuring that resources are dedicated to providing workspaces for students to documents considering the stated benefits of medical student documentation.

LIMITATIONS

One limitation to our study is the use of coding LoS does not fully encapsulate note quality. Ultimately, our investigation into whether LoS differs in the MS note versus the AT note only represents the perspective on a note's quality as it pertains to billing. While we do think that the comparison of LoS does provide a reasonable surrogate for, and the most objective comparator of, note quality, LoS is just one aspect of quality documentation. Accurate and comprehensive reflection of the care and communication of medical decision making are other critical elements of quality documentation. Limited previous work suggests that medical student documentation is at risk for factual errors, highlighting the need for close faculty supervision of all trainee documentation.  The highest quality documentation curriculum still cannot take the place of appropriate oversight and ongoing guidance for students who may be encountering complex and unfamiliar clinical scenarios. While some physicians may perceive this additional time spent reviewing student notes as a potential burden, this must be weighed against the time saved by not having to provide wholesale documentation of the patient encounter and the enrichment of the medical student experience. , An additional limitation is that this study involves notes taken from a single institution and a small cohort of coders. Therefore, the results may not be generalizable to other health care institutions. Given that implementation of the documentation curriculum coincided with implementation of the practice of allowing medical students to document officially in the EMR, we were unable to analyze student charts completed prior to the implementation of our educational curriculum. It is possible that students were already capable of documenting comprehensive charts for the ED prior to the implementation of our curriculum, although we feel this is unlikely, especially in light of coding discrepancies seen in documentation from untrained students. , In our case, however, students simply knowing that their documentation “counted” for the medical record could have contributed to the LoS similarity. It is also possible that different levels of patient complexity may alter the utility of student notes. For example, did not power our study to evaluate for LoS differences specifically identified among ESI 1 versus ESI 2 versus ESI 3 encounters. Additionally, coders for this study reviewed the notes as plain text files as opposed to formatted notes within the EHR with access to additional documentation (e.g., procedure notes, test results not already included in the note). This may, at least partly, contribute to the difference between our reported true LoS and our AT LoS. It also may be a source of unmeasured bias when comparing AT and MS LoS. Finally, as the scope and complexity of medical training grows, there are no easy solutions to curricular overload, and adding new documentation‐focused content runs the risk of making this problem worse. Additionally, EHRs and documentation have been cited as leading causes of physician burnout. By shifting additional documentation burden to students, it is possible that they may burn out more quickly at a stage in their career when they may be least equipped to tackle these challenges, a concern that has been raised by others.  While the allowance for use of medical student documentation is not universal, broader implementation of documentation curricula and programs should be coupled with careful monitoring to ensure that potential negative consequences are minimized.

CONCLUSIONS

Medical students can effectively document in the electronic medical record with a minimum of downcoding. Our curriculum may have contributed to this success. Medical schools should consider developing this aspect of their own curriculum while continuing to ensure appropriate attending supervision.

CONFLICT OF INTEREST

The authors have no potential conflicts to disclose.

AUTHOR CONTRIBUTIONS

David S. Tillman: concept/design, 70%; acquisition of data, 30%; analysis and interpretation of data, 30%; drafting of the manuscript, 60%; critical revision of manuscript, 30%; statistical expertise, 0%; acquisition of funding, 100%. Corlin M. Jewell: concept/design, 5%; acquisition of data, 10%; analysis and interpretation of data, 10%; drafting of manuscript, 20%; critical revision of manuscript, 5%; statistical expertise, 0%; acquisition of funding, 0%. Dann J. Hekman: concept/design, 5%; acquisition of data, 30%; analysis and interpretation of data, 10%; drafting of manuscript, 10%; critical revision of manuscript, 0%; statistical expertise, 20%; acquisition of funding, 0%. Adam M. Nicholson: concept/design, 5%; acquisition of data, 10%; analysis and interpretation of data, 0%; drafting of manuscript, 5%; critical revision of manuscript, 5%; statistical expertise, 0%; acquisition of funding, 0%. Benjamin H. Schnapp: concept/design, 5%; acquisition of data, 10%; analysis and interpretation of data, 0%; drafting of manuscript, 5%; critical revision of manuscript, 5%; statistical expertise, 0%; acquisition of funding, 0%. Michael R. Lasarev: concept/design, 0%; acquisition of data, 0%; analysis and interpretation of data, 25%; drafting of manuscript, 0%; critical revision of manuscript, 0%; statistical expertise, 40%; acquisition of funding, 0%. Roxana Alexandridis: concept/design, 0%; acquisition of data, 0%; analysis and interpretation of data, 25%; drafting of manuscript, 0%; critical revision of manuscript, 0%; statistical expertise, 40%; acquisition of funding, 0%. Jamie M. Hess: concept/design, 5%; acquisition of data, 10%; analysis and interpretation of data, 0%; drafting of manuscript, 5%; critical revision of manuscript, 5%; statistical expertise, 0%; acquisition of funding, 0%. Mary C. Westergaard: concept/design, 0%; acquisition of data, 0%; analysis and interpretation of data, 0%; drafting of manuscript, 0%; critical revision of manuscript, 50%; statistical expertise, 0%; acquisition of funding, 0%.
Example of original AT note ending:Revised AT note ending + sham attestation for blinding:

… Patient was admitted to general surgery.

Impression: Appendicitis

‐Sally Student, MS4

… I was present with the medical student. I have edited this note as necessary to reflect the history, physical examination, and medical decision making I have performed and deemed medically necessary. This patient has appendicitis, looks overall well. No findings of perf. Ceftri and metronidazole abx. Pain controlled with parenteral opiates here. c/s to surgery. They will admit and take to OR later today.

Jane Attending, MD

Professor, Emergency Medicine

… Patient was admitted to general surgery.

Impression: Appendicitis

‐Student

… I was present with the medical student. I have edited this note as necessary to reflect the history, physical examination, and medical decision making I have performed and deemed medically necessary. This patient has appendicitis, looks overall well. No findings of perf. Ceftri and metronidazole abx. Pain controlled with parenteral opiates here. c/s to surgery. They will admit and take to OR later today.

‐Faculty

Example of original MS note ending:Revised MS note ending + sham attestation for blinding:

… Patient was admitted to general surgery.

Impression: Appendicitis

‐Sally Student, MS4

… Patient was admitted to general surgery.

Impression: Appendicitis

‐Student

I was present with the medical student. I have edited this note as necessary to reflect the history, physical examination, and medical decision making I have performed and deemed medically necessary.

‐Faculty

PatientAT LoSMS LoSTrue LoSAT‐MSAT‐TrueMS‐True
1554 0 −1−1
2333 0 0 0
3444 0 0 0
44341 0 1
55332−2 0
6555 0 0 0
74341 0 1
8444 0 0 0
9444 0 0 0
104331−1 0
11444 0 0 0
12444 0 0 0
13444 0 0 0
14445 0 11
155451 0 1
16333 0 0 0
17444 0 0 0
18332 0 −1−1
19445 0 11
205441−1 0
21344−11 0
22444 0 0 0
23455−11 0
244341 0 1
255342−11
26554 0 −1−1
27455−11 0
28344−11 0
29444 0 0 0
30333 0 0 0
315451 0 1
324331−1 0
33343−1 0 −1
34333 0 0 0
35444 0 0 0
36445 0 11
37444 0 0 0
38555 0 0 0
39555 0 0 0
40122−11 0
41555 0 0 0
42343−1 0 −1
43445 0 11
44555 0 0 0
455451 0 1
465451 0 1
47443 0 −1−1
48234−121
49554 0 −1−1
504341 0 1
51333 0 0 0
524331−1 0
53334 0 11
545451 0 1
55334 0 11
56332 0 −1−1
57343−1 0 −1
58333 0 0 0
59555 0 0 0
60232−1 0 −1
61444 0 0 0
624341 0 1
63444 0 0 0
64334 0 11
65445 0 11
66333 0 0 0
674341 0 1
68445 0 11
69332 0 −1−1
704331−1 0
71444 0 0 0
72333 0 0 0
73444 0 0 0
74334 0 11
75344−11 0
76444 0 0 0
77555 0 0 0
78444 0 0 0
79344−11 0
805451 0 1
81333 0 0 0
82445 0 11
83444 0 0 0
84333 0 0 0
854331−1 0
86333 0 0 0
87334 0 11
88333 0 0 0
89444 0 0 0
91454−1 0 −1
92334 0 11
93455−11 0
944341 0 1
95435112
965441−1 0
97443 0 −1−1
984331−1 0
99555 0 0 0
100445 0 11
101332 0 −1−1
102223 0 11
103334 0 11
104343−1 0 −1
  12 in total

1.  Computing inter-rater reliability and its variance in the presence of high agreement.

Authors:  Kilem Li Gwet
Journal:  Br J Math Stat Psychol       Date:  2008-05       Impact factor: 3.380

2.  CMS Billing Guidelines and Student Documentation: a New Era or New Burden?

Authors:  Andre Kumar; Jeffrey Chi
Journal:  J Gen Intern Med       Date:  2019-02-12       Impact factor: 5.128

3.  Improved Physical Exam Documentation in a Pediatric After-Hours Clinic.

Authors:  Shannon Kinlaw; Mindy Dailey; Dawn Scott; Staci Hanchey; Dmitry Tumin; Amanda Higginson
Journal:  Am J Med Qual       Date:  2019-12-13       Impact factor: 1.852

Review 4.  Incorporating Medical Student Documentation Into the Billable Encounter: A Pragmatic Approach to Implementation of the 2018 Centers for Medicare & Medicaid Services Rule Revision.

Authors:  Amy E Blatt; Anne C Nofziger; Paul C Levy
Journal:  Chest       Date:  2020-02-19       Impact factor: 9.410

5.  Taking note of the perceived value and impact of medical student chart documentation on education and patient care.

Authors:  Erica Friedman; Michelle Sainte; Robert Fallar
Journal:  Acad Med       Date:  2010-09       Impact factor: 6.893

6.  Implementation of Changes to Medical Student Documentation at Duke University Health System: Balancing Education With Service.

Authors:  Jane P Gagliardi; Brian Bonanno; Eugenia R McPeek Hinz; R Clayton Musser; Nancy W Knudsen; Michael Palko; Felice McNair; Hui-Jie Lee; Alison S Clay
Journal:  Acad Med       Date:  2021-06-01       Impact factor: 6.893

7.  Coding Discrepancies Between Medical Student and Physician Documentation.

Authors:  Ryan Howard; Rishindra M Reddy
Journal:  J Surg Educ       Date:  2018-03-09       Impact factor: 2.891

8.  Medical Student Documentation in the Emergency Department in the Electronic Health Record Era-A National Survey.

Authors:  Ryan A Virden; F Meridith Sonnett; Abu N G A Khan
Journal:  Pediatr Emerg Care       Date:  2019-03       Impact factor: 1.454

9.  Assessing medical student documentation using simulated charts in emergency medicine.

Authors:  Wirachin Hoonpongsimanont; Irene Velarde; Christopher Gilani; Michael Louthan; Shahram Lotfipour
Journal:  BMC Med Educ       Date:  2018-08-28       Impact factor: 2.463

10.  A pilot study on the evaluation of medical student documentation: assessment of SOAP notes.

Authors:  Ji-Hyun Seo; Hyun-Hee Kong; Sun-Ju Im; HyeRin Roh; Do-Kyong Kim; Hwa-Ok Bae; Young-Rim Oh
Journal:  Korean J Med Educ       Date:  2016-03-17
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.