Literature DB >> 33457583

Misinformation in Patient Handouts About Upper Extremity Conditions.

Casey M O'Connor1,2, Joost Kortlever2, David Ring2.   

Abstract

This study investigated handouts regarding common upper extremity problems for inaccuracies, distracting information, and concepts that reinforce common unhelpful cognitive biases. We reviewed handouts on upper extremity conditions from 2 electronic medical records and 2 professional associations. We categorized information as inaccurate, distracting, and risk of reinforcing common unhelpful cognitive biases. Reading level, quality, and the ability of patients to process and take action was also rated. We found an average rate of inaccurate statements of 1.9 per 100 words, distracting statements of 0.73 per 100 words, and statements reinforcing common unhelpful cognitive biases of 2.1 per 100 words. Handouts from electronic medical records were rated higher quality and had higher reading grade level, but on average were constructed for better understandability. Patient handouts have a notable rate of inaccuracies, distractions, and information that may reinforce less adaptive cognitions. Greater attention is merited to making patient handouts readable, understandable, hopeful, and enabling.
© The Author(s) 2020.

Entities:  

Keywords:  hand surgery; handouts; misconceptions; misinformation

Year:  2020        PMID: 33457583      PMCID: PMC7786700          DOI: 10.1177/2374373520966823

Source DB:  PubMed          Journal:  J Patient Exp        ISSN: 2374-3735


Introduction

Clinicians often give patients handouts to help them learn about their condition. Some electronic medical records (EMRs) may include automatic orders for handouts. Many professional societies prepare handouts and make them available in web or print form. To our knowledge, the information in handouts is not as well studied as information available on the internet. Information on the web varies in quality, accuracy, and reading level (1 –5). Patients are at substantial risk of encountering health information of poor quality or information delivered using technical terminology that may better suit surgeons than patients. It is often difficult for patients to determine whether information provided on the web or in print is reliable. An effective teaching tool would allow people from diverse backgrounds and varying levels of health literacy to process, understand, and take action to promote their health (6). Technical information may be confusing for the lay person and inaccurate information may be hard to identify (7,8). Patient educational materials are recommended to have a reading level at or below the sixth grade to be suitable for a majority of the American adult population. Ideally, handouts would provide optimistic, enabling, and empowering information that can improve health and align choices with what matters most to an individual (their values). Suboptimal terminology can reinforce common unhelpful cognitive biases, may be inaccurate, and can provide information that is distracting from the key issues to be considered (9 –12). We classified the information in patient handouts for hand and upper extremity conditions from 2 EMRs and 2 professional societies in order to identify information that is inaccurate, distracting, or has the potential to reinforce common unhelpful cognitive biases. Our second aim was to compare the reading level, quality, actionability, and understandability of the patient handouts.

Methods

Institutional review board approval was not applicable. We analyzed the patient handouts for ganglion cyst, carpal tunnel syndrome, De Quervain tendinopathy, Dupuytren disease, rotator cuff tendinopathy, lateral epicondylitis, medial epicondylitis, thumb arthritis, osteoarthritis, ulnar neuropathy, and trigger finger available on 2 EMRs and handouts from an orthopedic and a hand surgery professional associations. The handouts generated automatically by EMRs when a diagnosis is entered. This is in contrast to handouts from professional societies that are available on the website or as paper pamphlets. A total of 37 handouts were analyzed. One orthopedic resident and one fellowship trained orthopedic hand surgeon categorized the information in patient handouts as follows: inaccurate statements, distracting statements, and statements that might reinforce common unhelpful cognitive biases (Table 1). The number of each type of misconception was recorded per 100 words per handout. Initial categorization was performed by the orthopedic resident with each handout reviewed by one orthopedic hand surgeon. Discrepancies were resolved by discussion. No discrepancies required adjudication.
Table 1.

Categorization of Reinforcing Common Unhelpful Cognitive Biases, Inaccuracies, and Distractions.

CategoryDefinitionExample(s)
Reinforces Common Unhelpful Cognitive BiasesStatements that reinforce misconceptions or less effective cognitive coping strategies including false hope or false despair or overly cautious statements.“You will need to stop doing the activities that cause pain until you have healed.“With lateral epicondylitis, there is degeneration of the tendon’s attachment, weakening the anchor site and placing greater stress on the area.”“Learn the right way to lift weights so you do not make joint pain worse.”“You will, however, be allowed to use your hand for light activities, taking care to avoid significant discomfort.”
Distracting InformationInformation that is unlikely to help a person make decisions or feel healthy. They add unhelpful material, which may make the document look overly complex and inaccessible.“It is also a good idea to know your results and keep a list of the medications you take.”“Follow your healthcare provider’s instructions, including any exercises recommended by your provider.”
Inaccurate informationInformation not supported by current best evidenceCarpal Tunnel Syndrome “In some cases you may have an ultrasound or MRI scan.”“Rest, exercises, and other things you can do at home may help your trigger finger relax so that it can bend as it should.”“Thumb arthritis is the second most common type of arthritis in the hand; the most prevalent hand arthritis involves the last joint in each finger.”
Categorization of Reinforcing Common Unhelpful Cognitive Biases, Inaccuracies, and Distractions. Patient handouts were also assessed using the Flesch-Kincaid reading level, DISCERN, and PEMAT tool. The reading level or reading level is typically determined in a grade level format by a reading level algorithm. For this study, we copied the content from each handout into a Microsoft Word document and determined reading level by the Flesch-Kincaid reading level. The DISCERN score was one of the first standardized quality indexes for consumer health information and can help categorize information as excellent (63-75), good (51–62), fair (39–50), poor (27–38), and very poor (15-26) (8,13). The PEMAT Understandability and Actionability tools help measure whether patients with varying levels of health literacy can process information from handouts and whether they are able to act on the information provided. The understandability score takes into account the structure and organization of the handout, how numbers are presented, word choice, and the use of visual aids. Actionability is dependent on the use of visual aids and the explicit description of how patients can act of the information provided.

Statistical Analysis

We counted the number of instances of inaccuracy, distraction, and potential reinforcement common unhelpful cognitive biases in each patient handout per 100 words. For each category, we calculated the mean ± standard deviation (SD) per 100 words per patient handout. This was also done for the Flesch-Kincaid reading level, DISCERN, and PEMAT tools per patient handout. We compared mean rates of inaccuracies, distractions, potential reinforcement common unhelpful cognitive biases, reading level, PEMAT scores, and DISCERN scores between sources of patient handouts using 1-way analysis of variance tests. Post hoc Tukey HSD pairwise comparisons were used to determine which categories were significantly different. We considered P < .05 as a statistically significant difference.

Results

There were an average of 1.9 inaccuracies per 100 words (SD: 0.98), 0.73 instances of distracting information per 100 words (SD 0.73), and 2.1 (SD 0.98) statements that might reinforce common unhelpful cognitive biases per 100 words in the handouts (Table 2). Professional society 1 had less inaccurate information when compared with EMR 1 and 2 (P < .05 Table 1). Electronic medical record 2 had more distracting information than both professional societies and EMR 1 (P < .05). There was no difference in the amount of statements potentially reinforcing common unhelpful cognitive biases in handouts from EMRs and those from professional societies.
Table 2.

Rate of Misconceptions, Distractions, and Inaccuracies per 100 Words per Handout.a

SourceMaladaptive informationDistracting informationInaccurate information
Overall2.1 ± 0.980.73 ± 0.661.9 ± 0.98
Electronic medical record 12.4 ± 0.710.55 ± 0.412.3 ± 0.73
Electronic medical record 22.5 ± 0.811.6 ± 0.692.3 ± 1.1
Professional association 11.7 ± 1.00.38 ± 0.231.0 ± 0.56
Professional association 22.0 ± 1.30.39 ± 0.232.0 ± 0.98
P value.260 <.05 <.05

a Bold indicates statistically significant; Continuous variables as mean ± standard deviation.

Rate of Misconceptions, Distractions, and Inaccuracies per 100 Words per Handout.a a Bold indicates statistically significant; Continuous variables as mean ± standard deviation. The average Flesch-Kincaid reading level was 7.7 (SD 1.6) with a range from 5.3 to 11. Sixty-five percent were seventh grade reading level or above. The handouts available from professional societies had higher average reading levels compared to those from EMRs (P < .05; Table 3).
Table 3.

Readability and Quality Metrics.a

SourceFlesch-Kincaid reading levelPEMAT understandabilityPEMAT actionabilityDISCERN
Overall7.7 ± 1.650 ± 9.344 ± 2739 ± 13
Electronic medical record 17.1 ± 0.7345 ± 8.414 ± 9.734 ± 2.9
Electronic medical record 25.8 ± 0.3843 ± 5.760 ± 0.028 ± 2.7
Professional association 18.7 ± 0.6253 ± 6.378 ± 6.760 ± 5.2
Professional association 29.2 ± 1.359 ± 6.327 ± 1036 ± 2.5
P value <.05 <.05 <.05 <.05

a Bold indicates statistically significant; Continuous variables as mean ± standard deviation.

Readability and Quality Metrics.a a Bold indicates statistically significant; Continuous variables as mean ± standard deviation. On average, handouts from professional societies are more understandable to patients with diverse levels of health literacy than handouts from EMRs (mean 50 [SD 9.3], P < .05). The actionability scores varied significantly but not by source (EMR vs professional society; P < .05; Table 3). On average, the handouts were classified as fair quality using the DISCERN score (mean DISCERN score of 39 [SD 13]). Handouts from professional society 1 provided better quality information when DISCERN scores were compared with all of the other sources (P < .05; Table 3). DISCERN scores from patient handouts from EMR 1 and professional society 2 were not significantly different.

Discussion

We investigated inaccuracies, distracting information, and potential reinforcement common unhelpful cognitive biases in patient handouts from EMRs and professional associations. Statements that reinforce common unhelpful cognitive biases and inaccurate statements were quite common, which raises the concern that handouts may reinforce less healthy mindsets that are known to contribute to greater pain intensity and magnitude of limitations. In other words, well-intentioned handouts have the potential to make a person feel more ill. The results of this study should be interpreted along with its limitations. The ratings of potential reinforcement of less adaptive coping strategies are based on an interpretation of best available evidence and a motivation to avoid reinforcing worst-case thinking (catastrophic thinking) or fear of movement (kinesiophobia) given the moderate to large correlation of these factors with pain intensity and magnitude of limitations in people with upper limb conditions (1,2,14). This determination of potential reinforcement of less effective cognitive coping strategies was somewhat subjective, but there were no instances of debate. Given the importance of avoiding reinforcement of unhealthy mindsets, a low threshold for labeling a statement as potentially misleading seems justified. Reading level measures are largely based on a count of syllables, and there is debate regarding the degree to which this reflects reading level (9,10,15). Statements that might reinforce common unhelpful cognitive biases are common in patient handouts, particularly those available in EMRs. Common cognitive errors that increase symptom intensity and magnitude of limitations fit the categories of “hurt represents harm,” “it’s taking too long,” and other types of worst-case thinking. These thoughts are inconsistent with most of the common hand and upper extremity problems, many of which are aspects of normal human aging while others are benign and self-limited. Statements that reinforce worst-case thinking, kinesiophobia, and other automatic but common unhelpful cognitive biases increase symptom intensity and magnitude of limitations (16 –18). Misconceptions reinforced by common unhelpful cognitive biases coping strategies might also lead people to choose options that are not consistent with what matters most to them (their values). We encourage writers of health information to take care to use the most hopeful, enabling, empowering language consistent with best evidence about a given disease. Handouts from the EMR were easier to read when compared with handouts from professional medical associations. Feghhi et al demonstrated good reliability and accessibility of online pediatric orthopedic educational materials (19). We found that handouts from all 4 sources had an average reading-level grade higher than that recommended by the National Institutes of Health (NIH). This means that the average patient would be unable to read and comprehend the information provided to them in the handouts. The reading level of professional association handouts was higher than the EMR handouts. A 2008 study found that only 2% of articles on The American Academy of Orthopaedic Surgeons (AAOS) patient-oriented web pages provided an appropriate reading level for patients (9). A 2014 study compared reading level of the AAOS handouts from 2008 with the revised patient-directed website and found that 84% of web pages remained above the eighth grade reading level (20). Multiple studies have found the reading level of content provided by professional associations to be higher than recommended by the NIH (9 –12,21 –24). Professional handouts had higher understandability scores when compared with the handouts from EMRs. Limited and low-quality visual aids included in handouts from EMRs contributed to lower PEMAT scores. Specifically, EMRs consistently scored poorly largely due to confusing or unclear descriptions and inaccurate visual depictions. The variability seen in actionability scores was dependent on whether sources were more likely to provide material that broke down actions into manageable and explicit steps and the availability of a clear structure to help patients take action. Current evidence suggests that it is difficult for patients to find reliable information that is also optimistic, enabling, and empowering. Even patient handouts available in EMRs and those from professional societies can be difficult to read, distracting, inaccurate, and have the potential to reinforce common unhelpful cognitive biases. In other words, patient handouts intend to inform and empower, but they often mislead and misinform. It seems that greater attention to reading level, accuracy, relevance, and health promotion aspects of handouts is merited. Patient handouts can be written at a sixth grade reading level with greater attention to the language and concepts used. Medical jargon and colloquialisms have the potential to be distracting, inaccurate, or to reinforce unhelpful cognitive biases.
  23 in total

1.  The emotive impact of medical language.

Authors:  Ana-Maria Vranceanu; Megan Elbon; Margaritha Adams; David Ring
Journal:  Hand (N Y)       Date:  2012-09

2.  Misinformation and Its Correction: Continued Influence and Successful Debiasing.

Authors:  Stephan Lewandowsky; Ullrich K H Ecker; Colleen M Seifert; Norbert Schwarz; John Cook
Journal:  Psychol Sci Public Interest       Date:  2012-12

3.  The emotive impact of orthopedic words.

Authors:  Ana-Maria Vranceanu; Megan Elbon; David Ring
Journal:  J Hand Ther       Date:  2011-01-31       Impact factor: 1.950

Review 4.  The Readability of AAOS Patient Education Materials: Evaluating the Progress Since 2008.

Authors:  Heather Roberts; Dafang Zhang; George S M Dyer
Journal:  J Bone Joint Surg Am       Date:  2016-09-07       Impact factor: 5.284

5.  Quality of online pediatric orthopaedic education materials.

Authors:  Daniel P Feghhi; Daniel Komlos; Nitin Agarwal; Sanjeev Sabharwal
Journal:  J Bone Joint Surg Am       Date:  2014-12-03       Impact factor: 5.284

6.  Most American Academy of Orthopaedic Surgeons' online patient education material exceeds average patient reading level.

Authors:  Adam E M Eltorai; Pranav Sharma; Jing Wang; Alan H Daniels
Journal:  Clin Orthop Relat Res       Date:  2014-12-05       Impact factor: 4.176

7.  Readability of patient education materials from the American Academy of Orthopaedic Surgeons and Pediatric Orthopaedic Society of North America web sites.

Authors:  Sameer Badarudeen; Sanjeev Sabharwal
Journal:  J Bone Joint Surg Am       Date:  2008-01       Impact factor: 5.284

8.  Debunking: A Meta-Analysis of the Psychological Efficacy of Messages Countering Misinformation.

Authors:  Man-Pui Sally Chan; Christopher R Jones; Kathleen Hall Jamieson; Dolores Albarracín
Journal:  Psychol Sci       Date:  2017-09-12

Review 9.  Readability of Trauma-Related Patient Education Materials From the American Academy of Orthopaedic Surgeons.

Authors:  Adam E M Eltorai; Nathan P Thomas; Heejae Yang; Alan H Daniels; Christopher T Born
Journal:  Trauma Mon       Date:  2016-02-06

10.  Readability assessment of American Shoulder and Elbow Surgeons patient brochures with suggestions for improvement.

Authors:  Adam P Schumaier; Rafael Kakazu; Chelsea E Minoughan; Brian M Grawe
Journal:  JSES Open Access       Date:  2018-03-22
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.