| Literature DB >> 35572425 |
Mikaël Chelli1,2, Gregory Gasbarro3, Vincent Lavoué1, Marc-Olivier Gauci4, Jean-Luc Raynier1, Christophe Trojani1, Pascal Boileau1.
Abstract
Background: The Neer classification is among the most widely used systems to describe proximal humerus fractures (PHF) despite the poor interobserver agreement. The purpose of this study was to verify whether or not blinded shoulder surgeons and trainees agree with the authors of articles published in the highest impact-factor orthopedic journals.Entities:
Keywords: Interobserver agreement; Neer classification; Proximal humerus fracture; Reliability; Survey; Traumatology
Year: 2022 PMID: 35572425 PMCID: PMC9091924 DOI: 10.1016/j.jseint.2022.02.006
Source DB: PubMed Journal: JSES Int ISSN: 2666-6383
Characteristics of included articles.
| First author | Year | Journal | Number of fractures |
|---|---|---|---|
| Grubhofer | 2017 | JSES | 1 |
| Kancherla | 2017 | J Am Acad Orthop Surg | 2 |
| Padegimas | 2017 | JSES | 2 |
| Park | 2017 | JSES | 1 |
| Singh | 2017 | JSES | 3 |
| Trikha | 2017 | JSES | 2 |
| Boileau | 2018 | JSES | 3 |
| Boileau | 2018 | JSES | 1 |
| Chen | 2018 | JSES | 2 |
| Chen | 2018 | JSES | 2 |
| Chung | 2018 | Acta Orthopaedica | 2 |
| Kim | 2018 | JSES | 2 |
| Singh | 2018 | JSES | 1 |
| Cai | 2019 | JSES | 1 |
| Hudgens | 2019 | JSES | 2 |
| Jorge-Mora | 2019 | JSES | 1 |
| Klug | 2019 | JSES | 2 |
| Large | 2019 | J Am Acad Orthop Surg | 1 |
| Sears | 2019 | J Am Acad Orthop Surg | 3 |
| Siebenbürger | 2019 | JSES | 1 |
JSES, Journal of Shoulder and Elbow Surgery; J Am Acad Orthop Surg, Journal of the American Academy of Orthopaedic Surgeons.
Figure 1Fractures with the lowest agreement between responders and article authors.
Figure 2Agreement rate between responders and article authors for the 35 included fractures.
Figure 3Fractures with the highest agreement between responders and article authors.
Rate of agreement between participants and authors.
| Agreement (%) | ||
|---|---|---|
| Overall (n = 138) | 54.7 | |
| Experience | .0023 | |
| Residents (n = 27) | 45.9 | |
| Fellows (n = 15) | 54.5 | |
| Surgeons < 10 y. of experience (n = 46) | 56.0 | |
| Surgeons ≥ 10 y. of experience (n = 50) | 58.2 | |
| Fracture classification according to the authors | .634 | |
| 2-part (n = 6) | 62.0 | |
| 3-part (n = 7) | 57.5 | |
| 4-part (n = 22) | 51.8 | |
| Fracture-dislocation | .066 | |
| No (n = 30) | 51.5 | |
| Yes (n = 5) | 73.6 | |
| 4-part fractures | .112 | |
| Nondislocated 4-part fracture (n = 18) | 46.8 | |
| Dislocated 4-part fractures (n = 4) | 74.1 | |
| Journal | .820 | |
| J. Shoulder and Elbow Surgery (n = 27) | 53.5 | |
| J. Am. Acad. Orthop. Surgery (n = 6) | 60.4 | |
| Acta Orthopedica (n = 2) | 52.5 | |
| Origin of participants | .689 | |
| Asia (n = 9) | 56.8 | |
| Europe (n = 73) | 53.4 | |
| Middle East (n = 9) | 51.1 | |
| North Africa (n = 2) | 45.8 | |
| North America (n = 9) | 51.4 | |
| Oceania (n = 3) | 61.0 | |
| South America (n = 30) | 58.7 | |
| Imaging modality | .845 | |
| X-rays only (n = 24) | 54.4 | |
| X-rays + 2D CT-scan (n = 3) | 47.1 | |
| X-rays + 3D CT-scan (n = 8) | 58.4 | |
| Number of available images (X-ray and/or CT-scan) | .385 | |
| 1 imaging (n = 18) | 51.9 | |
| 2 imaging or more (n = 17) | 57.8 |
J. Shoulder and Elbow Surgery, Journal of Shoulder and Elbow Surgery; J. Am. Acad. Orthop. Surgery, Journal of the American Academy of Orthopaedic Surgeons; CT, computed tomography.
P < .05.
Figure 42-part fractures (according to the authors) classified as 3-part fractures by the majority of responders.
Interobserver agreement according to the level of experience.
| Level of experience | κ [95% confidence interval] | |
|---|---|---|
| Residents (n = 27) | 0.228 [0.217-0.240] | <.0001 |
| Fellow (n = 15) | 0.313 [0.288-0.338] | <.0001 |
| Surgeons ≤ 10 years of experience (n = 46) | 0.326 [0.319-0.333] | <.0001 |
| Surgeons > 10 years of experience (n = 50) | 0.314 [0.307-0.320] | <.0001 |
The P value indicates the probability that the κ value differs from 0.
P < .05.
Review of literature: interobserver agreements according to imaging modality.
| First author | Year | Interobserver agreement (κ) | ||
|---|---|---|---|---|
| X-rays | 2D CT-scan | 3D CT-scan | ||
| Kristiansen | 1988 | 0.07-0.48 | - | - |
| Siebenrock | 1993 | 0.40 | - | - |
| Bernstein | 1996 | 0.52 | 0.50 | - |
| Sjödén | 1997 | - | 0.42 | - |
| Shrader | 2005 | 0.47 | 0.34 | - |
| Mora Guix | 2006 | 0.35 | 0.44 | - |
| Brunner | 2009 | 0.48 | 0.58 | 0.80 |
| Foroohar | 2011 | |||
| Overall | 0.14 | 0.06 | 0.09 | |
| Upper-limb specialists | 0.03 | 0.23 | 0.32 | |
| Berkes | 2014 | 0.42 | 0.67 | 0.63 |
| Matsushigue | 2014 | 0.37 | - | 0.57 |
| Handoll | 2016 | 0.48 | - | - |
| Iordens | 2016 | 0.29 | 0.51 | 0.51 |
| Sumrein | 2018 | 0.73 | 0.72 | - |
| Torrens | 2018 | 0.50 | 0.53 | 0.46 |
CT, computed tomography.
The 3D CT, scans reconstructions were visualized with special spectacles for ‘real’ 3D projection.