BACKGROUND CONTEXT: Flexion-extension X-rays are commonly used to identify abnormalities in intervertebral motion, despite little evidence for the reliability of the information that clinicians derive from these test. PURPOSE: Quantify observer agreement on intervertebral motion abnormalities assessed with and without the use of computer-assisted technology. STUDY DESIGN: Assess interobserver agreement among clinicians when they evaluate cervical flexion-extension X-rays using the methods they now use in clinical practice, and compare this to observer agreement when the same clinicians reassess the X-rays using computer-assisted technology. METHODS: Seventy-five flexion-extension X-rays of the cervical spine, obtained from several clinical practices, were assessed by seven practicing physicians who routinely assess these X-rays. Observers assessed the studies using the methods they routinely use, and then reassessed the studies, at least a month later, using validated computer-assisted methods. Agreement among clinicians with and without computer-assisted technology was assessed using kappa statistics. RESULTS: Agreement was poor (kappa=0.17) with methods routinely used in clinical practice. Computer-assisted analysis improved interobserver agreement (kappa=0.77). With computer-assisted methods, disagreements involved cases with severe degeneration or static misalignment where motion was within normal limits, or in fusion cases where there was between 1 and 1.5 degrees of motion at the fusion site. CONCLUSIONS: This study suggests that commonly used methods to assess flexion-extension X-rays of the cervical spine may not provide reliable clinical information about intervertebral motion abnormalities, and that validated, computer-assisted methods can dramatically improve agreement among clinicians. The lack of definitions of instability and fusion acceptable to all the clinicians was likely a primary source of disagreement with both manual and computer-assisted assessments.
BACKGROUND CONTEXT: Flexion-extension X-rays are commonly used to identify abnormalities in intervertebral motion, despite little evidence for the reliability of the information that clinicians derive from these test. PURPOSE: Quantify observer agreement on intervertebral motion abnormalities assessed with and without the use of computer-assisted technology. STUDY DESIGN: Assess interobserver agreement among clinicians when they evaluate cervical flexion-extension X-rays using the methods they now use in clinical practice, and compare this to observer agreement when the same clinicians reassess the X-rays using computer-assisted technology. METHODS: Seventy-five flexion-extension X-rays of the cervical spine, obtained from several clinical practices, were assessed by seven practicing physicians who routinely assess these X-rays. Observers assessed the studies using the methods they routinely use, and then reassessed the studies, at least a month later, using validated computer-assisted methods. Agreement among clinicians with and without computer-assisted technology was assessed using kappa statistics. RESULTS: Agreement was poor (kappa=0.17) with methods routinely used in clinical practice. Computer-assisted analysis improved interobserver agreement (kappa=0.77). With computer-assisted methods, disagreements involved cases with severe degeneration or static misalignment where motion was within normal limits, or in fusion cases where there was between 1 and 1.5 degrees of motion at the fusion site. CONCLUSIONS: This study suggests that commonly used methods to assess flexion-extension X-rays of the cervical spine may not provide reliable clinical information about intervertebral motion abnormalities, and that validated, computer-assisted methods can dramatically improve agreement among clinicians. The lack of definitions of instability and fusion acceptable to all the clinicians was likely a primary source of disagreement with both manual and computer-assisted assessments.
Authors: Lisa K Cannada; Steven C Scherping; Jung U Yoo; Paul K Jones; Sanford E Emery Journal: Spine (Phila Pa 1976) Date: 2003-01-01 Impact factor: 3.468
Authors: P Suchomel; L Jurák; J Antinheimo; J Pohjola; J Stulik; H-J Meisel; M Čabraja; C Woiciechowsky; B Bruchmann; I Shackleford; R Arregui; S Sola Journal: Eur Spine J Date: 2014-02-20 Impact factor: 3.134
Authors: Stephen R Chen; Clarissa M LeVasseur; Samuel Pitcairn; Maria A Munsch; Brandon K Couch; Adam S Kanter; David O Okonkwo; Jeremy D Shaw; William F Donaldson; Joon Y Lee; William J Anderst Journal: Spine (Phila Pa 1976) Date: 2022-06-29 Impact factor: 3.241
Authors: Xu Wang; René Lindstroem; Niels Peter Bak Carstens; Thomas Graven-Nielsen Journal: BMC Musculoskelet Disord Date: 2017-03-13 Impact factor: 2.362