Li-Yang Dai1, Wen-Jie Jin. 1. Department of Orthopaedic Surgery, Xinhua Hospital, Shanghai Second Medical University, Shanghai, China. lydai@etang.com
Abstract
STUDY DESIGN: The Load Sharing Classification of spinal fractures was evaluated by 5 observers on 2 occasions. OBJECTIVE: To evaluate the interobserver and intraobserver reliability of the Load Sharing Classification of spinal fractures in the assessment of thoracolumbar burst fractures. SUMMARY OF BACKGROUND DATA: The Load Sharing Classification of spinal fractures provides a basis for the choice of operative approaches, but the reliability of this classification system has not been established. METHODS: The radiographic and computed tomography scan images of 45 consecutive patients with thoracolumbar burst fractures were reviewed by 5 observers on 2 different occasions 3 months apart. Interobserver reliability was assessed by comparison of the fracture classifications determined by the 5 observers. Intraobserver reliability was evaluated by comparison of the classifications determined by each observer on the first and second sessions. Ten paired interobserver and 5 intraobserver comparisons were then analyzed with use of kappa statistics. RESULTS: All 5 observers agreed on the final classification for 58% and 73% of the fractures on the first and second assessments, respectively. The average kappa coefficient for the 10 paired comparisons among the 5 observers was 0.79 (range 0.73-0.89) for the first assessment and 0.84 (range 0.81-0.95) for the second assessment. Interobserver agreement improved when the 3 components of the classification system were analyzed separately, reaching an almost perfect interobserver reliability with the average kappa values of 0.90 (range 0.82-0.97) for the first assessment and 0.92 (range 0.83-1) for the second assessment. The kappa values for the 5 intraobserver comparisons ranged from 0.73 to 0.87 (average 0.78), expressing at least substantial agreement; 2 observers showed almost perfect intraobserver reliability. For the 3 components of the classification system, all observers reached almost perfect intraobserver agreement with the kappa values of 0.83 to 0.97 (average, 0.89). CONCLUSIONS: Kappa statistics showed high levels of agreement when the Load Sharing Classification was used to assess thoracolumbar burst fractures. This system can be applied with excellent reliability.
STUDY DESIGN: The Load Sharing Classification of spinal fractures was evaluated by 5 observers on 2 occasions. OBJECTIVE: To evaluate the interobserver and intraobserver reliability of the Load Sharing Classification of spinal fractures in the assessment of thoracolumbar burst fractures. SUMMARY OF BACKGROUND DATA: The Load Sharing Classification of spinal fractures provides a basis for the choice of operative approaches, but the reliability of this classification system has not been established. METHODS: The radiographic and computed tomography scan images of 45 consecutive patients with thoracolumbar burst fractures were reviewed by 5 observers on 2 different occasions 3 months apart. Interobserver reliability was assessed by comparison of the fracture classifications determined by the 5 observers. Intraobserver reliability was evaluated by comparison of the classifications determined by each observer on the first and second sessions. Ten paired interobserver and 5 intraobserver comparisons were then analyzed with use of kappa statistics. RESULTS: All 5 observers agreed on the final classification for 58% and 73% of the fractures on the first and second assessments, respectively. The average kappa coefficient for the 10 paired comparisons among the 5 observers was 0.79 (range 0.73-0.89) for the first assessment and 0.84 (range 0.81-0.95) for the second assessment. Interobserver agreement improved when the 3 components of the classification system were analyzed separately, reaching an almost perfect interobserver reliability with the average kappa values of 0.90 (range 0.82-0.97) for the first assessment and 0.92 (range 0.83-1) for the second assessment. The kappa values for the 5 intraobserver comparisons ranged from 0.73 to 0.87 (average 0.78), expressing at least substantial agreement; 2 observers showed almost perfect intraobserver reliability. For the 3 components of the classification system, all observers reached almost perfect intraobserver agreement with the kappa values of 0.83 to 0.97 (average, 0.89). CONCLUSIONS: Kappa statistics showed high levels of agreement when the Load Sharing Classification was used to assess thoracolumbar burst fractures. This system can be applied with excellent reliability.
Authors: Heiko Koller; Frank Acosta; Axel Hempfing; David Rohrmüller; Mark Tauber; Stefan Lederer; Herbert Resch; Juliane Zenner; Helmut Klampfer; Robert Schwaiger; Robert Bogner; Wolfgang Hitzl Journal: Eur Spine J Date: 2008-06-25 Impact factor: 3.134
Authors: Akhil P Verheyden; Ulrich J Spiegl; Helmut Ekkerlein; Erol Gercek; Stefan Hauck; Christoph Josten; Frank Kandziora; Sebastian Katscher; Philipp Kobbe; Christian Knop; Wolfgang Lehmann; Rainer H Meffert; Christian W Müller; Axel Partenheimer; Christian Schinkel; Philipp Schleicher; Matti Scholz; Christoph Ulrich; Alexander Hoelzl Journal: Global Spine J Date: 2018-09-07