Loading…

Interobserver and intraobserver agreement of lenke and king classifications for idiopathic scoliosis and the influence of level of professional training

This is a blinded study of radiographs by observers with different levels of professional training. To determine whether the level of professional training on nonmeasured and premeasured radiographs would affect reliability of Lenke's and King's classifications for adolescent idiopathic sc...

Full description

Saved in:
Bibliographic Details
Published in:Spine (Philadelphia, Pa. 1976) Pa. 1976), 2006-08, Vol.31 (18), p.2103-2108
Main Authors: NIEMEYER, Thomas, WOLF, Alexandra, KLUBA, Susanne, HALM, Henry F, DIETZ, Klaus, KLUBA, Torsten, LENKE, Lawrence G
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:This is a blinded study of radiographs by observers with different levels of professional training. To determine whether the level of professional training on nonmeasured and premeasured radiographs would affect reliability of Lenke's and King's classifications for adolescent idiopathic scoliosis. Both classification systems have been studied for their reliability, mainly by observers with a high level of experience in orthopedics and scoliosis surgery using premeasured radiographs. Examination of radiographs of 60 operative cases of adolescent idiopathic scoliosis was performed. On 5 occasions, 3 observers with a completely different degree of professional training measured and classified preoperative radiographs according to Lenke's or King's criteria. The results were determined by calculating the interobserver and intraobserver agreement and were quantified using two-rater and multirater kappa statistics. The Lenke and King classifications demonstrated poor to fair interobserver and good intraobserver agreement on nonmeasured radiographs. Both classifications demonstrated good to excellent interobserver agreement on premeasured radiographs. The results confirm that both classifications have a good reliability. On nonmeasured radiographs, the degree of professional training and the measurement process seem to influence the outcome. On premeasured radiographs, the interobserver agreement does not seem to be influenced by the level of professional training.
ISSN:0362-2436
1528-1159
DOI:10.1097/01.brs.0000231434.93884.c9