Loading…

Multirater Agreement of Arthroscopic Meniscal Lesions

Background: Establishing the validity of classification schemes is a crucial preparatory step that should precede multicenter studies. There are no studies investigating the reproducibility of arthroscopic classification of meniscal pathology among multiple surgeons at different institutions. Hypoth...

Full description

Saved in:
Bibliographic Details
Published in:The American journal of sports medicine 2004-12, Vol.32 (8), p.1937-1940
Main Authors: Dunn, Warren R., Wolf, Brian R., Amendola, Annunziato, Andrish, Jack T., Kaeding, Christopher, Marx, Robert G., McCarty, Eric C., Parker, Richard D., Wright, Rick W., Spindler, Kurt P.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Background: Establishing the validity of classification schemes is a crucial preparatory step that should precede multicenter studies. There are no studies investigating the reproducibility of arthroscopic classification of meniscal pathology among multiple surgeons at different institutions. Hypothesis: Arthroscopic classification of meniscal pathology is reliable and reproducible and suitable for multicenter studies that involve multiple surgeons. Study Design: Multirater agreement study. Methods: Seven surgeons reviewed a video of 18 meniscal tears and completed a meniscal classification questionnaire. Multirater agreement was calculated based on the proportion of agreement, the kappa coefficient, and the intraclass correlation coefficient. Results: There was a 46% agreement on the central/peripheral location of tears (κ = 0.30), an 80% agreement on the depth of tears (κ = 0.46), a 72% agreement on the presence of a degenerative component (κ = 0.44), a 71% agreement on whether lateral tears were central to the popliteal hiatus (κ = 0.42), a 73% agreement on the type of tear (κ = 0.63), an 87% agreement on the location of the tear (κ = 0.61), and an 84% agreement on the treatment of tears (κ = 0.66). There was considerable agreement among surgeons on length, with an intraclass correlation coefficient of 0.78, 95% confidence interval of 0.57 to 0.92, and P < .001. Conclusions: Arthroscopic grading of meniscal pathology is reliable and reproducible. Clinical Relevance: Surgeons can reliably classify meniscal pathology and agree on treatment, which is important for multicenter trials. Keywords: multicenter meniscus multirater agreement Multicenter Orthpaedic Outcomes Network (MOON)
ISSN:0363-5465
1552-3365
DOI:10.1177/0363546504264586