Interobserver agreement issues in radiology - 19/09/20
Cet article a été publié dans un numéro de la revue, cliquez ici pour y accéder
Highlights |
• | Intra- and inter observer agreement is a critical issue in imaging and this must be assessed with the most appropriate test. |
• | Cohen's kappa test should be used to evaluate inter-rater consistency (i.e., inter-rater reliability) for qualitative (nominal or ordinal) variables. |
• | Intraclass correlation coefficient should be used to assess agreement for qualitative variables. |
• | The Bland-Altman method can be used to assess consistency and conformity but its use should be restricted to comparison of two raters. |
Abstract |
Agreement between observers (i.e., inter-rater agreement) can be quantified with various criteria but their appropriate selections are critical. When the measure is qualitative (nominal or ordinal), the proportion of agreement or the kappa coefficient should be used to evaluate inter-rater consistency (i.e., inter-rater reliability). The kappa coefficient is more meaningful that the raw percentage of agreement, because the latter does not account for agreements due to chance alone. When the measures are quantitative, the intraclass correlation coefficient (ICC) should be used to assess agreement but this should be done with care because there are different ICCs so that it is important to describe the model and type of ICC being used. The Bland-Altman method can be used to assess consistency and conformity but its use should be restricted to comparison of two raters.
Le texte complet de cet article est disponible en PDF.Keywords : Reproducibility of results, Interobserver agreement, Radiology, Kappa test, Intraclass correlation coefficient
Plan
Bienvenue sur EM-consulte, la référence des professionnels de santé.
L’accès au texte intégral de cet article nécessite un abonnement.
Déjà abonné à cette revue ?