Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Más filtros











Base de datos
Asunto principal
Intervalo de año de publicación
1.
J Hand Surg Am ; 49(5): 482-485, 2024 May.
Artículo en Inglés | MEDLINE | ID: mdl-38372689

RESUMEN

Observer reliability studies for fracture classification systems evaluate agreement using Cohen's κ and absolute agreement as outcome measures. Cohen's κ is a chance-corrected measure of agreement and can range between 0 (no agreement) and 1 (perfect agreement). Absolute agreement is the percentage of times observers agree on the matter they have to rate. Some studies report a high-absolute agreement but a relatively low κ value, which is counterintuitive. This phenomenon is referred to as the Kappa Paradox. The objective of this article was to explain the statistical phenomenon of the Kappa Paradox and to help readers and researchers to recognize and prevent this phenomenon.


Asunto(s)
Fracturas Óseas , Humanos , Fracturas Óseas/clasificación , Variaciones Dependientes del Observador , Reproducibilidad de los Resultados
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA