Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 1 de 1
Filtrar
Mais filtros








Base de dados
Intervalo de ano de publicação
1.
J Orthop Sci ; 2022 Nov 29.
Artigo em Inglês | MEDLINE | ID: mdl-36460558

RESUMO

BACKGROUND: Several classification systems have been developed to support orthopedic surgeons regarding diagnostic, treatment, or prognostic outcomes of distal radius fracture (DRF). However, the best classification system for this fracture remains controversial. We aimed to identify the reliability of three different DRF classifications among orthopedists in training (medical residents). METHODS: Orthopedic residents (n = 22) evaluated thirty cases of DRF in anteroposterior and lateral projections in three different periods (0, 6, 12 months). Each radiography was sorted with three different classifications: Frykman, AO/OTA, and Jupiter-Fernandez. All assessments were blinded to the investigators. The inter- and intra-observer reliability was evaluated using the Cohen's kappa coefficient. An additional analysis was performed for a simpler sub-classification of the AO/OTA (27, 9, or 3 groups). RESULTS: Inter-observer agreement for AO/OTA, Frykman, and Jupiter-Fernandez classifications was slight (k = 0.15), fair (k = 0.31), and fair (k = 0.30), respectively. Intra-observer agreement showed similar results: AO/OTA, k = 0.14; Frykman, k = 0.28; and Jupiter-Fernandez, k = 0.28. When the AO/OTA classification was simplified (9 or 3 descriptions), the inter-observer agreement improved from slight (k = 0.16) to fair (k = 0.21 and k = 0.30, respectively). A similar improvement from slight (k = 0.14) to fair (k = 0.32 and k = 0.21) was detected for intra-observer agreement. CONCLUSIONS: The more complex the DRF classification system, the more complex is to reach reliable inter- and intra-observer agreements between orthopedic trainees. Senior residents did not necessarily show greater kappa values in DRF classifications.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA