Your browser doesn't support javascript.
loading
Clinical instrument to retrospectively capture levels of EDSS.
Ciotti, John Robert; Sanders, Noah; Salter, Amber; Berger, Joseph R; Cross, Anne Haney; Chahin, Salim.
Affiliation
  • Ciotti JR; Department of Neurology, Washington University in St. Louis; St. Louis, MO, USA.
  • Sanders N; University of Minnesota Medical School; Minneapolis, MN, USA.
  • Salter A; Division of Biostatistics, Washington University in St. Louis; St. Louis, MO, USA.
  • Berger JR; Department of Neurology, University of Pennsylvania; Philadelphia, PA, USA.
  • Cross AH; Department of Neurology, Washington University in St. Louis; St. Louis, MO, USA.
  • Chahin S; Department of Neurology, Washington University in St. Louis; St. Louis, MO, USA. Electronic address: chahins@wustl.edu.
Mult Scler Relat Disord ; 39: 101884, 2020 Apr.
Article in En | MEDLINE | ID: mdl-31865272
ABSTRACT

BACKGROUND:

The Expanded Disability Status Scale (EDSS), a common outcome measure in Multiple Sclerosis (MS), is obtained prospectively through a direct standardized evaluation. The objective of this study is to develop and validate an algorithm to derive EDSS scores from previous neurological clinical documentation.

METHODS:

The algorithm utilizes data from the history, review of systems, and physical exam. EDSS scores formally obtained from research patients were compared to captured EDSS (c-EDSS) scores. To test inter-rater reliability, a second investigator captured scores from a subset of patients. Agreement between formal and c-EDSS scores was assessed using a weighted kappa. Clinical concordance was defined as a difference of one-step in EDSS (0.5) and functional system (1.0) scores.

RESULTS:

Clinical documentation from 92 patients (EDSS range 0.0-8.5) was assessed. Substantial agreement between the c-EDSS and formal EDSS (kappa 0.80; 95% CI 0.74-0.86) was observed. The mean difference between scores was 0.16. The clinical concordance was 78%. Near-perfect agreement was found between the two raters (kappa 0.89; 95% CI 0.84-0.95). The mean inter-rater difference in c-EDSS was 0.23.

CONCLUSIONS:

This algorithm reliably captures EDSS scores retrospectively with substantial correlation with formal EDSS and high inter-rater agreement. This algorithm may have practical implications in clinic, MS research and clinical trials.
Key words

Full text: 1 Database: MEDLINE Language: En Year: 2020 Type: Article

Full text: 1 Database: MEDLINE Language: En Year: 2020 Type: Article