Your browser doesn't support javascript.
loading
Automated Quantification of Eye Tics Using Computer Vision and Deep Learning Techniques.
Conelea, Christine; Liang, Hengyue; DuBois, Megan; Raab, Brittany; Kellman, Mia; Wellen, Brianna; Jacob, Suma; Wang, Sonya; Sun, Ju; Lim, Kelvin.
Afiliação
  • Conelea C; Department of Psychiatry & Behavioral Sciences, University of Minnesota, Minneapolis, Minnesota, USA.
  • Liang H; Department of Electrical & Computer Engineering, University of Minnesota, Minneapolis, Minnesota, USA.
  • DuBois M; Department of Psychiatry & Behavioral Sciences, University of Minnesota, Minneapolis, Minnesota, USA.
  • Raab B; Department of Psychiatry & Behavioral Sciences, University of Minnesota, Minneapolis, Minnesota, USA.
  • Kellman M; Department of Psychiatry & Behavioral Sciences, University of Minnesota, Minneapolis, Minnesota, USA.
  • Wellen B; Department of Psychiatry & Behavioral Sciences, University of Minnesota, Minneapolis, Minnesota, USA.
  • Jacob S; Department of Psychiatry & Behavioral Sciences, University of Minnesota, Minneapolis, Minnesota, USA.
  • Wang S; Department of Neurology, University of Minnesota, Minneapolis, Minnesota, USA.
  • Sun J; Department of Computer Science & Engineering, University of Minnesota, Minneapolis, Minnesota, USA.
  • Lim K; Department of Psychiatry & Behavioral Sciences, University of Minnesota, Minneapolis, Minnesota, USA.
Mov Disord ; 39(1): 183-191, 2024 Jan.
Article em En | MEDLINE | ID: mdl-38146055
ABSTRACT

BACKGROUND:

Tourette syndrome (TS) tics are typically quantified using "paper and pencil" rating scales that are susceptible to factors that adversely impact validity. Video-based methods to more objectively quantify tics have been developed but are challenged by reliance on human raters and procedures that are resource intensive. Computer vision approaches that automate detection of atypical movements may be useful to apply to tic quantification.

OBJECTIVE:

The current proof-of-concept study applied a computer vision approach to train a supervised deep learning algorithm to detect eye tics in video, the most common tic type in patients with TS.

METHODS:

Videos (N = 54) of 11 adolescent patients with TS were rigorously coded by trained human raters to identify 1.5-second clips depicting "eye tic events" (N = 1775) and "non-tic events" (N = 3680). Clips were encoded into three-dimensional facial landmarks. Supervised deep learning was applied to processed data using random split and disjoint split regimens to simulate model validity under different conditions.

RESULTS:

Area under receiver operating characteristic curve was 0.89 for the random split regimen, indicating high accuracy in the algorithm's ability to properly classify eye tic vs. non-eye tic movements. Area under receiver operating characteristic curve was 0.74 for the disjoint split regimen, suggesting that algorithm generalizability is more limited when trained on a small patient sample.

CONCLUSIONS:

The algorithm was successful in detecting eye tics in unseen validation sets. Automated tic detection from video is a promising approach for tic quantification that may have future utility in TS screening, diagnostics, and treatment outcome measurement. © 2023 The Authors. Movement Disorders published by Wiley Periodicals LLC on behalf of International Parkinson and Movement Disorder Society.
Assuntos
Palavras-chave

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Transtornos de Tique / Síndrome de Tourette / Tiques / Aprendizado Profundo / Transtornos dos Movimentos Limite: Adolescent / Humans Idioma: En Revista: Mov Disord Assunto da revista: NEUROLOGIA Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Estados Unidos País de publicação: Estados Unidos

Texto completo: 1 Coleções: 01-internacional Base de dados: MEDLINE Assunto principal: Transtornos de Tique / Síndrome de Tourette / Tiques / Aprendizado Profundo / Transtornos dos Movimentos Limite: Adolescent / Humans Idioma: En Revista: Mov Disord Assunto da revista: NEUROLOGIA Ano de publicação: 2024 Tipo de documento: Article País de afiliação: Estados Unidos País de publicação: Estados Unidos